Sample records for task 2 laboratory scale

  1. Guidance, Navigation and Control Digital Emulation Technology Laboratory. Volume 1. Part 2. Task 1: Digital Emulation Technology Laboratory

    DTIC Science & Technology

    1991-09-27

    AD-A241 692 II I] II I11 ANNUAL REPORT VOLUME 1 PART 2 TASK 1: DIGITAL EMULATION TECHNOLOGY LABORATOIRY REPORT NO. AR-0142-91-001 September 27, 1991... DIGITAL EMULATION TECHNOLOGY LABORATORY Contract No. DASG60-89-C-0142 Sponsored By The United States Army ? trategic Defense Command COMPUTER...ANNUAL REPORT VOLUME 1 PART 2 TASK 1: DIGITAL EMULATION TECHNOLOGY LABORATORY September 27, 1991 Authors Thomas R. Collins and Stephen R. Wachtel

  2. The AAPT Advanced Laboratory Task Force Report

    NASA Astrophysics Data System (ADS)

    Dunham, Jeffrey

    2008-04-01

    In late 2005, the American Association of Physics Teachers (AAPT) assembled a seven-member Advanced Laboratory Task Force^ to recommend ways that AAPT could increase the degree and effectiveness of its interactions with physics teachers of upper-division physics laboratories, with the ultimate goal of improving the teaching of advanced laboratories. The task force completed its work during the first half of 2006 and its recommendations were presented to the AAPT Executive Committee in July 2006. This talk will present the recommendations of the task force and actions taken by AAPT in response to them. The curricular goals of the advanced laboratory course at various institutions will also be discussed. The talk will conclude with an appeal to the APS membership to support ongoing efforts to revitalize advanced laboratory course instruction. ^Members of the Advanced Laboratory Task Force: Van Bistrow, University of Chicago; Bob DeSerio, University of Florida; Jeff Dunham, Middlebury College (Chair); Elizabeth George, Wittenburg University; Daryl Preston, California State University, East Bay; Patricia Sparks, Harvey Mudd College; Gerald Taylor, James Madison University; and David Van Baak, Calvin College.

  3. Laboratory and Pilot Scale Evaluation of Coagulation, Clarification, and Filtration for Upgrading Sewage Lagoon Effluents.

    DTIC Science & Technology

    1980-08-01

    AD-AGAB 906 ARMY ENGINEER WATERWAYS EXPERIMENT STATION VICKSBURG--ETC FIG 14/2 LABORATORY AND PILOT SCALE EVALUATION OF COAGULATION, CLARIFICA -ETC U...FILTRATION FOR LWGRADING JEWAGE LAGOON EFFLUENTS~ w IL j0 ( M John ullinane, Jr., Richard A. hafer (0 Environmental Laboratory gel U. S. Army Engineer ...Shafer 9. PERFORMING ORGANIZATION NAME AND ADORESS SO. PROGRAM ELEMENT, PROJECT, TASK AREA a WORK UNIT NUMBERS U. S. Army Engineer Waterways Experiment

  4. Inquiring Scaffolds in Laboratory Tasks: An Instance of a "Worked Laboratory Guide Effect"?

    ERIC Educational Resources Information Center

    Schmidt-Borcherding, Florian; Hänze, Martin; Wodzinski, Rita; Rincke, Karsten

    2013-01-01

    The study explores if established support devices for paper-pencil problem solving, namely worked examples and incremental scaffolds, are applicable to laboratory tasks. N?=?173 grade eight students solved in dyads a physics laboratory task in one of three conditions. In condition A (unguided problem solving), students were asked to determine the…

  5. Rankings, Standards, and Competition: Task vs. Scale Comparisons

    ERIC Educational Resources Information Center

    Garcia, Stephen M.; Tor, Avishalom

    2007-01-01

    Research showing how upward social comparison breeds competitive behavior has so far conflated local comparisons in "task" performance (e.g. a test score) with comparisons on a more general "scale" (i.e. an underlying skill). Using a ranking methodology (Garcia, Tor, & Gonzalez, 2006) to separate task and scale comparisons, Studies 1-2 reveal that…

  6. Data Services and Transnational Access for European Geosciences Multi-Scale Laboratories

    NASA Astrophysics Data System (ADS)

    Funiciello, Francesca; Rosenau, Matthias; Sagnotti, Leonardo; Scarlato, Piergiorgio; Tesei, Telemaco; Trippanera, Daniele; Spires, Chris; Drury, Martyn; Kan-Parker, Mirjam; Lange, Otto; Willingshofer, Ernst

    2016-04-01

    The EC policy for research in the new millennium supports the development of european-scale research infrastructures. In this perspective, the existing research infrastructures are going to be integrated with the objective to increase their accessibility and to enhance the usability of their multidisciplinary data. Building up integrating Earth Sciences infrastructures in Europe is the mission of the Implementation Phase (IP) of the European Plate Observing System (EPOS) project (2015-2019). The integration of european multiscale laboratories - analytical, experimental petrology and volcanology, magnetic and analogue laboratories - plays a key role in this context and represents a specific task of EPOS IP. In the frame of the WP16 of EPOS IP working package 16, European geosciences multiscale laboratories aims to be linked, merging local infrastructures into a coherent and collaborative network. In particular, the EPOS IP WP16-task 4 "Data services" aims at standardize data and data products, already existing and newly produced by the participating laboratories, and made them available through a new digital platform. The following data and repositories have been selected for the purpose: 1) analytical and properties data a) on volcanic ash from explosive eruptions, of interest to the aviation industry, meteorological and government institutes, b) on magmas in the context of eruption and lava flow hazard evaluation, and c) on rock systems of key importance in mineral exploration and mining operations; 2) experimental data describing: a) rock and fault properties of importance for modelling and forecasting natural and induced subsidence, seismicity and associated hazards, b) rock and fault properties relevant for modelling the containment capacity of rock systems for CO2, energy sources and wastes, c) crustal and upper mantle rheology as needed for modelling sedimentary basin formation and crustal stress distributions, d) the composition, porosity, permeability, and

  7. Factors Predicting Smoking in a Laboratory-Based Smoking-Choice Task

    PubMed Central

    Bold, Krysten W.; Yoon, Haewon; Chapman, Gretchen B.; McCarthy, Danielle E.

    2013-01-01

    This study aimed to expand the current understanding of smoking maintenance mechanisms by examining how putative relapse risk factors relate to a single behavioral smoking choice using a novel laboratory smoking-choice task. After 12 hours of nicotine deprivation, participants were exposed to smoking cues and given the choice between smoking up to two cigarettes in a 15-minute window or waiting and receiving four cigarettes after a delay of 45 minutes. Greater nicotine dependence, higher impulsivity, and lower distress tolerance were hypothesized to predict earlier and more intensive smoking. Out of 35 participants (n=9 female), 26 chose to smoke with a median time to a first puff of 1.22 minutes (standard deviation=2.62 min, range=0.03–10.62 min). Survival analyses examined latency to first puff, and results indicated that greater pre-task craving and smoking more cigarettes per day were significantly related to smoking sooner in the task. Greater behavioral disinhibition predicted shorter smoking latency in the first two minutes of the task, but not at a delay of more than two minutes. Lower distress tolerance (reporting greater regulation efforts to alleviate distress) was related to more puffs smoked and greater nicotine dependence was related to more time spent smoking in the task. This novel laboratory smoking-choice paradigm may be a useful laboratory analog for the choices smokers make during cessation attempts and may help identify factors that influence smoking lapses. PMID:23421357

  8. Hydrodynamic Scalings: from Astrophysics to Laboratory

    NASA Astrophysics Data System (ADS)

    Ryutov, D. D.; Remington, B. A.

    2000-05-01

    A surprisingly general hydrodynamic similarity has been recently described in Refs. [1,2]. One can call it the Euler similarity because it works for the Euler equations (with MHD effects included). Although the dissipation processes are assumed to be negligible, the presence of shocks is allowed. For the polytropic medium (i.e., the medium where the energy density is proportional to the pressure), an evolution of an arbitrarily chosen 3D initial state can be scaled to another system, if a single dimensionless parameter (the Euler number) is the same for both initial states. The Euler similarity allows one to properly design laboratory experiments modeling astrophysical phenomena. We discuss several examples of such experiments related to the physics of supernovae [3]. For the problems with a single spatial scale, the condition of the smallness of dissipative processes can be adequately described in terms of the Reynolds, Peclet, and magnetic Reynolds numbers related to this scale (all three numbers must be large). However, if the system develops small-scale turbulence, dissipation may become important at these smaller scales, thereby affecting the gross behavior of the system. We analyze the corresponding constraints. We discuss also constraints imposed by the presence of interfaces between the substances with different polytropic index. Another set of similarities governs evolution of photoevaporation fronts in astrophysics. Convenient scaling laws exist in situations where the density of the ablated material is very low compared to the bulk density. We conclude that a number of hydrodynamical problems related to such objects as the Eagle Nebula can be adequately simulated in the laboratory. We discuss also possible scalings for radiative astrophysical jets (see Ref. [3] and references therein). This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract W-7405-Eng-48

  9. Exploration of task performance tests in a physics laboratory

    NASA Astrophysics Data System (ADS)

    Liu, Dan; El Turkey, Houssein

    2017-11-01

    In this article, we investigate the implementation of task performance tests in an undergraduate physics laboratory. Two performance tests were carried out over two semesters using the task of building a DC circuit. The first implementation in Spring 2014 had certain concerns such as the privacy of students’ testing and their ‘trial and error’ attempts. These concerns were addressed in Fall 2015 through implementing a second performance test. The second implementation was administered differently but the content of the two tests was the same. We discuss the validity of both implementations and present the correlation (or lack of) between the time that students needed to complete the tests and their grades from a paper-based laboratory assessment method.

  10. Implementing the Science Assessment Standards: Developing and validating a set of laboratory assessment tasks in high school biology

    NASA Astrophysics Data System (ADS)

    Saha, Gouranga Chandra

    Very often a number of factors, especially time, space and money, deter many science educators from using inquiry-based, hands-on, laboratory practical tasks as alternative assessment instruments in science. A shortage of valid inquiry-based laboratory tasks for high school biology has been cited. Driven by this need, this study addressed the following three research questions: (1) How can laboratory-based performance tasks be designed and developed that are doable by students for whom they are designed/written? (2) Do student responses to the laboratory-based performance tasks validly represent at least some of the intended process skills that new biology learning goals want students to acquire? (3) Are the laboratory-based performance tasks psychometrically consistent as individual tasks and as a set? To answer these questions, three tasks were used from the six biology tasks initially designed and developed by an iterative process of trial testing. Analyses of data from 224 students showed that performance-based laboratory tasks that are doable by all students require careful and iterative process of development. Although the students demonstrated more skill in performing than planning and reasoning, their performances at the item level were very poor for some items. Possible reasons for the poor performances have been discussed and suggestions on how to remediate the deficiencies have been made. Empirical evidences for validity and reliability of the instrument have been presented both from the classical and the modern validity criteria point of view. Limitations of the study have been identified. Finally implications of the study and directions for further research have been discussed.

  11. An Examination of Preservice Primary Teachers' Written Arguments in an Open Inquiry Laboratory Task

    ERIC Educational Resources Information Center

    McDonald, Christine V.

    2013-01-01

    This study assessed the quality of preservice primary teachers' written arguments in an open inquiry laboratory task. An analysis of the features of the laboratory task was also undertaken to ascertain the characteristics of the task that facilitated or constrained the development of participants' written arguments. Australian preservice primary…

  12. [The National Reference Centres and Reference Laboratories. Importance and tasks].

    PubMed

    Laude, G; Ammon, A

    2005-09-01

    Since 1995, the German Federal Ministry for Health and Social Security funds National Reference Centres (NRC) for the laboratory surveillance of important pathogens and syndromes. Which pathogens or syndromes are selected to be covered by a NRC depends on their epidemiological relevance, the special diagnostic tools, problems with antimicrobial resistance and necessary infection control measures. Currently, there are 15 NRC, which are appointed for a period of 3 years (currently from January 2005 through December 2007). Towards the end of their appointment all NRC are evaluated by a group of specialists. The assessment of their achievements is guided by a catalogue of tasks for the NRC. In addition to the NRC, a total of 50 laboratories are appointed which provide specialist expertise for additional pathogens in order to have a broad range of pathogens for which specialist laboratories are available. Their predominant task is to give advice and support for special diagnostic problems. Both NRC and the specialist laboratories are important parts of the network for infectious disease epidemiology.

  13. Cognitive task analysis for teaching technical skills in an inanimate surgical skills laboratory.

    PubMed

    Velmahos, George C; Toutouzas, Konstantinos G; Sillin, Lelan F; Chan, Linda; Clark, Richard E; Theodorou, Demetrios; Maupin, Fredric

    2004-01-01

    The teaching of surgical skills is based mostly on the traditional "see one, do one, teach one" resident-to-resident method. Surgical skills laboratories provide a new environment for teaching skills but their effectiveness has not been adequately tested. Cognitive task analysis is an innovative method to teach skills, used successfully in nonmedical fields. The objective of this study is to evaluate the effectiveness of a 3-hour surgical skills laboratory course on central venous catheterization (CVC), taught by the principles of cognitive task analysis to surgical interns. Upon arrival to the Department of Surgery, 26 new interns were randomized to either receive a surgical skills laboratory course on CVC ("course" group, n = 12) or not ("traditional" group, n = 14). The course consisted mostly of hands-on training on inanimate CVC models. All interns took a 15-item multiple-choice question test on CVC at the beginning of the study. Within two and a half months all interns performed CVC on critically ill patients. The outcome measures were cognitive knowledge and technical-skill competence on CVC. These outcomes were assessed by a 14-item checklist evaluating the interns while performing CVC on a patient and by the 15-item multiple-choice-question test, which was repeated at that time. There were no differences between the two groups in the background characteristics of the interns or the patients having CVC. The scores at the initial multiple-choice test were similar (course: 7.33 +/- 1.07, traditional: 8 +/- 2.15, P = 0.944). However, the course interns scored significantly higher in the repeat test compared with the traditional interns (11 +/- 1.86 versus 8.64 +/- 1.82, P = 0.03). Also, the course interns achieved a higher score on the 14-item checklist (12.6 +/- 1.1 versus 7.5 +/- 2.2, P <0.001). They required fewer attempts to find the vein (3.3 +/- 2.2 versus 6.4 +/- 4.2, P = 0.046) and showed a trend toward less time to complete the procedure (15.4 +/- 9

  14. The suppression of scale-free fMRI brain dynamics across three different sources of effort: aging, task novelty and task difficulty.

    PubMed

    Churchill, Nathan W; Spring, Robyn; Grady, Cheryl; Cimprich, Bernadine; Askren, Mary K; Reuter-Lorenz, Patricia A; Jung, Mi Sook; Peltier, Scott; Strother, Stephen C; Berman, Marc G

    2016-08-08

    There is growing evidence that fluctuations in brain activity may exhibit scale-free ("fractal") dynamics. Scale-free signals follow a spectral-power curve of the form P(f ) ∝ f(-β), where spectral power decreases in a power-law fashion with increasing frequency. In this study, we demonstrated that fractal scaling of BOLD fMRI signal is consistently suppressed for different sources of cognitive effort. Decreases in the Hurst exponent (H), which quantifies scale-free signal, was related to three different sources of cognitive effort/task engagement: 1) task difficulty, 2) task novelty, and 3) aging effects. These results were consistently observed across multiple datasets and task paradigms. We also demonstrated that estimates of H are robust across a range of time-window sizes. H was also compared to alternative metrics of BOLD variability (SDBOLD) and global connectivity (Gconn), with effort-related decreases in H producing similar decreases in SDBOLD and Gconn. These results indicate a potential global brain phenomenon that unites research from different fields and indicates that fractal scaling may be a highly sensitive metric for indexing cognitive effort/task engagement.

  15. The suppression of scale-free fMRI brain dynamics across three different sources of effort: aging, task novelty and task difficulty

    PubMed Central

    Churchill, Nathan W.; Spring, Robyn; Grady, Cheryl; Cimprich, Bernadine; Askren, Mary K.; Reuter-Lorenz, Patricia A.; Jung, Mi Sook; Peltier, Scott; Strother, Stephen C.; Berman, Marc G.

    2016-01-01

    There is growing evidence that fluctuations in brain activity may exhibit scale-free (“fractal”) dynamics. Scale-free signals follow a spectral-power curve of the form P(f ) ∝ f−β, where spectral power decreases in a power-law fashion with increasing frequency. In this study, we demonstrated that fractal scaling of BOLD fMRI signal is consistently suppressed for different sources of cognitive effort. Decreases in the Hurst exponent (H), which quantifies scale-free signal, was related to three different sources of cognitive effort/task engagement: 1) task difficulty, 2) task novelty, and 3) aging effects. These results were consistently observed across multiple datasets and task paradigms. We also demonstrated that estimates of H are robust across a range of time-window sizes. H was also compared to alternative metrics of BOLD variability (SDBOLD) and global connectivity (Gconn), with effort-related decreases in H producing similar decreases in SDBOLD and Gconn. These results indicate a potential global brain phenomenon that unites research from different fields and indicates that fractal scaling may be a highly sensitive metric for indexing cognitive effort/task engagement. PMID:27498696

  16. Laboratory studies of 2H evaporator scale dissolution in dilute nitric acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oji, L.

    The rate of 2H evaporator scale solids dissolution in dilute nitric acid has been experimentally evaluated under laboratory conditions in the SRNL shielded cells. The 2H scale sample used for the dissolution study came from the bottom of the evaporator cone section and the wall section of the evaporator cone. The accumulation rate of aluminum and silicon, assumed to be the two principal elemental constituents of the 2H evaporator scale aluminosilicate mineral, were monitored in solution. Aluminum and silicon concentration changes, with heating time at a constant oven temperature of 90 deg C, were used to ascertain the extent ofmore » dissolution of the 2H evaporator scale mineral. The 2H evaporator scale solids, assumed to be composed of mostly aluminosilicate mineral, readily dissolves in 1.5 and 1.25 M dilute nitric acid solutions yielding principal elemental components of aluminum and silicon in solution. The 2H scale dissolution rate constant, based on aluminum accumulation in 1.5 and 1.25 M dilute nitric acid solution are, respectively, 9.21E-04 ± 6.39E-04 min{sup -1} and 1.07E-03 ± 7.51E-05 min{sup -1}. Silicon accumulation rate in solution does track the aluminum accumulation profile during the first few minutes of scale dissolution. It however diverges towards the end of the scale dissolution. This divergence therefore means the aluminum-to-silicon ratio in the first phase of the scale dissolution (non-steady state conditions) is different from the ratio towards the end of the scale dissolution. Possible causes of this change in silicon accumulation in solution as the scale dissolution progresses may include silicon precipitation from solution or the 2H evaporator scale is a heterogeneous mixture of aluminosilicate minerals with several impurities. The average half-life for the decomposition of the 2H evaporator scale mineral in 1.5 M nitric acid is 12.5 hours, while the half-life for the decomposition of the 2H evaporator scale in 1.25 M nitric acid is

  17. Design of a laboratory scale fluidized bed reactor

    NASA Astrophysics Data System (ADS)

    Wikström, E.; Andersson, P.; Marklund, S.

    1998-04-01

    The aim of this project was to construct a laboratory scale fluidized bed reactor that simulates the behavior of full scale municipal solid waste combustors. The design of this reactor is thoroughly described. The size of the laboratory scale fluidized bed reactor is 5 kW, which corresponds to a fuel-feeding rate of approximately 1 kg/h. The reactor system consists of four parts: a bed section, a freeboard section, a convector (postcombustion zone), and an air pollution control (APC) device system. The inside diameter of the reactor is 100 mm at the bed section and it widens to 200 mm in diameter in the freeboard section; the total height of the reactor is 1760 mm. The convector part consists of five identical sections; each section is 2700 mm long and has an inside diameter of 44.3 mm. The reactor is flexible regarding the placement and number of sampling ports. At the beginning of the first convector unit and at the end of each unit there are sampling ports for organic micropollutants (OMP). This makes it possible to study the composition of the flue gases at various residence times. Sampling ports for inorganic compounds and particulate matter are also placed in the convector section. All operating parameters, reactor temperatures, concentrations of CO, CO2, O2, SO2, NO, and NO2 are continuously measured and stored at selected intervals for further evaluation. These unique features enable full control over the fuel feed, air flows, and air distribution as well as over the temperature profile. Elaborate details are provided regarding the configuration of the fuel-feeding systems, the fluidized bed, the convector section, and the APC device. This laboratory reactor enables detailed studies of the formation mechanisms of OMP, such as polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs), poly-chlorinated biphenyls (PCBs), and polychlorinated benzenes (PCBzs). With this system formation mechanisms of OMP occurring in both the combustion

  18. The Work Tasks Motivation Scale for Teachers (WTMST)

    ERIC Educational Resources Information Center

    Fernet, Claude; Senecal, Caroline; Guay, Frederic; Marsh, Herbert; Dowson, Martin

    2008-01-01

    The authors developed and validated a measure of teachers' motivation toward specific work tasks: The Work Tasks Motivation Scale for Teachers (WTMST). The WTMST is designed to assess five motivational constructs toward six work tasks (e.g., class preparation, teaching). The authors conducted a preliminary (n = 42) and a main study among…

  19. Student Off-Task Electronic Multitasking Predictors: Scale Development and Validation

    ERIC Educational Resources Information Center

    Qian, Yuxia; Li, Li

    2017-01-01

    In an attempt to better understand factors contributing to students' off-task electronic multitasking behavior in class, the research included two studies that developed a scale of students' off-task electronic multitasking predictors (the SOTEMP scale), and explored relationships between the scale and various classroom communication processes and…

  20. Achieving across-laboratory replicability in psychophysical scaling

    PubMed Central

    Ward, Lawrence M.; Baumann, Michael; Moffat, Graeme; Roberts, Larry E.; Mori, Shuji; Rutledge-Taylor, Matthew; West, Robert L.

    2015-01-01

    It is well known that, although psychophysical scaling produces good qualitative agreement between experiments, precise quantitative agreement between experimental results, such as that routinely achieved in physics or biology, is rarely or never attained. A particularly galling example of this is the fact that power function exponents for the same psychological continuum, measured in different laboratories but ostensibly using the same scaling method, magnitude estimation, can vary by a factor of three. Constrained scaling (CS), in which observers first learn a standardized meaning for a set of numerical responses relative to a standard sensory continuum and then make magnitude judgments of other sensations using the learned response scale, has produced excellent quantitative agreement between individual observers’ psychophysical functions. Theoretically it could do the same for across-laboratory comparisons, although this needs to be tested directly. We compared nine different experiments from four different laboratories as an example of the level of across experiment and across-laboratory agreement achievable using CS. In general, we found across experiment and across-laboratory agreement using CS to be significantly superior to that typically obtained with conventional magnitude estimation techniques, although some of its potential remains to be realized. PMID:26191019

  1. H2@Scale Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pivovar, Bryan

    2017-03-31

    Final report from the H2@Scale Workshop held November 16-17, 2016, at the National Renewable Energy Laboratory in Golden, Colorado. The U.S. Department of Energy's National Renewable Energy Laboratory hosted a technology workshop to identify the current barriers and research needs of the H2@Scale concept. H2@Scale is a concept regarding the potential for wide-scale impact of hydrogen produced from diverse domestic resources to enhance U.S. energy security and enable growth of innovative technologies and domestic industries. Feedback received from a diverse set of stakeholders at the workshop will guide the development of an H2@Scale roadmap for research, development, and early stagemore » demonstration activities that can enable hydrogen as an energy carrier at a national scale.« less

  2. Students' Task Interpretation and Conceptual Understanding in an Electronics Laboratory

    ERIC Educational Resources Information Center

    Rivera-Reyes, Presentacion; Lawanto, Oenardi; Pate, Michael L.

    2017-01-01

    Task interpretation is a critical first step for students in the process of self-regulated learning, and a key determinant when they set goals in their learning and select strategies in assigned work. This paper focuses on the explicit and implicit aspects of task interpretation based on Hadwin's model. Laboratory activities improve students'…

  3. EFFECTS OF LARVAL STOCKING DENSITY ON LABORATORY-SCALE AND COMMERICAL-SCALE PRODUCTION OF SUMMER FLOUNDER, PARALICHTHYS DENTATUS

    EPA Science Inventory

    Three experiments investigating larval stocking densities of summer flounder from hatch to metamorphosis, Paralichthys dentatus, were conducted at laboratory-scale (75-L aquaria) and at commercial scale (1,000-L tanks). Experiments 1 and 2 at commercial scale tested the densities...

  4. MHD scaling: from astrophysics to the laboratory

    NASA Astrophysics Data System (ADS)

    Ryutov, Dmitri

    2000-10-01

    During the last few years, considerable progress has been made in simulating astrophysical phenomena in laboratory experiments with high power lasers [1]. Astrophysical phenomena that have drawn particular interest include supernovae explosions; young supernova remnants; galactic jets; the formation of fine structures in late supernova remnants by instabilities; and the ablation driven evolution of molecular clouds illuminated by nearby bright stars, which may affect star formation. A question may arise as to what extent the laser experiments, which deal with targets of a spatial scale 0.01 cm and occur at a time scale of a few nanoseconds, can reproduce phenomena occurring at spatial scales of a million or more kilometers and time scales from hours to many years. Quite remarkably, if dissipative processes (like, e.g., viscosity, Joule dissipation, etc.) are subdominant in both systems, and the matter behaves as a polytropic gas, there exists a broad hydrodynamic similarity (the ``Euler similarity" of Ref. [2]) that allows a direct scaling of laboratory results to astrophysical phenomena. Following a review of relevant earlier work (in particular, [3]-[5]), discussion is presented of the details of the Euler similarity related to the presence of shocks and to a special case of a strong drive. After that, constraints stemming from possible development of small-scale turbulence are analyzed. Generalization of the Euler similarity to the case of a gas with spatially varying polytropic index is presented. A possibility of scaled simulations of ablation front dynamics is one more topic covered in this paper. It is shown that, with some additional constraints, a simple similarity exists. This, in particular, opens up the possibility of scaled laboratory simulation of the aforementioned ablation (photoevaporation) fronts. A nonlinear transformation [6] that establishes a duality between implosion and explosion processes is also discussed in the paper. 1. B.A. Remington et

  5. The Impact of Differentiated Instructional Materials on English Language Learner (ELL) Students' Comprehension of Science Laboratory Tasks

    ERIC Educational Resources Information Center

    Manavathu, Marian; Zhou, George

    2012-01-01

    Through a qualitative research design, this article investigates the impacts of differentiated laboratory instructional materials on English language learners' (ELLs) laboratory task comprehension. The factors affecting ELLs' science learning experiences are further explored. Data analysis reveals a greater degree of laboratory task comprehension…

  6. LABORATORY-SCALE ANALYSIS OF AQUIFER REMEDIATION BY IN-WELL VAPOR STRIPPING 2. MODELING RESULTS. (R825689C061)

    EPA Science Inventory

    2>Abstract2>

    The removal of volatile organic compounds (VOCs) from groundwater through in-well vapor stripping has been demonstrated by Gonen and Gvirtzman (1997, J. Contam. Hydrol., 00: 000-000) at the laboratory scale. The present study compares experimental breakthrough...

  7. Students' Progression in Monitoring Anomalous Results Obtained in Inquiry-Based Laboratory Tasks

    NASA Astrophysics Data System (ADS)

    Crujeiras-Pérez, Beatriz; Jiménez-Aleixandre, Maria Pilar

    2017-07-01

    This paper examines students' engagement in monitoring anomalous results across a 2-year longitudinal study with 9th and 10th graders (14-15 and 15-16 years of age). The context is a set of five inquiry-based laboratory tasks, requiring students to plan and carry out investigations. The study seeks to examine students' interpretation of data, in particular anomalous results generated by them during the process of solving the tasks, and their ability to monitor them. Data collected include video and audio recordings as well as students' written products. For the analysis, two rubrics were developed drawing on Chinn and Brewer (Cognition and Instruction, 19, 323-393, 2001) and Hmelo-Silver et al. (Science Education, 86, 219-243, 2002). The findings point to a pattern of progress in students' responses across the 2 years: (a) responses revealing a low capacity of monitoring due to not recognizing the data as anomalous or recognizing it as anomalous but being unable to explain their causes are more frequent in the first tasks and (b) responses revealing an improved capacity of monitoring are more frequent in the last tasks. The factors influencing students' regulation of their performances, as the requirement of planning, and specific scaffolding based on activity theory are discussed.

  8. Beyond-laboratory-scale prediction for channeling flows through subsurface rock fractures with heterogeneous aperture distributions revealed by laboratory evaluation

    NASA Astrophysics Data System (ADS)

    Ishibashi, Takuya; Watanabe, Noriaki; Hirano, Nobuo; Okamoto, Atsushi; Tsuchiya, Noriyoshi

    2015-01-01

    The present study evaluates aperture distributions and fluid flow characteristics for variously sized laboratory-scale granite fractures under confining stress. As a significant result of the laboratory investigation, the contact area in fracture plane was found to be virtually independent of scale. By combining this characteristic with the self-affine fractal nature of fracture surfaces, a novel method for predicting fracture aperture distributions beyond laboratory scale is developed. Validity of this method is revealed through reproduction of the results of laboratory investigation and the maximum aperture-fracture length relations, which are reported in the literature, for natural fractures. The present study finally predicts conceivable scale dependencies of fluid flows through joints (fractures without shear displacement) and faults (fractures with shear displacement). Both joint and fault aperture distributions are characterized by a scale-independent contact area, a scale-dependent geometric mean, and a scale-independent geometric standard deviation of aperture. The contact areas for joints and faults are approximately 60% and 40%. Changes in the geometric means of joint and fault apertures (µm), em, joint and em, fault, with fracture length (m), l, are approximated by em, joint = 1 × 102 l0.1 and em, fault = 1 × 103 l0.7, whereas the geometric standard deviations of both joint and fault apertures are approximately 3. Fluid flows through both joints and faults are characterized by formations of preferential flow paths (i.e., channeling flows) with scale-independent flow areas of approximately 10%, whereas the joint and fault permeabilities (m2), kjoint and kfault, are scale dependent and are approximated as kjoint = 1 × 10-12 l0.2 and kfault = 1 × 10-8 l1.1.

  9. Computational simulation of laboratory-scale volcanic jets

    NASA Astrophysics Data System (ADS)

    Solovitz, S.; Van Eaton, A. R.; Mastin, L. G.; Herzog, M.

    2017-12-01

    Volcanic eruptions produce ash clouds that may travel great distances, significantly impacting aviation and communities downwind. Atmospheric hazard forecasting relies partly on numerical models of the flow physics, which incorporate data from eruption observations and analogue laboratory tests. As numerical tools continue to increase in complexity, they must be validated to fine-tune their effectiveness. Since eruptions are relatively infrequent and challenging to observe in great detail, analogue experiments can provide important insights into expected behavior over a wide range of input conditions. Unfortunately, laboratory-scale jets cannot easily attain the high Reynolds numbers ( 109) of natural volcanic eruption columns. Comparisons between the computational models and analogue experiments can help bridge this gap. In this study, we investigate a 3-D volcanic plume model, the Active Tracer High-resolution Atmospheric Model (ATHAM), which has been used to simulate a variety of eruptions. However, it has not been previously validated using laboratory-scale data. We conducted numerical simulations of three flows that we have studied in the laboratory: a vertical jet in a quiescent environment, a vertical jet in horizontal cross flow, and a particle-laden jet. We considered Reynolds numbers from 10,000 to 50,000, jet-to-cross flow velocity ratios of 2 to 10, and particle mass loadings of up to 25% of the exit mass flow rate. Vertical jet simulations produce Gaussian velocity profiles in the near exit region by 3 diameters downstream, matching the mean experimental profiles. Simulations of air entrainment are of the correct order of magnitude, but they show decreasing entrainment with vertical distance from the vent. Cross flow simulations reproduce experimental trajectories for the jet centerline initially, although confinement appears to impact the response later. Particle-laden simulations display minimal variation in concentration profiles between cases with

  10. Scaling of theory-of-mind tasks.

    PubMed

    Wellman, Henry M; Liu, David

    2004-01-01

    Two studies address the sequence of understandings evident in preschoolers' developing theory of mind. The first, preliminary study provides a meta-analysis of research comparing different types of mental state understandings (e.g., desires vs. beliefs, ignorance vs. false belief). The second, primary study tests a theory-of-mind scale for preschoolers. In this study 75 children (aged 2 years, 11 months to 6 years, 6 months) were tested on 7 tasks tapping different aspects of understanding persons' mental states. Responses formed a consistent developmental progression, where for most children if they passed a later item they passed all earlier items as well, as confirmed by Guttman and Rasch measurement model analyses.

  11. LOW OZONE-DEPLETING HALOCARBONS AS TOTAL-FLOOD AGENTS: VOLUME 2. LABORATORY-SCALE FIRE SUPPRESSION AND EXPLOSION PREVENTION TESTING

    EPA Science Inventory

    The report gives results from (1) flame suppression testing of potential Halon-1301 (CF3Br) replacement chemicals in a laboratory cup burner using n-heptane fuel and (2) explosion prevention (inertion) testing in a small-scale explosion sphere using propane and methane as fuels. ...

  12. EPOS-WP16: A Platform for European Multi-scale Laboratories

    NASA Astrophysics Data System (ADS)

    Spiers, Chris; Drury, Martyn; Kan-Parker, Mirjam; Lange, Otto; Willingshofer, Ernst; Funiciello, Francesca; Rosenau, Matthias; Scarlato, Piergiorgio; Sagnotti, Leonardo; W16 Participants

    2016-04-01

    The participant countries in EPOS embody a wide range of world-class laboratory infrastructures ranging from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue modeling and paleomagnetic laboratories. Most data produced by the various laboratory centres and networks are presently available only in limited "final form" in publications. As such many data remain inaccessible and/or poorly preserved. However, the data produced at the participating laboratories are crucial to serving society's need for geo-resources exploration and for protection against geo-hazards. Indeed, to model resource formation and system behaviour during exploitation, we need an understanding from the molecular to the continental scale, based on experimental data. This contribution will describe the work plans that the laboratories community in Europe is making, in the context of EPOS. The main objectives are: - To collect and harmonize available and emerging laboratory data on the properties and processes controlling rock system behaviour at multiple scales, in order to generate products accessible and interoperable through services for supporting research activities. - To co-ordinate the development, integration and trans-national usage of the major solid Earth Science laboratory centres and specialist networks. The length scales encompassed by the infrastructures included range from the nano- and micrometer levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetre sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. - To provide products and services supporting research into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution.

  13. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Task 3 Full-scale Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary Blythe

    2007-05-01

    This Topical Report summarizes progress on Cooperative Agreement DE-FC26-04NT42309, 'Field Testing of a Wet FGD Additive'. The objective of the project is to demonstrate the use of a flue gas desulfurization (FGD) additive, Degussa Corporation's TMT-15, to prevent the reemission of elemental mercury (Hg{sup 0}) in flue gas exiting wet FGD systems on coal-fired boilers. Furthermore, the project intends to demonstrate whether the additive can be used to precipitate most of the mercury (Hg) removed in the wet FGD system as a fine TMT salt that can be separated from the FGD liquor and bulk solid byproducts for separate disposal.more » The project is conducting pilot- and full-scale tests of the TMT-15 additive in wet FGD absorbers. The tests are intended to determine required additive dosages to prevent Hg{sup 0} reemissions and to separate mercury from the normal FGD byproducts for three coal types: Texas lignite/Power River Basin (PRB) coal blend, high-sulfur Eastern bituminous coal, and low-sulfur Eastern bituminous coal. The project team consists of URS Group, Inc., EPRI, TXU Generation Company LP, Southern Company, and Degussa Corporation. TXU Generation has provided the Texas lignite/PRB cofired test site for pilot FGD tests, Monticello Steam Electric Station Unit 3. Southern Company is providing the low-sulfur Eastern bituminous coal host site for wet scrubbing tests, as well as the pilot- and full-scale jet bubbling reactor (JBR) FGD systems to be tested. IPL, an AES company, provided the high-sulfur Eastern bituminous coal full-scale FGD test site and cost sharing. Degussa Corporation is providing the TMT-15 additive and technical support to the test program as cost sharing. The project is being conducted in six tasks. Of the six project tasks, Task 1 involves project planning and Task 6 involves management and reporting. The other four tasks involve field testing on FGD systems, either at pilot or full scale. The four tasks include: Task 2 - Pilot Additive

  14. 30 CFR 14.21 - Laboratory-scale flame test apparatus.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Laboratory-scale flame test apparatus. 14.21 Section 14.21 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING... Technical Requirements § 14.21 Laboratory-scale flame test apparatus. The principal parts of the apparatus...

  15. 30 CFR 14.21 - Laboratory-scale flame test apparatus.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Laboratory-scale flame test apparatus. 14.21 Section 14.21 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING... Technical Requirements § 14.21 Laboratory-scale flame test apparatus. The principal parts of the apparatus...

  16. [Tasks and duties of veterinary reference laboratories for food borne zoonoses].

    PubMed

    Ellerbroek, Lüppo; Alter, T; Johne, R; Nöckler, K; Beutin, L; Helmuth, R

    2009-02-01

    Reference laboratories are of central importance for consumer protection. Field expertise and high scientific competence are basic requirements for the nomination of a national reference laboratory. To ensure a common approach in the analysis of zoonotic hazards, standards have been developed by the reference laboratories together with national official laboratories on the basis of Art. 33 of Directive (EG) No. 882/2004. Reference laboratories function as arbitrative boards in the case of ambivalent or debatable results. New methods for detection of zoonotic agents are developed and validated to provide tools for analysis, e. g., in legal cases, if results from different parties are disputed. Besides these tasks, national reference laboratories offer capacity building and advanced training courses and control the performance of ring trials to ensure consistency in the quality of analyses in official laboratories. All reference laboratories work according to the ISO standard 17025 which defines the grounds for strict laboratory quality rules and in cooperation with the respective Community Reference Laboratories (CRL). From the group of veterinary reference laboratories for food-borne zoonoses, the national reference laboratories are responsible for Listeria monocytogenes, for Campylobacter, for the surveillance and control of viral and bacterial contamination of bivalve molluscs, for E. coli, for the performance of analysis and tests on zoonoses (Salmonella), and from the group of parasitological zoonotic agents, the national reference laboratory for Trichinella.

  17. LABORATORY SCALE STEAM INJECTION TREATABILITY STUDIES

    EPA Science Inventory

    Laboratory scale steam injection treatability studies were first developed at The University of California-Berkeley. A comparable testing facility has been developed at USEPA's Robert S. Kerr Environmental Research Center. Experience has already shown that many volatile organic...

  18. Relationship between Adolescent Risk Preferences on a Laboratory Task and Behavioral Measures of Risk-taking

    PubMed Central

    Rao, Uma; Sidhartha, Tanuj; Harker, Karen R.; Bidesi, Anup S.; Chen, Li-Ann; Ernst, Monique

    2010-01-01

    Purpose The goal of the study was to assess individual differences in risk-taking behavior among adolescents in the laboratory. A second aim was to evaluate whether the laboratory-based risk-taking behavior is associated with other behavioral and psychological measures associated with risk-taking behavior. Methods Eighty-two adolescents with no personal history of psychiatric disorder completed a computerized decision-making task, the Wheel of Fortune (WOF). By offering choices between clearly defined probabilities and real monetary outcomes, this task assesses risk preferences when participants are confronted with potential rewards and losses. The participants also completed a variety of behavioral and psychological measures associated with risk-taking behavior. Results Performance on the task varied based on the probability and anticipated outcomes. In the winning sub-task, participants selected low probability-high magnitude reward (high-risk choice) less frequently than high probability-low magnitude reward (low-risk choice). In the losing sub-task, participants selected low probability-high magnitude loss more often than high probability-low magnitude loss. On average, the selection of probabilistic rewards was optimal and similar to performance in adults. There were, however, individual differences in performance, and one-third of the adolescents made high-risk choice more frequently than low-risk choice while selecting a reward. After controlling for sociodemographic and psychological variables, high-risk choice on the winning task predicted “real-world” risk-taking behavior and substance-related problems. Conclusions These findings highlight individual differences in risk-taking behavior. Preliminary data on face validity of the WOF task suggest that it might be a valuable laboratory tool for studying behavioral and neurobiological processes associated with risk-taking behavior in adolescents. PMID:21257113

  19. Report of the Fermilab ILC Citizens' Task Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Fermi National Accelerator Laboratory convened the ILC Citizens' Task Force to provide guidance and advice to the laboratory to ensure that community concerns and ideas are included in all public aspects of planning and design for a proposed future accelerator, the International Linear Collider. In this report, the members of the Task Force describe the process they used to gather and analyze information on all aspects of the proposed accelerator and its potential location at Fermilab in northern Illinois. They present the conclusions and recommendations they reached as a result of the learning process and their subsequent discussions and deliberations.more » While the Task Force was charged to provide guidance on the ILC, it became clear during the process that the high cost of the proposed accelerator made a near-term start for the project at Fermilab unlikely. Nevertheless, based on a year of extensive learning and dialogue, the Task Force developed a series of recommendations for Fermilab to consider as the laboratory develops all successor projects to the Tevatron. The Task Force recognizes that bringing a next-generation particle physics project to Fermilab will require both a large international effort and the support of the local community. While the Task Force developed its recommendations in response to the parameters of a future ILC, the principles they set forth apply directly to any large project that may be conceived at Fermilab, or at other laboratories, in the future. With this report, the Task Force fulfills its task of guiding Fermilab from the perspective of the local community on how to move forward with a large-scale project while building positive relationships with surrounding communities. The report summarizes the benefits, concerns and potential impacts of bringing a large-scale scientific project to northern Illinois.« less

  20. H2@Scale Laboratory CRADA Call | Hydrogen and Fuel Cells | NREL

    Science.gov Websites

    research and development agreement (CRADA) projects with the Hydrogen at Scale (H2@Scale) national CRADA Call H2@Scale CRADA Agreement Template Responses to CRADA Call Questions: Part 1 (includes

  1. Laboratory-Scale Evidence for Lightning-Mediated Gene Transfer in Soil

    PubMed Central

    Demanèche, Sandrine; Bertolla, Franck; Buret, François; Nalin, Renaud; Sailland, Alain; Auriol, Philippe; Vogel, Timothy M.; Simonet, Pascal

    2001-01-01

    Electrical fields and current can permeabilize bacterial membranes, allowing for the penetration of naked DNA. Given that the environment is subjected to regular thunderstorms and lightning discharges that induce enormous electrical perturbations, the possibility of natural electrotransformation of bacteria was investigated. We demonstrated with soil microcosm experiments that the transformation of added bacteria could be increased locally via lightning-mediated current injection. The incorporation of three genes coding for antibiotic resistance (plasmid pBR328) into the Escherichia coli strain DH10B recipient previously added to soil was observed only after the soil had been subjected to laboratory-scale lightning. Laboratory-scale lightning had an electrical field gradient (700 versus 600 kV m−1) and current density (2.5 versus 12.6 kA m−2) similar to those of full-scale lightning. Controls handled identically except for not being subjected to lightning produced no detectable antibiotic-resistant clones. In addition, simulated storm cloud electrical fields (in the absence of current) did not produce detectable clones (transformation detection limit, 10−9). Natural electrotransformation might be a mechanism involved in bacterial evolution. PMID:11472916

  2. Constructing a Scale to Assess L2 Written Speech Act Performance: WDCT and E-Mail Tasks

    ERIC Educational Resources Information Center

    Chen, Yuan-shan; Liu, Jianda

    2016-01-01

    This study reports the development of a scale to evaluate the speech act performance by intermediate-level Chinese learners of English. A qualitative analysis of the American raters' comments was conducted on learner scripts in response to a total of 16 apology and request written discourse completion task (WDCT) situations. The results showed…

  3. Experimental methods for the simulation of supercritical CO2 injection at laboratory scale aimed to investigate capillary trapping

    NASA Astrophysics Data System (ADS)

    Trevisan, L.; Illangasekare, T. H.; Rodriguez, D.; Sakaki, T.; Cihan, A.; Birkholzer, J. T.; Zhou, Q.

    2011-12-01

    Geological storage of carbon dioxide in deep geologic formations is being considered as a technical option to reduce greenhouse gas loading to the atmosphere. The processes associated with the movement and stable trapping are complex in deep naturally heterogeneous formations. Three primary mechanisms contribute to trapping; capillary entrapment due to immobilization of the supercritical fluid CO2 within soil pores, liquid CO2 dissolving in the formation water and mineralization. Natural heterogeneity in the formation is expected to affect all three mechanisms. A research project is in progress with the primary goal to improve our understanding of capillary and dissolution trapping during injection and post-injection process, focusing on formation heterogeneity. It is expected that this improved knowledge will help to develop site characterization methods targeting on obtaining the most critical parameters that capture the heterogeneity to design strategies and schemes to maximize trapping. This research combines experiments at the laboratory scale with multiphase modeling to upscale relevant trapping processes to the field scale. This paper presents the results from a set of experiments that were conducted in an intermediate scale test tanks. Intermediate scale testing provides an attractive alternative to investigate these processes under controlled conditions in the laboratory. Conducting these types of experiments is highly challenging as methods have to be developed to extrapolate the data from experiments that are conducted under ambient laboratory conditions to high temperatures and pressures settings in deep geologic formations. We explored the use of a combination of surrogate fluids that have similar density, viscosity contrasts and analogous solubility and interfacial tension as supercritical CO2-brine in deep formations. The extrapolation approach involves the use of dimensionless numbers such as Capillary number (Ca) and the Bond number (Bo). A set of

  4. Subjective scaling of mental workload in a multi-task environment

    NASA Technical Reports Server (NTRS)

    Daryanian, B.

    1982-01-01

    Those factors in a multi-task environment that contribute to the operators' "sense" of mental workload were identified. The subjective judgment as conscious experience of mental effort was decided to be the appropriate method of measurement. Thurstone's law of comparative judgment was employed in order to construct interval scales of subjective mental workload from paired comparisons data. An experimental paradigm (Simulated Multi-Task Decision-Making Environment) was employed to represent the ideal experimentally controlled environment in which human operators were asked to "attend" to different cases of Tulga's decision making tasks. Through various statistical analyses it was found that, in general, a lower number of tasks-to-be-processed per unit time (a condition associated with longer interarrival times), results in a lower mental workload, a higher consistency of judgments within a subject, a higher degree of agreement among the subjects, and larger distances between the cases on the Thurstone scale of subjective mental workload. The effects of various control variables and their interactions, and the different characteristics of the subjects on the variation of subjective mental workload are demonstrated.

  5. Effects of scaling task constraints on emergent behaviours in children's racquet sports performance.

    PubMed

    Fitzpatrick, Anna; Davids, Keith; Stone, Joseph A

    2018-04-01

    Manipulating task constraints by scaling key features like space and equipment is considered an effective method for enhancing performance development and refining movement patterns in sport. Despite this, it is currently unclear whether scaled manipulation of task constraints would impact emergent movement behaviours in young children, affording learners opportunities to develop relevant skills. Here, we sought to investigate how scaling task constraints during 8 weeks of mini tennis training shaped backhand stroke development. Two groups, control (n = 8, age = 7.2 ± 0.6 years) and experimental (n = 8, age 7.4 ± 0.4 years), underwent practice using constraints-based manipulations, with a specific field of affordances designed for backhand strokes as the experimental treatment. To evaluate intervention effects, pre- and post-test match-play characteristics (e.g. forehand and backhand percentage strokes) and measures from a tennis-specific skills test (e.g. forehand and backhand technical proficiency), were evaluated. Post intervention, the experimental group performed a greater percentage of backhand strokes out of total number of shots played (46.7 ± 3.3%). There was also a significantly greater percentage of backhand winners out of total backhand strokes observed (5.5 ± 3.0%), compared to the control group during match-play (backhands = 22.4 ± 6.5%; backhand winners = 1.0 ± 3.6%). The experimental group also demonstrated improvements in forehand and backhand technical proficiency and the ability to maintain a rally with a coach, compared to the control group. In conclusion, scaled manipulations implemented here elicited more functional performance behaviours than standard Mini Tennis Red constraints. Results suggested how human movement scientists may scale task constraint manipulations to augment young athletes' performance development. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  6. Building a Laboratory-Scale Biogas Plant and Verifying its Functionality

    NASA Astrophysics Data System (ADS)

    Boleman, Tomáš; Fiala, Jozef; Blinová, Lenka; Gerulová, Kristína

    2011-01-01

    The paper deals with the process of building a laboratory-scale biogas plant and verifying its functionality. The laboratory-scale prototype was constructed in the Department of Safety and Environmental Engineering at the Faculty of Materials Science and Technology in Trnava, of the Slovak University of Technology. The Department has already built a solar laboratory to promote and utilise solar energy, and designed SETUR hydro engine. The laboratory is the next step in the Department's activities in the field of renewable energy sources and biomass. The Department is also involved in the European Union project, where the goal is to upgrade all existed renewable energy sources used in the Department.

  7. Robonaut 2 in the U.S. Laboratory

    NASA Image and Video Library

    2013-01-02

    ISS034-E-013990 (2 Jan. 2013) --- In the International Space Station’s Destiny laboratory, Robonaut 2 is pictured during a round of testing for the first humanoid robot in space. Ground teams put Robonaut through its paces as they remotely commanded it to operate valves on a task board. Robonaut is a testbed for exploring new robotic capabilities in space, and its form and dexterity allow it to use the same tools and control panels as its human counterparts do aboard the station.

  8. Predicting the performance uncertainty of a 1-MW pilot-scale carbon capture system after hierarchical laboratory-scale calibration and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Lai, Canhai; Marcy, Peter William

    2017-05-01

    A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less

  9. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Task 5 Full-Scale Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary Blythe; MariJon Owens

    2007-12-01

    management and reporting. The other four tasks involve field testing on FGD systems, either at pilot or full scale. The four tasks include: Task 2 - Pilot Additive Testing in Texas Lignite Flue Gas; Task 3 - Full-scale FGD Additive Testing in High-sulfur Eastern Bituminous Flue Gas; Task 4 - Pilot Wet Scrubber Additive Tests at Plant Yates; and Task 5 - Full-scale Additive Tests at Plant Yates. The pilot-scale tests and the full-scale test using high-sulfur coal were completed in 2005 and 2006 and have been previously reported. This topical report presents the results from the Task 5 full-scale additive tests, conducted at Southern Company's Plant Yates Unit 1. Both additives were tested there.« less

  10. A comparison of refuse attenuation in laboratory and field scale lysimeters.

    PubMed

    Youcai, Zhao; Luochun, Wang; Renhua, Hua; Dimin, Xu; Guowei, Gu

    2002-01-01

    For this study, small and middle scale laboratory lysimeters, and a large scale field lysimeter in situ in Shanghai Refuse Landfill, with refuse weights of 187,600 and 10,800,000 kg, respectively, were created. These lysimeters are compared in terms of leachate quality (pH, concentrations of COD, BOD and NH3-N), refuse composition (biodegradable matter and volatile solid) and surface settlement for a monitoring period of 0-300 days. The objectives of this study were to explore both the similarities and disparities between laboratory and field scale lysimeters, and to compare degradation behaviors of refuse at the intensive reaction phase in the different scale lysimeters. Quantitative relationships of leachate quality and refuse composition with placement time show that degradation behaviors of refuse seem to depend heavily on the scales of the lysimeters and the parameters of concern, especially in the starting period of 0-6 months. However, some similarities exist between laboratory and field lysimeters after 4-6 months of placement because COD and BOD concentrations in leachate in the field lysimeter decrease regularly in a parallel pattern with those in the laboratory lysimeters. NH3-N, volatile solid (VS) and biodegradable matter (BDM) also gradually decrease in parallel in this intensive reaction phase for all scale lysimeters as refuse ages. Though the concrete data are different among the different scale lysimeters, it may be considered that laboratory lysimeters with sufficient scale are basically applicable for a rough simulation of a real landfill, especially for illustrating the degradation pattern and mechanism. Settlement of refuse surface is roughly proportional to the initial refuse height.

  11. Integration and segregation of large-scale brain networks during short-term task automatization

    PubMed Central

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F.; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-01-01

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes. PMID:27808095

  12. Note: Measurement system for the radiative forcing of greenhouse gases in a laboratory scale.

    PubMed

    Kawamura, Yoshiyuki

    2016-01-01

    The radiative forcing of the greenhouse gases has been studied being based on computational simulations or the observation of the real atmosphere meteorologically. In order to know the greenhouse effect more deeply and to study it from various viewpoints, the study on it in a laboratory scale is important. We have developed a direct measurement system for the infrared back radiation from the carbon dioxide (CO2) gas. The system configuration is similar with that of the practical earth-atmosphere-space system. Using this system, the back radiation from the CO2 gas was directly measured in a laboratory scale, which roughly coincides with meteorologically predicted value.

  13. HD-MTL: Hierarchical Deep Multi-Task Learning for Large-Scale Visual Recognition.

    PubMed

    Fan, Jianping; Zhao, Tianyi; Kuang, Zhenzhong; Zheng, Yu; Zhang, Ji; Yu, Jun; Peng, Jinye

    2017-02-09

    In this paper, a hierarchical deep multi-task learning (HD-MTL) algorithm is developed to support large-scale visual recognition (e.g., recognizing thousands or even tens of thousands of atomic object classes automatically). First, multiple sets of multi-level deep features are extracted from different layers of deep convolutional neural networks (deep CNNs), and they are used to achieve more effective accomplishment of the coarseto- fine tasks for hierarchical visual recognition. A visual tree is then learned by assigning the visually-similar atomic object classes with similar learning complexities into the same group, which can provide a good environment for determining the interrelated learning tasks automatically. By leveraging the inter-task relatedness (inter-class similarities) to learn more discriminative group-specific deep representations, our deep multi-task learning algorithm can train more discriminative node classifiers for distinguishing the visually-similar atomic object classes effectively. Our hierarchical deep multi-task learning (HD-MTL) algorithm can integrate two discriminative regularization terms to control the inter-level error propagation effectively, and it can provide an end-to-end approach for jointly learning more representative deep CNNs (for image representation) and more discriminative tree classifier (for large-scale visual recognition) and updating them simultaneously. Our incremental deep learning algorithms can effectively adapt both the deep CNNs and the tree classifier to the new training images and the new object classes. Our experimental results have demonstrated that our HD-MTL algorithm can achieve very competitive results on improving the accuracy rates for large-scale visual recognition.

  14. Fracture induced electromagnetic emissions: extending laboratory findings by observations at the geophysical scale

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Constantinos

    2014-05-01

    Under natural conditions, it is practically impossible to install an experimental network on the geophysical scale using the same instrumentations as in laboratory experiments for understanding, through the states of stress and strain and their time variation, the laws that govern the friction during the last stages of EQ generation, or to monitor (much less to control) the principal characteristics of a fracture process. Fracture-induced electromagnetic emissions (EME) in a wide range of frequency bands are sensitive to the micro-structural chances. Thus, their study constitutes a nondestructive method for the monitoring of the evolution of damage process at the laboratory scale. It has been suggested that fracture induced MHz-kHz electromagnetic (EM) emissions, which emerge from a few days up to a few hours before the main seismic shock occurrence permit a real time monitoring of the damage process during the last stages of earthquake preparation, as it happens at the laboratory scale. Since the EME are produced both in the case of the laboratory scale fracture and the EQ preparation process (geophysical scale fracture) they should present similar characteristics in these two scales. Therefore, both the laboratory experimenting scientists and the experimental scientists studying the pre-earthquake EME could benefit from each- other's results. Importantly, it is noted that when studying the fracture process by means of laboratory experiments, the fault growth process normally occurs violently in a fraction of a second. However, a major difference between the laboratory and natural processes is the order-of-magnitude differences in scale (in space and time), allowing the possibility of experimental observation at the geophysical scale for a range of physical processes which are not observable at the laboratory scale. Therefore, the study of fracture-induced EME is expected to reveal more information, especially for the last stages of the fracture process, when it

  15. The design of dapog rice seeder model for laboratory scale

    NASA Astrophysics Data System (ADS)

    Purba, UI; Rizaldi, T.; Sumono; Sigalingging, R.

    2018-02-01

    The dapog system is seeding rice seeds using a special nursery tray. Rice seedings with dapog systems can produce seedlings in the form of higher quality and uniform seed rolls. This study aims to reduce the cost of making large-scale apparatus by designing models for small-scale and can be used for learning in the laboratory. Parameters observed were soil uniformity, seeds and fertilizers, soil looses, seeds and fertilizers, effective capacity of apparatus, and power requirements. The results showed a high uniformity in soil, seed and fertilizer respectively 92.8%, 1-3 seeds / cm2 and 82%. The scattered materials for soil, seed and fertilizer were respectively 6.23%, 2.7% and 2.23%. The effective capacity of apparatus was 360 boxes / hour with 237.5 kWh of required power.

  16. Exploring students' perceptions and performance on predict-observe-explain tasks in high school chemistry laboratory

    NASA Astrophysics Data System (ADS)

    Vadapally, Praveen

    This study sought to understand the impact of gender and reasoning level on students' perceptions and performances of Predict-Observe-Explain (POE) laboratory tasks in a high school chemistry laboratory. Several literature reviews have reported that students at all levels have not developed the specific knowledge and skills that were expected from their laboratory work. Studies conducted over the last several decades have found that boys tend to be more successful than girls in science and mathematics courses. However, some recent studies have suggested that girls may be reducing this gender gap. This gender difference is the focal point of this research study, which was conducted at a mid-western, rural high school. The participants were 24 boys and 25 girls enrolled in two physical science classes taught by the same teacher. In this mixed methods study, qualitative and quantitative methods were implemented simultaneously over the entire period of the study. MANOVA statistics revealed significant effects due to gender and level of reasoning on the outcome variables, which were POE performances and perceptions of the chemistry laboratory environment. There were no significant interactions between these effects. For the qualitative method, IRB-approved information was collected, coded, grouped, and analyzed. This method was used to derive themes from students' responses on questionnaires and semi-structured interviews. Students with different levels of reasoning and gender were interviewed, and many of them expressed positive themes, which was a clear indication that they had enjoyed participating in the POE learning tasks and they had developed positive perceptions towards POE inquiry laboratory learning environment. When students are capable of formal reasoning, they can use an abstract scientific concept effectively and then relate it to the ideas they generate in their minds. Thus, instructors should factor the nature of students' thinking abilities into their

  17. A Teacher-Report Measure of Children's Task-Avoidant Behavior: A Validation Study of the Behavioral Strategy Rating Scale

    ERIC Educational Resources Information Center

    Zhang, Xiao; Nurmi, Jari-Erik; Kiuru, Noona; Lerkkanen, Marja-Kristiina; Aunola, Kaisa

    2011-01-01

    This study aims to validate a teacher-report measure of children's task-avoidant behavior, namely the Behavioral Strategy Rating Scale (BSRS), in a sample of 352 Finnish children. In each of the four waves from Kindergarten to Grade 2, teachers rated children's task-avoidant behavior using the BSRS, children completed reading and mathematics…

  18. Life sciences payload definition and integration study, task C and D. Volume 2: Payload definition, integration, and planning studies

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Life Sciences Payload Definition and Integration Study was composed of four major tasks. Tasks A and B, the laboratory definition phase, were the subject of prior NASA study. The laboratory definition phase included the establishment of research functions, equipment definitions, and conceptual baseline laboratory designs. These baseline laboratories were designated as Maxi-Nom, Mini-30, and Mini-7. The outputs of Tasks A and B were used by the NASA Life Sciences Payload Integration Team to establish guidelines for Tasks C and D, the laboratory integration phase of the study. A brief review of Tasks A and B is presented provide background continuity. The tasks C and D effort is the subject of this report. The Task C effort stressed the integration of the NASA selected laboratory designs with the shuttle sortie module. The Task D effort updated and developed costs that could be used by NASA for preliminary program planning.

  19. Intermediate Scale Laboratory Testing to Understand Mechanisms of Capillary and Dissolution Trapping during Injection and Post-Injection of CO 2 in Heterogeneous Geological Formations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Illangasekare, Tissa; Trevisan, Luca; Agartan, Elif

    2015-03-31

    Carbon Capture and Storage (CCS) represents a technology aimed to reduce atmospheric loading of CO 2 from power plants and heavy industries by injecting it into deep geological formations, such as saline aquifers. A number of trapping mechanisms contribute to effective and secure storage of the injected CO 2 in supercritical fluid phase (scCO 2) in the formation over the long term. The primary trapping mechanisms are structural, residual, dissolution and mineralization. Knowledge gaps exist on how the heterogeneity of the formation manifested at all scales from the pore to the site scales affects trapping and parameterization of contributing mechanismsmore » in models. An experimental and modeling study was conducted to fill these knowledge gaps. Experimental investigation of fundamental processes and mechanisms in field settings is not possible as it is not feasible to fully characterize the geologic heterogeneity at all relevant scales and gathering data on migration, trapping and dissolution of scCO 2. Laboratory experiments using scCO 2 under ambient conditions are also not feasible as it is technically challenging and cost prohibitive to develop large, two- or three-dimensional test systems with controlled high pressures to keep the scCO 2 as a liquid. Hence, an innovative approach that used surrogate fluids in place of scCO 2 and formation brine in multi-scale, synthetic aquifers test systems ranging in scales from centimeter to meter scale developed used. New modeling algorithms were developed to capture the processes controlled by the formation heterogeneity, and they were tested using the data from the laboratory test systems. The results and findings are expected to contribute toward better conceptual models, future improvements to DOE numerical codes, more accurate assessment of storage capacities, and optimized placement strategies. This report presents the experimental and modeling methods and research results.« less

  20. Effects of a no-go Task 2 on Task 1 performance in dual - tasking: From benefits to costs.

    PubMed

    Janczyk, Markus; Huestegge, Lynn

    2017-04-01

    When two tasks are combined in a dual-task experiment, characteristics of Task 2 can influence Task 1 performance, a phenomenon termed the backward crosstalk effect (BCE). Besides instances depending on the (spatial) compatibility of both responses, a particularly interesting example was introduced by Miller (2006): If Task 2 was a no-go task (i.e., one not requiring any action at all), responses were slowed in Task 1. Subsequent work, however, also reported the opposite result-that is, faster Task 1 responses in cases of no-go Task 2 trials. We report three experiments aiming to more precisely identify the conditions under which a no-go Task 2 facilitates or impedes Task 1 performance. The results suggest that an adverse no-go BCE is only observed when the Task 2 response(s) are sufficiently prepared in advance, yielding strong inhibitory control demands for Task 2 that eventually hamper Task 1 processing as well (i.e., inhibitory costs). If this is not the case, encountering a no-go Task 2 trial facilitates Task 1 performance, suggesting that the underlying task representation is reduced to a single - task. These results are discussed in the context of other recent work on BCEs and of recently suggested accounts of the no-go BCE.

  1. Full-scale and laboratory-scale anaerobic treatment of citric acid production wastewater.

    PubMed

    Colleran, E; Pender, S; Philpott, U; O'Flaherty, V; Leahy, B

    1998-01-01

    This paper reviews the operation of a full-scale, fixed-bed digester treating a citric acid production wastewater with a COD:sulphate ratio of 3-4:1. Support matrix pieces were removed from the digester at intervals during the first 5 years of operation in order to quantify the vertical distribution of biomass within the digester. Detailed analysis of the digester biomass after 5 years of operation indicated that H2 and propionate-utilising SRB had outcompeted hydrogenophilic methanogens and propionate syntrophs. Acetoclastic methanogens were shown to play the dominant role in acetate conversion. Butyrate and ethanol-degrading syntrophs also remained active in the digester after 5 years of operation. Laboratory-scale hybrid reactor treatment at 55 degrees C of a diluted molasses influent, with and without sulphate supplementation, showed that the reactors could be operated with high stability at volumetric loading rates of 24 kgCOD.m-3.d-1 (12 h HRT). In the presence of sulphate (2 g/l-1; COD/sulphate ratio of 6:1), acetate conversion was severely inhibited, resulting in effluent acetate concentrations of up to 4000 mg.l-1.

  2. A Simple Laboratory Scale Model of Iceberg Dynamics and its Role in Undergraduate Education

    NASA Astrophysics Data System (ADS)

    Burton, J. C.; MacAyeal, D. R.; Nakamura, N.

    2011-12-01

    Lab-scale models of geophysical phenomena have a long history in research and education. For example, at the University of Chicago, Dave Fultz developed laboratory-scale models of atmospheric flows. The results from his laboratory were so stimulating that similar laboratories were subsequently established at a number of other institutions. Today, the Dave Fultz Memorial Laboratory for Hydrodynamics (http://geosci.uchicago.edu/~nnn/LAB/) teaches general circulation of the atmosphere and oceans to hundreds of students each year. Following this tradition, we have constructed a lab model of iceberg-capsize dynamics for use in the Fultz Laboratory, which focuses on the interface between glaciology and physical oceanography. The experiment consists of a 2.5 meter long wave tank containing water and plastic "icebergs". The motion of the icebergs is tracked using digital video. Movies can be found at: http://geosci.uchicago.edu/research/glaciology_files/tsunamigenesis_research.shtml. We have had 3 successful undergraduate interns with backgrounds in mathematics, engineering, and geosciences perform experiments, analyze data, and interpret results. In addition to iceberg dynamics, the wave-tank has served as a teaching tool in undergraduate classes studying dam-breaking and tsunami run-up. Motivated by the relatively inexpensive cost of our apparatus (~1K-2K dollars) and positive experiences of undergraduate students, we hope to serve as a model for undergraduate research and education that other universities may follow.

  3. Application of peer instruction in the laboratory task of measuring the effective mass of a spring

    NASA Astrophysics Data System (ADS)

    Zhang, Chun-Ling; Hou, Zhen-Yu; Si, Yu-Chang; Wen, Xiao-Qing; Tang, Lei

    2017-11-01

    Peer instruction (PI) is an effective interactive approach to teaching and learning that has principally been used to modify the experience of learning in traditional physics lecture settings. This article further illustrates how the concept of PI can be effectively applied in the physics student laboratory setting. The setting used is a laboratory task that calls for the measurement of the effective mass of the spring of a Jolly balance. Through PI the students gain a better understanding of what is meant by the construct ‘effective mass of a spring’, and thereby competently work out how the mass, shape, wire diameter, and number of turns of the spring can all affect the effective mass of the spring. Furthermore, using stopwatches the students were also able to appreciate how recorded times at the equilibrium position had greater uncertainty than measurements made at the maximum displacement. This led to their calculations of the effective mass of the spring being impressively close to the theoretical value. Such laboratory tasks are extremely challenging to introductory level students and the success attained by the students in this study indicates that there is much potential in the application of PI in laboratory settings. PI should be used to teach in the laboratory and results should be reported in order for our community to build on these experiences. This article is a contribution to that effort.

  4. Using Mental Transformation Strategies for Spatial Scaling: Evidence from a Discrimination Task

    ERIC Educational Resources Information Center

    Möhring, Wenke; Newcombe, Nora S.; Frick, Andrea

    2016-01-01

    Spatial scaling, or an understanding of how distances in different-sized spaces relate to each other, is fundamental for many spatial tasks and relevant for success in numerous professions. Previous research has suggested that adults use mental transformation strategies to mentally scale spatial input, as indicated by linear increases in response…

  5. Task driven optimal leg trajectories in insect-scale legged microrobots

    NASA Astrophysics Data System (ADS)

    Doshi, Neel; Goldberg, Benjamin; Jayaram, Kaushik; Wood, Robert

    Origami inspired layered manufacturing techniques and 3D-printing have enabled the development of highly articulated legged robots at the insect-scale, including the 1.43g Harvard Ambulatory MicroRobot (HAMR). Research on these platforms has expanded its focus from manufacturing aspects to include design optimization and control for application-driven tasks. Consequently, the choice of gait selection, body morphology, leg trajectory, foot design, etc. have become areas of active research. HAMR has two controlled degrees-of-freedom per leg, making it an ideal candidate for exploring leg trajectory. We will discuss our work towards optimizing HAMR's leg trajectories for two different tasks: climbing using electroadhesives and level ground running (5-10 BL/s). These tasks demonstrate the ability of single platform to adapt to vastly different locomotive scenarios: quasi-static climbing with controlled ground contact, and dynamic running with un-controlled ground contact. We will utilize trajectory optimization methods informed by existing models and experimental studies to determine leg trajectories for each task. We also plan to discuss how task specifications and choice of objective function have contributed to the shape of these optimal leg trajectories.

  6. Towards Better Computational Models of the Balance Scale Task: A Reply to Shultz and Takane

    ERIC Educational Resources Information Center

    van der Maas, Han L. J.; Quinlan, Philip T.; Jansen, Brenda R. J.

    2007-01-01

    In contrast to Shultz and Takane [Shultz, T.R., & Takane, Y. (2007). Rule following and rule use in the balance-scale task. "Cognition", in press, doi:10.1016/j.cognition.2006.12.004.] we do not accept that the traditional Rule Assessment Method (RAM) of scoring responses on the balance scale task has advantages over latent class analysis (LCA):…

  7. The Use of Specially Designed Tasks to Enhance Student Interest in the Cadaver Dissection Laboratory

    ERIC Educational Resources Information Center

    Kang, Seok Hoon; Shin, Jwa-Seop; Hwang, Young-il

    2012-01-01

    Cadaver dissection is a key component of anatomy education. Unfortunately, students sometimes regard the process of dissection as uninteresting or stressful. To make laboratory time more interesting and to encourage discussion and collaborative learning among medical students, specially designed tasks were assigned to students throughout…

  8. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  9. EPOS-WP16: A coherent and collaborative network of Solid Earth Multi-scale laboratories

    NASA Astrophysics Data System (ADS)

    Calignano, Elisa; Rosenau, Matthias; Lange, Otto; Spiers, Chris; Willingshofer, Ernst; Drury, Martyn; van Kan-Parker, Mirjam; Elger, Kirsten; Ulbricht, Damian; Funiciello, Francesca; Trippanera, Daniele; Sagnotti, Leonardo; Scarlato, Piergiorgio; Tesei, Telemaco; Winkler, Aldo

    2017-04-01

    Laboratory facilities are an integral part of Earth Science research. The diversity of methods employed in such infrastructures reflects the multi-scale nature of the Earth system and is essential for the understanding of its evolution, for the assessment of geo-hazards and for the sustainable exploitation of geo-resources. In the frame of EPOS (European Plate Observing System), the Working Package 16 represents a developing community of European Geoscience Multi-scale laboratories. The participant and collaborating institutions (Utrecht University, GFZ, RomaTre University, INGV, NERC, CSIC-ICTJA, CNRS, LMU, C4G-UBI, ETH, CNR*) embody several types of laboratory infrastructures, engaged in different fields of interest of Earth Science: from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue tectonic and geodynamic modelling and paleomagnetic laboratories. The length scales encompassed by these infrastructures range from the nano- and micrometre levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetres-sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. The aim of WP16 is to provide two services by the year 2019: first, providing virtual access to data from laboratories (data service) and, second, providing physical access to laboratories (transnational access, TNA). Regarding the development of a data service, the current status is such that most data produced by the various laboratory centres and networks are available only in limited "final form" in publications, many data remain inaccessible and/or poorly preserved. Within EPOS the TCS Multi-scale laboratories is collecting and harmonizing available and emerging laboratory data on the properties and process controlling rock system behaviour at all relevant scales, in order to generate products accessible and interoperable through services for supporting

  10. Fate of estrone in laboratory-scale constructed wetlands

    USDA-ARS?s Scientific Manuscript database

    A horizontal, subsurface, laboratory-scale constructed wetland (CW) consisting of four cells in series was used to determine the attenuation of the steroid hormone estrone (E1) present in animal wastewater. Liquid swine manure diluted 1:80 with farm pond water and dosed with [14C]E1 flowed through ...

  11. Differences between child and adult large-scale functional brain networks for reading tasks.

    PubMed

    Liu, Xin; Gao, Yue; Di, Qiqi; Hu, Jiali; Lu, Chunming; Nan, Yun; Booth, James R; Liu, Li

    2018-02-01

    Reading is an important high-level cognitive function of the human brain, requiring interaction among multiple brain regions. Revealing differences between children's large-scale functional brain networks for reading tasks and those of adults helps us to understand how the functional network changes over reading development. Here we used functional magnetic resonance imaging data of 17 adults (19-28 years old) and 16 children (11-13 years old), and graph theoretical analyses to investigate age-related changes in large-scale functional networks during rhyming and meaning judgment tasks on pairs of visually presented Chinese characters. We found that: (1) adults had stronger inter-regional connectivity and nodal degree in occipital regions, while children had stronger inter-regional connectivity in temporal regions, suggesting that adults rely more on visual orthographic processing whereas children rely more on auditory phonological processing during reading. (2) Only adults showed between-task differences in inter-regional connectivity and nodal degree, whereas children showed no task differences, suggesting the topological organization of adults' reading network is more specialized. (3) Children showed greater inter-regional connectivity and nodal degree than adults in multiple subcortical regions; the hubs in children were more distributed in subcortical regions while the hubs in adults were more distributed in cortical regions. These findings suggest that reading development is manifested by a shift from reliance on subcortical to cortical regions. Taken together, our study suggests that Chinese reading development is supported by developmental changes in brain connectivity properties, and some of these changes may be domain-general while others may be specific to the reading domain. © 2017 Wiley Periodicals, Inc.

  12. 14th International Congress on Antiphospholipid Antibodies Task Force. Report on antiphospholipid syndrome laboratory diagnostics and trends.

    PubMed

    Bertolaccini, Maria Laura; Amengual, Olga; Andreoli, Laura; Atsumi, Tatsuya; Chighizola, Cecilia B; Forastiero, Ricardo; de Groot, Philip; Lakos, Gabriella; Lambert, Marc; Meroni, Pierluigi; Ortel, Thomas L; Petri, Michelle; Rahman, Anisur; Roubey, Robert; Sciascia, Savino; Snyder, Melissa; Tebo, Anne E; Tincani, Angela; Willis, Rohan

    2014-09-01

    Current classification criteria for definite Antiphospholipid Syndrome (APS) require the use of three laboratory assays to detect antiphospholipid antibodies (aCL, anti-β2GPI and LA) in the presence of at least one of the two major clinical manifestations (i.e. thrombosis or pregnancy morbidity) of the syndrome. However, several other autoantibodies shown to be directed to other proteins or their complex with phospholipids have been proposed to be relevant to APS but their clinical utility and their diagnostic value remains elusive. This report summarizes the findings, conclusions and recommendations of the "APS Task Force 3-Laboratory Diagnostics and Trends" meeting that took place during the 14th International Congress on Antiphospholipid Antibodies (APLA 2013, September 18-21, Rio de Janeiro, RJ, Brazil). Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Wellbore Completion Systems Containment Breach Solution Experiments at a Large Scale Underground Research Laboratory : Sealant placement & scale-up from Lab to Field

    NASA Astrophysics Data System (ADS)

    Goodman, H.

    2017-12-01

    This investigation seeks to develop sealant technology that can restore containment to completed wells that suffer CO2 gas leakages currently untreatable using conventional technologies. Experimentation is performed at the Mont Terri Underground Research Laboratory (MT-URL) located in NW Switzerland. The laboratory affords investigators an intermediate-scale test site that bridges the gap between the laboratory bench and full field-scale conditions. Project focus is the development of CO2 leakage remediation capability using sealant technology. The experimental concept includes design and installation of a field scale completion package designed to mimic well systems heating-cooling conditions that may result in the development of micro-annuli detachments between the casing-cement-formation boundaries (Figure 1). Of particular interest is to test novel sealants that can be injected in to relatively narrow micro-annuli flow-paths of less than 120 microns aperture. Per a special report on CO2 storage submitted to the IPCC[1], active injection wells, along with inactive wells that have been abandoned, are identified as one of the most probable sources of leakage pathways for CO2 escape to the surface. Origins of pressure leakage common to injection well and completions architecture often occur due to tensile cracking from temperature cycles, micro-annulus by casing contraction (differential casing to cement sheath movement) and cement sheath channel development. This discussion summarizes the experiment capability and sealant testing results. The experiment concludes with overcoring of the entire mock-completion test site to assess sealant performance in 2018. [1] IPCC Special Report on Carbon Dioxide Capture and Storage (September 2005), section 5.7.2 Processes and pathways for release of CO2 from geological storage sites, page 244

  14. Dual tasking and stuttering: from the laboratory to the clinic.

    PubMed

    Metten, Christine; Bosshardt, Hans-Georg; Jones, Mark; Eisenhuth, John; Block, Susan; Carey, Brenda; O'Brian, Sue; Packman, Ann; Onslow, Mark; Menzies, Ross

    2011-01-01

    The aim of the three studies in this article was to develop a way to include dual tasking in speech restructuring treatment for persons who stutter (PWS). It is thought that this may help clients maintain the benefits of treatment in the real world, where attentional resources are frequently diverted away from controlling fluency by the demands of other tasks. In Part 1, 17 PWS performed a story-telling task and a computer semantic task simultaneously. Part 2 reports the incorporation of the Part 1 protocol into a handy device for use in a clinical setting (the Dual Task and Stuttering Device, DAS-D). Part 3 is a proof of concept study in which three PWS reported on their experiences of using the device during treatment. In Part 1, stuttering frequency and errors on the computer task both increased under dual task conditions, indicating that the protocol would be appropriate for use in a clinical setting. All three participants in Part 3 reported positively on their experiences using the DAS-D. Dual tasking during treatment using the DAS-D appears to be a viable clinical procedure. Further research is required to establish effectiveness.

  15. Characterization of seismic properties across scales: from the laboratory- to the field scale

    NASA Astrophysics Data System (ADS)

    Grab, Melchior; Quintal, Beatriz; Caspari, Eva; Maurer, Hansruedi; Greenhalgh, Stewart

    2016-04-01

    When exploring geothermal systems, the main interest is on factors controlling the efficiency of the heat exchanger. This includes the energy state of the pore fluids and the presence of permeable structures building part of the fluid transport system. Seismic methods are amongst the most common exploration techniques to image the deep subsurface in order to evaluate such a geothermal heat exchanger. They make use of the fact that a seismic wave caries information on the properties of the rocks in the subsurface through which it passes. This enables the derivation of the stiffness and the density of the host rock from the seismic velocities. Moreover, it is well-known that the seismic waveforms are modulated while propagating trough the subsurface by visco-elastic effects due to wave induced fluid flow, hence, delivering information about the fluids in the rock's pore space. To constrain the interpretation of seismic data, that is, to link seismic properties with the fluid state and host rock permeability, it is common practice to measure the rock properties of small rock specimens in the laboratory under in-situ conditions. However, in magmatic geothermal systems or in systems situated in the crystalline basement, the host rock is often highly impermeable and fluid transport predominately takes place in fracture networks, consisting of fractures larger than the rock samples investigated in the laboratory. Therefore, laboratory experiments only provide the properties of relatively intact rock and an up-scaling procedure is required to characterize the seismic properties of large rock volumes containing fractures and fracture networks and to study the effects of fluids in such fractured rock. We present a technique to parameterize fractured rock volumes as typically encountered in Icelandic magmatic geothermal systems, by combining laboratory experiments with effective medium calculations. The resulting models can be used to calculate the frequency-dependent bulk

  16. Rule Following and Rule Use in the Balance-Scale Task

    ERIC Educational Resources Information Center

    Shultz, Thomas R.; Takane, Yoshio

    2007-01-01

    Quinlan et al. [Quinlan, p., van der Mass, H., Jansen, B., Booij, O., & Rendell, M. (this issue). Re-thinking stages of cognitive development: An appraisal of connectionist models of the balance scale task. "Cognition", doi:10.1016/j.cognition.2006.02.004] use Latent Class Analysis (LCA) to criticize a connectionist model of development on the…

  17. SIMILARITY PROPERTIES AND SCALING LAWS OF RADIATION HYDRODYNAMIC FLOWS IN LABORATORY ASTROPHYSICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falize, E.; Bouquet, S.; Michaut, C., E-mail: emeric.falize@cea.fr

    The spectacular recent development of modern high-energy density laboratory facilities which concentrate more and more energy in millimetric volumes allows the astrophysical community to reproduce and to explore, in millimeter-scale targets and during very short times, astrophysical phenomena where radiation and matter are strongly coupled. The astrophysical relevance of these experiments can be checked from the similarity properties and especially scaling law establishment, which constitutes the keystone of laboratory astrophysics. From the radiating optically thin regime to the so-called optically thick radiative pressure regime, we present in this paper, for the first time, a complete analysis of the main radiatingmore » regimes that we encountered in laboratory astrophysics with the same formalism based on Lie group theory. The use of the Lie group method appears to be a systematic method which allows us to construct easily and systematically the scaling laws of a given problem. This powerful tool permits us to unify the recent major advances on scaling laws and to identify new similarity concepts that we discuss in this paper, and suggests important applications for present and future laboratory astrophysics experiments. All these results enable us to demonstrate theoretically that astrophysical phenomena in such radiating regimes can be explored experimentally thanks to powerful facilities. Consequently, the results presented here are a fundamental tool for the high-energy density laboratory astrophysics community in order to quantify the astrophysics relevance and justify laser experiments. Moreover, relying on Lie group theory, this paper constitutes the starting point of any analysis of the self-similar dynamics of radiating fluids.« less

  18. Robonaut 2 performs tests in the U.S. Laboratory

    NASA Image and Video Library

    2013-01-17

    ISS034-E-031125 (17 Jan. 2013) --- In the International Space Station's Destiny laboratory, Robonaut 2 is pictured during a round of testing for the first humanoid robot in space. Ground teams put Robonaut through its paces as they remotely commanded it to operate valves on a task board. Robonaut is a testbed for exploring new robotic capabilities in space, and its form and dexterity allow it to use the same tools and control panels as its human counterparts do aboard the station.

  19. Robonaut 2 performs tests in the U.S. Laboratory

    NASA Image and Video Library

    2013-01-17

    ISS034-E-031124 (17 Jan. 2013) --- In the International Space Station's Destiny laboratory, Robonaut 2 is pictured during a round of testing for the first humanoid robot in space. Ground teams put Robonaut through its paces as they remotely commanded it to operate valves on a task board. Robonaut is a testbed for exploring new robotic capabilities in space, and its form and dexterity allow it to use the same tools and control panels as its human counterparts do aboard the station.

  20. Transitioning EEG experiments away from the laboratory using a Raspberry Pi 2.

    PubMed

    Kuziek, Jonathan W P; Shienh, Axita; Mathewson, Kyle E

    2017-02-01

    Electroencephalography (EEG) experiments are typically performed in controlled laboratory settings to minimise noise and produce reliable measurements. These controlled conditions also reduce the applicability of the obtained results to more varied environments and may limit their relevance to everyday situations. Advances in computer portability may increase the mobility and applicability of EEG results while decreasing costs. In this experiment we show that stimulus presentation using a Raspberry Pi 2 computer provides a low cost, reliable alternative to a traditional desktop PC in the administration of EEG experimental tasks. Significant and reliable MMN and P3 activity, typical event-related potentials (ERPs) associated with an auditory oddball paradigm, were measured while experiments were administered using the Raspberry Pi 2. While latency differences in ERP triggering were observed between systems, these differences reduced power only marginally, likely due to the reduced processing power of the Raspberry Pi 2. An auditory oddball task administered using the Raspberry Pi 2 produced similar ERPs to those derived from a desktop PC in a laboratory setting. Despite temporal differences and slight increases in trials needed for similar statistical power, the Raspberry Pi 2 can be used to design and present auditory experiments comparable to a PC. Our results show that the Raspberry Pi 2 is a low cost alternative to the desktop PC when administering EEG experiments and, due to its small size and low power consumption, will enable mobile EEG experiments unconstrained by a traditional laboratory setting. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. A Content Analysis of General Chemistry Laboratory Manuals for Evidence of Higher-Order Cognitive Tasks

    NASA Astrophysics Data System (ADS)

    Domin, Daniel S.

    1999-01-01

    The science laboratory instructional environment is ideal for fostering the development of problem-solving, manipulative, and higher-order thinking skills: the skills needed by today's learner to compete in an ever increasing technology-based society. This paper reports the results of a content analysis of ten general chemistry laboratory manuals. Three experiments from each manual were examined for evidence of higher-order cognitive activities. Analysis was based upon the six major cognitive categories of Bloom's Taxonomy of Educational Objectives: knowledge, comprehension, application, analysis, synthesis, and evaluation. The results of this study show that the overwhelming majority of general chemistry laboratory manuals provide tasks that require the use of only the lower-order cognitive skills: knowledge, comprehension, and application. Two of the laboratory manuals were disparate in having activities that utilized higher-order cognition. I describe the instructional strategies used within these manuals to foster higher-order cognitive development.

  2. Bioreactor Scalability: Laboratory-Scale Bioreactor Design Influences Performance, Ecology, and Community Physiology in Expanded Granular Sludge Bed Bioreactors

    PubMed Central

    Connelly, Stephanie; Shin, Seung G.; Dillon, Robert J.; Ijaz, Umer Z.; Quince, Christopher; Sloan, William T.; Collins, Gavin

    2017-01-01

    Studies investigating the feasibility of new, or improved, biotechnologies, such as wastewater treatment digesters, inevitably start with laboratory-scale trials. However, it is rarely determined whether laboratory-scale results reflect full-scale performance or microbial ecology. The Expanded Granular Sludge Bed (EGSB) bioreactor, which is a high-rate anaerobic digester configuration, was used as a model to address that knowledge gap in this study. Two laboratory-scale idealizations of the EGSB—a one-dimensional and a three- dimensional scale-down of a full-scale design—were built and operated in triplicate under near-identical conditions to a full-scale EGSB. The laboratory-scale bioreactors were seeded using biomass obtained from the full-scale bioreactor, and, spent water from the distillation of whisky from maize was applied as substrate at both scales. Over 70 days, bioreactor performance, microbial ecology, and microbial community physiology were monitored at various depths in the sludge-beds using 16S rRNA gene sequencing (V4 region), specific methanogenic activity (SMA) assays, and a range of physical and chemical monitoring methods. SMA assays indicated dominance of the hydrogenotrophic pathway at full-scale whilst a more balanced activity profile developed during the laboratory-scale trials. At each scale, Methanobacterium was the dominant methanogenic genus present. Bioreactor performance overall was better at laboratory-scale than full-scale. We observed that bioreactor design at laboratory-scale significantly influenced spatial distribution of microbial community physiology and taxonomy in the bioreactor sludge-bed, with 1-D bioreactor types promoting stratification of each. In the 1-D laboratory bioreactors, increased abundance of Firmicutes was associated with both granule position in the sludge bed and increased activity against acetate and ethanol as substrates. We further observed that stratification in the sludge-bed in 1-D laboratory-scale

  3. A comparison of relative toxicity rankings by some small-scale laboratory tests

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Cumming, H. J.

    1977-01-01

    Small-scale laboratory tests for fire toxicity, suitable for use in the average laboratory hood, are needed for screening and ranking materials on the basis of relative toxicity. The performance of wool, cotton, and aromatic polyamide under several test procedures is presented.

  4. Explorative Function in Williams Syndrome Analyzed through a Large-Scale Task with Multiple Rewards

    ERIC Educational Resources Information Center

    Foti, F.; Petrosini, L.; Cutuli, D.; Menghini, D.; Chiarotti, F.; Vicari, S.; Mandolesi, L.

    2011-01-01

    This study aimed to evaluate spatial function in subjects with Williams syndrome (WS) by using a large-scale task with multiple rewards and comparing the spatial abilities of WS subjects with those of mental age-matched control children. In the present spatial task, WS participants had to explore an open space to search nine rewards placed in…

  5. Scale-Free and Multifractal Time Dynamics of fMRI Signals during Rest and Task

    PubMed Central

    Ciuciu, P.; Varoquaux, G.; Abry, P.; Sadaghiani, S.; Kleinschmidt, A.

    2012-01-01

    Scaling temporal dynamics in functional MRI (fMRI) signals have been evidenced for a decade as intrinsic characteristics of ongoing brain activity (Zarahn et al., 1997). Recently, scaling properties were shown to fluctuate across brain networks and to be modulated between rest and task (He, 2011): notably, Hurst exponent, quantifying long memory, decreases under task in activating and deactivating brain regions. In most cases, such results were obtained: First, from univariate (voxelwise or regionwise) analysis, hence focusing on specific cognitive systems such as Resting-State Networks (RSNs) and raising the issue of the specificity of this scale-free dynamics modulation in RSNs. Second, using analysis tools designed to measure a single scaling exponent related to the second order statistics of the data, thus relying on models that either implicitly or explicitly assume Gaussianity and (asymptotic) self-similarity, while fMRI signals may significantly depart from those either of those two assumptions (Ciuciu et al., 2008; Wink et al., 2008). To address these issues, the present contribution elaborates on the analysis of the scaling properties of fMRI temporal dynamics by proposing two significant variations. First, scaling properties are technically investigated using the recently introduced Wavelet Leader-based Multifractal formalism (WLMF; Wendt et al., 2007). This measures a collection of scaling exponents, thus enables a richer and more versatile description of scale invariance (beyond correlation and Gaussianity), referred to as multifractality. Also, it benefits from improved estimation performance compared to tools previously used in the literature. Second, scaling properties are investigated in both RSN and non-RSN structures (e.g., artifacts), at a broader spatial scale than the voxel one, using a multivariate approach, namely the Multi-Subject Dictionary Learning (MSDL) algorithm (Varoquaux et al., 2011) that produces a set of spatial components that

  6. Laboratory and pilot-scale bioremediation of pentaerythritol tetranitrate (PETN) contaminated soil.

    PubMed

    Zhuang, Li; Gui, Lai; Gillham, Robert W; Landis, Richard C

    2014-01-15

    PETN (pentaerythritol tetranitrate), a munitions constituent, is commonly encountered in munitions-contaminated soils, and pose a serious threat to aquatic organisms. This study investigated anaerobic remediation of PETN-contaminated soil at a site near Denver Colorado. Both granular iron and organic carbon amendments were used in both laboratory and pilot-scale tests. The laboratory results showed that, with various organic carbon amendments, PETN at initial concentrations of between 4500 and 5000mg/kg was effectively removed within 84 days. In the field trial, after a test period of 446 days, PETN mass removal of up to 53,071mg/kg of PETN (80%) was achieved with an organic carbon amendment (DARAMEND) of 4% by weight. In previous laboratory studies, granular iron has shown to be highly effective in degrading PETN. However, for both the laboratory and pilot-scale tests, granular iron was proven to be ineffective. This was a consequence of passivation of the iron surfaces caused by the very high concentrations of nitrate in the contaminated soil. This study indicated that low concentration of organic carbon was a key factor limiting bioremediation of PETN in the contaminated soil. Furthermore, the addition of organic carbon amendments such as the DARAMEND materials or brewers grain, proved to be highly effective in stimulating the biodegradation of PETN and could provide the basis for full-scale remediation of PETN-contaminated sites. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Job Analysis Techniques for Restructuring Health Manpower Education and Training in the Navy Medical Department. Attachment 9. Laboratory QPCB Task Sort for Medical Laboratory Technology.

    ERIC Educational Resources Information Center

    Technomics, Inc., McLean, VA.

    This publication is Attachment 9 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in medical laboratory technology. (BT)

  8. Replicating the microbial community and water quality performance of full-scale slow sand filters in laboratory-scale filters.

    PubMed

    Haig, Sarah-Jane; Quince, Christopher; Davies, Robert L; Dorea, Caetano C; Collins, Gavin

    2014-09-15

    Previous laboratory-scale studies to characterise the functional microbial ecology of slow sand filters have suffered from methodological limitations that could compromise their relevance to full-scale systems. Therefore, to ascertain if laboratory-scale slow sand filters (L-SSFs) can replicate the microbial community and water quality production of industrially operated full-scale slow sand filters (I-SSFs), eight cylindrical L-SSFs were constructed and were used to treat water from the same source as the I-SSFs. Half of the L-SSFs sand beds were composed of sterilized sand (sterile) from the industrial filters and the other half with sand taken directly from the same industrial filter (non-sterile). All filters were operated for 10 weeks, with the microbial community and water quality parameters sampled and analysed weekly. To characterize the microbial community phyla-specific qPCR assays and 454 pyrosequencing of the 16S rRNA gene were used in conjunction with an array of statistical techniques. The results demonstrate that it is possible to mimic both the water quality production and the structure of the microbial community of full-scale filters in the laboratory - at all levels of taxonomic classification except OTU - thus allowing comparison of LSSF experiments with full-scale units. Further, it was found that the sand type composing the filter bed (non-sterile or sterile), the water quality produced, the age of the filters and the depth of sand samples were all significant factors in explaining observed differences in the structure of the microbial consortia. This study is the first to the authors' knowledge that demonstrates that scaled-down slow sand filters can accurately reproduce the water quality and microbial consortia of full-scale slow sand filters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Recovery. Oxygen Transport Membrane-Based OxyCombustion for CO 2 Capture from Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Sean; Geary, Joan; Chakravrti, Shrikar

    2015-12-22

    This Final report documents and summarizes all of the work performed for the DOE award DE-FC26-07NT43088 during the period from April 2007 - June 2012. This report outlines accomplishments for the following tasks: Task 1 – Process and Systems Engineering, Task 2 – OTM Performance Improvement, Task 3 – OTM Manufacturing Development, Task 4 - Laboratory Scale Testing and Task 5 – Project Management.

  10. Task2 potassium channels set central respiratory CO2 and O2 sensitivity

    PubMed Central

    Gestreau, Christian; Heitzmann, Dirk; Thomas, Joerg; Dubreuil, Véronique; Bandulik, Sascha; Reichold, Markus; Bendahhou, Saïd; Pierson, Patricia; Sterner, Christina; Peyronnet-Roux, Julie; Benfriha, Chérif; Tegtmeier, Ines; Ehnes, Hannah; Georgieff, Michael; Lesage, Florian; Brunet, Jean-Francois; Goridis, Christo; Warth, Richard; Barhanin, Jacques

    2010-01-01

    Task2 K+ channel expression in the central nervous system is surprisingly restricted to a few brainstem nuclei, including the retrotrapezoid (RTN) region. All Task2-positive RTN neurons were lost in mice bearing a Phox2b mutation that causes the human congenital central hypoventilation syndrome. In plethysmography, Task2−/− mice showed disturbed chemosensory function with hypersensitivity to low CO2 concentrations, leading to hyperventilation. Task2 probably is needed to stabilize the membrane potential of chemoreceptive cells. In addition, Task2−/− mice lost the long-term hypoxia-induced respiratory decrease whereas the acute carotid-body-mediated increase was maintained. The lack of anoxia-induced respiratory depression in the isolated brainstem–spinal cord preparation suggested a central origin of the phenotype. Task2 activation by reactive oxygen species generated during hypoxia could silence RTN neurons, thus contributing to respiratory depression. These data identify Task2 as a determinant of central O2 chemoreception and demonstrate that this phenomenon is due to the activity of a small number of neurons located at the ventral medullary surface. PMID:20133877

  11. Laboratory Scale Coal And Biomass To Drop-In Fuels (CBDF) Production And Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lux, Kenneth; Imam, Tahmina; Chevanan, Nehru

    This Final Technical Report describes the work and accomplishments of the project entitled, “Laboratory Scale Coal and Biomass to Drop-In Fuels (CBDF) Production and Assessment.” The main objective of the project was to fabricate and test a lab-scale liquid-fuel production system using coal containing different percentages of biomass such as corn stover and switchgrass at a rate of 2 liters per day. The system utilizes the patented Altex fuel-production technology, which incorporates advanced catalysts developed by Pennsylvania State University. The system was designed, fabricated, tested, and assessed for economic and environmental feasibility relative to competing technologies.

  12. The Intersection of Task-Based Interaction, Task Complexity, and Working Memory: L2 Question Development through Recasts in a Laboratory Setting

    ERIC Educational Resources Information Center

    Kim, YouJin; Payant, Caroline; Pearson, Pamela

    2015-01-01

    The extent to which individual differences in cognitive abilities affect the relationship among task complexity, attention to form, and second language development has been addressed only minimally in the cognition hypothesis literature. The present study explores how reasoning demands in tasks and working memory (WM) capacity predict learners'…

  13. Child Proportional Scaling: Is 1/3 = 2/6 = 3/9 = 4/12?

    ERIC Educational Resources Information Center

    Boyer, Ty W.; Levine, Susan C.

    2012-01-01

    The current experiments examined the role of scale factor in children's proportional reasoning. Experiment 1 used a choice task and Experiment 2 used a production task to examine the abilities of kindergartners through fourth-graders to match equivalent, visually depicted proportional relations. The findings of both experiments show that accuracy…

  14. Recurrence quantification analysis of electroencephalograph signals during standard tasks of Waterloo-Stanford group scale of hypnotic susceptibility.

    PubMed

    Yargholi, Elahe'; Nasrabadi, Ali Motie

    2015-01-01

    The purpose of this study was to apply RQA (recurrence quantification analysis) on hypnotic electroencephalograph (EEG) signals recorded after hypnotic induction while subjects were doing standard tasks of the Waterloo-Stanford Group Scale (WSGS) of hypnotic susceptibility. Then recurrence quantifiers were used to analyse the influence of hypnotic depth on EEGs. By the application of this method, the capability of tasks to distinguish subjects of different hypnotizability levels was determined. Besides, medium hypnotizable subjects showed the highest disposition to be inducted by hypnotizer. Similarities between brain governing dynamics during tasks of the same type were also observed. The present study demonstrated two remarkable innovations; investigating the EEGs of the hypnotized as doing mental tasks of Waterloo-Stanford Group Scale (WSGS) and applying RQA on hypnotic EEGs.

  15. Leveraging Large-Scale Semantic Networks for Adaptive Robot Task Learning and Execution.

    PubMed

    Boteanu, Adrian; St Clair, Aaron; Mohseni-Kabir, Anahita; Saldanha, Carl; Chernova, Sonia

    2016-12-01

    This work seeks to leverage semantic networks containing millions of entries encoding assertions of commonsense knowledge to enable improvements in robot task execution and learning. The specific application we explore in this project is object substitution in the context of task adaptation. Humans easily adapt their plans to compensate for missing items in day-to-day tasks, substituting a wrap for bread when making a sandwich, or stirring pasta with a fork when out of spoons. Robot plan execution, however, is far less robust, with missing objects typically leading to failure if the robot is not aware of alternatives. In this article, we contribute a context-aware algorithm that leverages the linguistic information embedded in the task description to identify candidate substitution objects without reliance on explicit object affordance information. Specifically, we show that the task context provided by the task labels within the action structure of a task plan can be leveraged to disambiguate information within a noisy large-scale semantic network containing hundreds of potential object candidates to identify successful object substitutions with high accuracy. We present two extensive evaluations of our work on both abstract and real-world robot tasks, showing that the substitutions made by our system are valid, accepted by users, and lead to a statistically significant reduction in robot learning time. In addition, we report the outcomes of testing our approach with a large number of crowd workers interacting with a robot in real time.

  16. Report of the Task Force on SSC Magnet System Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1984-10-01

    The Task Force on SSC Magnet Systems test Site was appointed by Maury Tigner, Director of the SSC, Phase 1 in August 1984. In brief, the charge asked the Task Force to make a critical evaluation of potential test sites for a major SSC magnet System Test Facility (STF) with regard to: (1) availability of the needed space, utilities, staff and other requirements on the desired time scale; and (2) the cost of preparing the sites for the tests and for operating the facilities during the test period. The charge further suggests that, by virtue of existing facilities and availabilitymore » of experienced staff, BNL and FNAL are the two best candidate sites and that is therefore appears appropriate to restrict the considerations of the Task Force to these sites. During the subsequent deliberations of the Task Force, no new facts were revealed that altered the assumptions of the charge in this regard. The charge does not ask for a specific site recommendation for the STF. Indeed, an agreement on such a recommendation would be difficult to achieve considering the composition of the Task Force, wherein a large fraction of the membership is drawn from the two contending laboratories. Instead, we have attempted to describe the purpose of the facility, outline a productive test program, list the major facilities required, carefully review the laboratories` responses to the facility requirements, and make objective comparisons of the specific features and capabilities offered.« less

  17. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    DTIC Science & Technology

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...of agents, and each agent attempts to form a coalition with its most profitable partner. The second algorithm builds upon the Shapley for- mula [37...ters at the second layer. These Category Layer clusters each represent a single resource, and agents join one or more clusters based on their

  18. Analysis Of 2H-Evaporator Scale Pot Bottom Sample [HTF-13-11-28H

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oji, L. N.

    2013-07-15

    Savannah River Remediation (SRR) is planning to remove a buildup of sodium aluminosilicate scale from the 2H-evaporator pot by loading and soaking the pot with heated 1.5 M nitric acid solution. Sampling and analysis of the scale material from the 2H evaporator has been performed so that the evaporator can be chemically cleaned beginning July of 2013. Historically, since the operation of the Defense Waste Processing Facility (DWPF), silicon in the DWPF recycle stream combines with aluminum in the typical tank farm supernate to form sodium aluminosilicate scale mineral deposits in the 2H-evaporator pot and gravity drain line. The 2H-evaporatormore » scale samples analyzed by Savannah River National Laboratory (SRNL) came from the bottom cone sections of the 2H-evaporator pot. The sample holder from the 2H-evaporator wall was virtually empty and was not included in the analysis. It is worth noting that after the delivery of these 2H-evaporator scale samples to SRNL for the analyses, the plant customer determined that the 2H evaporator could be operated for additional period prior to requiring cleaning. Therefore, there was no need for expedited sample analysis as was presented in the Technical Task Request. However, a second set of 2H evaporator scale samples were expected in May of 2013, which would need expedited sample analysis. X-ray diffraction analysis (XRD) confirmed the bottom cone section sample from the 2H-evaporator pot consisted of nitrated cancrinite, (a crystalline sodium aluminosilicate solid), clarkeite and uranium oxide. There were also mercury compound XRD peaks which could not be matched and further X-ray fluorescence (XRF) analysis of the sample confirmed the existence of elemental mercury or mercuric oxide. On ''as received'' basis, the scale contained an average of 7.09E+00 wt % total uranium (n = 3; st.dev. = 8.31E-01 wt %) with a U-235 enrichment of 5.80E-01 % (n = 3; st.dev. = 3.96E-02 %). The measured U-238 concentration was 7.05E+00 wt

  19. Development and Confirmatory Factory Analysis of the Achievement Task Value Scale for University Students

    ERIC Educational Resources Information Center

    Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen

    2013-01-01

    The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…

  20. The Scaling of Water Governance Tasks: A Comparative Federal Analysis of the European Union and Australia

    NASA Astrophysics Data System (ADS)

    Benson, David; Jordan, Andrew

    2010-07-01

    Conflicts over how to “scale” policy-making tasks have characterized environmental governance since time immemorial. They are particularly evident in the area of water policy and raise important questions over the democratic legitimacy, economic efficiency and effectiveness of allocating (or “scaling”) tasks to some administrative levels as opposed to others. This article adopts a comparative federalism perspective to assess the “optimality” of scaling—either upward or downward—in one issue area, namely coastal recreational water quality. It does so by comparing the scaling of recreational water quality tasks in the European Union (EU) and Australia. It reveals that the two systems have adopted rather different approaches to scaling and that this difference can partly be accounted for in federal theoretical terms. However, a much greater awareness of the inescapably political nature of scaling processes is nonetheless required. Finally, some words of caution are offered with regard to transferring policy lessons between these two jurisdictions.

  1. Mother Earth Chemistry: A Laboratory Course for Nonmajors.

    ERIC Educational Resources Information Center

    Roberts, J. L.; And Others

    1996-01-01

    Describes a laboratory course that introduces students to chemistry using examples commonly encountered in the supermarket and on the dinner table. Acquaints students with simple chemical tasks that can be practiced at home, including the making of wine, ale, soap, cheese, and yogurt, and introduces them to the small-scale production of…

  2. The creation of a superstitious belief regarding putters in a laboratory-based golfing task

    PubMed Central

    Churchill, Andrew; Taylor, Jamie A.; Parkes, Royston

    2015-01-01

    The objective was to determine the extent to which it was possible to induce superstitious behaviour and beliefs in a golf putting task in a laboratory. Participants (N = 28) took part in a putting task using three identical clubs in which visual feedback regarding performance was restricted. Participants were provided with verbal feedback of their performance, which was honest when they used one putter, negative with a second putter (they did better than they were told) and positive with a third (they did worse than they were told). After this initial acquisition phase, a competition was announced and participants were asked to select a putter they would like to use. The participants were then asked to rate various qualities of the putters. Significantly more participants selected the “positive” putter for the competition (N = 22) compared to the “negative” putter (N = 1), p < .001. In addition, participants claimed that the positive putter had a better weight, was more comfortable and easier to use than the negative putter (all p < .001). Overall, this evidence can be taken to show that a superstitious belief can be formed in a short amount of time within a laboratory setting and that it can affect both the perceptions and choices of an individual. PMID:26877722

  3. The creation of a superstitious belief regarding putters in a laboratory-based golfing task.

    PubMed

    Churchill, Andrew; Taylor, Jamie A; Parkes, Royston

    2015-10-02

    The objective was to determine the extent to which it was possible to induce superstitious behaviour and beliefs in a golf putting task in a laboratory. Participants ( N  = 28) took part in a putting task using three identical clubs in which visual feedback regarding performance was restricted. Participants were provided with verbal feedback of their performance, which was honest when they used one putter, negative with a second putter (they did better than they were told) and positive with a third (they did worse than they were told). After this initial acquisition phase, a competition was announced and participants were asked to select a putter they would like to use. The participants were then asked to rate various qualities of the putters. Significantly more participants selected the "positive" putter for the competition ( N  = 22) compared to the "negative" putter ( N  = 1), p  < .001. In addition, participants claimed that the positive putter had a better weight, was more comfortable and easier to use than the negative putter (all p  < .001). Overall, this evidence can be taken to show that a superstitious belief can be formed in a short amount of time within a laboratory setting and that it can affect both the perceptions and choices of an individual.

  4. Medical Laboratory Assistant. Laboratory Occupations Cluster.

    ERIC Educational Resources Information Center

    Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.

    This task-based curriculum guide for medical laboratory assistant is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a career ladder, a matrix relating duty/task numbers to job titles, and a task list. Each…

  5. Scale-Up of GRCop: From Laboratory to Rocket Engines

    NASA Technical Reports Server (NTRS)

    Ellis, David L.

    2016-01-01

    GRCop is a high temperature, high thermal conductivity copper-based series of alloys designed primarily for use in regeneratively cooled rocket engine liners. It began with laboratory-level production of a few grams of ribbon produced by chill block melt spinning and has grown to commercial-scale production of large-scale rocket engine liners. Along the way, a variety of methods of consolidating and working the alloy were examined, a database of properties was developed and a variety of commercial and government applications were considered. This talk will briefly address the basic material properties used for selection of compositions to scale up, the methods used to go from simple ribbon to rocket engines, the need to develop a suitable database, and the issues related to getting the alloy into a rocket engine or other application.

  6. Countercurrent fixed-bed gasification of biomass at laboratory scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Blasi, C.; Signorelli, G.; Portoricco, G.

    1999-07-01

    A laboratory-scale countercurrent fixed-bed gasification plant has been designed and constructed to produce data for process modeling and to compare the gasification characteristics of several biomasses (beechwood, nutshells, olive husks, and grape residues). The composition of producer gas and spatial temperature profiles have been measured for biomass gasification at different air flow rates. The gas-heating value always attains a maximum as a function of this operating variable, associated with a decrease of the air-to-fuel ratio. Optical gasification conditions of wood and agricultural residues give rise to comparable gas-heating values, comprised in the range 5--5.5 MJ/Nm{sup 3} with 28--30% CO, 5--7%more » CO{sub 2}, 6--8% H{sub 2}, 1--2% CH{sub 4}, and small amounts of C{sub 2}- hydrocarbons (apart from nitrogen). However, gasification of agricultural residues is more difficult because of bed transport, partial ash sintering, nonuniform flow distribution, and the presence of a muddy phase in the effluents, so that proper pretreatments are needed for largescale applications.« less

  7. A meta-analytic review of the impact of intranasal oxytocin administration on cortisol concentrations during laboratory tasks: moderation by method and mental health.

    PubMed

    Cardoso, Christopher; Kingdon, Danielle; Ellenbogen, Mark A

    2014-11-01

    A large body of research has examined the acute effects of intranasal oxytocin administration on social cognition and stress-regulation. While progress has been made with respect to understanding the effect of oxytocin administration on social cognition in clinical populations (e.g. autism, schizophrenia), less is known about its impact on the functioning of the hypothalamic-pituitary-adrenal (HPA) axis among individuals with a mental disorder. We conducted a meta-analysis on the acute effect of intranasal oxytocin administration on the cortisol response to laboratory tasks. The search yielded eighteen studies employing a randomized, placebo-controlled design (k=18, N=675). Random-effects models and moderator analyses were performed using the metafor package for the statistical program R. The overall effect size estimate was modest and not statistically significant (Hedges g=-0.151, p=0.11) with moderate heterogeneity in this effect across studies (I(2)=31%). Controlling for baseline differences in cortisol concentrations, moderation analyses revealed that this effect was larger in response to challenging laboratory tasks that produced a robust stimulation of the HPA-axis (Hedges g=-0.433, 95% CI[-0.841, -0.025]), and in clinical populations relative to healthy controls (Hedges g=-0.742, 95% CI[-1.405, -0.078]). Overall, oxytocin administration showed greater attenuation of the cortisol response to laboratory tasks that strongly activated the HPA-axis, relative to tasks that did not. The effect was more robust among clinical populations, suggesting possible increased sensitivity to oxytocin among those with a clinical diagnosis and concomitant social difficulties. These data support the view that oxytocin may play an important role in HPA dysfunction associated with psychopathology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Cross-flow turbines: progress report on physical and numerical model studies at large laboratory scale

    NASA Astrophysics Data System (ADS)

    Wosnik, Martin; Bachant, Peter

    2016-11-01

    Cross-flow turbines show potential in marine hydrokinetic (MHK) applications. A research focus is on accurately predicting device performance and wake evolution to improve turbine array layouts for maximizing overall power output, i.e., minimizing wake interference, or taking advantage of constructive wake interaction. Experiments were carried with large laboratory-scale cross-flow turbines D O (1 m) using a turbine test bed in a large cross-section tow tank, designed to achieve sufficiently high Reynolds numbers for the results to be Reynolds number independent with respect to turbine performance and wake statistics, such that they can be reliably extrapolated to full scale and used for model validation. Several turbines of varying solidity were employed, including the UNH Reference Vertical Axis Turbine (RVAT) and a 1:6 scale model of the DOE-Sandia Reference Model 2 (RM2) turbine. To improve parameterization in array simulations, an actuator line model (ALM) was developed to provide a computationally feasible method for simulating full turbine arrays inside Navier-Stokes models. Results are presented for the simulation of performance and wake dynamics of cross-flow turbines and compared with experiments and body-fitted mesh, blade-resolving CFD. Supported by NSF-CBET Grant 1150797, Sandia National Laboratories.

  9. Construction of the Propulsion Systems Laboratory No. 1 and 2

    NASA Image and Video Library

    1951-01-21

    Construction of the Propulsion Systems Laboratory No. 1 and 2 at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory. When it began operation in late 1952, the Propulsion Systems Laboratory was the NACA’s most powerful facility for testing full-scale engines at simulated flight altitudes. The facility contained two altitude simulating test chambers which were a technological combination of the static sea-level test stands and the complex Altitude Wind Tunnel, which recreated actual flight conditions on a larger scale. NACA Lewis began designing the new facility in 1947 as part of a comprehensive plan to improve the altitude testing capabilities across the lab. The exhaust, refrigeration, and combustion air systems from all the major test facilities were linked. In this way, different facilities could be used to complement the capabilities of one another. Propulsion Systems Laboratory construction began in late summer 1949 with the installation of an overhead exhaust pipe connecting the facility to the Altitude Wind Tunnel and Engine Research Building. The large test section pieces arriving in early 1951, when this photograph was taken. The two primary coolers for the altitude exhaust are in place within the framework near the center of the photograph.

  10. Validation of laboratory-scale recycling test method of paper PSA label products

    Treesearch

    Carl Houtman; Karen Scallon; Richard Oldack

    2008-01-01

    Starting with test methods and a specification developed by the U.S. Postal Service (USPS) Environmentally Benign Pressure Sensitive Adhesive Postage Stamp Program, a laboratory-scale test method and a specification were developed and validated for pressure-sensitive adhesive labels, By comparing results from this new test method and pilot-scale tests, which have been...

  11. Transformation of pristine and citrate-functionalized CeO2 nanoparticles in a laboratory-scale activated sludge reactor.

    PubMed

    Barton, Lauren E; Auffan, Melanie; Bertrand, Marie; Barakat, Mohamed; Santaella, Catherine; Masion, Armand; Borschneck, Daniel; Olivi, Luca; Roche, Nicolas; Wiesner, Mark R; Bottero, Jean-Yves

    2014-07-01

    Engineered nanomaterials (ENMs) are used to enhance the properties of many manufactured products and technologies. Increased use of ENMs will inevitably lead to their release into the environment. An important route of exposure is through the waste stream, where ENMs will enter wastewater treatment plants (WWTPs), undergo transformations, and be discharged with treated effluent or biosolids. To better understand the fate of a common ENM in WWTPs, experiments with laboratory-scale activated sludge reactors and pristine and citrate-functionalized CeO2 nanoparticles (NPs) were conducted. Greater than 90% of the CeO2 introduced was observed to associate with biosolids. This association was accompanied by reduction of the Ce(IV) NPs to Ce(III). After 5 weeks in the reactor, 44 ± 4% reduction was observed for the pristine NPs and 31 ± 3% for the citrate-functionalized NPs, illustrating surface functionality dependence. Thermodynamic arguments suggest that the likely Ce(III) phase generated would be Ce2S3. This study indicates that the majority of CeO2 NPs (>90% by mass) entering WWTPs will be associated with the solid phase, and a significant portion will be present as Ce(III). At maximum, 10% of the CeO2 will remain in the effluent and be discharged as a Ce(IV) phase, governed by cerianite (CeO2).

  12. Using a million cell simulation of the cerebellum: network scaling and task generality.

    PubMed

    Li, Wen-Ke; Hausknecht, Matthew J; Stone, Peter; Mauk, Michael D

    2013-11-01

    Several factors combine to make it feasible to build computer simulations of the cerebellum and to test them in biologically realistic ways. These simulations can be used to help understand the computational contributions of various cerebellar components, including the relevance of the enormous number of neurons in the granule cell layer. In previous work we have used a simulation containing 12000 granule cells to develop new predictions and to account for various aspects of eyelid conditioning, a form of motor learning mediated by the cerebellum. Here we demonstrate the feasibility of scaling up this simulation to over one million granule cells using parallel graphics processing unit (GPU) technology. We observe that this increase in number of granule cells requires only twice the execution time of the smaller simulation on the GPU. We demonstrate that this simulation, like its smaller predecessor, can emulate certain basic features of conditioned eyelid responses, with a slight improvement in performance in one measure. We also use this simulation to examine the generality of the computation properties that we have derived from studying eyelid conditioning. We demonstrate that this scaled up simulation can learn a high level of performance in a classic machine learning task, the cart-pole balancing task. These results suggest that this parallel GPU technology can be used to build very large-scale simulations whose connectivity ratios match those of the real cerebellum and that these simulations can be used guide future studies on cerebellar mediated tasks and on machine learning problems. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Fuzzy-TLX: using fuzzy integrals for evaluating human mental workload with NASA-Task Load indeX in laboratory and field studies.

    PubMed

    Mouzé-Amady, Marc; Raufaste, Eric; Prade, Henri; Meyer, Jean-Pierre

    2013-01-01

    The aim of this study was to assess mental workload in which various load sources must be integrated to derive reliable workload estimates. We report a new algorithm for computing weights from qualitative fuzzy integrals and apply it to the National Aeronautics and Space Administration -Task Load indeX (NASA-TLX) subscales in order to replace the standard pair-wise weighting technique (PWT). In this paper, two empirical studies were reported: (1) In a laboratory experiment, age- and task-related variables were investigated in 53 male volunteers and (2) In a field study, task- and job-related variables were studied on aircrews during 48 commercial flights. The results found in this study were as follows: (i) in the experimental setting, fuzzy estimates were highly correlated with classical (using PWT) estimates; (ii) in real work conditions, replacing PWT by automated fuzzy treatments simplified the NASA-TLX completion; (iii) the algorithm for computing fuzzy estimates provides a new classification procedure sensitive to various variables of work environments and (iv) subjective and objective measures can be used for the fuzzy aggregation of NASA-TLX subscales. NASA-TLX, a classical tool for mental workload assessment, is based on a weighted sum of ratings from six subscales. A new algorithm, which impacts on input data collection and computes weights and indexes from qualitative fuzzy integrals, is evaluated through laboratory and field studies. Pros and cons are discussed.

  14. Fabrication Method for Laboratory-Scale High-Performance Membrane Electrode Assemblies for Fuel Cells.

    PubMed

    Sassin, Megan B; Garsany, Yannick; Gould, Benjamin D; Swider-Lyons, Karen E

    2017-01-03

    Custom catalyst-coated membranes (CCMs) and membrane electrode assemblies (MEAs) are necessary for the evaluation of advanced electrocatalysts, gas diffusion media (GDM), ionomers, polymer electrolyte membranes (PEMs), and electrode structures designed for use in next-generation fuel cells, electrolyzers, or flow batteries. This Feature provides a reliable and reproducible fabrication protocol for laboratory scale (10 cm 2 ) fuel cells based on ultrasonic spray deposition of a standard Pt/carbon electrocatalyst directly onto a perfluorosulfonic acid PEM.

  15. Task-Induced Development of Hinting Behaviors in Online Task-Oriented L2 Interaction

    ERIC Educational Resources Information Center

    Balaman, Ufuk

    2018-01-01

    Technology-mediated task settings are rich interactional domains in which second language (L2) learners manage a multitude of interactional resources for task accomplishment. The affordances of these settings have been repeatedly addressed in computer-assisted language learning (CALL) literature mainly based on theory-informed task design…

  16. Chaos-chaos transition of left hemisphere EEGs during standard tasks of Waterloo-Stanford Group Scale of hypnotic susceptibility.

    PubMed

    Yargholi, Elahe'; Nasrabadi, Ali Motie

    2015-01-01

    A recent study, recurrence quantification analysis of EEG signals during standard tasks of Waterloo-Stanford Group Scale of hypnotic susceptibility investigated recurrence quantifiers (RQs) of hypnotic electroencephalograph (EEG) signals recorded after hypnotic induction while subjects were doing standard tasks of Waterloo-Stanford Group Scale (WSGS) of hypnotic susceptibility to distinguish subjects of different hypnotizability levels. Following the same analysis, the current study determines the capability of different RQs to distinguish subjects of low, medium and high hypnotizability level and studies the influence of hypnotizability level on underlying dynamic of tasks. Besides, EEG channels were sorted according to the number of their RQs, which differed significantly among subjects of different hypnotizability levels. Another valuable result was determination of major brain regions in observing significant differences in various task types (ideomotors, hallucination, challenge and memory).

  17. Potential for improved radiation thermometry measurement uncertainty through implementing a primary scale in an industrial laboratory

    NASA Astrophysics Data System (ADS)

    Willmott, Jon R.; Lowe, David; Broughton, Mick; White, Ben S.; Machin, Graham

    2016-09-01

    A primary temperature scale requires realising a unit in terms of its definition. For high temperature radiation thermometry in terms of the International Temperature Scale of 1990 this means extrapolating from the signal measured at the freezing temperature of gold, silver or copper using Planck’s radiation law. The difficulty in doing this means that primary scales above 1000 °C require specialist equipment and careful characterisation in order to achieve the extrapolation with sufficient accuracy. As such, maintenance of the scale at high temperatures is usually only practicable for National Metrology Institutes, and calibration laboratories have to rely on a scale calibrated against transfer standards. At lower temperatures it is practicable for an industrial calibration laboratory to have its own primary temperature scale, which reduces the number of steps between the primary scale and end user. Proposed changes to the SI that will introduce internationally accepted high temperature reference standards might make it practicable to have a primary high temperature scale in a calibration laboratory. In this study such a scale was established by calibrating radiation thermometers directly to high temperature reference standards. The possible reduction in uncertainty to an end user as a result of the reduced calibration chain was evaluated.

  18. Identifying and individuating cognitive systems: a task-based distributed cognition alternative to agent-based extended cognition.

    PubMed

    Davies, Jim; Michaelian, Kourken

    2016-08-01

    This article argues for a task-based approach to identifying and individuating cognitive systems. The agent-based extended cognition approach faces a problem of cognitive bloat and has difficulty accommodating both sub-individual cognitive systems ("scaling down") and some supra-individual cognitive systems ("scaling up"). The standard distributed cognition approach can accommodate a wider variety of supra-individual systems but likewise has difficulties with sub-individual systems and faces the problem of cognitive bloat. We develop a task-based variant of distributed cognition designed to scale up and down smoothly while providing a principled means of avoiding cognitive bloat. The advantages of the task-based approach are illustrated by means of two parallel case studies: re-representation in the human visual system and in a biomedical engineering laboratory.

  19. TASK 2: QUENCH ZONE SIMULATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fusselman, Steve

    Aerojet Rocketdyne (AR) has developed an innovative gasifier concept incorporating advanced technologies in ultra-dense phase dry feed system, rapid mix injector, and advanced component cooling to significantly improve gasifier performance, life, and cost compared to commercially available state-of-the-art systems. A key feature of the AR gasifier design is the transition from the gasifier outlet into the quench zone, where the raw syngas is cooled to ~ 400°C by injection and vaporization of atomized water. Earlier pilot plant testing revealed a propensity for the original gasifier outlet design to accumulate slag in the outlet, leading to erratic syngas flow from themore » outlet. Subsequent design modifications successfully resolved this issue in the pilot plant gasifier. In order to gain greater insight into the physical phenomena occurring within this zone, AR developed a cold flow simulation apparatus with Coanda Research & Development with a high degree of similitude to hot fire conditions with the pilot scale gasifier design, and capable of accommodating a scaled-down quench zone for a demonstration-scale gasifier. The objective of this task was to validate similitude of the cold flow simulation model by comparison of pilot-scale outlet design performance, and to assess demonstration scale gasifier design feasibility from testing of a scaled-down outlet design. Test results did exhibit a strong correspondence with the two pilot scale outlet designs, indicating credible similitude for the cold flow simulation device. Testing of the scaled-down outlet revealed important considerations in the design and operation of the demonstration scale gasifier, in particular pertaining to the relative momentum between the downcoming raw syngas and the sprayed quench water and associated impacts on flow patterns within the quench zone. This report describes key findings from the test program, including assessment of pilot plant configuration simulations relative to

  20. A surgical skills laboratory improves residents' knowledge and performance of episiotomy repair.

    PubMed

    Banks, Erika; Pardanani, Setul; King, Mary; Chudnoff, Scott; Damus, Karla; Freda, Margaret Comerford

    2006-11-01

    This study was undertaken to assess whether a surgical skills laboratory improves residents' knowledge and performance of episiotomy repair. Twenty-four first- and second-year residents were randomly assigned to either a surgical skills laboratory on episiotomy repair or traditional teaching alone. Pre- and posttests assessed basic knowledge. Blinded attending physicians assessed performance, evaluating residents on second-degree laceration/episiotomy repairs in the clinical setting with 3 validated tools: a task-specific checklist, global rating scale, and a pass-fail grade. Postgraduate year 1 (PGY-1) residents participating in the laboratory scored significantly better on all 3 surgical assessment tools: the checklist, the global score, and the pass/fail analysis. All the residents who had the teaching laboratory demonstrated significant improvements on knowledge and the skills checklist. PGY-2 residents did not benefit as much as PGY-1 residents. A surgical skills laboratory improved residents' knowledge and performance in the clinical setting. Improvement was greatest for PGY-1 residents.

  1. R2 dark energy in the laboratory

    NASA Astrophysics Data System (ADS)

    Brax, Philippe; Valageas, Patrick; Vanhove, Pierre

    2018-05-01

    We analyze the role, on large cosmological scales and laboratory experiments, of the leading curvature squared contributions to the low-energy effective action of gravity. We argue for a natural relationship c0λ2≃1 at low energy between the R2 coefficients c0 of the Ricci scalar squared term in this expansion and the dark energy scale Λ =(λ MPl)4 in four-dimensional Planck mass units. We show how the compatibility between the acceleration of the expansion rate of the Universe, local tests of gravity and the quantum stability of the model all converge to select such a relationship up to a coefficient which should be determined experimentally. When embedding this low-energy theory of gravity into candidates for its ultraviolet completion, we find that the proposed relationship is guaranteed in string-inspired supergravity models with modulus stabilization and supersymmetry breaking leading to de Sitter compactifications. In this case, the scalar degree of freedom of R2 gravity is associated to a volume modulus. Once written in terms of a scalar-tensor theory, the effective theory corresponds to a massive scalar field coupled with the universal strength β =1 /√{6 } to the matter stress-energy tensor. When the relationship c0λ2≃1 is realized, we find that on astrophysical scales and in cosmology the scalar field is ultralocal and therefore no effect arises on such large scales. On the other hand, the scalar field mass is tightly constrained by the nonobservation of fifth forces in torsion pendulum experiments such as Eöt-Wash. It turns out that the observation of the dark energy scale in cosmology implies that the scalar field could be detectable by fifth-force experiments in the near future.

  2. UV/TiO2 photocatalytic disinfection of carbon-bacteria complexes in activated carbon-filtered water: Laboratory and pilot-scale investigation.

    PubMed

    Zhao, Jin Hui; Chen, Wei; Zhao, Yaqian; Liu, Cuiyun; Liu, Ranbin

    2015-01-01

    The occurrence of carbon-bacteria complexes in activated carbon filtered water has posed a public health problem regarding the biological safety of drinking water. The application of combined process of ultraviolet radiation and nanostructure titanium dioxide (UV/TiO2) photocatalysis for the disinfection of carbon-bacteria complexes were assessed in this study. Results showed that a 1.07 Lg disinfection rate can be achieved using a UV dose of 20 mJ cm(-2), while the optimal UV intensity was 0.01 mW cm(-2). Particle sizes ≥8 μm decreased the disinfection efficiency, whereas variation in particle number in activated carbon-filtered water did not significantly affect the disinfection efficiency. Photoreactivation ratio was reduced from 12.07% to 1.69% when the UV dose was increased from 5 mJ cm(-2) to 20 mJ cm(-2). Laboratory and on-site pilot-scale experiments have demonstrated that UV/TiO2 photocatalytic disinfection technology is capable of controlling the risk posed by carbon-bacteria complexes and securing drinking water safety.

  3. Comparison of NASA-TLX scale, Modified Cooper-Harper scale and mean inter-beat interval as measures of pilot mental workload during simulated flight tasks.

    PubMed

    Mansikka, Heikki; Virtanen, Kai; Harris, Don

    2018-04-30

    The sensitivity of NASA-TLX scale, modified Cooper-Harper (MCH) scale and the mean inter-beat interval (IBI) of successive heart beats, as measures of pilot mental workload (MWL), were evaluated in a flight training device (FTD). Operational F/A-18C pilots flew instrument approaches with varying task loads. Pilots' performance, subjective MWL ratings and IBI were measured. Based on the pilots' performance, three performance categories were formed; high-, medium- and low-performance. Values of the subjective rating scales and IBI were compared between categories. It was found that all measures were able to differentiate most task conditions and there was a strong, positive correlation between NASA-TLX and MCH scale. An explicit link between IBI, NASA-TLX, MCH and performance was demonstrated. While NASA-TLX, MCH and IBI have all been previously used to measure MWL, this study is the first one to investigate their association in a modern FTD, using a realistic flying mission and operational pilots.

  4. Moving out of the laboratory: does nicotine improve everyday attention?

    PubMed

    Rusted, J M; Caulfield, D; King, L; Goode, A

    2000-11-01

    The most robust demonstrations of the nicotine-related performance effects on human cognitive processes are seen in tasks that measure attention. If nicotine does have some potential for enhancing attention, the obvious question to ask is whether the effects demonstrated in the laboratory hold any significance for real-life performance. This paper describes three studies that compare the effects in smokers of a single own brand cigarette on laboratory tests of attention and on everyday analogues of these laboratory tasks. In the laboratory measures of sustained attention and in the everyday analogue, performance advantages were registered in the smoking condition. These benefits were observed in smokers who abstained for a self-determined period of not less than 2 h. The studies were unable to replicate previous research reporting positive effects of smoking on a laboratory task of selective attention, the Stroop task. Small but significant improvements in performance were registered in the everyday analogues, which involved sustaining attention in a dual task situation, a telephone directory search task and a map search task. In addition, smokers showed a significant colour-naming decrement for smoking-related stimuli in the Stroop task. This attentional bias towards smoking-related words occurred independent of whether they had abstained or recently smoked an own brand cigarette. The effect is discussed in terms of the two-component model of processing bias for emotionally valenced stimuli.

  5. Laboratory-Scale Internal Wave Apparatus for Studying Copepod Behavior

    NASA Astrophysics Data System (ADS)

    Jung, S.; Webster, D. R.; Haas, K. A.; Yen, J.

    2016-02-01

    Internal waves are ubiquitous features in coastal marine environments and have been observed to mediate vertical distributions of zooplankton in situ. Internal waves create fine-scale hydrodynamic cues that copepods and other zooplankton are known to sense, such as fluid density gradients and velocity gradients (quantified as shear deformation rate). The role of copepod behavior in response to cues associated with internal waves is largely unknown. The objective is to provide insight to the bio-physical interaction and the role of biological versus physical forcing in mediating organism distributions. We constructed a laboratory-scale internal wave apparatus to facilitate fine-scale observations of copepod behavior in flows that replicate in situ conditions of internal waves in two-layer stratification. Two cases were chosen with density jump of 1 and 1.5 sigma-t units. Analytical analysis of the two-layer system provided guidance to the target forcing frequency needed to generate a standing internal wave with a single dominate frequency of oscillation. Flow visualization and signal processing of the interface location were used to quantify the wave characteristics. The results show a close match to the target wave parameters. Marine copepod (mixed population of Acartia tonsa, Temora longicornis, and Eurytemora affinis) behavior assays were conducted for three different physical arrangements: (1) no density stratification, (2) stagnant two-layer density stratification, and (3) two-layer density stratification with internal wave motion. Digitized trajectories of copepod swimming behavior indicate that in the control (case 1) the animals showed no preferential motion in terms of direction. In the stagnant density jump treatment (case 2) copepods preferentially moved horizontally, parallel to the density interface. In the internal wave treatment (case 3) copepods demonstrated orbital trajectories near the density interface.

  6. Laboratory safety aspects of SARS at Biosafety Level 2.

    PubMed

    Barkham, T M S

    2004-03-01

    The severe acute respiratory syndrome (SARS)-associated coronavirus causes severe disease, is transmissible to the community and there is no effective prophylaxis or treatment--perhaps fulfilling the criteria for biohazard group 3 or 4. The recommendation to use Biosafety Level (BSL)3 practices within a BSL2 environment appears to have been a practical decision based on available resources; most diagnostic laboratories operate at BSL2. Safety is achieved with controls in administration, engineering and personal protective equipment/behaviour. At the heart of every safety policy is a risk assessment based on the exact manipulations employed. Excessive administrative and engineering controls are less important than the training and personal attitudes, abilities and understanding of the staff. The SARS outbreak focused our attention on the safety aspects of common mundane tasks, such as decapping blood tubes. Laboratories often claim they follow certain practices but casual observation does not always support these claims. Guidelines differed and created uncertainty. This was stressful for laboratory staff held accountable for their implementation. Attempts to categorise risks and their management into neatly wrapped parcels are attractive, but closer inspection reveals a subjective element that allows doubt to creep in with varying interpretations of the literature. Staff most at risk were those handling respiratory samples. Staff receiving samples via pneumatic tubes had least control over their exposure and were potentially exposed to aerosols from leaking samples. Risk assessment remains a balance between cost and benefit.

  7. Symbiotic Sensing for Energy-Intensive Tasks in Large-Scale Mobile Sensing Applications.

    PubMed

    Le, Duc V; Nguyen, Thuong; Scholten, Hans; Havinga, Paul J M

    2017-11-29

    Energy consumption is a critical performance and user experience metric when developing mobile sensing applications, especially with the significantly growing number of sensing applications in recent years. As proposed a decade ago when mobile applications were still not popular and most mobile operating systems were single-tasking, conventional sensing paradigms such as opportunistic sensing and participatory sensing do not explore the relationship among concurrent applications for energy-intensive tasks. In this paper, inspired by social relationships among living creatures in nature, we propose a symbiotic sensing paradigm that can conserve energy, while maintaining equivalent performance to existing paradigms. The key idea is that sensing applications should cooperatively perform common tasks to avoid acquiring the same resources multiple times. By doing so, this sensing paradigm executes sensing tasks with very little extra resource consumption and, consequently, extends battery life. To evaluate and compare the symbiotic sensing paradigm with the existing ones, we develop mathematical models in terms of the completion probability and estimated energy consumption. The quantitative evaluation results using various parameters obtained from real datasets indicate that symbiotic sensing performs better than opportunistic sensing and participatory sensing in large-scale sensing applications, such as road condition monitoring, air pollution monitoring, and city noise monitoring.

  8. Symbiotic Sensing for Energy-Intensive Tasks in Large-Scale Mobile Sensing Applications

    PubMed Central

    Scholten, Hans; Havinga, Paul J. M.

    2017-01-01

    Energy consumption is a critical performance and user experience metric when developing mobile sensing applications, especially with the significantly growing number of sensing applications in recent years. As proposed a decade ago when mobile applications were still not popular and most mobile operating systems were single-tasking, conventional sensing paradigms such as opportunistic sensing and participatory sensing do not explore the relationship among concurrent applications for energy-intensive tasks. In this paper, inspired by social relationships among living creatures in nature, we propose a symbiotic sensing paradigm that can conserve energy, while maintaining equivalent performance to existing paradigms. The key idea is that sensing applications should cooperatively perform common tasks to avoid acquiring the same resources multiple times. By doing so, this sensing paradigm executes sensing tasks with very little extra resource consumption and, consequently, extends battery life. To evaluate and compare the symbiotic sensing paradigm with the existing ones, we develop mathematical models in terms of the completion probability and estimated energy consumption. The quantitative evaluation results using various parameters obtained from real datasets indicate that symbiotic sensing performs better than opportunistic sensing and participatory sensing in large-scale sensing applications, such as road condition monitoring, air pollution monitoring, and city noise monitoring. PMID:29186037

  9. Anaerobic treatment of animal byproducts from slaughterhouses at laboratory and pilot scale.

    PubMed

    Edström, Mats; Nordberg, Ake; Thyselius, Lennart

    2003-01-01

    Different mixtures of animal byproducts, other slaughterhouse waste (i.e., rumen, stomach and intestinal content), food waste, and liquid manure were codigested at mesophilic conditions (37 degrees C) at laboratory and pilot scale. Animal byproducts, including blood, represent 70-80% of the total biogas potential from waste generated during slaughter of animals. The total biogas potential from waste generated during slaughter is about 1300 MJ/cattle and about 140 MJ/pig. Fed-batch digestion of pasteurized (70 degrees C, 1 h) animal byproducts resulted in a fourfold increase in biogas yield (1.14 L/g of volatile solids [VS]) compared with nonpasteurized animal byproducts (0.31 L/g of VS). Mixtures with animal byproducts representing 19-38% of the total dry matter were digested in continuous-flow stirred tank reactors at laboratory and pilot scale. Stable processes at organic loading rates (OLRs) exceeding 2.5 g of VS/(L.d) and hydraulic retention times (HRTs) less than 40 d could be obtained with total ammonia nitrogen concentrations (NH4-N + NH3-N) in the range of 4.0-5.0 g/L. After operating one process for more than 1.5 yr at total ammonia nitrogen concentrations >4 g/L, an increase in OLR to 5 g of VS/(L.d) and a decrease in HRT to 22 d was possible without accumulation of volatile fatty acids.

  10. The Children's Social Understanding Scale: construction and validation of a parent-report measure for assessing individual differences in children's theories of mind.

    PubMed

    Tahiroglu, Deniz; Moses, Louis J; Carlson, Stephanie M; Mahy, Caitlin E V; Olofson, Eric L; Sabbagh, Mark A

    2014-11-01

    Children's theory of mind (ToM) is typically measured with laboratory assessments of performance. Although these measures have generated a wealth of informative data concerning developmental progressions in ToM, they may be less useful as the sole source of information about individual differences in ToM and their relation to other facets of development. In the current research, we aimed to expand the repertoire of methods available for measuring ToM by developing and validating a parent-report ToM measure: the Children's Social Understanding Scale (CSUS). We present 3 studies assessing the psychometric properties of the CSUS. Study 1 describes item analysis, internal consistency, test-retest reliability, and relation of the scale to children's performance on laboratory ToM tasks. Study 2 presents cross-validation data for the scale in a different sample of preschool children with a different set of ToM tasks. Study 3 presents further validation data for the scale with a slightly older age group and a more advanced ToM task, while controlling for several other relevant cognitive abilities. The findings indicate that the CSUS is a reliable and valid measure of individual differences in children's ToM that may be of great value as a complement to standard ToM tasks in many different research contexts. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  11. Laboratory formation of a scaled protostellar jet by coaligned poloidal magnetic field.

    PubMed

    Albertazzi, B; Ciardi, A; Nakatsutsumi, M; Vinci, T; Béard, J; Bonito, R; Billette, J; Borghesi, M; Burkley, Z; Chen, S N; Cowan, T E; Herrmannsdörfer, T; Higginson, D P; Kroll, F; Pikuz, S A; Naughton, K; Romagnani, L; Riconda, C; Revet, G; Riquier, R; Schlenvoigt, H-P; Skobelev, I Yu; Faenov, A Ya; Soloviev, A; Huarte-Espinosa, M; Frank, A; Portugall, O; Pépin, H; Fuchs, J

    2014-10-17

    Although bipolar jets are seen emerging from a wide variety of astrophysical systems, the issue of their formation and morphology beyond their launching is still under study. Our scaled laboratory experiments, representative of young stellar object outflows, reveal that stable and narrow collimation of the entire flow can result from the presence of a poloidal magnetic field whose strength is consistent with observations. The laboratory plasma becomes focused with an interior cavity. This gives rise to a standing conical shock from which the jet emerges. Following simulations of the process at the full astrophysical scale, we conclude that it can also explain recently discovered x-ray emission features observed in low-density regions at the base of protostellar jets, such as the well-studied jet HH 154. Copyright © 2014, American Association for the Advancement of Science.

  12. A Connectionist Model of a Continuous Developmental Transition in the Balance Scale Task

    ERIC Educational Resources Information Center

    Schapiro, Anna C.; McClelland, James L.

    2009-01-01

    A connectionist model of the balance scale task is presented which exhibits developmental transitions between "Rule I" and "Rule II" behavior [Siegler, R. S. (1976). Three aspects of cognitive development. "Cognitive Psychology," 8, 481-520.] as well as the "catastrophe flags" seen in data from Jansen and van der Maas [Jansen, B. R. J., & van der…

  13. ExM:System Support for Extreme-Scale, Many-Task Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, Daniel S

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastestmore » computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)« less

  14. Bridge-in-a-Backpack(TM). Task 2.1 and 2.2 : investigate alternative shapes with varying radii.

    DOT National Transportation Integrated Search

    2015-02-01

    This report includes fulfillment of Tasks 2.1 and 2.2 of a multi-task contract to further enhance concrete filled FRP : tubes, or the Bridge in a Backpack. Task 2 is an investigation of alternative shapes for the FRP tubes with varying : radii. Task ...

  15. FLARE: A New User Facility for Laboratory Studies of Multiple-Scale Physics of Magnetic Reconnection and Related Phenomena in Heliophysics and Astrophysics

    NASA Astrophysics Data System (ADS)

    Ji, H.; Bhattacharjee, A.; Goodman, A.; Prager, S.; Daughton, W.; Cutler, R.; Fox, W.; Hoffmann, F.; Kalish, M.; Kozub, T.; Jara-Almonte, J.; Myers, C.; Ren, Y.; Sloboda, P.; Yamada, M.; Yoo, J.; Bale, S. D.; Carter, T.; Dorfman, S.; Drake, J.; Egedal, J.; Sarff, J.; Wallace, J.

    2017-10-01

    The FLARE device (Facility for Laboratory Reconnection Experiments; flare.pppl.gov) is a new laboratory experiment under construction at Princeton with first plasmas expected in the fall of 2017, based on the design of Magnetic Reconnection Experiment (MRX; mrx.pppl.gov) with much extended parameter ranges. Its main objective is to provide an experimental platform for the studies of magnetic reconnection and related phenomena in the multiple X-line regimes directly relevant to space, solar, astrophysical and fusion plasmas. The main diagnostics is an extensive set of magnetic probe arrays, simultaneously covering multiple scales from local electron scales ( 2 mm), to intermediate ion scales ( 10 cm), and global MHD scales ( 1 m). Specific example space physics topics which can be studied on FLARE will be discussed.

  16. Dual-Task Processing When Task 1 Is Hard and Task 2 Is Easy: Reversed Central Processing Order?

    ERIC Educational Resources Information Center

    Leonhard, Tanja; Fernandez, Susana Ruiz; Ulrich, Rolf; Miller, Jeff

    2011-01-01

    Five psychological refractory period (PRP) experiments were conducted with an especially time-consuming first task (Experiments 1, 3, and 5: mental rotation; Experiments 2 and 4: memory scanning) and with equal emphasis on the first task and on the second (left-right tone judgment). The standard design with varying stimulus onset asynchronies…

  17. The development of a model to predict the effects of worker and task factors on foot placements in manual material handling tasks.

    PubMed

    Wagner, David W; Reed, Matthew P; Chaffin, Don B

    2010-11-01

    Accurate prediction of foot placements in relation to hand locations during manual materials handling tasks is critical for prospective biomechanical analysis. To address this need, the effects of lifting task conditions and anthropometric variables on foot placements were studied in a laboratory experiment. In total, 20 men and women performed two-handed object transfers that required them to walk to a shelf, lift an object from the shelf at waist height and carry the object to a variety of locations. Five different changes in the direction of progression following the object pickup were used, ranging from 45° to 180° relative to the approach direction. Object weights of 1.0 kg, 4.5 kg, 13.6 kg were used. Whole-body motions were recorded using a 3-D optical retro-reflective marker-based camera system. A new parametric system for describing foot placements, the Quantitative Transition Classification System, was developed to facilitate the parameterisation of foot placement data. Foot placements chosen by the subjects during the transfer tasks appeared to facilitate a change in the whole-body direction of progression, in addition to aiding in performing the lift. Further analysis revealed that five different stepping behaviours accounted for 71% of the stepping patterns observed. More specifically, the most frequently observed behaviour revealed that the orientation of the lead foot during the actual lifting task was primarily affected by the amount of turn angle required after the lift (R(2) = 0.53). One surprising result was that the object mass (scaled by participant body mass) was not found to significantly affect any of the individual step placement parameters. Regression models were developed to predict the most prevalent step placements and are included in this paper to facilitate more accurate human motion simulations and ergonomics analyses of manual material lifting tasks. STATEMENT OF RELEVANCE: This study proposes a method for parameterising the steps

  18. Improving laboratory efficiencies to scale-up HIV viral load testing.

    PubMed

    Alemnji, George; Onyebujoh, Philip; Nkengasong, John N

    2017-03-01

    Viral load measurement is a key indicator that determines patients' response to treatment and risk for disease progression. Efforts are ongoing in different countries to scale-up access to viral load testing to meet the Joint United Nations Programme on HIV and AIDS target of achieving 90% viral suppression among HIV-infected patients receiving antiretroviral therapy. However, the impact of these initiatives may be challenged by increased inefficiencies along the viral load testing spectrum. This will translate to increased costs and ineffectiveness of scale-up approaches. This review describes different parameters that could be addressed across the viral load testing spectrum aimed at improving efficiencies and utilizing test results for patient management. Though progress is being made in some countries to scale-up viral load, many others still face numerous challenges that may affect scale-up efficiencies: weak demand creation, ineffective supply chain management systems; poor specimen referral systems; inadequate data and quality management systems; and weak laboratory-clinical interface leading to diminished uptake of test results. In scaling up access to viral load testing, there should be a renewed focus to address efficiencies across the entire spectrum, including factors related to access, uptake, and impact of test results.

  19. Toward Developing Laboratory-Based Parent-Adolescent Conflict Discussion Tasks that Consistently Elicit Adolescent Conflict-Related Stress Responses: Support from Physiology and Observed Behavior.

    PubMed

    Thomas, Sarah A; Wilson, Tristan; Jain, Anjali; Deros, Danielle E; Um, Miji; Hurwitz, Joanna; Jacobs, Irene; Myerberg, Lindsay; Ehrlich, Katherine B; Dunn, Emily J; Aldao, Amelia; Stadnik, Ryan; De Los Reyes, Andres

    2017-12-01

    Parent-adolescent conflict poses risk for youth maladjustment. One potential mechanism of this risk is that stress in the form of increased arousal during conflict interactions results in adolescents' impaired decision-making. However, eliciting consistent adolescent stress responses within laboratory-based tasks of parent-adolescent conflict (i.e., conflict discussion tasks) is hindered by task design. This limitation may stem from how conflict topics are assessed and selected for discussion. Within a sample of 47 adolescents (ages 14-17) and parents, we investigated whether a modified version of a conflict discussion task could elicit physiological (i.e., arousal) and behavioral (i.e., hostility) displays of adolescents' conflict-related stress responses. We assessed parent-adolescent conflict via structured interview to identify topics for dyads to discuss during the task. We randomly assigned dyads to complete a 5-minute task to discuss either a putatively benign topic (i.e., control condition) or a conflict topic while undergoing direct assessments of continuous arousal. Trained raters coded dyad members' hostile behavior during the task. Adolescents in the conflict condition exhibited significantly greater levels of arousal than adolescents in the control condition. We observed an interaction between discussion condition and baseline conflict. Specifically, higher baseline conflict predicted greater hostile behavior for adolescents in the conflict condition, yet we observed the inverse relation for adolescents in the control condition. Our modified laboratory discussion task successfully elicited both physiological and behavioral displays of adolescent conflict-related stress. These findings have important implications for leveraging experimental paradigms to understand causal links between parent-adolescent conflict and adolescent psychopathology, and their underlying mechanisms.

  20. DeepSkeleton: Learning Multi-Task Scale-Associated Deep Side Outputs for Object Skeleton Extraction in Natural Images

    NASA Astrophysics Data System (ADS)

    Shen, Wei; Zhao, Kai; Jiang, Yuan; Wang, Yan; Bai, Xiang; Yuille, Alan

    2017-11-01

    Object skeletons are useful for object representation and object detection. They are complementary to the object contour, and provide extra information, such as how object scale (thickness) varies among object parts. But object skeleton extraction from natural images is very challenging, because it requires the extractor to be able to capture both local and non-local image context in order to determine the scale of each skeleton pixel. In this paper, we present a novel fully convolutional network with multiple scale-associated side outputs to address this problem. By observing the relationship between the receptive field sizes of the different layers in the network and the skeleton scales they can capture, we introduce two scale-associated side outputs to each stage of the network. The network is trained by multi-task learning, where one task is skeleton localization to classify whether a pixel is a skeleton pixel or not, and the other is skeleton scale prediction to regress the scale of each skeleton pixel. Supervision is imposed at different stages by guiding the scale-associated side outputs toward the groundtruth skeletons at the appropriate scales. The responses of the multiple scale-associated side outputs are then fused in a scale-specific way to detect skeleton pixels using multiple scales effectively. Our method achieves promising results on two skeleton extraction datasets, and significantly outperforms other competitors. Additionally, the usefulness of the obtained skeletons and scales (thickness) are verified on two object detection applications: Foreground object segmentation and object proposal detection.

  1. EPOS Multi-Scale Laboratory platform: a long-term reference tool for experimental Earth Sciences

    NASA Astrophysics Data System (ADS)

    Trippanera, Daniele; Tesei, Telemaco; Funiciello, Francesca; Sagnotti, Leonardo; Scarlato, Piergiorgio; Rosenau, Matthias; Elger, Kirsten; Ulbricht, Damian; Lange, Otto; Calignano, Elisa; Spiers, Chris; Drury, Martin; Willingshofer, Ernst; Winkler, Aldo

    2017-04-01

    With continuous progress on scientific research, a large amount of datasets has been and will be produced. The data access and sharing along with their storage and homogenization within a unique and coherent framework is a new challenge for the whole scientific community. This is particularly emphasized for geo-scientific laboratories, encompassing the most diverse Earth Science disciplines and typology of data. To this aim the "Multiscale Laboratories" Work Package (WP16), operating in the framework of the European Plate Observing System (EPOS), is developing a virtual platform of geo-scientific data and services for the worldwide community of laboratories. This long-term project aims at merging the top class multidisciplinary laboratories in Geoscience into a coherent and collaborative network, facilitating the standardization of virtual access to data, data products and software. This will help our community to evolve beyond the stage in which most of data produced by the different laboratories are available only within the related scholarly publications (often as print-version only) or they remain unpublished and inaccessible on local devices. The EPOS multi-scale laboratory platform will provide the possibility to easily share and discover data by means of open access, DOI-referenced, online data publication including long-term storage, managing and curation services and to set up a cohesive community of laboratories. The WP16 is starting with three pilot cases laboratories: (1) rock physics, (2) palaeomagnetic, and (3) analogue modelling. As a proof of concept, first analogue modelling datasets have been published via GFZ Data Services (http://doidb.wdc-terra.org/search/public/ui?&sort=updated+desc&q=epos). The datasets include rock analogue material properties (e.g. friction data, rheology data, SEM imagery), as well as supplementary figures, images and movies from experiments on tectonic processes. A metadata catalogue tailored to the specific communities

  2. Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture

    ERIC Educational Resources Information Center

    Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie

    2015-01-01

    This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…

  3. Examining Infants' Cortisol Responses to Laboratory Tasks among Children Varying in Attachment Disorganization: Stress Reactivity or Return to Baseline?

    ERIC Educational Resources Information Center

    Bernard, Kristin; Dozier, Mary

    2010-01-01

    Cortisol is a hormone involved in mounting a stress response in humans. The evidence of stress reactivity among young children has been mixed, however. In the present study, the order of two laboratory tasks (i.e., Strange Situation and play) was counterbalanced, and home saliva samples were obtained. Saliva samples were also collected upon the…

  4. Formulation and development of tablets based on Ludipress and scale-up from laboratory to production scale.

    PubMed

    Heinz, R; Wolf, H; Schuchmann, H; End, L; Kolter, K

    2000-05-01

    In spite of the wealth of experience available in the pharmaceutical industry, tablet formulations are still largely developed on an empirical basis, and the scale-up from laboratory to production is a time-consuming and costly process. Using Ludipress greatly simplifies formulation development and the manufacturing process because only the active ingredient Ludipress and a lubricant need to be mixed briefly before being compressed into tablets. The studies described here were designed to investigate the scale-up of Ludipress-based formulations from laboratory to production scale, and to predict changes in tablet properties due to changes in format, compaction pressure, and the use of different tablet presses. It was found that the tensile strength of tablets made of Ludipress increased linearly with compaction pressures up to 300 MPa. It was also independent of the geometry of the tablets (diameter, thickness, shape). It is therefore possible to give an equation with which the compaction pressure required to achieve a given hardness can be calculated for a given tablet form. The equation has to be modified slightly to convert from a single-punch press to a rotary tableting machine. Tablets produced in the rotary machine at the same pressure have a slightly higher tensile strength. The rate of increase in pressure, and therefore the throughput, has no effect on the tensile strength of Ludipress tablets. It is thought that a certain minimum dwell time is responsible for this difference. The production of tablets based on Ludipress can be scaled up from one rotary press to another without problem if the powder mixtures are prepared with the same mixing energy. The tensile strength curve determined for tablets made with Ludipress alone can also be applied to tablets with a small quantity (< 10%) of an active ingredient.

  5. Risky Decision Making in a Laboratory Driving Task Is Associated with Health Risk Behaviors during Late Adolescence but Not Adulthood

    ERIC Educational Resources Information Center

    Kim-Spoon, Jungmeen; Kahn, Rachel; Deater-Deckard, Kirby; Chiu, Pearl; Steinberg, Laurence; King-Casas, Brooks

    2016-01-01

    Adolescence is characterized by increasing incidence of health risk behaviors, including experimentation with drugs and alcohol. To fill the gap in our understanding of the associations between risky decision-making and health risk behaviors, we investigated associations between laboratory-based risky decision-making using the Stoplight task and…

  6. Fluid dynamics structures in a fire environment observed in laboratory-scale experiments

    Treesearch

    J. Lozano; W. Tachajapong; D.R. Weise; S. Mahalingam; M. Princevac

    2010-01-01

    Particle Image Velocimetry (PIV) measurements were performed in laboratory-scale experimental fires spreading across horizontal fuel beds composed of aspen (Populus tremuloides Michx) excelsior. The continuous flame, intermittent flame, and thermal plume regions of a fire were investigated. Utilizing a PIV system, instantaneous velocity fields for...

  7. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    ERIC Educational Resources Information Center

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  8. On the dominant noise components of tactical aircraft: Laboratory to full scale

    NASA Astrophysics Data System (ADS)

    Tam, Christopher K. W.; Aubert, Allan C.; Spyropoulos, John T.; Powers, Russell W.

    2018-05-01

    This paper investigates the dominant noise components of a full-scale high performance tactical aircraft. The present study uses acoustic measurements of the exhaust jet from a single General Electric F414-400 turbofan engine installed in a Boeing F/A-18E Super Hornet aircraft operating from flight idle to maximum afterburner. The full-scale measurements are to the ANSI S12.75-2012 standard employing about 200 microphones. By comparing measured noise spectra with those from hot supersonic jets observed in the laboratory, the dominant noise components specific to the F/A-18E aircraft at different operating power levels are identified. At intermediate power, it is found that the dominant noise components of an F/A-18E aircraft are essentially the same as those of high temperature supersonic laboratory jets. However, at military and afterburner powers, there are new dominant noise components. Their characteristics are then documented and analyzed. This is followed by an investigation of their origin and noise generation mechanisms.

  9. Effect of nacelle on wake meandering in a laboratory scale wind turbine using LES

    NASA Astrophysics Data System (ADS)

    Foti, Daniel; Yang, Xiaolei; Guala, Michele; Sotiropoulos, Fotis

    2015-11-01

    Wake meandering, large scale motion in the wind turbine wakes, has considerable effects on the velocity deficit and turbulence intensity in the turbine wake from the laboratory scale to utility scale wind turbines. In the dynamic wake meandering model, the wake meandering is assumed to be caused by large-scale atmospheric turbulence. On the other hand, Kang et al. (J. Fluid Mech., 2014) demonstrated that the nacelle geometry has a significant effect on the wake meandering of a hydrokinetic turbine, through the interaction of the inner wake of the nacelle vortex with the outer wake of the tip vortices. In this work, the significance of the nacelle on the wake meandering of a miniature wind turbine previously used in experiments (Howard et al., Phys. Fluid, 2015) is demonstrated with large eddy simulations (LES) using immersed boundary method with fine enough grids to resolve the turbine geometric characteristics. The three dimensionality of the wake meandering is analyzed in detail through turbulent spectra and meander reconstruction. The computed flow fields exhibit wake dynamics similar to those observed in the wind tunnel experiments and are analyzed to shed new light into the role of the energetic nacelle vortex on wake meandering. This work was supported by Department of Energy DOE (DE-EE0002980, DE-EE0005482 and DE-AC04-94AL85000), and Sandia National Laboratories. Computational resources were provided by Sandia National Laboratories and the University of Minnesota Supercomputing.

  10. Scaled vibratory feedback can bias muscle use in children with dystonia during a redundant, one-dimensional myocontrol task

    PubMed Central

    Liyanagamage, Shanie A.; Bertucco, Matteo; Bhanpuri, Nasir H.; Sanger, Terence D.

    2016-01-01

    Vibratory feedback can be a useful tool for rehabilitation. We examined its use in children with dystonia to understand how it affects muscle activity in a population that does not respond well to standard rehabilitation. We predicted scaled vibration (i.e. vibration that was directly or inversely proportional to muscle activity) would increase use of the vibrated muscle because of task-relevant sensory information, while non-scaled vibration would not change muscle use. The study was conducted on 11 subjects with dystonia and 14 controls. Each subject underwent 4 different types of vibration on the more dystonic biceps muscle (or non-dominant arm in controls) in a one-dimensional, bimanual myocontrol task. Our results showed that only scaled vibratory feedback could bias muscle use without changing overall performance in children with dystonia. We believe there may be a role in rehabilitation for scaled vibratory feedback to retrain abnormal muscle patterns. PMID:27798370

  11. Decision paths in complex tasks

    NASA Technical Reports Server (NTRS)

    Galanter, Eugene

    1991-01-01

    Complex real world action and its prediction and control has escaped analysis by the classical methods of psychological research. The reason is that psychologists have no procedures to parse complex tasks into their constituents. Where such a division can be made, based say on expert judgment, there is no natural scale to measure the positive or negative values of the components. Even if we could assign numbers to task parts, we lack rules i.e., a theory, to combine them into a total task representation. We compare here two plausible theories for the amalgamation of the value of task components. Both of these theories require a numerical representation of motivation, for motivation is the primary variable that guides choice and action in well-learned tasks. We address this problem of motivational quantification and performance prediction by developing psychophysical scales of the desireability or aversiveness of task components based on utility scaling methods (Galanter 1990). We modify methods used originally to scale sensory magnitudes (Stevens and Galanter 1957), and that have been applied recently to the measure of task 'workload' by Gopher and Braune (1984). Our modification uses utility comparison scaling techniques which avoid the unnecessary assumptions made by Gopher and Braune. Formula for the utility of complex tasks based on the theoretical models are used to predict decision and choice of alternate paths to the same goal.

  12. Source Code Analysis Laboratory (SCALe)

    DTIC Science & Technology

    2012-04-01

    Versus Flagged Nonconformities (FNC) Software System TP/FNC Ratio Mozilla Firefox version 2.0 6/12 50% Linux kernel version 2.6.15 10/126 8...is inappropriately tuned for analysis of the Linux kernel, which has anomalous results. Customizing SCALe to work with software for a particular...servers support a collection of virtual machines (VMs) that can be configured to support analysis in various environments, such as Windows XP and Linux . A

  13. Task Effects on Linguistic Complexity and Accuracy: A Large-Scale Learner Corpus Analysis Employing Natural Language Processing Techniques

    ERIC Educational Resources Information Center

    Alexopoulou, Theodora; Michel, Marije; Murakami, Akira; Meurers, Detmar

    2017-01-01

    Large-scale learner corpora collected from online language learning platforms, such as the EF-Cambridge Open Language Database (EFCAMDAT), provide opportunities to analyze learner data at an unprecedented scale. However, interpreting the learner language in such corpora requires a precise understanding of tasks: How does the prompt and input of a…

  14. Task 2 Report - A GIS-Based Technical Potential Assessment of Domestic Energy Resources for Electricity Generation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Nathan; Grue, Nicholas W; Rosenlieb, Evan

    The purpose of this report is to support the Lao Ministry of Energy and Mines in assessing the technical potential of domestic energy resources for utility scale electricity generation in the Lao PDR. Specifically, this work provides assessments of technical potential, and associated maps of developable areas, for energy technologies of interest. This report details the methodology, assumptions, and datasets employed in this analysis to provide a transparent, replicable process for future analyses. The methodology and results presented are intended to be a fundamental input to subsequent decision making and energy planning-related analyses. This work concentrates on domestic energy resourcesmore » for utility-scale electricity generation and considers solar photovoltaic, wind, biomass, and coal resources. This work does not consider potentially imported energy resources (e.g., natural gas) or domestic energy resources that are not present in sufficient quantity for utility-scale generation (e.g., geothermal resources). A technical potential assessment of hydropower resources is currently not feasible due to the absence of required data including site-level assessments of multiple characteristics (e.g., geology environment and access) as well as spatial data on estimated non-exploited hydropower resources. This report is the second output of the Energy Alternatives Study for the Lao PDR, a collaboration led by the Lao Ministry of Energy and Mines and the United States Agency for International Development under the auspices of the Smart Infrastructure for the Mekong program. The Energy Alternatives Study is composed of five successive tasks that collectively support the project's goals. This work is focused on Task 2 - Assess technical potential of domestic energy resources for electricity generation. The work was carried out by a team from the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) in collaboration with the Lao Ministry of Energy

  15. Final report for 105-N Basin sediment disposition task, phase 2 -- samples BOMPC8 and BOMPC9

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esch, R.A.

    1998-02-05

    This document is the final report deliverable for Phase 2 analytical work for the 105-N Basin Sediment Disposition Task. On December 23, 1997, ten samples were received at the 222-S Laboratory as follows: two (2) bottles of potable water, six (6) samples for process control testing and two (2) samples for characterization. Analyses were performed in accordance with the Letter of Instruction for Phase 2 Analytical Work for the 105-N Basin Sediment Disposition Task (Logan and Kessner, 1997) (Attachment 7) and 105-N Basin Sediment Disposition Phase-Two Sampling and Analysis Plan (SAP) (Smith, 1997). The analytical results are included in Tablemore » 1. This document provides the values of X/Qs for the onsite and offsite receptors, taking into account the building wake and the atmospheric stability effects. X/Qs values for the potential fire accident were also calculated. In addition, the unit dose were calculated for the mixtures of isotopes.« less

  16. Identification of oxidative coupling products of xylenols arising from laboratory-scale phytoremediation.

    PubMed

    Poerschmann, J; Schultze-Nobre, L; Ebert, R U; Górecki, T

    2015-01-01

    Oxidative coupling reactions take place during the passage of xylenols through a laboratory-scale helophyte-based constructed wetland system. Typical coupling product groups including tetramethyl-[1,1'-biphenyl] diols and tetramethyl diphenylether monools as stable organic intermediates could be identified by a combination of pre-chromatographic derivatization and GC/MS analysis. Structural assignment of individual analytes was performed by an increment system developed by Zenkevich to pre-calculate retention sequences. The most abundant analyte turned out to be 3,3',5,5'-tetramethyl-[1,1'-biphenyl]-4,4'-diol, which can be formed by a combination of radicals based on 2,6-xylenol or by an attack of a 2,6-xylenol-based radical on 2,6-xylenol. Organic intermediates originating from oxidative coupling could also be identified in anaerobic constructed wetland systems. This finding suggested the presence of (at least partly) oxic conditions in the rhizosphere. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Fast laboratory-based micro-computed tomography for pore-scale research: Illustrative experiments and perspectives on the future

    NASA Astrophysics Data System (ADS)

    Bultreys, Tom; Boone, Marijn A.; Boone, Matthieu N.; De Schryver, Thomas; Masschaele, Bert; Van Hoorebeke, Luc; Cnudde, Veerle

    2016-09-01

    Over the past decade, the wide-spread implementation of laboratory-based X-ray micro-computed tomography (micro-CT) scanners has revolutionized both the experimental and numerical research on pore-scale transport in geological materials. The availability of these scanners has opened up the possibility to image a rock's pore space in 3D almost routinely to many researchers. While challenges do persist in this field, we treat the next frontier in laboratory-based micro-CT scanning: in-situ, time-resolved imaging of dynamic processes. Extremely fast (even sub-second) micro-CT imaging has become possible at synchrotron facilities over the last few years, however, the restricted accessibility of synchrotrons limits the amount of experiments which can be performed. The much smaller X-ray flux in laboratory-based systems bounds the time resolution which can be attained at these facilities. Nevertheless, progress is being made to improve the quality of measurements performed on the sub-minute time scale. We illustrate this by presenting cutting-edge pore scale experiments visualizing two-phase flow and solute transport in real-time with a lab-based environmental micro-CT set-up. To outline the current state of this young field and its relevance to pore-scale transport research, we critically examine its current bottlenecks and their possible solutions, both on the hardware and the software level. Further developments in laboratory-based, time-resolved imaging could prove greatly beneficial to our understanding of transport behavior in geological materials and to the improvement of pore-scale modeling by providing valuable validation.

  18. Brain-wave measures of workload in advanced cockpits: The transition of technology from laboratory to cockpit simulator, phase 2

    NASA Technical Reports Server (NTRS)

    Horst, Richard L.; Mahaffey, David L.; Munson, Robert C.

    1989-01-01

    The present Phase 2 small business innovation research study was designed to address issues related to scalp-recorded event-related potential (ERP) indices of mental workload and to transition this technology from the laboratory to cockpit simulator environments for use as a systems engineering tool. The project involved five main tasks: (1) Two laboratory studies confirmed the generality of the ERP indices of workload obtained in the Phase 1 study and revealed two additional ERP components related to workload. (2) A task analysis' of flight scenarios and pilot tasks in the Advanced Concepts Flight Simulator (ACFS) defined cockpit events (i.e., displays, messages, alarms) that would be expected to elicit ERPs related to workload. (3) Software was developed to support ERP data analysis. An existing ARD-proprietary package of ERP data analysis routines was upgraded, new graphics routines were developed to enhance interactive data analysis, and routines were developed to compare alternative single-trial analysis techniques using simulated ERP data. (4) Working in conjunction with NASA Langley research scientists and simulator engineers, preparations were made for an ACFS validation study of ERP measures of workload. (5) A design specification was developed for a general purpose, computerized, workload assessment system that can function in simulators such as the ACFS.

  19. Use of task-shifting to rapidly scale-up HIV treatment services: experiences from Lusaka, Zambia

    PubMed Central

    Morris, Mary B; Chapula, Bushimbwa Tambatamba; Chi, Benjamin H; Mwango, Albert; Chi, Harmony F; Mwanza, Joyce; Manda, Handson; Bolton, Carolyn; Pankratz, Debra S; Stringer, Jeffrey SA; Reid, Stewart E

    2009-01-01

    The World Health Organization advocates task-shifting, the process of delegating clinical care functions from more specialized to less specialized health workers, as a strategy to achieve the United Nations Millennium Development Goals. However, there is a dearth of literature describing task shifting in sub-Saharan Africa, where services for antiretroviral therapy (ART) have scaled up rapidly in the face of generalized human resource crises. As part of ART services expansion in Lusaka, Zambia, we implemented a comprehensive task-shifting program among existing health providers and community-based workers. Training begins with didactic sessions targeting specialized skill sets. This is followed by an intensive period of practical mentorship, where providers are paired with trainers before working independently. We provide on-going quality assessment using key indicators of clinical care quality at each site. Program performance is reviewed with clinic-based staff quarterly. When problems are identified, clinic staff members design and implement specific interventions to address targeted areas. From 2005 to 2007, we trained 516 health providers in adult HIV treatment; 270 in pediatric HIV treatment; 341 in adherence counseling; 91 in a specialty nurse "triage" course, and 93 in an intensive clinical mentorship program. On-going quality assessment demonstrated improvement across clinical care quality indicators, despite rapidly growing patient volumes. Our task-shifting strategy was designed to address current health care worker needs and to sustain ART scale-up activities. While this approach has been successful, long-term solutions to the human resource crisis are also urgently needed to expand the number of providers and to slow staff migration out of the region. PMID:19134202

  20. Laboratory and theoretical models of planetary-scale instabilities and waves

    NASA Technical Reports Server (NTRS)

    Hart, John E.; Toomre, Juri

    1991-01-01

    Meteorologists and planetary astronomers interested in large-scale planetary and solar circulations recognize the importance of rotation and stratification in determining the character of these flows. The two outstanding problems of interest are: (1) the origins and nature of chaos in baroclinically unstable flows; and (2) the physical mechanisms responsible for high speed zonal winds and banding on the giant planets. The methods used to study these problems, and the insights gained, are useful in more general atmospheric and climate dynamic settings. Because the planetary curvature or beta-effect is crucial in the large scale nonlinear dynamics, the motions of rotating convecting liquids in spherical shells were studied using electrohydrodynamic polarization forces to generate radial gravity and centrally directed buoyancy forces in the laboratory. The Geophysical Fluid Flow Cell (GFFC) experiments performed on Spacelab 3 in 1985 were analyzed. The interpretation and extension of these results have led to the construction of efficient numerical models of rotating convection with an aim to understand the possible generation of zonal banding on Jupiter and the fate of banana cells in rapidly rotating convection as the heating is made strongly supercritical. Efforts to pose baroclinic wave experiments for future space missions using a modified version of the 1985 instrument have led us to develop theoretical and numerical models of baroclinic instability. Some surprising properties of both these models were discovered.

  1. Manual versus automated γ-H2AX foci analysis across five European laboratories: can this assay be used for rapid biodosimetry in a large scale radiation accident?

    PubMed

    Rothkamm, Kai; Barnard, Stephen; Ainsbury, Elizabeth A; Al-Hafidh, Jenna; Barquinero, Joan-Francesc; Lindholm, Carita; Moquet, Jayne; Perälä, Marjo; Roch-Lefèvre, Sandrine; Scherthan, Harry; Thierens, Hubert; Vral, Anne; Vandersickel, Veerle

    2013-08-30

    The identification of severely exposed individuals and reassurance of the 'worried well' are of prime importance for initial triage following a large scale radiation accident. We aim to develop the γ-H2AX foci assay into a rapid biomarker tool for use in accidents. Here, five laboratories established a standard operating procedure and analysed 100 ex vivo γ-irradiated, 4 or 24h incubated and overnight-shipped lymphocyte samples from four donors to generate γ-H2AX reference data, using manual and/or automated foci scoring strategies. In addition to acute, homogeneous exposures to 0, 1, 2 and 4Gy, acute simulated partial body (4Gy to 50% of cells) and protracted exposures (4Gy over 24h) were analysed. Data from all laboratories could be satisfactorily fitted with linear dose response functions. Average yields observed at 4h post exposure were 2-4 times higher than at 24h and varied considerably between laboratories. Automated scoring caused larger uncertainties than manual scoring and was unable to identify partial exposures, which were detectable in manually scored samples due to their overdispersed foci distributions. Protracted exposures were detectable but doses could not be accurately estimated with the γ-H2AX assay. We conclude that the γ-H2AX assay may be useful for rapid triage following a recent acute radiation exposure. The potentially higher speed and convenience of automated relative to manual foci scoring needs to be balanced against its compromised accuracy and inability to detect partial body exposures. Regular re-calibration or inclusion of reference samples may be necessary to ensure consistent results between laboratories or over long time periods. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  2. Pilot-scale laboratory waste treatment by supercritical water oxidation.

    PubMed

    Oshima, Yoshito; Hayashi, Rumiko; Yamamoto, Kazuo

    2006-01-01

    Supercritical water oxidation (SCWO) is a reaction in which organics in an aqueous solution can be oxidized by O2 to CO2 and H2O at a very high reaction rate. In 2003, The University of Tokyo constructed a facility for the SCWO process, the capacity of which is approximately 20 kl/year, for the purpose of treating organic laboratory waste. Through the operation of this facility, we have demonstrated that most of the organics in laboratory waste including halogenated organic compounds can be successfully treated without the formation of dioxines, suggesting that SCWO is useful as an alternative technology to the conventional incineration process.

  3. Comparing Treatment Effect Measurements in Narcolepsy: The Sustained Attention to Response Task, Epworth Sleepiness Scale and Maintenance of Wakefulness Test.

    PubMed

    van der Heide, Astrid; van Schie, Mojca K M; Lammers, Gert Jan; Dauvilliers, Yves; Arnulf, Isabelle; Mayer, Geert; Bassetti, Claudio L; Ding, Claire-Li; Lehert, Philippe; van Dijk, J Gert

    2015-07-01

    To validate the Sustained Attention to Response Task (SART) as a treatment effect measure in narcolepsy, and to compare the SART with the Maintenance of Wakefulness Test (MWT) and the Epworth Sleepiness Scale (ESS). Validation of treatment effect measurements within a randomized controlled trial (RCT). Ninety-five patients with narcolepsy with or without cataplexy. The RCT comprised a double-blind, parallel-group, multicenter trial comparing the effects of 8-w treatments with pitolisant (BF2.649), modafinil, or placebo (NCT01067222). MWT, ESS, and SART were administered at baseline and after an 8-w treatment period. The severity of excessive daytime sleepiness and cataplexy was also assessed using the Clinical Global Impression scale (CGI-C). The SART, MWT, and ESS all had good reliability, obtained for the SART and MWT using two to three sessions in 1 day. The ability to distinguish responders from nonresponders, classified using the CGI-C score, was high for all measures, with a high performance for the SART (r = 0.61) and the ESS (r = 0.54). The Sustained Attention to Response Task is a valid and easy-to-administer measure to assess treatment effects in narcolepsy, enhanced by combining it with the Epworth Sleepiness Scale. © 2015 Associated Professional Sleep Societies, LLC.

  4. On the differentiation of N2 components in an appetitive choice task: evidence for the revised Reinforcement Sensitivity Theory.

    PubMed

    Leue, Anja; Chavanon, Mira-Lynn; Wacker, Jan; Stemmler, Gerhard

    2009-11-01

    Task- and personality-related modulations of the N2 were probed within the framework of the revised Reinforcement Sensitivity Theory (RST). Using an appetitive choice task, we investigated 58 students with extreme scores on the behavioral inhibition system and behavioral approach system (BIS/BAS) scales. The baseline-to-peak N2 amplitude was sensitive to the strength of decision conflict and demonstrated RST-related personality differences. In addition to the baseline N2 amplitude, temporal PCA results suggested two N2 components accounting for a laterality effect and capturing different N2 patterns for BIS/BAS groups with increasing conflict level. Evidence for RST-related personality differences was obtained for baseline-to-peak N2 and tPCA components in the present task. The results support the RST prediction that BAS sensitivity modulates conflict processing and confirm the cognitive-motivational conflict concept of RST.

  5. Is the Go/No-Go Lexical Decision Task Preferable to the Yes/No Task with Developing Readers?

    ERIC Educational Resources Information Center

    Moret-Tatay, Carmen; Perea, Manuel

    2011-01-01

    The lexical decision task is probably the most common laboratory visual word identification task together with the naming task. In the usual setup, participants need to press the "yes" button when the stimulus is a word and the "no" button when the stimulus is not a word. A number of studies have employed this task with developing readers;…

  6. Geospace Plasma Dynamics Laboratory Annual Task Report (FY11)

    DTIC Science & Technology

    2012-03-01

    Site Contractors: Nagendra Singh, Ph.D., Physicist , 0.5 MY Neil Grossbard, M.S., Mathematician , 0.7 MY Visitors: Publications: Articles in...PhD Project Manager Division Chief, RVB This report is published in the interest of scientific and technical...Annual Task Report (FY11) 5c. PROGRAM ELEMENT NUMBER 61102F 6. AUTHOR(S) 5d. PROJECT NUMBER 2311 Daniel Ober 5e. TASK NUMBER

  7. The Effect of Focus on Form and Task Complexity on L2 Learners' Oral Task Performance

    ERIC Educational Resources Information Center

    Salimi, Asghar

    2015-01-01

    Second Language learners' oral task performance has been one of interesting and research generating areas of investigations in the field of second language acquisition specially, task-based language teaching and learning. The main purpose of the present study is to investigate the effect of focus on form and task complexity on L2 learners' oral…

  8. TASK-2: a K2P K+ channel with complex regulation and diverse physiological functions

    PubMed Central

    Cid, L. Pablo; Roa-Rojas, Hugo A.; Niemeyer, María I.; González, Wendy; Araki, Masatake; Araki, Kimi; Sepúlveda, Francisco V.

    2013-01-01

    TASK-2 (K2P5.1) is a two-pore domain K+ channel belonging to the TALK subgroup of the K2P family of proteins. TASK-2 has been shown to be activated by extra- and intracellular alkalinization. Extra- and intracellular pH-sensors reside at arginine 224 and lysine 245 and might affect separate selectivity filter and inner gates respectively. TASK-2 is modulated by changes in cell volume and a regulation by direct G-protein interaction has also been proposed. Activation by extracellular alkalinization has been associated with a role of TASK-2 in kidney proximal tubule bicarbonate reabsorption, whilst intracellular pH-sensitivity might be the mechanism for its participation in central chemosensitive neurons. In addition to these functions TASK-2 has been proposed to play a part in apoptotic volume decrease in kidney cells and in volume regulation of glial cells and T-lymphocytes. TASK-2 is present in chondrocytes of hyaline cartilage, where it is proposed to play a central role in stabilizing the membrane potential. Additional sites of expression are dorsal root ganglion neurons, endocrine and exocrine pancreas and intestinal smooth muscle cells. TASK-2 has been associated with the regulation of proliferation of breast cancer cells and could become target for breast cancer therapeutics. Further work in native tissues and cells together with genetic modification will no doubt reveal the details of TASK-2 functions that we are only starting to suspect. PMID:23908634

  9. Jena Reference Air Set (JRAS): a multi-point scale anchor for isotope measurements of CO2 in air

    NASA Astrophysics Data System (ADS)

    Wendeberg, M.; Richter, J. M.; Rothe, M.; Brand, W. A.

    2013-03-01

    The need for a unifying scale anchor for isotopes of CO2 in air was brought to light at the 11th WMO/IAEA Meeting of Experts on Carbon Dioxide in Tokyo 2001. During discussions about persistent discrepancies in isotope measurements between the worlds leading laboratories, it was concluded that a unifying scale anchor for Vienna Pee Dee Belemnite (VPDB) of CO2 in air was desperately needed. Ten years later, at the 2011 Meeting of Experts on Carbon Dioxide in Wellington, it was recommended that the Jena Reference Air Set (JRAS) become the official scale anchor for isotope measurements of CO2 in air (Brailsford, 2012). The source of CO2 used for JRAS is two calcites. After releasing CO2 by reaction with phosphoric acid, the gases are mixed into CO2-free air. This procedure ensures both isotopic stability and longevity of the CO2. That the reference CO2 is generated from calcites and supplied as an air mixture is unique to JRAS. This is made to ensure that any measurement bias arising from the extraction procedure is eliminated. As every laboratory has its own procedure for extracting the CO2, this is of paramount importance if the local scales are to be unified with a common anchor. For a period of four years, JRAS has been evaluated through the IMECC1 program, which made it possible to distribute sets of JRAS gases to 13 laboratories worldwide. A summary of data from the six laboratories that have reported the full set of results is given here along with a description of the production and maintenance of the JRAS scale anchors. 1 IMECC refers to the EU project "Infrastructure for Measurements of the European Carbon Cycle" (http://imecc.ipsl.jussieu.fr/).

  10. Distributed cerebellar plasticity implements generalized multiple-scale memory components in real-robot sensorimotor tasks.

    PubMed

    Casellato, Claudia; Antonietti, Alberto; Garrido, Jesus A; Ferrigno, Giancarlo; D'Angelo, Egidio; Pedrocchi, Alessandra

    2015-01-01

    The cerebellum plays a crucial role in motor learning and it acts as a predictive controller. Modeling it and embedding it into sensorimotor tasks allows us to create functional links between plasticity mechanisms, neural circuits and behavioral learning. Moreover, if applied to real-time control of a neurorobot, the cerebellar model has to deal with a real noisy and changing environment, thus showing its robustness and effectiveness in learning. A biologically inspired cerebellar model with distributed plasticity, both at cortical and nuclear sites, has been used. Two cerebellum-mediated paradigms have been designed: an associative Pavlovian task and a vestibulo-ocular reflex, with multiple sessions of acquisition and extinction and with different stimuli and perturbation patterns. The cerebellar controller succeeded to generate conditioned responses and finely tuned eye movement compensation, thus reproducing human-like behaviors. Through a productive plasticity transfer from cortical to nuclear sites, the distributed cerebellar controller showed in both tasks the capability to optimize learning on multiple time-scales, to store motor memory and to effectively adapt to dynamic ranges of stimuli.

  11. Task-specific modulation of adult humans' tool preferences: number of choices and size of the problem.

    PubMed

    Silva, Kathleen M; Gross, Thomas J; Silva, Francisco J

    2015-03-01

    In two experiments, we examined the effect of modifications to the features of a stick-and-tube problem on the stick lengths that adult humans used to solve the problem. In Experiment 1, we examined whether people's tool preferences for retrieving an out-of-reach object in a tube might more closely resemble those reported with laboratory crows if people could modify a single stick to an ideal length to solve the problem. Contrary to when adult humans have selected a tool from a set of ten sticks, asking people to modify a single stick to retrieve an object did not generally result in a stick whose length was related to the object's distance. Consistent with the prior research, though, the working length of the stick was related to the object's distance. In Experiment 2, we examined the effect of increasing the scale of the stick-and-tube problem on people's tool preferences. Increasing the scale of the task influenced people to select relatively shorter tools than had selected in previous studies. Although the causal structures of the tasks used in the two experiments were identical, their results were not. This underscores the necessity of studying physical cognition in relation to a particular causal structure by using a variety of tasks and methods.

  12. Novel TASK channels inhibitors derived from dihydropyrrolo[2,1-a]isoquinoline.

    PubMed

    Noriega-Navarro, R; Lopez-Charcas, O; Hernández-Enríquez, B; Reyes-Gutiérrez, P E; Martínez, R; Landa, A; Morán, J; Gomora, J C; Garcia-Valdes, J

    2014-04-01

    TASK channels belong to the family of K(+) channels with 4 transmembrane segments and 2 pore domains (4TM/2P) per subunit. These channels have been related to apoptosis in cerebellar granule neurons (CGN), as well as cancer in other tissues. TASK current is regulated by hormones, neurotransmitters, anesthetics and divalent cations, which are not selective. Recently, there has been found some organic compounds that inhibit TASK current selectively. In order to find other modulators, we report here a group of five dihydropyrrolo[2,1-a]isoquinolines (DPIs), four of them with putative anticancer activity, that were evaluated on TASK-1 and TASK-3 channels. The compounds 1, 2 and 3 showed IC50 < 320 μM on TASK-1 and TASK-3, intermediate activity on TASK-1/TASK-3 heterodimer, moderate effect over hslo and TREK-1 (500 μM), and practically not inhibition on Shaker-IR, herg and IRK2.1 potassium channels, when they were expressed heterologously in Xenopus laevis oocytes. In rat CGN, 500 μM of these three compounds induced a decrement by >39% of the TASK-carried leak current. Finally, only compound 1 showed significant protection (∼36%) against apoptotic death of CGN induced by K(+) deprivation. These results suggest that DPI compounds could be potential candidates for designing new selective inhibitors of TASK channels. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Visual Experience Enhances Infants' Use of Task-Relevant Information in an Action Task

    ERIC Educational Resources Information Center

    Wang, Su-hua; Kohne, Lisa

    2007-01-01

    Four experiments examined whether infants' use of task-relevant information in an action task could be facilitated by visual experience in the laboratory. Twelve- but not 9-month-old infants spontaneously used height information and chose an appropriate (taller) cover in search of a hidden tall toy. After watching examples of covering events in a…

  14. Individual differences in situation awareness: Validation of the Situationism Scale

    PubMed Central

    Roberts, Megan E.; Gibbons, Frederick X.; Gerrard, Meg; Klein, William M. P.

    2015-01-01

    This paper concerns the construct of lay situationism—an individual’s belief in the importance of a behavior’s context. Study 1 identified a 13-item Situationism Scale, which demonstrated good reliability and validity. In particular, higher situationism was associated with greater situation-control (strategies to manipulate the environment in order to avoid temptation). Subsequent laboratory studies indicated that people higher on the situationism subscales used greater situation-control by sitting farther from junk food (Study 2) and choosing to drink non-alcoholic beverages before a cognitive task (Study 3). Overall, findings provide preliminary support for the psychometric validity and predictive utility of the Situationism Scale and offer this individual difference construct as a means to expand self-regulation theory. PMID:25329242

  15. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  16. GeoVision Exploration Task Force Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doughty, Christine; Dobson, Patrick F.; Wall, Anna

    The GeoVision study effort included ground-breaking, detailed research on current and future market conditions and geothermal technologies in order to forecast and quantify the electric and non-electric deployment potentials under a range of scenarios, in addition to their impacts on the Nation’s jobs, economy and environment. Coordinated by the U.S. Department of Energy’s (DOE’s) Geothermal Technologies Office (GTO), the GeoVision study development relied on the collection, modeling, and analysis of robust datasets through seven national laboratory partners, which were organized into eight technical Task Force groups. The purpose of this report is to provide a central repository for the researchmore » conducted by the Exploration Task Force. The Exploration Task Force consists of four individuals representing three national laboratories: Patrick Dobson (task lead) and Christine Doughty of Lawrence Berkeley National Laboratory, Anna Wall of National Renewable Energy Laboratory, Travis McLing of Idaho National Laboratory, and Chester Weiss of Sandia National Laboratories. As part of the GeoVision analysis, our team conducted extensive scientific and financial analyses on a number of topics related to current and future geothermal exploration methods. The GeoVision Exploration Task Force complements the drilling and resource technology investigations conducted as part of the Reservoir Maintenance and Development Task Force. The Exploration Task Force however has focused primarily on early stage R&D technologies in exploration and confirmation drilling, along with an evaluation of geothermal financing challenges and assumptions, and innovative “blue-sky” technologies. This research was used to develop geothermal resource supply curves (through the use of GETEM) for use in the ReEDS capacity expansion modeling that determines geothermal technology deployment potential. It also catalogues and explores the large array of early-stage R&D technologies with the

  17. Experiments with an ESOL Reading Laboratory.

    ERIC Educational Resources Information Center

    Ahrens, Patricia

    The reading laboratory has been developed to supplement intensive reading work for adult foreign students developing English-as-a-second-language skills at the American Language Institute. The laboratory is designed to suggest to students that there is a variety of reading tasks and a variety of reading strategies related to the tasks, to offer…

  18. Altered segregation between task-positive and task-negative regions in mild traumatic brain injury.

    PubMed

    Sours, Chandler; Kinnison, Joshua; Padmala, Srikanth; Gullapalli, Rao P; Pessoa, Luiz

    2018-06-01

    Changes in large-scale brain networks that accompany mild traumatic brain injury (mTBI) were investigated using functional magnetic resonance imaging (fMRI) during the N-back working memory task at two cognitive loads (1-back and 2-back). Thirty mTBI patients were examined during the chronic stage of injury and compared to 28 control participants. Demographics and behavioral performance were matched across groups. Due to the diffuse nature of injury, we hypothesized that there would be an imbalance in the communication between task-positive and Default Mode Network (DMN) regions in the context of effortful task execution. Specifically, a graph-theoretic measure of modularity was used to quantify the extent to which groups of brain regions tended to segregate into task-positive and DMN sub-networks. Relative to controls, mTBI patients showed reduced segregation between the DMN and task-positive networks, but increased functional connectivity within the DMN regions during the more cognitively demanding 2-back task. Together, our findings reveal that patients exhibit alterations in the communication between and within neural networks during a cognitively demanding task. These findings reveal altered processes that persist through the chronic stage of injury, highlighting the need for longitudinal research to map the neural recovery of mTBI patients.

  19. The Time on Task Effect in Reading and Problem Solving Is Moderated by Task Difficulty and Skill: Insights from a Computer-Based Large-Scale Assessment

    ERIC Educational Resources Information Center

    Goldhammer, Frank; Naumann, Johannes; Stelter, Annette; Tóth, Krisztina; Rölke, Heiko; Klieme, Eckhard

    2014-01-01

    Computer-based assessment can provide new insights into behavioral processes of task completion that cannot be uncovered by paper-based instruments. Time presents a major characteristic of the task completion process. Psychologically, time on task has 2 different interpretations, suggesting opposing associations with task outcome: Spending more…

  20. Balancing the Demands of Two Tasks: An Investigation of Cognitive-Motor Dual-Tasking in Relapsing Remitting Multiple Sclerosis.

    PubMed

    Butchard-MacDonald, Emma; Paul, Lorna; Evans, Jonathan J

    2018-03-01

    People with relapsing remitting multiple sclerosis (PwRRMS) suffer disproportionate decrements in gait under dual-task conditions, when walking and a cognitive task are combined. There has been much less investigation of the impact of cognitive demands on balance. This study investigated whether: (1) PwRRMS show disproportionate decrements in postural stability under dual-task conditions compared to healthy controls, and (2) dual-task decrements are associated with everyday dual-tasking difficulties. The impact of mood, fatigue, and disease severity on dual-tasking was also examined. A total of 34 PwRRMS and 34 matched controls completed cognitive (digit span) and balance (movement of center of pressure on Biosway on stable and unstable surfaces) tasks under single- and dual-task conditions. Everyday dual-tasking was measured using the Dual-Tasking Questionnaire. Mood was measured by the Hospital Anxiety & Depression Scale. Fatigue was measured via the Modified Fatigue Index Scale. No differences in age, gender, years of education, estimated pre-morbid IQ, or baseline digit span between groups. Compared with controls, PwRRMS showed significantly greater decrement in postural stability under dual-task conditions on an unstable surface (p=.007), but not a stable surface (p=.679). Balance decrement scores were not correlated with everyday dual-tasking difficulties or fatigue. Stable surface balance decrement scores were significantly associated with levels of anxiety (rho=0.527; p=.001) and depression (rho=0.451; p=.007). RRMS causes dual-tasking difficulties, impacting balance under challenging conditions, which may contribute to increased risk of gait difficulties and falls. The relationship between anxiety/depression and dual-task decrement suggests that emotional factors may be contributing to dual-task difficulties. (JINS, 2018, 24, 247-258).

  1. Development and psychometric validation of the Task-Specific Self-Efficacy Scale for Chinese people with mental illness.

    PubMed

    Chou, Chih Chin; Cardoso, Elizabeth Da Silva; Chan, Fong; Tsang, Hector W H; Wu, Mingyi

    2007-12-01

    The aim of this study was to validate a Task-Specific Self-Efficacy Scale for Chinese people with mental illness. The study included 79 men and 77 women with chronic mental illness. The Task-Specific Self-Efficacy Scale for People with Mental Illness (TSSES-PMI) and Change Assessment Questionnaire for People with Severe and Persistent Mental Illness were used as measures for the study. Factor analysis of the TSSES-PMI resulted in four subscales: Symptom Management Skills, Work-Related Skills, Help-Seeking Skills, and Self-Emotional-Regulation Skills. These community living skills were found to be related to the level of readiness for psychiatric rehabilitation among Chinese people with mental illness. In conclusion the results support the construct validity of the TSSES-PMI for the Chinese population and the TSSES-PMI can be a useful instrument for working with Chinese people with mental illnesses.

  2. Numerical modeling of seismic anomalies at impact craters on a laboratory scale

    NASA Astrophysics Data System (ADS)

    Wuennemann, K.; Grosse, C. U.; Hiermaier, S.; Gueldemeister, N.; Moser, D.; Durr, N.

    2011-12-01

    Almost all terrestrial impact craters exhibit a typical geophysical signature. The usually observed circular negative gravity anomaly and reduced seismic velocities in the vicinity of crater structures are presumably related to an approximately hemispherical zone underneath craters where rocks have experienced intense brittle plastic deformation and fracturing during formation (see Fig.1). In the framework of the "MEMIN" (multidisciplinary experimental and modeling impact crater research network) project we carried out hypervelocity cratering experiments at the Fraunhofer Institute for High-Speed Dynamics on a decimeter scale to study the spatiotemporal evolution of the damage zone using ultrasound, acoustic emission techniques, and numerical modeling of crater formation. 2.5-10 mm iron projectiles were shot at 2-5.5 km/s on dry and water-saturated sandstone targets. The target material was characterized before, during and after the impact with high spatial resolution acoustic techniques to detect the extent of the damage zone, the state of rocks therein and to record the growth of cracks. The ultrasound measurements are applied analog to seismic surveys at natural craters but used on a different - i.e. much smaller - scale. We compare the measured data with dynamic models of crater formation, shock, plastic and elastic wave propagation, and tensile/shear failure of rocks in the impacted sandstone blocks. The presence of porosity and pore water significantly affects the propagation of waves. In particular the crushing of pores due to shock compression has to be taken into account. We present preliminary results showing good agreement between experiments and numerical model. In a next step we plan to use the numerical models to upscale the results from laboratory dimensions to the scale of natural impact craters.

  3. Numerical Investigation of Earthquake Nucleation on a Laboratory-Scale Heterogeneous Fault with Rate-and-State Friction

    NASA Astrophysics Data System (ADS)

    Higgins, N.; Lapusta, N.

    2014-12-01

    Many large earthquakes on natural faults are preceded by smaller events, often termed foreshocks, that occur close in time and space to the larger event that follows. Understanding the origin of such events is important for understanding earthquake physics. Unique laboratory experiments of earthquake nucleation in a meter-scale slab of granite (McLaskey and Kilgore, 2013; McLaskey et al., 2014) demonstrate that sample-scale nucleation processes are also accompanied by much smaller seismic events. One potential explanation for these foreshocks is that they occur on small asperities - or bumps - on the fault interface, which may also be the locations of smaller critical nucleation size. We explore this possibility through 3D numerical simulations of a heterogeneous 2D fault embedded in a homogeneous elastic half-space, in an attempt to qualitatively reproduce the laboratory observations of foreshocks. In our model, the simulated fault interface is governed by rate-and-state friction with laboratory-relevant frictional properties, fault loading, and fault size. To create favorable locations for foreshocks, the fault surface heterogeneity is represented as patches of increased normal stress, decreased characteristic slip distance L, or both. Our simulation results indicate that one can create a rate-and-state model of the experimental observations. Models with a combination of higher normal stress and lower L at the patches are closest to matching the laboratory observations of foreshocks in moment magnitude, source size, and stress drop. In particular, we find that, when the local compression is increased, foreshocks can occur on patches that are smaller than theoretical critical nucleation size estimates. The additional inclusion of lower L for these patches helps to keep stress drops within the range observed in experiments, and is compatible with the asperity model of foreshock sources, since one would expect more compressed spots to be smoother (and hence have

  4. Laboratory Waste Management. A Guidebook.

    ERIC Educational Resources Information Center

    American Chemical Society, Washington, DC.

    A primary goal of the American Chemical Society Task Force on Laboratory Waste Management is to provide laboratories with the information necessary to develop effective strategies and training programs for managing laboratory wastes. This book is intended to present a fresh look at waste management from the laboratory perspective, considering both…

  5. Re-Thinking Stages of Cognitive Development: An Appraisal of Connectionist Models of the Balance Scale Task

    ERIC Educational Resources Information Center

    Quinlan, Philip T.; van der Maas, Han L. J.; Jansen, Brenda R. J.; Booij, Olaf; Rendell, Mark

    2007-01-01

    The present paper re-appraises connectionist attempts to explain how human cognitive development appears to progress through a series of sequential stages. Models of performance on the Piagetian balance scale task are the focus of attention. Limitations of these models are discussed and replications and extensions to the work are provided via the…

  6. The effects of 2 landing techniques on knee kinematics, kinetics, and performance during stop-jump and side-cutting tasks.

    PubMed

    Dai, Boyi; Garrett, William E; Gross, Michael T; Padua, Darin A; Queen, Robin M; Yu, Bing

    2015-02-01

    Anterior cruciate ligament injuries (ACL) commonly occur during jump landing and cutting tasks. Attempts to land softly and land with greater knee flexion are associated with decreased ACL loading. However, their effects on performance are unclear. Attempts to land softly will decrease peak posterior ground-reaction force (PPGRF) and knee extension moment at PPGRF compared with a natural landing during stop-jump and side-cutting tasks. Attempts to land with greater knee flexion at initial ground contact will increase knee flexion at PPGRF compared with a natural landing during both tasks. In addition, both landing techniques will increase stance time and lower extremity mechanical work as well as decrease jump height and movement speed compared with a natural landing during both tasks. Controlled laboratory study. A total of 18 male and 18 female recreational athletes participated in the study. Three-dimensional kinematic and kinetic data were collected during stop-jump and side-cutting tasks under 3 conditions: natural landing, soft landing, and landing with greater knee flexion at initial ground contact. Attempts to land softly decreased PPGRF and knee extension moment at PPGRF compared with a natural landing during stop-jump tasks. Attempts to land softly decreased PPGRF compared with a natural landing during side-cutting tasks. Attempts to land with greater knee flexion at initial ground contact increased knee flexion angle at PPGRF compared with a natural landing during both stop-jump and side-cutting tasks. Attempts to land softly and land with greater knee flexion at initial ground contact increased stance time and lower extremity mechanical work, as well as decreased jump height and movement speed during both stop-jump and side-cutting tasks. Although landing softly and landing with greater knee flexion at initial ground contact may reduce ACL loading during stop-jump and side-cutting tasks, the performance of these tasks decreased, as indicated by

  7. Dispatching the wandering mind? Toward a laboratory method for cuing “spontaneous” off-task thought

    PubMed Central

    McVay, Jennifer C.; Kane, Michael J.

    2013-01-01

    Cognitive psychologists and neuroscientists study most phenomena of attention by measuring subjects' overt responses to discrete environmental stimuli that can be manipulated to test competing theories. The mind wandering experience, however, cannot be locally instigated by cleverly engineered stimuli. Investigators must therefore rely on correlational and observational methods to understand subjects' flow of thought, which is only occasionally and indirectly monitored. In an effort toward changing this state of affairs, we present four experiments that develop a method for inducing mind wandering episodes—on demand—in response to task-embedded cues. In an initial laboratory session, subjects described their personal goals and concerns across several life domains (amid some filler questionnaires). In a second session, 48 h later, subjects completed a go/no-go task in which they responded to the perceptual features of words; unbeknownst to subjects, some stimulus words were presented in triplets to represent the personal concerns they had described in session 1. Thought probes appearing shortly after these personal-goal triplets indicated that, compared to control triplets, priming subjects' concerns increased mind wandering rate by about 3–4%. We argue that this small effect is, nonetheless, a promising development toward the pursuit of an experimentally informed, theory-driven science of mind wandering. PMID:24027542

  8. A Note on the Factor Structure of Some Piagetian Tasks.

    ERIC Educational Resources Information Center

    Lawson, Anton E.; Nordland, Floyd H.

    Some evidence supports the hypothesis that formal operational reasoning ability (at least that measured by Piagetian tasks) is a unified process. The purpose of this research was to determine: (1) if conservation tasks, such as conservation of number, liquid amount, weight and volume, are unifactor; and (2) if conservation tasks form a scale of…

  9. Infrared Imagery of Shuttle (IRIS). Task 2. [indium antimonide sensors

    NASA Technical Reports Server (NTRS)

    Chocol, C. J.

    1978-01-01

    An opto-electronic breadboard of 10 channels of the IR temperature measuring system was produced as well as a scaled up portion of the tracking system reticle in order to verify Task 1 assumptions. The breadboards and the tests performed on them are described and both raw and reduced data are presented. Tests show that the electronics portion of the imaging system will provide a dc to 10,000 Hz bandwidth that is flat and contributes no more than 0.4% of full-scale uncertainty to the measurement. Conventional packaging is adequate for the transresistance amplifier design. Measurement errors expected from all sources tested are discussed.

  10. Task conflict effect in task switching.

    PubMed

    Braverman, Ami; Meiran, Nachshon

    2010-11-01

    A part of action preparation is deciding what the relevant task is. This task-decision process is conceptually separate from response selection. To show this, the authors manipulated task conflict in a spatial task-switching paradigm, using conflict stimuli that appeared during trials with univalent targets (affording 1 task). The conflict stimuli afforded task identity because they were used as task cues with bivalent targets (affording 2 tasks) that were intermixed with the univalent targets. Thus, for univalent targets, irrelevant stimuli either caused low task conflict or high task conflict. In three experiments, the authors found poorer performance in high task conflict trials than in low task conflict trials. Task conflict was introduced during target appearance (Experiment 1) or task preparation (Experiments 2 and 3). In the latter case, the task conflict effect decreased with increasing task preparation time showing that task preparation involves task decision.

  11. Clinical Laboratory Helper.

    ERIC Educational Resources Information Center

    Szucs, Susan C.; And Others

    This curriculum guide provides competencies and tasks for the position of clinical laboratory helper; it serves as both a career exploration experience and/or entry-level employment training. A list of 25 validated competencies and tasks covers careers from entry level to those that must be mastered to earn an associate degree in clinical…

  12. Community-aware task allocation for social networked multiagent systems.

    PubMed

    Wang, Wanyuan; Jiang, Yichuan

    2014-09-01

    In this paper, we propose a novel community-aware task allocation model for social networked multiagent systems (SN-MASs), where the agent' cooperation domain is constrained in community and each agent can negotiate only with its intracommunity member agents. Under such community-aware scenarios, we prove that it remains NP-hard to maximize system overall profit. To solve this problem effectively, we present a heuristic algorithm that is composed of three phases: 1) task selection: select the desirable task to be allocated preferentially; 2) allocation to community: allocate the selected task to communities based on a significant task-first heuristics; and 3) allocation to agent: negotiate resources for the selected task based on a nonoverlap agent-first and breadth-first resource negotiation mechanism. Through the theoretical analyses and experiments, the advantages of our presented heuristic algorithm and community-aware task allocation model are validated. 1) Our presented heuristic algorithm performs very closely to the benchmark exponential brute-force optimal algorithm and the network flow-based greedy algorithm in terms of system overall profit in small-scale applications. Moreover, in the large-scale applications, the presented heuristic algorithm achieves approximately the same overall system profit, but significantly reduces the computational load compared with the greedy algorithm. 2) Our presented community-aware task allocation model reduces the system communication cost compared with the previous global-aware task allocation model and improves the system overall profit greatly compared with the previous local neighbor-aware task allocation model.

  13. Validation of mathematical model for CZ process using small-scale laboratory crystal growth furnace

    NASA Astrophysics Data System (ADS)

    Bergfelds, Kristaps; Sabanskis, Andrejs; Virbulis, Janis

    2018-05-01

    The present material is focused on the modelling of small-scale laboratory NaCl-RbCl crystal growth furnace. First steps towards fully transient simulations are taken in the form of stationary simulations that deal with the optimization of material properties to match the model to experimental conditions. For this purpose, simulation software primarily used for the modelling of industrial-scale silicon crystal growth process was successfully applied. Finally, transient simulations of the crystal growth are presented, giving a sufficient agreement to experimental results.

  14. Simulation of large scale motions and small scale structures in planetary atmospheres and oceans: From laboratory to space experiments on ISS

    NASA Astrophysics Data System (ADS)

    Egbers, Christoph; Futterer, Birgit; Zaussinger, Florian; Harlander, Uwe

    2014-05-01

    Baroclinic waves are responsible for the transport of heat and momentum in the oceans, in the Earth's atmosphere as well as in other planetary atmospheres. The talk will give an overview on possibilities to simulate such large scale as well as co-existing small scale structures with the help of well defined laboratory experiments like the baroclinic wave tank (annulus experiment). The analogy between the Earth's atmosphere and the rotating cylindrical annulus experiment only driven by rotation and differential heating between polar and equatorial regions is obvious. From the Gulf stream single vortices seperate from time to time. The same dynamics and the co-existence of small and large scale structures and their separation can be also observed in laboratory experiments as in the rotating cylindrical annulus experiment. This experiment represents the mid latitude dynamics quite well and is part as a central reference experiment in the German-wide DFG priority research programme ("METSTRÖM", SPP 1276) yielding as a benchmark for lot of different numerical methods. On the other hand, those laboratory experiments in cylindrical geometry are limited due to the fact, that the surface and real interaction between polar and equatorial region and their different dynamics can not be really studied. Therefore, I demonstrate how to use the very successful Geoflow I and Geoflow II space experiment hardware on ISS with future modifications for simulations of small and large scale planetary atmospheric motion in spherical geometry with differential heating between inner and outer spheres as well as between the polar and equatorial regions. References: Harlander, U., Wenzel, J., Wang, Y., Alexandrov, K. & Egbers, Ch., 2012, Simultaneous PIV- and thermography measurements of partially blocked flow in a heated rotating annulus, Exp. in Fluids, 52 (4), 1077-1087 Futterer, B., Krebs, A., Plesa, A.-C., Zaussinger, F., Hollerbach, R., Breuer, D. & Egbers, Ch., 2013, Sheet-like and

  15. Engineering development of selective agglomeration: Task 5, Bench- scale process testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-09-01

    Under the overall objectives of DOE Contract ``Engineering Development of Selective Agglomeration,`` there were a number of specific objectives in the Task 5 program. The prime objectives of Task 5 are highlighted below: (1) Maximize process performance in pyritic sulfur rejection and BTU recovery, (2) Produce a low ash product, (3) Compare the performance of the heavy agglomerant process based on diesel and the light agglomerant process using heptane, (4) Define optimum processing conditions for engineering design, (5) Provide first-level evaluation of product handleability, and (6) Explore and investigate process options/ideas which may enhance process performance and/or product handleability.

  16. Engineering development of selective agglomeration: Task 5, Bench- scale process testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-09-01

    Under the overall objectives of DOE Contract Engineering Development of Selective Agglomeration,'' there were a number of specific objectives in the Task 5 program. The prime objectives of Task 5 are highlighted below: (1) Maximize process performance in pyritic sulfur rejection and BTU recovery, (2) Produce a low ash product, (3) Compare the performance of the heavy agglomerant process based on diesel and the light agglomerant process using heptane, (4) Define optimum processing conditions for engineering design, (5) Provide first-level evaluation of product handleability, and (6) Explore and investigate process options/ideas which may enhance process performance and/or product handleability.

  17. Comparing field investigations with laboratory models to predict landfill leachate emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fellner, Johann; Doeberl, Gernot; Allgaier, Gerhard

    2009-06-15

    Investigations into laboratory reactors and landfills are used for simulating and predicting emissions from municipal solid waste landfills. We examined water flow and solute transport through the same waste body for different volumetric scales (laboratory experiment: 0.08 m{sup 3}, landfill: 80,000 m{sup 3}), and assessed the differences in water flow and leachate emissions of chloride, total organic carbon and Kjeldahl nitrogen. The results indicate that, due to preferential pathways, the flow of water in field-scale landfills is less uniform than in laboratory reactors. Based on tracer experiments, it can be discerned that in laboratory-scale experiments around 40% of pore watermore » participates in advective solute transport, whereas this fraction amounts to less than 0.2% in the investigated full-scale landfill. Consequences of the difference in water flow and moisture distribution are: (1) leachate emissions from full-scale landfills decrease faster than predicted by laboratory experiments, and (2) the stock of materials remaining in the landfill body, and thus the long-term emission potential, is likely to be underestimated by laboratory landfill simulations.« less

  18. The Multi-Feature Hypothesis: Connectionist Guidelines for L2 Task Design

    ERIC Educational Resources Information Center

    Moonen, Machteld; de Graaff, Rick; Westhoff, Gerard; Brekelmans, Mieke

    2014-01-01

    This study focuses on the effects of task type on the retention and ease of activation of second language (L2) vocabulary, based on the multi-feature hypothesis (Moonen, De Graaff, & Westhoff, 2006). Two tasks were compared: a writing task and a list-learning task. It was hypothesized that performing the writing task would yield higher…

  19. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  20. Laboratory Experiments On Continually Forced 2d Turbulence

    NASA Astrophysics Data System (ADS)

    Wells, M. G.; Clercx, H. J. H.; Van Heijst, G. J. F.

    There has been much recent interest in the advection of tracers by 2D turbulence in geophysical flows. While there is a large body of literature on decaying 2D turbulence or forced 2D turbulence in unbounded domains, there have been very few studies of forced turbulence in bounded domains. In this study we present new experimental results from a continuously forced quasi 2D turbulent field. The experiments are performed in a square Perspex tank filled with water. The flow is made quasi 2D by a steady background rotation. The rotation rate of the tank has a small (<8 %) sinusoidal perturbation which leads to the periodic formation of eddies in the corners of the tank. When the oscillation period of the perturbation is greater than an eddy roll-up time-scale, dipole structures are observed to form. The dipoles can migrate away from the walls, and the interior of the tank is continually filled with vortexs. From experimental visualizations the length scale of the vortexs appears to be largely controlled by the initial formation mechanism and large scale structures are not observed to form at large times. Thus the experiments provide a simple way of cre- ating a continuously forced 2D turbulent field. The resulting structures are in contrast with most previous laboratory experiments on 2D turbulence which have investigated decaying turbulence and have observed the formations of large scale structure. In these experiments, decaying turbulence had been produced by a variety of methods such as the decaying turbulence in the wake of a comb of rods (Massen et al 1999), organiza- tion of vortices in thin conducting liquids (Cardoso et al 1994) or in rotating systems where there are sudden changes in angular rotation rate (Konijnenberg et al 1998). Results of dye visualizations, particle tracking experiments and a direct numerical simulation will be presented and discussed in terms of their oceanographic application. Bibliography Cardoso,O. Marteau, D. &Tabeling, P

  1. Dissociation in the laboratory: a comparison of strategies.

    PubMed

    Leonard, K N; Telch, M J; Harrington, P J

    1999-01-01

    Several methods for inducing dissociation in the laboratory were examined in a sample of 78 undergraduate students. Participants scoring high or low on the Dissociative Experiences Scale participated in three dissociation challenge conditions: (a) dot-staring task, (b) administration of pulsed photic and audio stimulation and (c) stimulus deprivation. Participants recorded their dissociative experiences both before and after each of the three challenge conditions. Across conditions, high DES participants reported significantly more dissociative sensations than low DES participants, even after controlling for pre-challenge dissociation. Moreover, regardless of DES status, pulsed photo and audio stimulation produced the greatest level of dissociative symptoms. The findings suggest that the induction of dissociative symptoms in a nonclinical sample is easily accomplished in the laboratory and that those who report more dissociative symptoms in their day-to-day life exhibit more pronounced dissociative symptoms when undergoing dissociative challenge in the laboratory. Implications for the study and treatment of dissociative symptoms are discussed.

  2. Munition Burial by Local Scour and Sandwaves: large-scale laboratory experiments

    NASA Astrophysics Data System (ADS)

    Garcia, M. H.

    2017-12-01

    Our effort has been the direct observation and monitoring of the burial process of munitions induced by the combined action of waves, currents and pure oscillatory flows. The experimental conditions have made it possible to observe the burial process due to both local scour around model munitions as well as the passage of sandwaves. One experimental facility is the Large Oscillating Water Sediment Tunnel (LOWST) constructed with DURIP support. LOWST can reproduce field-like conditions near the sea bed. The second facility is a multipurpose wave-current flume which is 4 feet (1.20 m) deep, 6 feet (1.8 m) wide, and 161 feet (49.2 m) long. More than two hundred experiments were carried out in the wave-current flume. The main task completed within this effort has been the characterization of the burial process induced by local scour as well in the presence of dynamic sandwaves with superimposed ripples. It is found that the burial of a finite-length model munition (cylinder) is determined by local scour around the cylinder and by a more global process associated with the formation and evolution of sandwaves having superimposed ripples on them. Depending on the ratio of the amplitude of these features and the body's diameter (D), a model munition can progressively get partially or totally buried as such bedforms migrate. Analysis of the experimental data indicates that existing semi-empirical formulae for prediction of equilibrium-burial-depth, geometry of the scour hole around a cylinder, and time-scales developed for pipelines are not suitable for the case of a cylinder of finite length. Relative burial depth (Bd / D) is found to be mainly a function of two parameters. One is the Keulegan-Carpenter number, KC, and the Shields parameter, θ. Munition burial under either waves or combined flow, is influenced by two different processes. One is related to the local scour around the object, which takes place within the first few hundred minutes of flow action (i.e. short

  3. Laboratory quality improvement in Tanzania.

    PubMed

    Andiric, Linda R; Massambu, Charles G

    2015-04-01

    The article describes the implementation and improvement in the first groups of medical laboratories in Tanzania selected to participate in the training program on Strengthening Laboratory Management Toward Accreditation (SLMTA). As in many other African nations, the selected improvement plan consisted of formalized hands-on training (SLMTA) that teaches the tasks and skills of laboratory management and provides the tools for implementation of best laboratory practice. Implementation of the improvements learned during training was verified before and after SLMTA with the World Health Organization African Region Stepwise Laboratory Improvement Process Towards Accreditation checklist. During a 4-year period, the selected laboratories described in this article demonstrated improvement with a range of 2% to 203% (cohort I) and 12% to 243% (cohort II) over baseline scores. The article describes the progress made in Tanzania's first cohorts, the obstacles encountered, and the lessons learned during the pilot and subsequent implementations. Copyright© by the American Society for Clinical Pathology.

  4. Acoustic Emission Patterns and the Transition to Ductility in Sub-Micron Scale Laboratory Earthquakes

    NASA Astrophysics Data System (ADS)

    Ghaffari, H.; Xia, K.; Young, R.

    2013-12-01

    We report observation of a transition from the brittle to ductile regime in precursor events from different rock materials (Granite, Sandstone, Basalt, and Gypsum) and Polymers (PMMA, PTFE and CR-39). Acoustic emission patterns associated with sub-micron scale laboratory earthquakes are mapped into network parameter spaces (functional damage networks). The sub-classes hold nearly constant timescales, indicating dependency of the sub-phases on the mechanism governing the previous evolutionary phase, i.e., deformation and failure of asperities. Based on our findings, we propose that the signature of the non-linear elastic zone around a crack tip is mapped into the details of the evolutionary phases, supporting the formation of a strongly weak zone in the vicinity of crack tips. Moreover, we recognize sub-micron to micron ruptures with signatures of 'stiffening' in the deformation phase of acoustic-waveforms. We propose that the latter rupture fronts carry critical rupture extensions, including possible dislocations faster than the shear wave speed. Using 'template super-shear waveforms' and their network characteristics, we show that the acoustic emission signals are possible super-shear or intersonic events. Ref. [1] Ghaffari, H. O., and R. P. Young. "Acoustic-Friction Networks and the Evolution of Precursor Rupture Fronts in Laboratory Earthquakes." Nature Scientific reports 3 (2013). [2] Xia, Kaiwen, Ares J. Rosakis, and Hiroo Kanamori. "Laboratory earthquakes: The sub-Rayleigh-to-supershear rupture transition." Science 303.5665 (2004): 1859-1861. [3] Mello, M., et al. "Identifying the unique ground motion signatures of supershear earthquakes: Theory and experiments." Tectonophysics 493.3 (2010): 297-326. [4] Gumbsch, Peter, and Huajian Gao. "Dislocations faster than the speed of sound." Science 283.5404 (1999): 965-968. [5] Livne, Ariel, et al. "The near-tip fields of fast cracks." Science 327.5971 (2010): 1359-1363. [6] Rycroft, Chris H., and Eran Bouchbinder

  5. WAG 2 remedial investigation and site investigation site-specific work plan/health and safety checklist for the soil and sediment task. Environmental Restoration Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holt, V.L.; Burgoa, B.B.

    1993-12-01

    This document is a site-specific work plan/health and safety checklist (WP/HSC) for a task of the Waste Area Grouping 2 Remedial Investigation and Site Investigation (WAG 2 RI&SI). Title 29 CFR Part 1910.120 requires that a health and safety program plan that includes site- and task-specific information be completed to ensure conformance with health- and safety-related requirements. To meet this requirement, the health and safety program plan for each WAG 2 RI&SI field task must include (1) the general health and safety program plan for all WAG 2 RI&SI field activities and (2) a WP/HSC for that particular field task.more » These two components, along with all applicable referenced procedures, must be kept together at the work site and distributed to field personnel as required. The general health and safety program plan is the Health and Safety Plan for the Remedial Investigation and Site Investigation of Waste Area Grouping 2 at the Oak Ridge National Laboratory, Oak Ridge, Tennessee (ORNL/ER-169). The WP/HSCs are being issued as supplements to ORNL/ER-169.« less

  6. The effect of single-task and dual-task balance exercise programs on balance performance in adults with osteoporosis: a randomized controlled preliminary trial.

    PubMed

    Konak, H E; Kibar, S; Ergin, E S

    2016-11-01

    Osteoporosis is a serious disease characterized by muscle weakness in the lower extremities, shortened length of trunk, and increased dorsal kyphosis leading to poor balance performance. Although balance impairment increases in adults with osteoporosis, falls and fall-related injuries have been shown to occur mainly during the dual-task performance. Several studies have shown that dual-task performance was improved with specific repetitive dual-task exercises. The aims of this study were to compare the effect of single- and dual-task balance exercise programs on static balance, dynamic balance, and activity-specific balance confidence in adults with osteoporosis and to assess the effectiveness of dual-task balance training on gait speed under dual-task conditions. Older adults (N = 42) (age range, 45-88 years) with osteoporosis were randomly assigned into two groups. Single-task balance training group was given single-task balance exercises for 4 weeks, whereas dual-task balance training group received dual-task balance exercises. Participants received 45-min individualized training session, three times a week. Static balance was evaluated by one-leg stance (OLS) and a kinesthetic ability trainer (KAT) device. Dynamic balance was measured by the Berg Balance Scale (BBS), Time Up and Go (TUG) test, and gait speed. Self-confidence was assessed with the Activities-specific Balance Confidence (ABC-6) scale. Assessments were performed at baseline and after the 4-week program. At the end of the treatment periods, KAT score, BBS score, time in OLS and TUG, gait speeds under single- and dual-task conditions, and ABC-6 scale scores improved significantly in all patients (p < 0.05). However, BBS and gait speeds under single- and dual-task conditions showed significantly greater improvement in the dual-task balance training group than in the single-task balance training group (p < 0.05). ABC-6 scale scores improved more in the single-task balance training group than

  7. Development of L2 Interactional Resources for Online Collaborative Task Accomplishment

    ERIC Educational Resources Information Center

    Balaman, Ufuk; Sert, Olcay

    2017-01-01

    Technology-mediated task environments have long been considered integral parts of L2 learning and teaching processes. However, the interactional resources that the learners deploy to complete tasks in these environments have remained largely unexplored due to an overall focus on task design and outcomes rather than task engagement processes. With…

  8. "One Task Fits All"? The Roles of Task Complexity, Modality, and Working Memory Capacity in L2 Performance

    ERIC Educational Resources Information Center

    Zalbidea, Janire

    2017-01-01

    The present study explores the independent and interactive effects of task complexity and task modality on linguistic dimensions of second language (L2) performance and investigates how these effects are modulated by individual differences in working memory capacity. Thirty-two intermediate learners of L2 Spanish completed less and more complex…

  9. A classroom activity and laboratory on astronomical scale

    NASA Astrophysics Data System (ADS)

    LoPresto, Michael

    2017-10-01

    The four basics "scales" at which astronomy is studied, that of (1) the Earth-Moon system, (2) the solar system, (3) the galaxy, and (4) the universe (Fig. 1), are a common place to start an intro astronomy course. In fact, courses and textbooks are often divided into approximately four sections based on these scales.

  10. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  11. The Effects of Cognitive Task Complexity on L2 Oral Production

    ERIC Educational Resources Information Center

    Levkina, Mayya; Gilabert, Roger

    2012-01-01

    This paper examines the impact of task complexity on L2 production. The study increases task complexity by progressively removing pre-task planning time and increasing the number of elements. The combined effects of manipulating these two variables of task complexity simultaneously are also analyzed. Using a repeated measures design, 42…

  12. Scaling of metabolic rate on body mass in small laboratory mammals

    NASA Technical Reports Server (NTRS)

    Pace, N.; Rahlmann, D. F.; Smith, A. H.

    1980-01-01

    The scaling of metabolic heat production rate on body mass is investigated for five species of small laboratory mammal in order to define selection of animals of metabolic rates and size range appropriate for the measurement of changes in the scaling relationship upon exposure to weightlessness in Shuttle/Spacelab experiment. Metabolic rates were measured according to oxygen consumption and carbon dioxide production for individual male and female Swiss-Webster mice, Syrian hamsters, Simonsen albino rats, Hartley guinea pigs and New Zealand white rabbits, which range in mass from 0.05 to 5 kg mature body size, at ages of 1, 2, 3, 5, 8, 12, 18 and 24 months. The metabolic intensity, defined as the heat produced per hour per kg body mass, is found to decrease dramatically with age until the animals are 6 to 8 months old, with little or no sex difference. When plotted on a logarithmic graph, the relation of metabolic rate to total body mass is found to obey a power law of index 0.676, which differs significantly from the classical value of 0.75. When the values for the mice are removed, however, an index of 0.749 is obtained. It is thus proposed that six male animals, 8 months of age, of each of the four remaining species be used to study the effects of gravitational loading on the metabolic energy requirements of terrestrial animals.

  13. WAG 2 remedial investigation and site investigation site-specific work plan/health and safety checklist for the sediment transport modeling task

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holt, V.L.; Baron, L.A.

    1994-05-01

    This site-specific Work Plan/Health and Safety Checklist (WP/HSC) is a supplement to the general health and safety plan (HASP) for Waste Area Grouping (WAG) 2 remedial investigation and site investigation (WAG 2 RI&SI) activities [Health and Safety Plan for the Remedial Investigation and Site Investigation of Waste Area Grouping 2 at the Oak Ridge National Laboratory, Oak Ridge, Tennessee (ORNL/ER-169)] and provides specific details and requirements for the WAG 2 RI&SI Sediment Transport Modeling Task. This WP/HSC identifies specific site operations, site hazards, and any recommendations by Oak Ridge National Laboratory (ORNL) health and safety organizations [i.e., Industrial Hygiene (IH),more » Health Physics (HP), and/or Industrial Safety] that would contribute to the safe completion of the WAG 2 RI&SI. Together, the general HASP for the WAG 2 RI&SI (ORNL/ER-169) and the completed site-specific WP/HSC meet the health and safety planning requirements specified by 29 CFR 1910.120 and the ORNL Hazardous Waste Operations and Emergency Response (HAZWOPER) Program Manual. In addition to the health and safety information provided in the general HASP for the WAG 2 RI&SI, details concerning the site-specific task are elaborated in this site-specific WP/HSC, and both documents, as well as all pertinent procedures referenced therein, will be reviewed by all field personnel prior to beginning operations.« less

  14. Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hampton, Jesse Clay

    The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.

  15. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  16. Task Parallel Incomplete Cholesky Factorization using 2D Partitioned-Block Layout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kyungjoo; Rajamanickam, Sivasankaran; Stelle, George Widgery

    We introduce a task-parallel algorithm for sparse incomplete Cholesky factorization that utilizes a 2D sparse partitioned-block layout of a matrix. Our factorization algorithm follows the idea of algorithms-by-blocks by using the block layout. The algorithm-byblocks approach induces a task graph for the factorization. These tasks are inter-related to each other through their data dependences in the factorization algorithm. To process the tasks on various manycore architectures in a portable manner, we also present a portable tasking API that incorporates different tasking backends and device-specific features using an open-source framework for manycore platforms i.e., Kokkos. A performance evaluation is presented onmore » both Intel Sandybridge and Xeon Phi platforms for matrices from the University of Florida sparse matrix collection to illustrate merits of the proposed task-based factorization. Experimental results demonstrate that our task-parallel implementation delivers about 26.6x speedup (geometric mean) over single-threaded incomplete Choleskyby- blocks and 19.2x speedup over serial Cholesky performance which does not carry tasking overhead using 56 threads on the Intel Xeon Phi processor for sparse matrices arising from various application problems.« less

  17. Deep ART Neural Model for Biologically Inspired Episodic Memory and Its Application to Task Performance of Robots.

    PubMed

    Park, Gyeong-Moon; Yoo, Yong-Ho; Kim, Deok-Hwa; Kim, Jong-Hwan; Gyeong-Moon Park; Yong-Ho Yoo; Deok-Hwa Kim; Jong-Hwan Kim; Yoo, Yong-Ho; Park, Gyeong-Moon; Kim, Jong-Hwan; Kim, Deok-Hwa

    2018-06-01

    Robots are expected to perform smart services and to undertake various troublesome or difficult tasks in the place of humans. Since these human-scale tasks consist of a temporal sequence of events, robots need episodic memory to store and retrieve the sequences to perform the tasks autonomously in similar situations. As episodic memory, in this paper we propose a novel Deep adaptive resonance theory (ART) neural model and apply it to the task performance of the humanoid robot, Mybot, developed in the Robot Intelligence Technology Laboratory at KAIST. Deep ART has a deep structure to learn events, episodes, and even more like daily episodes. Moreover, it can retrieve the correct episode from partial input cues robustly. To demonstrate the effectiveness and applicability of the proposed Deep ART, experiments are conducted with the humanoid robot, Mybot, for performing the three tasks of arranging toys, making cereal, and disposing of garbage.

  18. Effect of Feed Composition on Cold-Cap Formation in Laboratory-Scale Melter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, Derek R.; Schweiger, Michael J.; Lee, Seung Min

    The development of advanced glass formulations are a part of the plan for reducing the cost and time for treatment and vitrification of the 210,000 m3 of nuclear waste at the Hanford Site in southeastern Washington State. One property of interest in this development is melt viscosity, which has a decisive influence on the rate of glass production. In an electric melter, the conversion process from feed-to-glass above the melt pool occurs in the cold cap. At the final stage of conversion when the glass-forming melt becomes connected, gas evolving reactions cause foaming. The melt viscosity affects foam stability. Threemore » glasses were formulated with viscosities of 1.5, 3.5, and 9.5 Pa s at 1150°C by varying the SiO2 content at the expense of B2O3, Li2O, and Na2O kept at constant proportions. Cold caps were produced by charging simulated high-alumina, high-level waste feeds in a laboratory-scale melter (LSM). The spread of the feed on the cold cap during charging and the cross-sectional structure of the final cold caps were compared. The amount of the foam and the size of the bubbles increased as the viscosity increased.« less

  19. User Needs, Benefits, and Integration of Robotic Systems in a Space Station Laboratory

    NASA Technical Reports Server (NTRS)

    Dodd, W. R.; Badgley, M. B.; Konkel, C. R.

    1989-01-01

    The methodology, results and conclusions of all tasks of the User Needs, Benefits, and Integration Study (UNBIS) of Robotic Systems in a Space Station Laboratory are summarized. Study goals included the determination of user requirements for robotics within the Space Station, United States Laboratory. In Task 1, three experiments were selected to determine user needs and to allow detailed investigation of microgravity requirements. In Task 2, a NASTRAN analysis of Space Station response to robotic disturbances, and acceleration measurement of a standard industrial robot (Intelledex Model 660) resulted in selection of two ranges of microgravity manipulation: Level 1 (10-3 to 10-5 G at greater than 1 Hz) and Level 2 (less than equal 10-6 G at 0.1 Hz). This task included an evaluation of microstepping methods for controlling stepper motors and concluded that an industrial robot actuator can perform milli-G motion without modification. Relative merits of end-effectors and manipulators were studied in Task 3 in order to determine their ability to perform a range of tasks related to the three microgravity experiments. An Effectivity Rating was established for evaluating these robotic system capabilities. Preliminary interface requirements for an orbital flight demonstration were determined in Task 4. Task 5 assessed the impact of robotics.

  20. Predicting dual-task performance with the Multiple Resources Questionnaire (MRQ).

    PubMed

    Boles, David B; Bursk, Jonathan H; Phillips, Jeffrey B; Perdelwitz, Jason R

    2007-02-01

    The objective was to assess the validity of the Multiple Resources Questionnaire (MRQ) in predicting dual-task interference. Subjective workload measures such as the Subjective Workload Assessment Technique (SWAT) and NASA Task Load Index are sensitive to single-task parameters and dual-task loads but have not attempted to measure workload in particular mental processes. An alternative is the MRQ. In Experiment 1, participants completed simple laboratory tasks and the MRQ after each. Interference between tasks was then correlated to three different task similarity metrics: profile similarity, based on r(2) between ratings; overlap similarity, based on summed minima; and overall demand, based on summed ratings. Experiment 2 used similar methods but more complex computer-based games. In Experiment 1 the MRQ moderately predicted interference (r = +.37), with no significant difference between metrics. In Experiment 2 the metric effect was significant, with overlap similarity excelling in predicting interference (r = +.83). Mean ratings showed high diagnosticity in identifying specific mental processing bottlenecks. The MRQ shows considerable promise as a cognitive-process-sensitive workload measure. Potential applications of the MRQ include the identification of dual-processing bottlenecks as well as process overloads in single tasks, preparatory to redesign in areas such as air traffic management, advanced flight displays, and medical imaging.

  1. Kennedy's Biomedical Laboratory Makes Multi-Tasking Look Easy

    NASA Technical Reports Server (NTRS)

    Dunn, Carol Anne

    2009-01-01

    If it is one thing that Florida has in abundance, it is sunshine and with that sunshine heat and humidity. For workers at the Kennedy Space Center that have to work outside in the heat and humidity, heat exhaustion/stroke is a real possibility. It might help people to know that Kennedy's Biomedical Laboratory has been testing some new Koolvests(Trademark) that can be worn underneath SCAPE suits. They have also been working on how to block out high noise levels; in fact, Don Doerr, chief of the Biomedical Lab, says, "The most enjoyable aspect is knowing that the Biomedical Lab and the skills of its employees have been used to support safe space flight, not only for the astronaut flight crew, but just as important for the ground processing personnel as well." The NASA Biomedical Laboratory has existed in the John F. Kennedy's Operations and Checkout Building since the Apollo Program. The primary mission of this laboratory has been the biomedical support to major, manned space programs that have included Apollo, Apollo-Soyuz, Skylab, and Shuttle. In this mission, the laboratory has been responsible in accomplishing much of the technical design, planning, provision, fabrication, and maintenance of flight and ground biomedical monitoring instrumentation. This includes the electronics in the launch flight suit and similar instrumentation systems in the spacecraft. (Note: The Lab checked out the system for STS-128 at Pad A using Firing room 4 and ground support equipment in the lab.) During Apollo, there were six engineers and ten technicians in the facility. This has evolved today to two NASA engineers and two NASA technicians, a Life Science Support contract physiologist and part-time support from an LSSC nurse and physician. Over the years, the lab has enjoyed collaboration with outside agencies and investigators. These have included on-site support to the Ames Research Center bed rest studies (seven years) and the European Space Agency studies in Toulouse, France (two

  2. Automating a Detailed Cognitive Task Analysis for Structuring Curriculum

    DTIC Science & Technology

    1991-08-01

    1991-- ] Aleeo/i ISM’-19# l Title: Automating a Detailed Cognitive Task Analysis for Structuring Curriculum Activities: To date we have completed task...The Institute for Management Sciences. Although the particular application of the modified GOMS cognitive task analysis technique under development is...Laboratories 91 9 23 074 Automnating a Detailed Cognitive Task Analysis For Stucuring Curriculum Research Plan Year 1 Task 1.0 Design Task 1.1 Conduct body

  3. Deicer scaling resistance of concrete mixtures containing slag cement. Phase 2 : evaluation of different laboratory scaling test methods.

    DOT National Transportation Integrated Search

    2012-07-01

    With the use of supplementary cementing materials (SCMs) in concrete mixtures, salt scaling tests such as ASTM C672 have been found to be overly aggressive and do correlate well with field scaling performance. The reasons for this are thought to be b...

  4. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    NASA Astrophysics Data System (ADS)

    Wieman, Carl

    2015-09-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and unsatisfying, and "cookbook." Here, both to help understand the reason for such discrepant views and to aid in the design of instructional lab courses, I compare the mental tasks or types of thinking ("cognitive task analysis") associated with a physicist doing tabletop experimental research with the cognitive tasks of students in an introductory physics instructional lab involving traditional verification/confirmation exercises.

  5. Laboratory and theoretical models of planetary-scale instabilities and waves

    NASA Technical Reports Server (NTRS)

    Hart, John E.; Toomre, Juri

    1990-01-01

    Meteorologists and planetary astronomers interested in large-scale planetary and solar circulations recognize the importance of rotation and stratification in determining the character of these flows. In the past it has been impossible to accurately model the effects of sphericity on these motions in the laboratory because of the invariant relationship between the uni-directional terrestrial gravity and the rotation axis of an experiment. Researchers studied motions of rotating convecting liquids in spherical shells using electrohydrodynamic polarization forces to generate radial gravity, and hence centrally directed buoyancy forces, in the laboratory. The Geophysical Fluid Flow Cell (GFFC) experiments performed on Spacelab 3 in 1985 were analyzed. Recent efforts at interpretation led to numerical models of rotating convection with an aim to understand the possible generation of zonal banding on Jupiter and the fate of banana cells in rapidly rotating convection as the heating is made strongly supercritical. In addition, efforts to pose baroclinic wave experiments for future space missions using a modified version of the 1985 instrument led to theoretical and numerical models of baroclinic instability. Rather surprising properties were discovered, which may be useful in generating rational (rather than artificially truncated) models for nonlinear baroclinic instability and baroclinic chaos.

  6. Scaling of Sediment Dynamics in a Reach-Scale Laboratory Model of a Sand-Bed Stream with Riparian Vegetation

    NASA Astrophysics Data System (ADS)

    Gorrick, S.; Rodriguez, J. F.

    2011-12-01

    A movable bed physical model was designed in a laboratory flume to simulate both bed and suspended load transport in a mildly sinuous sand-bed stream. Model simulations investigated the impact of different vegetation arrangements along the outer bank to evaluate rehabilitation options. Preserving similitude in the 1:16 laboratory model was very important. In this presentation the scaling approach, as well as the successes and challenges of the strategy are outlined. Firstly a near-bankfull flow event was chosen for laboratory simulation. In nature, bankfull events at the field site deposit new in-channel features but cause only small amounts of bank erosion. Thus the fixed banks in the model were not a drastic simplification. Next, and as in other studies, the flow velocity and turbulence measurements were collected in separate fixed bed experiments. The scaling of flow in these experiments was simply maintained by matching the Froude number and roughness levels. The subsequent movable bed experiments were then conducted under similar hydrodynamic conditions. In nature, the sand-bed stream is fairly typical; in high flows most sediment transport occurs in suspension and migrating dunes cover the bed. To achieve similar dynamics in the model equivalent values of the dimensionless bed shear stress and the particle Reynolds number were important. Close values of the two dimensionless numbers were achieved with lightweight sediments (R=0.3) including coal and apricot pips with a particle size distribution similar to that of the field site. Overall the moveable bed experiments were able to replicate the dominant sediment dynamics present in the stream during a bankfull flow and yielded relevant information for the analysis of the effects of riparian vegetation. There was a potential conflict in the strategy, in that grain roughness was exaggerated with respect to nature. The advantage of this strategy is that although grain roughness is exaggerated, the similarity of

  7. PWC - PAIRWISE COMPARISON SOFTWARE: SOFTWARE PROGRAM FOR PAIRWISE COMPARISON TASK FOR PSYCHOMETRIC SCALING AND COGNITIVE RESEARCH

    NASA Technical Reports Server (NTRS)

    Ricks, W. R.

    1994-01-01

    PWC is used for pair-wise comparisons in both psychometric scaling techniques and cognitive research. The cognitive tasks and processes of a human operator of automated systems are now prominent considerations when defining system requirements. Recent developments in cognitive research have emphasized the potential utility of psychometric scaling techniques, such as multidimensional scaling, for representing human knowledge and cognitive processing structures. Such techniques involve collecting measurements of stimulus-relatedness from human observers. When data are analyzed using this scaling approach, an n-dimensional representation of the stimuli is produced. This resulting representation is said to describe the subject's cognitive or perceptual view of the stimuli. PWC applies one of the many techniques commonly used to acquire the data necessary for these types of analyses: pair-wise comparisons. PWC administers the task, collects the data from the test subject, and formats the data for analysis. It therefore addresses many of the limitations of the traditional "pen-and-paper" methods. By automating the data collection process, subjects are prevented from going back to check previous responses, the possibility of erroneous data transfer is eliminated, and the burden of the administration and taking of the test is eased. By using randomization, PWC ensures that subjects see the stimuli pairs presented in random order, and that each subject sees pairs in a different random order. PWC is written in Turbo Pascal v6.0 for IBM PC compatible computers running MS-DOS. The program has also been successfully compiled with Turbo Pascal v7.0. A sample executable is provided. PWC requires 30K of RAM for execution. The standard distribution medium for this program is a 5.25 inch 360K MS-DOS format diskette. Two electronic versions of the documentation are included on the diskette: one in ASCII format and one in MS Word for Windows format. PWC was developed in 1993.

  8. Description and Prediction of Age-Related Change in Everyday Task Performance.

    ERIC Educational Resources Information Center

    Marsiske, Michael; Willis, Sherry L.

    Traditionally, assessment of the cognitive competencies of older adults has focused on abstract laboratory tests, which have often seemed quite unlike the demands of tasks encountered in everyday activities. Consequently, external validity of these laboratory tasks has been questioned, and their utility for assessing real-world competence has been…

  9. CO2 geosequestration at the laboratory scale: Combined geophysical and hydromechanical assessment of weakly-cemented shallow Sleipner-like reservoirs

    NASA Astrophysics Data System (ADS)

    Falcon-Suarez, I.; North, L. J.; Best, A. I.

    2017-12-01

    To date, the most promising mitigation strategy for reducing global carbon emissions is Carbon Capture and Storage (CCS). The storage technology (i.e., CO2 geosequestration, CGS) consists of injecting CO2 into deep geological formations, specifically selected for such massive-scale storage. To guarantee the mechanical stability of the reservoir during and after injection, it is crucial to improve existing monitoring techniques for controlling CGS activities. We developed a comprehensive experimental program to investigate the integrity of the Sleipner CO2 storage site in the North Sea - the first commercial CCS project in history where 1 Mtn/y of CO2 has been injected since 1996. We assessed hydro-mechanical effects and the related geophysical signatures of three synthetic sandstones and samples from the Utsira Sand formation (main reservoir at Sleipner), at realistic pressure-temperature (PT) conditions and fluid compositions. Our experimental approach consists of brine-CO2 flow-through tests simulating variable inflation/depletion scenarios, performed in the CGS-rig (Fig. 1; Falcon-Suarez et al., 2017) at the National Oceanography Centre (NOC) in Southampton. The rig is designed for simultaneous monitoring of ultrasonic P- and S-wave velocities and attenuations, electrical resistivity, axial and radial strains, pore pressure and flow, during the co-injection of up to two fluids under controlled PT conditions. Our results show velocity-resistivity and seismic-geomechanical relations of practical importance for the distinction between pore pressure and pore fluid distribution during CGS activities. By combining geophysical and thermo-hydro-mechano-chemical coupled information, we can provide laboratory datasets that complement in situ seismic, geomechanical and electrical survey information, useful for the CO2 plume monitoring in Sleipner site and other shallow weakly-cemented sand CCS reservoirs. Falcon-Suarez, I., Marín-Moreno, H., Browning, F., Lichtschlag, A

  10. Application of simultaneous saccharification and fermentation (SSF) from viscosity reducing of raw sweet potato for bioethanol production at laboratory, pilot and industrial scales.

    PubMed

    Zhang, Liang; Zhao, Hai; Gan, Mingzhe; Jin, Yanlin; Gao, Xiaofeng; Chen, Qian; Guan, Jiafa; Wang, Zhongyan

    2011-03-01

    The aim of this work was to research a bioprocess for bioethanol production from raw sweet potato by Saccharomyces cerevisiae at laboratory, pilot and industrial scales. The fermentation mode, inoculum size and pressure from different gases were determined in laboratory. The maximum ethanol concentration, average ethanol productivity rate and yield of ethanol after fermentation in laboratory scale (128.51 g/L, 4.76 g/L/h and 91.4%) were satisfactory with small decrease at pilot scale (109.06 g/L, 4.89 g/L/h and 91.24%) and industrial scale (97.94 g/L, 4.19 g/L/h and 91.27%). When scaled up, the viscosity caused resistance to fermentation parameters, 1.56 AUG/g (sweet potato mash) of xylanase decreased the viscosity from approximately 30000 to 500 cp. Overall, sweet potato is a attractive feedstock for be bioethanol production from both the economic standpoints and environmentally friendly. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Age-related Multiscale Changes in Brain Signal Variability in Pre-task versus Post-task Resting-state EEG.

    PubMed

    Wang, Hongye; McIntosh, Anthony R; Kovacevic, Natasa; Karachalios, Maria; Protzner, Andrea B

    2016-07-01

    Recent empirical work suggests that, during healthy aging, the variability of network dynamics changes during task performance. Such variability appears to reflect the spontaneous formation and dissolution of different functional networks. We sought to extend these observations into resting-state dynamics. We recorded EEG in young, middle-aged, and older adults during a "rest-task-rest" design and investigated if aging modifies the interaction between resting-state activity and external stimulus-induced activity. Using multiscale entropy as our measure of variability, we found that, with increasing age, resting-state dynamics shifts from distributed to more local neural processing, especially at posterior sources. In the young group, resting-state dynamics also changed from pre- to post-task, where fine-scale entropy increased in task-positive regions and coarse-scale entropy increased in the posterior cingulate, a key region associated with the default mode network. Lastly, pre- and post-task resting-state dynamics were linked to performance on the intervening task for all age groups, but this relationship became weaker with increasing age. Our results suggest that age-related changes in resting-state dynamics occur across different spatial and temporal scales and have consequences for information processing capacity.

  12. The Subsurface Flow and Transport Laboratory: A New Department of Energy User's Facility for Intermediate-Scale Experimentation

    NASA Astrophysics Data System (ADS)

    Wietsma, T. W.; Oostrom, M.; Foster, N. S.

    2003-12-01

    Intermediate-scale experiments (ISEs) for flow and transport are a valuable tool for simulating subsurface features and conditions encountered in the field at government and private sites. ISEs offer the ability to study, under controlled laboratory conditions, complicated processes characteristic of mixed wastes and heterogeneous subsurface environments, in multiple dimensions and at different scales. ISEs may, therefore, result in major cost savings if employed prior to field studies. A distinct advantage of ISEs is that researchers can design physical and/or chemical heterogeneities in the porous media matrix that better approximate natural field conditions and therefore address research questions that contain the additional complexity of processes often encountered in the natural environment. A new Subsurface Flow and Transport Laboratory (SFTL) has been developed for ISE users in the Environmental Spectroscopy & Biogeochemistry Facility in the Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). The SFTL offers a variety of columns and flow cells, a new state-of-the-art dual-energy gamma system, a fully automated saturation-pressure apparatus, and analytical equipment for sample processing. The new facility, including qualified staff, is available for scientists interested in collaboration on conducting high-quality flow and transport experiments, including contaminant remediation. Close linkages exist between the SFTL and numerical modelers to aid in experimental design and interpretation. This presentation will discuss the facility and outline the procedures required to submit a proposal to use this unique facility for research purposes. The W. R. Wiley Environmental Molecular Sciences Laboratory, a national scientific user facility, is sponsored by the U.S. Department of Energy's Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory.

  13. The Effects of Task Complexity on Heritage and L2 Spanish Development

    ERIC Educational Resources Information Center

    Torres, Julio

    2018-01-01

    Manipulating cognitive demands on second language (L2) tasks, along with the provision of recasts and its effects on L2 development, has motivated recent inquiry within task-based research. However, empirical evidence remains inconclusive as to the impact of task complexity, and it is unknown how it may affect heritage language (HL) development.…

  14. Embedding measurement within existing computerized data systems: scaling clinical laboratory and medical records heart failure data to predict ICU admission.

    PubMed

    Fisher, William P; Burton, Elizabeth C

    2010-01-01

    This study employs existing data sources to develop a new measure of intensive care unit (ICU) admission risk for heart failure patients. Outcome measures were constructed from laboratory, accounting, and medical record data for 973 adult inpatients with primary or secondary heart failure. Several scoring interpretations of the laboratory indicators were evaluated relative to their measurement and predictive properties. Cases were restricted to tests within first lab draw that included at least 15 indicators. After optimizing the original clinical observations, a satisfactory heart failure severity scale was calibrated on a 0-1000 continuum. Patients with unadjusted CHF severity measures of 550 or less were 2.7 times more likely to be admitted to the ICU than those with higher measures. Patients with low HF severity measures (550 or less) adjusted for demographic and diagnostic risk factors are about six times more likely to be admitted to the ICU than those with higher adjusted measures. A nomogram facilitates routine clinical application. Existing computerized data systems could be programmed to automatically structure clinical laboratory reports using the results of studies like this one to reduce data volume with no loss of information, make laboratory results more meaningful to clinical end users, improve the quality of care, reduce errors and unneeded tests, prevent unnecessary ICU admissions, lower costs, and improve patient satisfaction. Existing data typically examined piecemeal form a coherent scale measuring heart failure severity sensitive to increased likelihood of ICU admission. Marked improvements in ROC curves were found for the aggregate measures relative to individual clinical indicators.

  15. The Psychometric Anatomy of Two Unidimensional Workload Scales

    DTIC Science & Technology

    2004-10-01

    for the subjects to complete. In Task I, the subject takes a deck of shuffled flashcards , each card having one of the scale’s descriptors printed on...workload scale. The size of the flashcards is somewhat arbitrary, but 2.5 by 3 inches should accommodate the length of most descriptors. Task I is

  16. Energy efficiency in California laboratory-type facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, E.; Bell, G.; Sartor, D.

    The central aim of this project is to provide knowledge and tools for increasing the energy efficiency and performance of new and existing laboratory-type facilities in California. We approach the task along three avenues: (1) identification of current energy use and savings potential, (2) development of a {ital Design guide for energy- Efficient Research Laboratories}, and (3) development of a research agenda for focused technology development and improving out understanding of the market. Laboratory-type facilities use a considerable amount of energy resources. They are also important to the local and state economy, and energy costs are a factor in themore » overall competitiveness of industries utilizing laboratory-type facilities. Although the potential for energy savings is considerable, improving energy efficiency in laboratory-type facilities is no easy task, and there are many formidable barriers to improving energy efficiency in these specialized facilities. Insufficient motivation for individual stake holders to invest in improving energy efficiency using existing technologies as well as conducting related R&D is indicative of the ``public goods`` nature of the opportunity to achieve energy savings in this sector. Due to demanding environmental control requirements and specialized processes, laboratory-type facilities epitomize the important intersection between energy demands in the buildings sector and the industrial sector. Moreover, given the high importance and value of the activities conducted in laboratory-type facilities, they represent one of the most powerful contexts in which energy efficiency improvements stand to yield abundant non-energy benefits if properly applied.« less

  17. Nucleus accumbens dopamine D2-receptor expressing neurons control behavioral flexibility in a place discrimination task in the IntelliCage.

    PubMed

    Macpherson, Tom; Morita, Makiko; Wang, Yanyan; Sasaoka, Toshikuni; Sawa, Akira; Hikida, Takatoshi

    2016-07-01

    Considerable evidence has demonstrated a critical role for the nucleus accumbens (NAc) in the acquisition and flexibility of behavioral strategies. These processes are guided by the activity of two discrete neuron types, dopamine D1- or D2-receptor expressing medium spiny neurons (D1-/D2-MSNs). Here we used the IntelliCage, an automated group-housing experimental cage apparatus, in combination with a reversible neurotransmission blocking technique to examine the role of NAc D1- and D2-MSNs in the acquisition and reversal learning of a place discrimination task. We demonstrated that NAc D1- and D2-MSNs do not mediate the acquisition of the task, but that suppression of activity in D2-MSNs impairs reversal learning and increased perseverative errors. Additionally, global knockout of the dopamine D2L receptor isoform produced a similar behavioral phenotype to D2-MSN-blocked mice. These results suggest that D2L receptors and NAc D2-MSNs act to suppress the influence of previously correct behavioral strategies allowing transfer of behavioral control to new strategies. © 2016 Macpherson et al.; Published by Cold Spring Harbor Laboratory Press.

  18. Heat pump concepts for nZEB Technology developments, design tools and testing of heat pump systems for nZEB in the USA: Country report IEA HPT Annex 40 Task 2, Task 3 and Task 4 of the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Van D.; Payne, W. Vance; Ling, Jiazhen

    The IEA HPT Annex 40 "Heat pump concepts for Nearly Zero Energy Buildings" deals with the application of heat pumps as a core component of the HVAC system for Nearly or Net Zero energy buildings (nZEB). This report covers Task 2 on the system comparison and optimisation and Task 3 dedicated to the development of adapted technologies for nZEB and field monitoring results of heat pump systems in nZEB. In the US team three institutions are involved and have worked on the following projects: The Oak Ridge National Laboratory (ORNL) will summarize development activities through the field demonstration stage formore » several integrated heat pump (IHP) systems electric ground-source (GS-IHP) and air-source (AS-IHP) versions and an engine driven AS-IHP version. The first commercial GS-IHP product was just introduced to the market in December 2012. This work is a contribution to Task 3 of the Annex. The University of Maryland will contribute a software development project to Task 2 of the Annex. The software ThermCom evaluates occupied space thermal comfort conditions accounting for all radiative and convective heat transfer effects as well as local air properties. The National Institute of Standards and Technology (NIST) is working on a field study effort on the NIST Net Zero Energy Residential Test Facility (NZERTF). This residential building was constructed on the NIST campus and officially opened in summer 2013. During the first year, between July 2013 and June 2014, baseline performance of the NZERTF was monitored under a simulated occupancy protocol. The house was equipped with an air-to-air heat pump which included a dedicated dehumidification operating mode. Outdoor conditions, internal loads and modes of heat pump operation were monitored. Field study results with respect to heat pump operation will be reported and recommendations on heat pump optimization for a net zero energy building will be provided. This work is a contribution to Task 3 of the Annex.« less

  19. SHC Project 3.63, Task 2, Beneficial Use of Waste Materials

    EPA Science Inventory

    SHC Project 3.63, Task 2, “Beneficial Use of Waste Materials”, is designed to conduct research and analyses to characterize and quantify the risks and benefits of using or reusing waste materials. There are 6 primary research areas in Task 2 that cover a broad spectr...

  20. Efficacy of a novel educational curriculum using a simulation laboratory on resident performance of hysteroscopic sterilization.

    PubMed

    Chudnoff, Scott G; Liu, Connie S; Levie, Mark D; Bernstein, Peter; Banks, Erika H

    2010-09-01

    To assess whether a novel educational curriculum using a simulation teaching laboratory improves resident knowledge, comfort with, and surgical performance of hysteroscopic sterilization. An educational prospective, pretest/posttest study. The Montefiore Institute of Minimally Invasive Surgery Laboratory. PATIENT(S)/SUBJECT(S): Thirty-four OB/GYN residents in an academic medical center. Hysteroscopic sterilization simulation laboratory and a brief didactic lecture. Differences in scores on validated skill assessment tools: Task specific checklist, Global Rating Scale (GRS), pass fail assessment, and a multiple-choice examination to evaluate knowledge and attitude. In the entire cohort improvements were observed on all evaluation tools after the simulation laboratory, with 31% points (SD+/-11.5, 95% confidence interval [CI] 27.3-35.3) higher score on the written evaluation; 63% points (SD+/-15.7, 95% CI 57.8-68.8) higher score on the task specific checklist; and 54% points (SD+/-13.6, 95% CI 48.8-58.3) higher score on the GRS. Higher PGY status was correlated with better pretest performance, but was not statistically significant in posttest scores. Residents reported an improvement in comfort performing the procedure after the laboratory. Simulation laboratory teaching significantly improved resident knowledge, comfort level, and technical skill performance of hysteroscopic sterilization. Copyright (c) 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Transitioning glass-ceramic scintillators for diagnostic x-ray imaging from the laboratory to commercial scale

    NASA Astrophysics Data System (ADS)

    Beckert, M. Brooke; Gallego, Sabrina; Elder, Eric; Nadler, Jason

    2016-10-01

    This study sought to mitigate risk in transitioning newly developed glass-ceramic scintillator technology from a laboratory concept to commercial product by identifying the most significant hurdles to increased scale. These included selection of cost effective raw material sources, investigation of process parameters with the most significant impact on performance, and synthesis steps that could see the greatest benefit from participation of an industry partner that specializes in glass or optical component manufacturing. Efforts focused on enhancing the performance of glass-ceramic nanocomposite scintillators developed specifically for medical imaging via composition and process modifications that ensured efficient capture of incident X-ray energy and emission of scintillation light. The use of cost effective raw materials and existing manufacturing methods demonstrated proof-of-concept for economical viable alternatives to existing benchmark materials, as well as possible disruptive applications afforded by novel geometries and comparatively lower cost per volume. The authors now seek the expertise of industry to effectively navigate the transition from laboratory demonstrations to pilot scale production and testing to evince the industry of the viability and usefulness of composite-based scintillators.

  2. Multi-level multi-task learning for modeling cross-scale interactions in nested geospatial data

    USGS Publications Warehouse

    Yuan, Shuai; Zhou, Jiayu; Tan, Pang-Ning; Fergus, Emi; Wagner, Tyler; Sorrano, Patricia

    2017-01-01

    Predictive modeling of nested geospatial data is a challenging problem as the models must take into account potential interactions among variables defined at different spatial scales. These cross-scale interactions, as they are commonly known, are particularly important to understand relationships among ecological properties at macroscales. In this paper, we present a novel, multi-level multi-task learning framework for modeling nested geospatial data in the lake ecology domain. Specifically, we consider region-specific models to predict lake water quality from multi-scaled factors. Our framework enables distinct models to be developed for each region using both its local and regional information. The framework also allows information to be shared among the region-specific models through their common set of latent factors. Such information sharing helps to create more robust models especially for regions with limited or no training data. In addition, the framework can automatically determine cross-scale interactions between the regional variables and the local variables that are nested within them. Our experimental results show that the proposed framework outperforms all the baseline methods in at least 64% of the regions for 3 out of 4 lake water quality datasets evaluated in this study. Furthermore, the latent factors can be clustered to obtain a new set of regions that is more aligned with the response variables than the original regions that were defined a priori from the ecology domain.

  3. Smartphone App–Based Assessment of Gait During Normal and Dual-Task Walking: Demonstration of Validity and Reliability

    PubMed Central

    Yu, Wanting; Zhu, Hao; Harrison, Rachel; Lo, On-Yee; Lipsitz, Lewis; Travison, Thomas; Pascual-Leone, Alvaro; Zhou, Junhong

    2018-01-01

    Background Walking is a complex cognitive motor task that is commonly completed while performing another task such as talking or making decisions. Gait assessments performed under normal and “dual-task” walking conditions thus provide important insights into health. Such assessments, however, are limited primarily to laboratory-based settings. Objective The objective of our study was to create and test a smartphone-based assessment of normal and dual-task walking for use in nonlaboratory settings. Methods We created an iPhone app that used the phone’s motion sensors to record movements during walking under normal conditions and while performing a serial-subtraction dual task, with the phone placed in the user’s pants pocket. The app provided the user with multimedia instructions before and during the assessment. Acquired data were automatically uploaded to a cloud-based server for offline analyses. A total of 14 healthy adults completed 2 laboratory visits separated by 1 week. On each visit, they used the app to complete three 45-second trials each of normal and dual-task walking. Kinematic data were collected with the app and a gold-standard–instrumented GAITRite mat. Participants also used the app to complete normal and dual-task walking trials within their homes on 3 separate days. Within laboratory-based trials, GAITRite-derived heel strikes and toe-offs of the phone-side leg aligned with smartphone acceleration extrema, following filtering and rotation to the earth coordinate system. We derived stride times—a clinically meaningful metric of locomotor control—from GAITRite and app data, for all strides occurring over the GAITRite mat. We calculated stride times and the dual-task cost to the average stride time (ie, percentage change from normal to dual-task conditions) from both measurement devices. We calculated similar metrics from home-based app data. For these trials, periods of potential turning were identified via custom-developed algorithms

  4. Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach

    PubMed Central

    Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.

    2016-01-01

    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048

  5. Cluster Matrices for Health Occupations. Education for Employment Task Lists.

    ERIC Educational Resources Information Center

    Lathrop, Janice

    These cluster matrices provide duties and tasks that form the basis of instructional content for secondary, postsecondary, and adult training programs for health occupations. The eight clusters (and the job titles included in each cluster) are as follows: (1) dental assisting (dental assistant); (2) dental laboratory technology (dental laboratory…

  6. The Effect of N-3 on N-2 Repetition Costs in Task Switching

    ERIC Educational Resources Information Center

    Schuch, Stefanie; Grange, James A.

    2015-01-01

    N-2 task repetition cost is a response time and error cost returning to a task recently performed after one intervening trial (i.e., an ABA task sequence) compared with returning to a task not recently performed (i.e., a CBA task sequence). This cost is considered a robust measure of inhibitory control during task switching. The present article…

  7. Streamlining workflow and automation to accelerate laboratory scale protein production.

    PubMed

    Konczal, Jennifer; Gray, Christopher H

    2017-05-01

    Protein production facilities are often required to produce diverse arrays of proteins for demanding methodologies including crystallography, NMR, ITC and other reagent intensive techniques. It is common for these teams to find themselves a bottleneck in the pipeline of ambitious projects. This pressure to deliver has resulted in the evolution of many novel methods to increase capacity and throughput at all stages in the pipeline for generation of recombinant proteins. This review aims to describe current and emerging options to accelerate the success of protein production in Escherichia coli. We emphasize technologies that have been evaluated and implemented in our laboratory, including innovative molecular biology and expression vectors, small-scale expression screening strategies and the automation of parallel and multidimensional chromatography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Fermentative lactic acid production from coffee pulp hydrolysate using Bacillus coagulans at laboratory and pilot scales.

    PubMed

    Pleissner, Daniel; Neu, Anna-Katrin; Mehlmann, Kerstin; Schneider, Roland; Puerta-Quintero, Gloria Inés; Venus, Joachim

    2016-10-01

    In this study, the lignocellulosic residue coffee pulp was used as carbon source in fermentative l(+)-lactic acid production using Bacillus coagulans. After thermo-chemical treatment at 121°C for 30min in presence of 0.18molL(-1) H2SO4 and following an enzymatic digestion using Accellerase 1500 carbon-rich hydrolysates were obtained. Two different coffee pulp materials with comparable biomass composition were used, but sugar concentrations in hydrolysates showed variations. The primary sugars were (gL(-1)) glucose (20-30), xylose (15-25), sucrose (5-11) and arabinose (0.7-10). Fermentations were carried out at laboratory (2L) and pilot (50L) scales in presence of 10gL(-1) yeast extract. At pilot scale carbon utilization and lactic acid yield per gram of sugar consumed were 94.65% and 0.78gg(-1), respectively. The productivity was 4.02gL(-1)h(-1). Downstream processing resulted in a pure formulation containing 937gL(-1)l(+)-lactic acid with an optical purity of 99.7%. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Analysis and Test Support for Phillips Laboratory Precision Structures

    DTIC Science & Technology

    1998-11-01

    Air Force Research Laboratory ( AFRL ), Phillips Research Site . Task objectives centered...around analysis and structural dynamic test support on experiments within the Space Vehicles Directorate at Kirtland Air Force Base. These efforts help...support for Phillips Laboratory Precision Structures." Mr. James Goodding of CSA Engineering was the principal investigator for this task. Mr.

  10. An evaluation of nursing tasks.

    PubMed

    Baptiste, Andrea

    2011-01-01

    Functional capacity evaluations have been criticized as being too general in theory and not being accurate enough to determine what tasks an employee can perform. This paper will describe results of a descriptive study that was conducted in a laboratory setting to objectively determine the physical demands of patient transfer tasks performed by nurses. Fifty three tasks were analyzed and broken down into sub-tasks to quantify the peak force required to perform each sub-task in order to determine which tasks pose healthcare workers at highest risk of injury. Dissecting the transfer task into segments allows us to see which part of the task requires high forces on the part of the caregiver. The task can then be modified to eliminate the risk of injury to the caregiver. This modification can be accomplished by using healthcare technology, such as floor based or overhead lifts, friction reducing devices, sit to stand lifts, properly designed slings, and motorized beds/trolleys. Technological solutions are available for some of these high risk tasks and should be implemented where applicable to reduce the force demand and eliminate or reduce the risk of injury to healthcare workers in nursing.

  11. Application of Pulse Spark Discharges for Scale Prevention and Continuous Filtration Methods in Coal-Fired Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, Young; Fridman, Alexander

    2012-06-30

    The overall objective of the present work was to develop a new scale-prevention technology by continuously precipitating and removing dissolved mineral ions (such as calcium and magnesium) in cooling water while the COC could be doubled from the present standard value of 3.5. The hypothesis of the present study was that if we could successfully precipitate and remove the excess calcium ions in cooling water, we could prevent condenser-tube fouling and at the same time double the COC. The approach in the study was to utilize pulse spark discharges directly in water to precipitate dissolved mineral ions in recirculating coolingmore » water into relatively large suspended particles, which could be removed by a self-cleaning filter. The present study began with a basic scientific research to better understand the mechanism of pulse spark discharges in water and conducted a series of validation experiments using hard water in a laboratory cooling tower. Task 1 of the present work was to demonstrate if the spark discharge could precipitate the mineral ions in water. Task 2 was to demonstrate if the selfcleaning filter could continuously remove these precipitated calcium particles such that the blowdown could be eliminated or significantly reduced. Task 3 was to demonstrate if the scale could be prevented or minimized at condenser tubes with a COC of 8 or (almost) zero blowdown. In Task 1, we successfully completed the validation study that confirmed the precipitation of dissolved calcium ions in cooling water with the supporting data of calcium hardness over time as measured by a calcium ion probe. In Task 2, we confirmed through experimental tests that the self-cleaning filter could continuously remove precipitated calcium particles in a simulated laboratory cooling tower such that the blowdown could be eliminated or significantly reduced. In addition, chemical water analysis data were obtained which were used to confirm the COC calculation. In Task 3, we conducted a

  12. Air Force Research Laboratory

    DTIC Science & Technology

    2009-06-08

    Air Force Research Laboratory 8 June 2009 Mr. Leo Marple Ai F R h L b t r orce esearc a ora ory Leo.Marple@wpafb.af.mil DISTRIBUTION STATEMENT A...TITLE AND SUBTITLE Air Force Research Laboratory 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Air Force Research Laboratory ,Wright

  13. Evaluation of 2 cognitive abilities tests in a dual-task environment

    NASA Technical Reports Server (NTRS)

    Vidulich, M. A.; Tsang, P. S.

    1986-01-01

    Most real world operators are required to perform multiple tasks simultaneously. In some cases, such as flying a high performance aircraft or trouble shooting a failing nuclear power plant, the operator's ability to time share or process in parallel" can be driven to extremes. This has created interest in selection tests of cognitive abilities. Two tests that have been suggested are the Dichotic Listening Task and the Cognitive Failures Questionnaire. Correlations between these test results and time sharing performance were obtained and the validity of these tests were examined. The primary task was a tracking task with dynamically varying bandwidth. This was performed either alone or concurrently with either another tracking task or a spatial transformation task. The results were: (1) An unexpected negative correlation was detected between the two tests; (2) The lack of correlation between either test and task performance made the predictive utility of the tests scores appear questionable; (3) Pilots made more errors on the Dichotic Listening Task than college students.

  14. Reduction of product-related species during the fermentation and purification of a recombinant IL-1 receptor antagonist at the laboratory and pilot scale.

    PubMed

    Schirmer, Emily B; Golden, Kathryn; Xu, Jin; Milling, Jesse; Murillo, Alec; Lowden, Patricia; Mulagapati, Srihariraju; Hou, Jinzhao; Kovalchin, Joseph T; Masci, Allyson; Collins, Kathryn; Zarbis-Papastoitsis, Gregory

    2013-08-01

    Through a parallel approach of tracking product quality through fermentation and purification development, a robust process was designed to reduce the levels of product-related species. Three biochemically similar product-related species were identified as byproducts of host-cell enzymatic activity. To modulate intracellular proteolytic activity, key fermentation parameters (temperature, pH, trace metals, EDTA levels, and carbon source) were evaluated through bioreactor optimization, while balancing negative effects on growth, productivity, and oxygen demand. The purification process was based on three non-affinity steps and resolved product-related species by exploiting small charge differences. Using statistical design of experiments for elution conditions, a high-resolution cation exchange capture column was optimized for resolution and recovery. Further reduction of product-related species was achieved by evaluating a matrix of conditions for a ceramic hydroxyapatite column. The optimized fermentation process was transferred from the 2-L laboratory scale to the 100-L pilot scale and the purification process was scaled accordingly to process the fermentation harvest. The laboratory- and pilot-scale processes resulted in similar process recoveries of 60 and 65%, respectively, and in a product that was of equal quality and purity to that of small-scale development preparations. The parallel approach for up- and downstream development was paramount in achieving a robust and scalable clinical process. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Analysis of applied forces and electromyography of back and shoulders muscles when performing a simulated hand scaling task.

    PubMed

    Porter, William; Gallagher, Sean; Torma-Krajewski, Janet

    2010-05-01

    Hand scaling is a physically demanding task responsible for numerous overexertion injuries in underground mining. Scaling requires the miner to use a long pry bar to remove loose rock, reducing the likelihood of rock fall injuries. The experiments described in this article simulated "rib" scaling (scaling a mine wall) from an elevated bucket to examine force generation and electromyographic responses using two types of scaling bars (steel and fiberglass-reinforced aluminum) at five target heights ranging from floor level to 176 cm. Ten male and six female subjects were tested in separate experiments. Peak and average force applied at the scaling bar tip and normalized electromyography (EMG) of the left and right pairs of the deltoid and erectores spinae muscles were obtained. Work height significantly affected peak prying force during scaling activities with highest force capacity at the lower levels. Bar type did not affect force generation. However, use of the lighter fiberglass bar required significantly more muscle activity to achieve the same force. Results of these studies suggest that miners scale points on the rock face that are below their knees, and reposition the bucket as often as necessary to do so. Published by Elsevier Ltd.

  16. Laboratory study of sonic booms and their scaling laws. [ballistic range simulation

    NASA Technical Reports Server (NTRS)

    Toong, T. Y.

    1974-01-01

    This program undertook to seek a basic understanding of non-linear effects associated with caustics, through laboratory simulation experiments of sonic booms in a ballistic range and a coordinated theoretical study of scaling laws. Two cases of superbooms or enhanced sonic booms at caustics have been studied. The first case, referred to as acceleration superbooms, is related to the enhanced sonic booms generated during the acceleration maneuvers of supersonic aircrafts. The second case, referred to as refraction superbooms, involves the superbooms that are generated as a result of atmospheric refraction. Important theoretical and experimental results are briefly reported.

  17. Cardiac data increase association between self-report and both expert ratings of task load and task performance in flight simulator tasks: An exploratory study.

    PubMed

    Lehrer, Paul; Karavidas, Maria; Lu, Shou-En; Vaschillo, Evgeny; Vaschillo, Bronya; Cheng, Andrew

    2010-05-01

    Seven professional airplane pilots participated in a one-session test in a Boeing 737-800 simulator. Mental workload for 18 flight tasks was rated by experienced test pilots (hereinafter called "expert ratings") and by study participants' self-report on NASA's Task Load Index (TLX) scale. Pilot performance was rated by a check pilot. The standard deviation of R-R intervals (SDNN) significantly added 3.7% improvement over the TLX in distinguishing high from moderate-load tasks and 2.3% improvement in distinguishing high from combined moderate and low-load tasks. Minimum RRI in the task significantly discriminated high- from medium- and low-load tasks, but did not add significant predictive variance to the TLX. The low-frequency/high-frequency (LF:HF) RRI ratio based on spectral analysis of R-R intervals, and ventricular relaxation time were each negatively related to pilot performance ratings independently of TLX values, while minimum and average RRI were positively related, showing added contribution of these cardiac measures for predicting performance. Cardiac results were not affected by controlling either for respiration rate or motor activity assessed by accelerometry. The results suggest that cardiac assessment can be a useful addition to self-report measures for determining flight task mental workload and risk for performance decrements. Replication on a larger sample is needed to confirm and extend the results. Copyright 2010 Elsevier B.V. All rights reserved.

  18. Smartphone App-Based Assessment of Gait During Normal and Dual-Task Walking: Demonstration of Validity and Reliability.

    PubMed

    Manor, Brad; Yu, Wanting; Zhu, Hao; Harrison, Rachel; Lo, On-Yee; Lipsitz, Lewis; Travison, Thomas; Pascual-Leone, Alvaro; Zhou, Junhong

    2018-01-30

    Walking is a complex cognitive motor task that is commonly completed while performing another task such as talking or making decisions. Gait assessments performed under normal and "dual-task" walking conditions thus provide important insights into health. Such assessments, however, are limited primarily to laboratory-based settings. The objective of our study was to create and test a smartphone-based assessment of normal and dual-task walking for use in nonlaboratory settings. We created an iPhone app that used the phone's motion sensors to record movements during walking under normal conditions and while performing a serial-subtraction dual task, with the phone placed in the user's pants pocket. The app provided the user with multimedia instructions before and during the assessment. Acquired data were automatically uploaded to a cloud-based server for offline analyses. A total of 14 healthy adults completed 2 laboratory visits separated by 1 week. On each visit, they used the app to complete three 45-second trials each of normal and dual-task walking. Kinematic data were collected with the app and a gold-standard-instrumented GAITRite mat. Participants also used the app to complete normal and dual-task walking trials within their homes on 3 separate days. Within laboratory-based trials, GAITRite-derived heel strikes and toe-offs of the phone-side leg aligned with smartphone acceleration extrema, following filtering and rotation to the earth coordinate system. We derived stride times-a clinically meaningful metric of locomotor control-from GAITRite and app data, for all strides occurring over the GAITRite mat. We calculated stride times and the dual-task cost to the average stride time (ie, percentage change from normal to dual-task conditions) from both measurement devices. We calculated similar metrics from home-based app data. For these trials, periods of potential turning were identified via custom-developed algorithms and omitted from stride-time analyses

  19. Laboratory and field measurements and evaluations of vibration at the handles of riveting hammers

    PubMed Central

    McDOWELL, THOMAS W.; WARREN, CHRISTOPHER; WELCOME, DANIEL E.; DONG, REN G.

    2015-01-01

    The use of riveting hammers can expose workers to harmful levels of hand-transmitted vibration (HTV). As a part of efforts to reduce HTV exposures through tool selection, the primary objective of this study was to evaluate the applicability of a standardized laboratory-based riveting hammer assessment protocol for screening riveting hammers. The second objective was to characterize the vibration emissions of reduced vibration riveting hammers and to make approximations of the HTV exposures of workers operating these tools in actual work tasks. Eight pneumatic riveting hammers were selected for the study. They were first assessed in a laboratory using the standardized method for measuring vibration emissions at the tool handle. The tools were then further assessed under actual working conditions during three aircraft sheet metal riveting tasks. Although the average vibration magnitudes of the riveting hammers measured in the laboratory test were considerably different from those measured in the field study, the rank orders of the tools determined via these tests were fairly consistent, especially for the lower vibration tools. This study identified four tools that consistently exhibited lower frequency-weighted and unweighted accelerations in both the laboratory and workplace evaluations. These observations suggest that the standardized riveting hammer test is acceptable for identifying tools that could be expected to exhibit lower vibrations in workplace environments. However, the large differences between the accelerations measured in the laboratory and field suggest that the standardized laboratory-based tool assessment is not suitable for estimating workplace riveting hammer HTV exposures. Based on the frequency-weighted accelerations measured at the tool handles during the three work tasks, the sheet metal mechanics assigned to these tasks at the studied workplace are unlikely to exceed the daily vibration exposure action value (2.5 m s−2) using any of the

  20. Performance goals on simulators boost resident motivation and skills laboratory attendance.

    PubMed

    Stefanidis, Dimitrios; Acker, Christina E; Greene, Frederick L

    2010-01-01

    To assess the impact of setting simulator training goals on resident motivation and skills laboratory attendance. Residents followed a proficiency-based laparoscopic curriculum on the 5 Fundamentals of Laparoscopic Surgery and 9 virtual reality tasks. Training goals consisted of the average expert performance on each task + 2 SD (mandatory) and best expert performance (optional). Residents rated the impact of the training goals on their motivation on a 20-point visual analog scale. Performance and attendance data were analyzed and correlated (Spearman's). Data are reported as medians (range). General Surgery residency program at a regional referral Academic Medical Center. General surgery residents (n = 15). During the first 5 months of the curriculum, weekly attendance rate was 51% (range, 8-96). After 153 (range, 21-412) repetitions, resident speed improved by 97% (range, 18-230), errors improved by 17% (range, 0-24), and motion efficiency by 59% (range, 26-114) compared with their baseline. Nine (60%) residents achieved proficiency in 7 (range, 3-14) and the best goals in 3.5 (range, 1-9) tasks; the other 6 residents had attendance rates <30%. Residents rated the impact of setting performance goals on their motivation as 15 (range, 1-18) and setting a best goal as 13 (range, 1-18). Motivation ratings correlated positively with attendance rates, number of repetitions, performance improvement, and achievement of proficiency and best goals (r = 0.59-0.75; p < 0.05) but negatively with postgraduate year (PGY) (-0.67; p = 0.02). Setting training goals on simulators are associated with improved resident motivation to participate in a simulator curriculum. While more stringent goals may potentiate this effect, they have a limited impact on senior residents. Further research is needed to investigate ways to improve skills laboratory attendance. Copyright 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  1. Development of flight experiment task requirements. Volume 2: Technical Report. Part 2: Appendix H: Tasks-skills data series

    NASA Technical Reports Server (NTRS)

    Hatterick, G. R.

    1972-01-01

    The data sheets presented contain the results of the task analysis portion of the study to identify skill requirements of space shuttle crew personnel. A comprehensive data base is provided of crew functions, operating environments, task dependencies, and task-skills applicable to a representative cross section of earth orbital research experiments.

  2. Indomethacin nanocrystals prepared by different laboratory scale methods: effect on crystalline form and dissolution behavior

    NASA Astrophysics Data System (ADS)

    Martena, Valentina; Censi, Roberta; Hoti, Ela; Malaj, Ledjan; Di Martino, Piera

    2012-12-01

    The objective of this study is to select very simple and well-known laboratory scale methods able to reduce particle size of indomethacin until the nanometric scale. The effect on the crystalline form and the dissolution behavior of the different samples was deliberately evaluated in absence of any surfactants as stabilizers. Nanocrystals of indomethacin (native crystals are in the γ form) (IDM) were obtained by three laboratory scale methods: A (Batch A: crystallization by solvent evaporation in a nano-spray dryer), B (Batch B-15 and B-30: wet milling and lyophilization), and C (Batch C-20-N and C-40-N: Cryo-milling in the presence of liquid nitrogen). Nanocrystals obtained by the method A (Batch A) crystallized into a mixture of α and γ polymorphic forms. IDM obtained by the two other methods remained in the γ form and a different attitude to the crystallinity decrease were observed, with a more considerable decrease in crystalline degree for IDM milled for 40 min in the presence of liquid nitrogen. The intrinsic dissolution rate (IDR) revealed a higher dissolution rate for Batches A and C-40-N, due to the higher IDR of α form than γ form for the Batch A, and the lower crystallinity degree for both the Batches A and C-40-N. These factors, as well as the decrease in particle size, influenced the IDM dissolution rate from the particle samples. Modifications in the solid physical state that may occur using different particle size reduction treatments have to be taken into consideration during the scale up and industrial development of new solid dosage forms.

  3. Laboratory Scale Electrodeposition. Practice and Applications.

    ERIC Educational Resources Information Center

    Bruno, Thomas J.

    1986-01-01

    Discusses some aspects of electrodeposition and electroplating. Emphasizes the materials, techniques, and safety precautions necessary to make electrodeposition work reliably in the chemistry laboratory. Describes some problem-solving applications of this process. (TW)

  4. The Use of Conjunctions in Cognitively Simple versus Complex Oral L2 Tasks

    ERIC Educational Resources Information Center

    Michel, Marije C.

    2013-01-01

    The present study explores the use of conjunctions in simple versus complex argumentative tasks performed by second language (L2) learners as a specific measure for the amount of reasoning involved in task performance. The Cognition Hypothesis (Robinson, 2005) states that an increase in cognitive task complexity promotes improvements in L2

  5. Knowledge of Previous Tasks: Task Similarity Influences Bias in Task Duration Predictions

    PubMed Central

    Thomas, Kevin E.; König, Cornelius J.

    2018-01-01

    Bias in predictions of task duration has been attributed to misremembering previous task duration and using previous task duration as a basis for predictions. This research sought to further examine how previous task information affects prediction bias by manipulating task similarity and assessing the role of previous task duration feedback. Task similarity was examined through participants performing two tasks 1 week apart that were the same or different. Duration feedback was provided to all participants (Experiment 1), its recall was manipulated (Experiment 2), and its provision was manipulated (Experiment 3). In all experiments, task similarity influenced bias on the second task, with predictions being less biased when the first task was the same task. However, duration feedback did not influence bias. The findings highlight the pivotal role of knowledge about previous tasks in task duration prediction and are discussed in relation to the theoretical accounts of task duration prediction bias. PMID:29881362

  6. The Effect of CO2 Pressure on Chromia Scale Microstructure at 750°C

    NASA Astrophysics Data System (ADS)

    Pint, B. A.; Unocic, K. A.

    2018-06-01

    To understand and model performance in supercritical CO2 (sCO2) for high-efficiency, concentrating solar power (CSP) and fossil energy power cycles, reaction rates are compared at 750°C in 0.1 MPa CO2 and 30 MPa sCO2 as well as laboratory air as a baseline on structural materials such as Ni-based alloy 625. Due to the thin reaction products formed even after 5000 h, scanning transmission electron microscopy was used to study the Cr-rich surface oxide scale. The scales formed in CO2 and sCO2 had a much finer grain size with more voids observed in CO2. However, the observations on alloy 625 were complicated by Mo and Nb-rich precipitates in the adjacent substrate and Al internal oxidation. To simplify the system, a binary Ni-22Cr alloy was exposed for 1000 h in similar environments. After exposure in sCO2, there was an indication of carbon segregation detected on the Cr2O3 grain boundaries. After exposure in air, metallic Ni precipitates were observed in the scale that were not observed in the scale formed on alloy 625. The scale formed in air on a second Ni-22Cr model alloy with Mn and Si additions did not contain Ni precipitates, suggesting caution when drawing conclusions from model alloys.

  7. A collaborative approach to lean laboratory workstation design reduces wasted technologist travel.

    PubMed

    Yerian, Lisa M; Seestadt, Joseph A; Gomez, Erron R; Marchant, Kandice K

    2012-08-01

    Lean methodologies have been applied in many industries to reduce waste. We applied Lean techniques to redesign laboratory workstations with the aim of reducing the number of times employees must leave their workstations to complete their tasks. At baseline in 68 workflows (aggregates or sequence of process steps) studied, 251 (38%) of 664 tasks required workers to walk away from their workstations. After analysis and redesign, only 59 (9%) of the 664 tasks required technologists to leave their workstations to complete these tasks. On average, 3.4 travel events were removed for each workstation. Time studies in a single laboratory section demonstrated that workers spend 8 to 70 seconds in travel each time they step away from the workstation. The redesigned workstations will allow employees to spend less time travelling around the laboratory. Additional benefits include employee training in waste identification, improved overall laboratory layout, and identification of other process improvement opportunities in our laboratory.

  8. A laboratory scale model of abrupt ice-shelf disintegration

    NASA Astrophysics Data System (ADS)

    Macayeal, D. R.; Boghosian, A.; Styron, D. D.; Burton, J. C.; Amundson, J. M.; Cathles, L. M.; Abbot, D. S.

    2010-12-01

    An important mode of Earth’s disappearing cryosphere is the abrupt disintegration of ice shelves along the Peninsula of Antarctica. This disintegration process may be triggered by climate change, however the work needed to produce the spectacular, explosive results witnessed with the Larsen B and Wilkins ice-shelf events of the last decade comes from the large potential energy release associated with iceberg capsize and fragmentation. To gain further insight into the underlying exchanges of energy involved in massed iceberg movements, we have constructed a laboratory-scale model designed to explore the physical and hydrodynamic interactions between icebergs in a confined channel of water. The experimental apparatus consists of a 2-meter water tank that is 30 cm wide. Within the tank, we introduce fresh water and approximately 20-100 rectangular plastic ‘icebergs’ having the appropriate density contrast with water to mimic ice. The blocks are initially deployed in a tight pack, with all blocks arranged in a manner to represent the initial state of an integrated ice shelf or ice tongue. The system is allowed to evolve through time under the driving forces associated with iceberg hydrodynamics. Digitized videography is used to quantify how the system of plastic icebergs evolves between states of quiescence to states of mobilization. Initial experiments show that, after a single ‘agitator’ iceberg begins to capsize, an ‘avalanche’ of capsizing icebergs ensues which drives horizontal expansion of the massed icebergs across the water surface, and which stimulates other icebergs to capsize. A surprise initially evident in the experiments is the fact that the kinetic energy of the expanding mass of icebergs is only a small fraction of the net potential energy released by the rearrangement of mass via capsize. Approximately 85 - 90 % of the energy released by the system goes into water motion modes, including a pervasive, easily observed seich mode of the tank

  9. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  10. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, C. K.; Tzeferacos, P.; Lamb, D.

    X-ray images from the Chandra X-ray Observatory show that the South-East jet in the Crab nebula changes direction every few years. This remarkable phenomenon is also observed in jets associated with pulsar wind nebulae and other astrophysical objects, and therefore is a fundamental feature of astrophysical jet evolution that needs to be understood. Theoretical modeling and numerical simulations have suggested that this phenomenon may be a consequence of magnetic fields (B) and current-driven magnetohydrodynamic (MHD) instabilities taking place in the jet, but until now there has been no verification of this process in a controlled laboratory environment. Here we reportmore » the first such experiments, using scaled laboratory plasma jets generated by high-power lasers to model the Crab jet and monoenergetic-proton radiography to provide direct visualization and measurement of magnetic fields and their behavior. The toroidal magnetic field embedded in the supersonic jet triggered plasma instabilities and resulted in considerable deflections throughout the jet propagation, mimicking the kinks in the Crab jet. We also demonstrated that these kinks are stabilized by high jet velocity, consistent with the observation that instabilities alter the jet orientation but do not disrupt the overall jet structure. We successfully modeled these laboratory experiments with a validated three-dimensional (3D) numerical simulation, which in conjunction with the experiments provide compelling evidence that we have an accurate model of the most important physics of magnetic fields and MHD instabilities in the observed, kinked jet in the Crab nebula. The experiments initiate a novel approach in the laboratory for visualizing fields and instabilities associated with jets observed in various astrophysical objects, ranging from stellar to extragalactic systems. We expect that future work along this line will have important impact on the study and understanding of such fundamental

  11. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet

    DOE PAGES

    Li, C. K.; Tzeferacos, P.; Lamb, D.; ...

    2016-10-07

    X-ray images from the Chandra X-ray Observatory show that the South-East jet in the Crab nebula changes direction every few years. This remarkable phenomenon is also observed in jets associated with pulsar wind nebulae and other astrophysical objects, and therefore is a fundamental feature of astrophysical jet evolution that needs to be understood. Theoretical modeling and numerical simulations have suggested that this phenomenon may be a consequence of magnetic fields (B) and current-driven magnetohydrodynamic (MHD) instabilities taking place in the jet, but until now there has been no verification of this process in a controlled laboratory environment. Here we reportmore » the first such experiments, using scaled laboratory plasma jets generated by high-power lasers to model the Crab jet and monoenergetic-proton radiography to provide direct visualization and measurement of magnetic fields and their behavior. The toroidal magnetic field embedded in the supersonic jet triggered plasma instabilities and resulted in considerable deflections throughout the jet propagation, mimicking the kinks in the Crab jet. We also demonstrated that these kinks are stabilized by high jet velocity, consistent with the observation that instabilities alter the jet orientation but do not disrupt the overall jet structure. We successfully modeled these laboratory experiments with a validated three-dimensional (3D) numerical simulation, which in conjunction with the experiments provide compelling evidence that we have an accurate model of the most important physics of magnetic fields and MHD instabilities in the observed, kinked jet in the Crab nebula. The experiments initiate a novel approach in the laboratory for visualizing fields and instabilities associated with jets observed in various astrophysical objects, ranging from stellar to extragalactic systems. We expect that future work along this line will have important impact on the study and understanding of such fundamental

  12. TASK-2 Channels Contribute to pH Sensitivity of Retrotrapezoid Nucleus Chemoreceptor Neurons

    PubMed Central

    Wang, Sheng; Benamer, Najate; Zanella, Sébastien; Kumar, Natasha N.; Shi, Yingtang; Bévengut, Michelle; Penton, David; Guyenet, Patrice G.; Lesage, Florian

    2013-01-01

    Phox2b-expressing glutamatergic neurons of the retrotrapezoid nucleus (RTN) display properties expected of central respiratory chemoreceptors; they are directly activated by CO2/H+ via an unidentified pH-sensitive background K+ channel and, in turn, facilitate brainstem networks that control breathing. Here, we used a knock-out mouse model to examine whether TASK-2 (K2P5), an alkaline-activated background K+ channel, contributes to RTN neuronal pH sensitivity. We made patch-clamp recordings in brainstem slices from RTN neurons that were identified by expression of GFP (directed by the Phox2b promoter) or β-galactosidase (from the gene trap used for TASK-2 knock-out). Whereas nearly all RTN cells from control mice were pH sensitive (95%, n = 58 of 61), only 56% of GFP-expressing RTN neurons from TASK-2−/− mice (n = 49 of 88) could be classified as pH sensitive (>30% reduction in firing rate from pH 7.0 to pH 7.8); the remaining cells were pH insensitive (44%). Moreover, none of the recorded RTN neurons from TASK-2−/− mice selected based on β-galactosidase activity (a subpopulation of GFP-expressing neurons) were pH sensitive. The alkaline-activated background K+ currents were reduced in amplitude in RTN neurons from TASK-2−/− mice that retained some pH sensitivity but were absent from pH-insensitive cells. Finally, using a working heart–brainstem preparation, we found diminished inhibition of phrenic burst amplitude by alkalization in TASK-2−/− mice, with apneic threshold shifted to higher pH levels. In conclusion, alkaline-activated TASK-2 channels contribute to pH sensitivity in RTN neurons, with effects on respiration in situ that are particularly prominent near apneic threshold. PMID:24107938

  13. Task frequency influences stimulus-driven effects on task selection during voluntary task switching.

    PubMed

    Arrington, Catherine M; Reiman, Kaitlin M

    2015-08-01

    Task selection during voluntary task switching involves both top-down (goal-directed) and bottom-up (stimulus-driven) mechanisms. The factors that shift the balance between these two mechanisms are not well characterized. In the present research, we studied the role that task frequency plays in determining the extent of stimulus-driven task selection. In two experiments, we used the basic paradigm adapted from Arrington (Memory & Cognition, 38, 991-997, 2008), in which the effect of stimulus availability serves as a marker of stimulus-driven task selection. A number and letter appeared on each trial with varying stimulus onset asynchronies, and participants performed either a consonant/vowel or an even/odd judgment. In Experiment 1, participants were instructed as to the relative frequency with which each task was to be performed (i.e., 50/50, 60/40, or 75/25) and were further instructed to make their transitions between tasks unpredictable. In Experiment 2, participants were given no instructions about how to select tasks, resulting in naturally occurring variation in task frequency. With both instructed (Exp. 1) and naturally occurring (Exp. 2) relative task frequencies, the less frequently performed task showed a greater effect of stimulus availability on task selection, suggestive of a larger influence of stimulus-driven mechanisms during task performance for the less frequent task. When goal-directed mechanisms of task choice are engaged less frequently, the relative influence of the stimulus environment increases.

  14. Life science payload definition and integration study, task C and D. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Research equipment requirements were based on the Mini-7 and Mini-30 laboratory concepts defined in Tasks A and B of the intial LSPD contract. Modified versions of these laboratories and the research equipment within them were to be used in three missions of Shuttle/Sortie Module. These were designated (1) the shared 7-day laboratory (a mission with the life sciences laboratory sharing the sortie module with another scientific laboratory), (2) the dedicated 7-day laboratory (full use of the sortie module), and (3) the dedicated 30-day laboratory (full sortie module use with a 30-day mission duration). In defining the research equipment requirements of these laboratories, the equipment was grouped according to its function, and equipment unit data packages were prepared.

  15. "How To" Videos Improve Residents Performance of Essential Perioperative Electronic Medical Records and Clinical Tasks.

    PubMed

    Zoghbi, Veronica; Caskey, Robert C; Dumon, Kristoffel R; Soegaard Ballester, Jacqueline M; Brooks, Ari D; Morris, Jon B; Dempsey, Daniel T

    The ability to use electronic medical records (EMR) is an essential skill for surgical residents. However, frustration and anxiety surrounding EMR tasks may detract from clinical performance. We created a series of brief, 1-3 minutes "how to" videos demonstrating 7 key perioperative EMR tasks: booking OR cases, placing preprocedure orders, ordering negative-pressure wound dressing supplies, updating day-of-surgery history and physical notes, writing brief operative notes, discharging patients from the postanesthesia care unit, and checking vital signs. Additionally, we used "Cutting Insights"-a locally developed responsive mobile application for surgical trainee education-as a platform for providing interns with easy access to these videos. We hypothesized that exposure to these videos would lead to increased resident efficiency and confidence in performing essential perioperative tasks, ultimately leading to improved clinical performance. Eleven surgery interns participated in this initiative. Before watching the "how to" videos, each intern was timed performing the aforementioned 7 key perioperative EMR tasks. They also underwent a simulated perioperative emergency requiring the performance of 3 of these EMR tasks in conjunction with 5 other required interventions (including notifying the chief resident, the anesthesia team, and the OR coordinator; and ordering fluid boluses, appropriate laboratories, and blood products). These simulations were scored on a scale from 0 to 8. The interns were then directed to watch the videos. Two days later, their times for performing the 7 tasks and their scores for a similar perioperative emergency simulation were once again recorded. Before and after watching the videos, participants were surveyed to assess their confidence in performing each EMR task using a 5-point Likert scale. We also elicited their opinions of the videos and web-based mobile application using a 5-point scale. Statistical analyses to assess for

  16. Effects of task structure on category priming in patients with Parkinson's disease and in healthy individuals.

    PubMed

    Brown, Gregory G; Brown, Sandra J; Christenson, Gina; Williams, Rebecca E; Kindermann, Sandra S; Loftis, Christopher; Olsen, Ryan; Siple, Patricia; Shults, Clifford; Gorell, Jay M

    2002-05-01

    Lexical decision tasks have been used to study both shifts of attention and semantic processing in Parkinson's Disease (PD). Whereas other laboratories have reported normal levels of semantic priming among PD patients, our laboratory has reported abnormally large levels. In this study, two experiments were performed to determine the influence of task structure on the extent of semantic priming during lexical decision-making and pronunciation tasks among PD patients and neurologically healthy controls. In Experiment 1, the effect of Prime Dominance (the ratio of category to neutral trials) on lexical decision-making was studied. Although equal numbers of word and nonword trials were presented, half of the PD patients and controls were studied under Category Prime Dominance (category : neutral prime ratio of 2:1) and half were studied under Neutral Prime Dominance (category : neutral prime ratio of 1:2). In Experiment 2, PD and control participants were studied on lexical decision-making and pronunciation tasks where twice as many words as nonword trials were presented, consistent with other studies from our laboratory. In Experiment 1, we found no group differences in the magnitude of priming and no effect of Prime Dominance. Moreover, the findings were similar in pattern and magnitude to results published by Neely (1977). In Experiment 2, we observed larger priming effects among PD patients than among controls, but only on the lexical decision (LD) task. These results support the hypothesis that abnormally large category-priming effects appear in LD studies of PD patients when the number of word trials exceeds the number of nonword trials. Furthermore, increased lexical priming in PD appears to be due to processes operating during the decision-making period that follows presentation of the lexical target.

  17. Stationary Engineering Laboratory Manual--2.

    ERIC Educational Resources Information Center

    Steingress, Frederick M.; Frost, Harold J.

    The Stationary Engineering Laboratory Manual 2 was designed for vocational/technical high school students who have received instruction in the basics of stationary engineering. It was developed for students who will be operating a live plant and who will be responsible for supplying steam for heating, cooking, and baking. Each lesson in the manual…

  18. Experimental investigation on gaseous emissions from the combustion of date palm residues in laboratory scale furnace.

    PubMed

    El may, Yassine; Jeguirim, Mejdi; Dorge, Sophie; Trouvé, Gwenaelle; Said, Rachid

    2013-03-01

    Emissions characteristics from the combustion of five date palm residues, DPR, (Date Palm Leaflets, Date Palm Rachis, Date Palm Trunk, Date Stones and fruitstalk prunings) in a laboratory scale furnace were investigated. Release of gaseous products such as CO2, CO, VOC, NOx and SO2 were measured at 600-800°C. The main goal was to analyze thermal behaviors and gaseous emissions in order to select the most convenient biofuel for an application in domestic boiler installations. Regards to biofuel characteristics, date stone have the highest energy density (11.4GJ/m(3)) and the lowest ash content (close to 1.2%). Combustion tests show that among the tested date palm residues, date stone may be the promising biofuel for the design of combustion processing system. However, a special attention to the design of the secondary air supply should be given to prevent high emissions of CO and volatile matters. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Using Prewriting Tasks in L2 Writing Classes: Insights from Three Experiments

    ERIC Educational Resources Information Center

    McDonough, Kim; Neumann, Heike

    2014-01-01

    Even though collaborative prewriting tasks are frequently used in second language (L2) writing classes (Fernández Dobao, 2012; Storch, 2005), they have not been as widely researched as other tasks, such as collaborative writing and peer review. This article examines the effectiveness of collaborative prewriting tasks at encouraging English for…

  20. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments

    PubMed Central

    Litton, Charles D.; Perera, Inoka E.; Harteis, Samuel P.; Teacoach, Kara A.; DeRosa, Maria I.; Thomas, Richard A.; Smith, Alex C.

    2018-01-01

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments. PMID:29599565

  1. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments.

    PubMed

    Litton, Charles D; Perera, Inoka E; Harteis, Samuel P; Teacoach, Kara A; DeRosa, Maria I; Thomas, Richard A; Smith, Alex C

    2018-04-15

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments.

  2. FLARE: a New User Facility for Studies of Magnetic Reconnection Through Simultaneous, in-situ Measurements on MHD Scales, Ion Scales and Electron Scales

    NASA Astrophysics Data System (ADS)

    Ji, H.; Bhattacharjee, A.; Goodman, A.; Prager, S.; Daughton, W. S.; Cutler, R.; Fox, W.; Hoffmann, F.; Kalish, M.; Kozub, T.; Jara-Almonte, J.; Myers, C. E.; Ren, Y.; Sloboda, P.; Yamada, M.; Yoo, J.; Bale, S. D.; Carter, T.; Dorfman, S. E.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.

    2017-12-01

    The FLARE device (Facility for Laboratory Reconnection Experiments; flare.pppl.gov) is a new laboratory experiment under construction at Princeton for the studies of magnetic reconnection in the multiple X-line regimes directly relevant to space, solar, astrophysical, and fusion plasmas, as guided by a reconnection phase diagram [Ji & Daughton, (2011)]. The whole device has been successfully assembled with rough leak check completed. The first plasmas are expected in the fall to winter. The main diagnostic is an extensive set of magnetic probe arrays to cover multiple scales from local electron scales ( ˜2 mm), to intermediate ion scales ( ˜10 cm), and global MHD scales ( ˜1 m), simultaneously providing in-situ measurements over all these relevant scales. By using these laboratory data, not only the detailed spatial profiles around each reconnecting X-line are available for direct comparisons with spacecraft data, but also the global conditions and consequences of magnetic reconnection, which are often difficult to quantify in space, can be controlled or studied systematically. The planned procedures and example topics as a user facility will be discussed in detail.

  3. Simulation of the 3-D Evolution of Electron Scale Magnetic Reconnection - Motivated by Laboratory Experiments Predictions for MMS

    NASA Astrophysics Data System (ADS)

    Buechner, J.; Jain, N.; Sharma, A.

    2013-12-01

    The four s/c of the Magnetospheric Multiscale (MMS) mission, to be launched in 2014, will use the Earth's magnetosphere as a laboratory to study the microphysics of three fundamental plasma processes. One of them is magnetic reconnection, an essentially multi-scale process. While laboratory experiments and past theoretical investigations have shown that important processes necessary to understand magnetic reconnection take place at electron scales the MMS mission for the first time will be able to resolve these scales by in space observations. For the measurement strategy of MMS it is important to make specific predictions of the behavior of current sheets with a thickness of the order of the electron skin depth which play an important role in the evolution of collisionless magnetic reconnection. Since these processes are highly nonlinear and non-local numerical simulation is needed to specify the current sheet evolution. Here we present new results about the nonlinear evolution of electron-scale current sheets starting from the linear stage and using 3-D electron-magnetohydrodynamic (EMHD) simulations. The growth rates of the simulated instabilities compared well with the growth rates obtained from linear theory. Mechanisms and conditions of the formation of flux ropes and of current filamentation will be discussed in comparison with the results of fully kinetic simulations. In 3D the X- and O-point configurations of the magnetic field formed in reconnection planes alternate along the out-of-reconnection-plane direction with the wavelength of the unstable mode. In the presence of multiple reconnection sites, the out-of-plane magnetic field can develop nested structure of quadrupoles in reconnection planes, similar to the 2-D case, but now with variations in the out-of-plane direction. The structures of the electron flow and magnetic field in 3-D simulations will be compared with those in 2-D simulations to discriminate the essentially 3D features. We also discuss

  4. Evaluation of Surface Runoff Generation Processes Using a Rainfall Simulator: A Small Scale Laboratory Experiment

    NASA Astrophysics Data System (ADS)

    Danáčová, Michaela; Valent, Peter; Výleta, Roman

    2017-12-01

    Nowadays, rainfall simulators are being used by many researchers in field or laboratory experiments. The main objective of most of these experiments is to better understand the underlying runoff generation processes, and to use the results in the process of calibration and validation of hydrological models. Many research groups have assembled their own rainfall simulators, which comply with their understanding of rainfall processes, and the requirements of their experiments. Most often, the existing rainfall simulators differ mainly in the size of the irrigated area, and the way they generate rain drops. They can be characterized by the accuracy, with which they produce a rainfall of a given intensity, the size of the irrigated area, and the rain drop generating mechanism. Rainfall simulation experiments can provide valuable information about the genesis of surface runoff, infiltration of water into soil and rainfall erodibility. Apart from the impact of physical properties of soil, its moisture and compaction on the generation of surface runoff and the amount of eroded particles, some studies also investigate the impact of vegetation cover of the whole area of interest. In this study, the rainfall simulator was used to simulate the impact of the slope gradient of the irrigated area on the amount of generated runoff and sediment yield. In order to eliminate the impact of external factors and to improve the reproducibility of the initial conditions, the experiments were conducted in laboratory conditions. The laboratory experiments were carried out using a commercial rainfall simulator, which was connected to an external peristaltic pump. The pump maintained a constant and adjustable inflow of water, which enabled to overcome the maximum volume of simulated precipitation of 2.3 l, given by the construction of the rainfall simulator, while maintaining constant characteristics of the simulated precipitation. In this study a 12-minute rainfall with a constant intensity

  5. Measuring preschool learning engagement in the laboratory.

    PubMed

    Halliday, Simone E; Calkins, Susan D; Leerkes, Esther M

    2018-03-01

    Learning engagement is a critical factor for academic achievement and successful school transitioning. However, current methods of assessing learning engagement in young children are limited to teacher report or classroom observation, which may limit the types of research questions one could assess about this construct. The current study investigated the validity of a novel assessment designed to measure behavioral learning engagement among young children in a standardized laboratory setting and examined how learning engagement in the laboratory relates to future classroom adjustment. Preschool-aged children (N = 278) participated in a learning-based Tangrams task and Story sequencing task and were observed based on seven behavioral indicators of engagement. Confirmatory factor analysis supported the construct validity for a behavioral engagement factor composed of six of the original behavioral indicators: attention to instructions, on-task behavior, enthusiasm/energy, persistence, monitoring progress/strategy use, and negative affect. Concurrent validity for this behavioral engagement factor was established through its associations with parent-reported mastery motivation and pre-academic skills in math and literacy measured in the laboratory, and predictive validity was demonstrated through its associations with teacher-reported classroom learning behaviors and performance in math and reading in kindergarten. These associations were found when behavioral engagement was observed during both the nonverbal task and the verbal story sequencing tasks and persisted even after controlling for child minority status, gender, and maternal education. Learning engagement in preschool appears to be successfully measurable in a laboratory setting. This finding has implications for future research on the mechanisms that support successful academic development. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Reducing the language content in ToM tests: A developmental scale.

    PubMed

    Burnel, Morgane; Perrone-Bertolotti, Marcela; Reboul, Anne; Baciu, Monica; Durrleman, Stephanie

    2018-02-01

    The goal of the current study was to statistically evaluate the reliable scalability of a set of tasks designed to assess Theory of Mind (ToM) without language as a confounding variable. This tool might be useful to study ToM in populations where language is impaired or to study links between language and ToM. Low verbal versions of the ToM tasks proposed by Wellman and Liu (2004) for their scale were tested in 234 children (2.5 years to 11.9 years). Results showed that 5 of the tasks formed a scale according to both Guttman and Rasch models whereas all 6 tasks could form a scale according to the Rasch model only. The main difference from the original scale was that the Explicit False Belief task could be included whereas the Knowledge Access (KA) task could not. The authors argue that the more verbal version of the KA task administered in previous studies could have measured language understanding rather than ToM. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. The Effect of Tool Handle Shape on Hand Muscle Load and Pinch Force in a Simulated Dental Scaling Task

    PubMed Central

    Dong, Hui; Loomer, Peter; Barr, Alan; LaRoche, Charles; Young, Ed; Rempel, David

    2007-01-01

    Work-related upper extremity musculoskeletal disorders, including carpal tunnel syndrome, are prevalent among dentists and dental hygienists. An important risk factor for developing these disorders is forceful pinching which occurs during periodontal work such as dental scaling. Ergonomically designed dental scaling instruments may help reduce the prevalence of carpal tunnel syndrome among dental practitioners. In this study, 8 custom-designed dental scaling instruments with different handle shapes were used by 24 dentists and dental hygienists to perform a simulated tooth scaling task. The muscle activity of two extensors and two flexors in the forearm was recorded with electromyography while thumb pinch force was measured by pressure sensors. The results demonstrated that the instrument handle with a tapered, round shape and a 10 mm diameter required the least muscle load and pinch force when performing simulated periodontal work. The results from this study can guide dentists and dental hygienists in selection of dental scaling instruments. PMID:17156742

  8. An Integrated Knowledge Framework to Characterize and Scaffold Size and Scale Cognition (FS2C)

    NASA Astrophysics Data System (ADS)

    Magana, Alejandra J.; Brophy, Sean P.; Bryan, Lynn A.

    2012-09-01

    Size and scale cognition is a critical ability associated with reasoning with concepts in different disciplines of science, technology, engineering, and mathematics. As such, researchers and educators have identified the need for young learners and their educators to become scale-literate. Informed by developmental psychology literature and recent findings in nanoscale science and engineering education, we propose an integrated knowledge framework for characterizing and scaffolding size and scale cognition called the FS2C framework. Five ad hoc assessment tasks were designed informed by the FS2C framework with the goal of identifying participants' understandings of size and scale. Findings identified participants' difficulties to discern different sizes of microscale and nanoscale objects and a low level of sophistication on identifying scale worlds among participants. Results also identified that as bigger the difference between the sizes of the objects is, the more difficult was for participants to identify how many times an object is bigger or smaller than another one. Similarly, participants showed difficulties to estimate approximate sizes of sub-macroscopic objects as well as a difficulty for participants to estimate the size of very large objects. Participants' accurate location of objects on a logarithmic scale was also challenging.

  9. The Relationship between Rating Scales Used to Evaluate Tasks from Task Inventories for Licensure and Certification Examinations

    ERIC Educational Resources Information Center

    Cadle, Adrienne Woodley

    2012-01-01

    The first step in developing or updating a licensure or certification examination is to conduct a job or task analysis. Following completion of the job analysis, a survey validation study is performed to validate the results of the job analysis and to obtain task ratings so that an examination blueprint may be created. Psychometricians and job…

  10. Vertex evoked potentials in a rating-scale detection task: Relation to signal probability

    NASA Technical Reports Server (NTRS)

    Squires, K. C.; Squires, N. K.; Hillyard, S. A.

    1974-01-01

    Vertex evoked potentials were recorded from human subjects performing in an auditory detection task with rating scale responses. Three values of a priori probability of signal presentation were tested. The amplitudes of the N1 and P3 components of the vertex potential associated with correct detections of the signal were found to be systematically related to the strictness of the response criterion and independent of variations in a priori signal probability. No similar evoked potential components were found associated with signal absent judgements (misses and correct rejections) regardless of the confidence level of the judgement or signal probability. These results strongly support the contention that the form of the vertex evoked response is closely correlated with the subject's psychophysical decision regarding the presence or absence of a threshold level signal.

  11. Italian version of the task and ego orientation in sport questionnaire.

    PubMed

    Bortoli, Laura; Robazza, Claudio

    2005-02-01

    The 1992 Task and Ego Orientation in Sport Questionnaire developed by Duda and Nicholls was translated into Italian and administered to 802 young athletes, 248 girls and 554 boys aged 8 to 14 years, drawn from a range of individual and team sports, to examine its factor structure. Data sets of a calibration sample (boys 12-14 years) and of four cross-validation samples (boys 8-11 years, girls 8-11 years, boys 12-14 years, and girls 12-14 years) were subjected to confirmatory factor analysis specifying, as in the original questionnaire, an Ego Orientation scale (6 items) and a Task Orientation scale (7 items). Results across sex and age yielded chi2/df ratios ranging from 1.95 to 3.57, GFI indices above .90, AGFI indices ranging from .90 to .92, and RMSEA values not above .10. Findings provided acceptable support for the two-dimension structure of the test. In the whole sample, the Ego factor accounted for the 27.2% of variance and the Task factor accounted for the 33.5% of variance. Acceptable internal consistency of the two scales was also shown, with Cronbach alpha values ranging from .73 to .85.

  12. Epistemic Stance in Spoken L2 English: The Effect of Task and Speaker Style

    ERIC Educational Resources Information Center

    Gablasova, Dana; Brezina, Vaclav; Mcenery, Tony; Boyd, Elaine

    2017-01-01

    The article discusses epistemic stance in spoken L2 production. Using a subset of the Trinity Lancaster Corpus of spoken L2 production, we analysed the speech of 132 advanced L2 speakers from different L1 and cultural backgrounds taking part in four speaking tasks: one largely monologic presentation task and three interactive tasks. The study…

  13. Effects of n-dominance and group composition on task efficiency in laboratory triads.

    NASA Technical Reports Server (NTRS)

    Lampkin, E. C.

    1972-01-01

    Task-oriented triads were formed into various homogeneous and heterogeneous combinations according to their scores on the n-dominance personality trait of the Edwards Personal Preference Schedule. Five group categories were used. The group task required a consensus decision on each trial. High cooperation and interdependence were reinforced by partially restricting the communication network. Results showed heterogeneous groups significantly better at organizing their group communication processes. They consequently performed the task more efficiently than homogeneous triads.

  14. Simulating flow in karst aquifers at laboratory and sub-regional scales using MODFLOW-CFP

    NASA Astrophysics Data System (ADS)

    Gallegos, Josue Jacob; Hu, Bill X.; Davis, Hal

    2013-12-01

    Groundwater flow in a well-developed karst aquifer dominantly occurs through bedding planes, fractures, conduits, and caves created by and/or enlarged by dissolution. Conventional groundwater modeling methods assume that groundwater flow is described by Darcian principles where primary porosity (i.e. matrix porosity) and laminar flow are dominant. However, in well-developed karst aquifers, the assumption of Darcian flow can be questionable. While Darcian flow generally occurs in the matrix portion of the karst aquifer, flow through conduits can be non-laminar where the relation between specific discharge and hydraulic gradient is non-linear. MODFLOW-CFP is a relatively new modeling program that accounts for non-laminar and laminar flow in pipes, like karst caves, within an aquifer. In this study, results from MODFLOW-CFP are compared to those from MODFLOW-2000/2005, a numerical code based on Darcy's law, to evaluate the accuracy that CFP can achieve when modeling flows in karst aquifers at laboratory and sub-regional (Woodville Karst Plain, Florida, USA) scales. In comparison with laboratory experiments, simulation results by MODFLOW-CFP are more accurate than MODFLOW 2005. At the sub-regional scale, MODFLOW-CFP was more accurate than MODFLOW-2000 for simulating field measurements of peak flow at one spring and total discharges at two springs for an observed storm event.

  15. Task-driven dictionary learning.

    PubMed

    Mairal, Julien; Bach, Francis; Ponce, Jean

    2012-04-01

    Modeling data with linear combinations of a few elements from a learned dictionary has been the focus of much recent research in machine learning, neuroscience, and signal processing. For signals such as natural images that admit such sparse representations, it is now well established that these models are well suited to restoration tasks. In this context, learning the dictionary amounts to solving a large-scale matrix factorization problem, which can be done efficiently with classical optimization tools. The same approach has also been used for learning features from data for other purposes, e.g., image classification, but tuning the dictionary in a supervised way for these tasks has proven to be more difficult. In this paper, we present a general formulation for supervised dictionary learning adapted to a wide variety of tasks, and present an efficient algorithm for solving the corresponding optimization problem. Experiments on handwritten digit classification, digital art identification, nonlinear inverse image problems, and compressed sensing demonstrate that our approach is effective in large-scale settings, and is well suited to supervised and semi-supervised classification, as well as regression tasks for data that admit sparse representations.

  16. Analyze the beta waves of electroencephalogram signals from young musicians and non-musicians in major scale working memory task.

    PubMed

    Hsu, Chien-Chang; Cheng, Ching-Wen; Chiu, Yi-Shiuan

    2017-02-15

    Electroencephalograms can record wave variations in any brain activity. Beta waves are produced when an external stimulus induces logical thinking, computation, and reasoning during consciousness. This work uses the beta wave of major scale working memory N-back tasks to analyze the differences between young musicians and non-musicians. After the feature analysis uses signal filtering, Hilbert-Huang transformation, and feature extraction methods to identify differences, k-means clustering algorithm are used to group them into different clusters. The results of feature analysis showed that beta waves significantly differ between young musicians and non-musicians from the low memory load of working memory task. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. The role of rehearsal in a novel call center-type task.

    PubMed

    Perham, Nick; Banbury, Simon

    2012-01-01

    Laboratory research has long demonstrated the disruptive effects of background sound to task performance yet the real-world implications of such effects are less well known. We report two experiments that demonstrate the importance of the role of rehearsal to a novel call center-type task. In Experiment 1, performance of a novel train timetable task-in which participants identified four train journeys following presentation of train journey information-was disrupted by realistic office noise. However, in Experiment 2, when the need for rehearsal was reduced by presenting the information and the timetable at the same time, no disruption occurred . Results are discussed in terms of interference-by-process and interference-by-content approaches to short-term memory.

  18. Novel laboratory simulations of astrophysical jets

    NASA Astrophysics Data System (ADS)

    Brady, Parrish Clawson

    This thesis was motivated by the promise that some physical aspects of astrophysical jets and collimation processes can be scaled to laboratory parameters through hydrodynamic scaling laws. The simulation of astrophysical jet phenomena with laser-produced plasmas was attractive because the laser- target interaction can inject energetic, repeatable plasma into an external environment. Novel laboratory simulations of astrophysical jets involved constructing and using the YOGA laser, giving a 1064 nm, 8 ns pulse laser with energies up to 3.7 + 0.2 J . Laser-produced plasmas were characterized using Schlieren, interferometry and ICCD photography for their use in simulating jet and magnetosphere physics. The evolution of the laser-produced plasma in various conditions was compared with self-similar solutions and HYADES computer simulations. Millimeter-scale magnetized collimated outflows were produced by a centimeter scale cylindrically symmetric electrode configuration triggered by a laser-produced plasma. A cavity with a flared nozzle surrounded the center electrode and the electrode ablation created supersonic uncollimated flows. This flow became collimated when the center electrode changed from an anodeto a cathode. The plasma jets were in axially directed permanent magnetic fields with strengths up to 5000 Gauss. The collimated magnetized jets were 0.1-0. 3 cm wide, up to 2.0 cm long, and had velocities of ~4.0 × 10 6 cm/s. The dynamics of the evolution of the jet were compared qualitatively and quantitatively with fluxtube simulations from Bellan's formulation [6] giving a calculated estimate of ~2.6 × 10 6 cm/s for jet evolution velocity and evidence for jet rotation. The density measured with interferometry was 1.9 ± 0.2 × 10 17 cm -3 compared with 2.1 × 10 16 cm -3 calculated with Bellan's pressure balance formulation. Kinks in the jet column were produced consistent with the Kruskal-Shafranov condition which allowed stable and symmetric jets to form with

  19. Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise

    PubMed Central

    Burge, Johannes

    2017-01-01

    Accuracy Maximization Analysis (AMA) is a recently developed Bayesian ideal observer method for task-specific dimensionality reduction. Given a training set of proximal stimuli (e.g. retinal images), a response noise model, and a cost function, AMA returns the filters (i.e. receptive fields) that extract the most useful stimulus features for estimating a user-specified latent variable from those stimuli. Here, we first contribute two technical advances that significantly reduce AMA’s compute time: we derive gradients of cost functions for which two popular estimators are appropriate, and we implement a stochastic gradient descent (AMA-SGD) routine for filter learning. Next, we show how the method can be used to simultaneously probe the impact on neural encoding of natural stimulus variability, the prior over the latent variable, noise power, and the choice of cost function. Then, we examine the geometry of AMA’s unique combination of properties that distinguish it from better-known statistical methods. Using binocular disparity estimation as a concrete test case, we develop insights that have general implications for understanding neural encoding and decoding in a broad class of fundamental sensory-perceptual tasks connected to the energy model. Specifically, we find that non-orthogonal (partially redundant) filters with scaled additive noise tend to outperform orthogonal filters with constant additive noise; non-orthogonal filters and scaled additive noise can interact to sculpt noise-induced stimulus encoding uncertainty to match task-irrelevant stimulus variability. Thus, we show that some properties of neural response thought to be biophysical nuisances can confer coding advantages to neural systems. Finally, we speculate that, if repurposed for the problem of neural systems identification, AMA may be able to overcome a fundamental limitation of standard subunit model estimation. As natural stimuli become more widely used in the study of psychophysical and

  20. 2. COLD FLOW LABORATORY, VIEW TOWARDS NORTH. Glenn L. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. COLD FLOW LABORATORY, VIEW TOWARDS NORTH. - Glenn L. Martin Company, Titan Missile Test Facilities, Cold Flow Laboratory Building B, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO

  1. Using grasping tasks to evaluate hand force coordination in children with hemiplegic cerebral palsy.

    PubMed

    Mackenzie, Samuel J; Getchell, Nancy; Modlesky, Christopher M; Miller, Freeman; Jaric, Slobodan

    2009-08-01

    Mackenzie SJ, Getchell N, Modlesky CM, Miller F, Jaric S. Using grasping tasks to evaluate hand force coordination in children with hemiplegic cerebral palsy. To assess force coordination in children with hemiplegic cerebral palsy (CP) using a device that allows for testing both unimanual and bimanual manipulation tasks performed under static and dynamic conditions. Nonequivalent groups design. University research laboratory for motor control. Six children with hemiplegic CP (age, mean +/- SD, 11.6+/-1.8 y) and 6 typically developing controls (11.6+/-1.6 y). Not applicable. Children performed simple lifting and force-matching static ramp tasks by way of both unimanual and bimanual pulling using a device that measures grip force (force acting perpendicularly at the digits-device contact area) and load force (tangential force). Main outcome measures were grip/load force ratios (grip force scaling) and correlation coefficients (force coupling). CP subjects showed significantly higher grip/load force ratios (P<.05) and slightly lower correlation coefficients than the control group, with more pronounced differences for most tasks when using their involved hand. For subjects with CP, switching from unimanual to bimanual conditions did not bring changes in scaling or coupling for the involved hand (P>.05). Compared with healthy children, the impaired hand function in the hemiplegic CP pediatric population could be reflected in excessive grip force that is also decoupled from ongoing changes in load force. Therefore, the bimanual grip load device used in this study could provide a sensitive measure of grip force coordination in CP, although nonmotor deficits should be taken into account when asking children to perform more complex tasks.

  2. Diffusion Activities in College Laboratory Manuals

    ERIC Educational Resources Information Center

    Tweedy, Maryanne E.; Hoese, William J.

    2005-01-01

    Many have called for reform of the science curriculum to incorporate the process of inquiry: this has been shown to improve student understanding of biological concepts. Laboratory activities provide excellent opportunities to incorporate inquiry in to the curriculum. This study used a modified version of the Laboratory Task Analysis Instrument…

  3. Simulated Laboratory in Digital Logic.

    ERIC Educational Resources Information Center

    Cleaver, Thomas G.

    Design of computer circuits used to be a pencil and paper task followed by laboratory tests, but logic circuit design can now be done in half the time as the engineer accesses a program which simulates the behavior of real digital circuits, and does all the wiring and testing on his computer screen. A simulated laboratory in digital logic has been…

  4. Chapter 1.1 Process Scale-Up of Cellulose Nanocrystal Production to 25 kg per Batch at the Forest Products Laboratory

    Treesearch

    Richard S. Reiner; Alan W. Rudie

    2013-01-01

    The Fiber and Chemical Sciences Research Work Unit at the Forest Products Laboratory began working out the preparation of cellulose nanocrystals in 2006, using the method of Dong, Revol, and Gray. Initial samples were provided to several scientists within the Forest Service. Continued requests for this material forced scale-up from the initial 20 g scale to kg...

  5. Separate Gating Mechanisms Mediate the Regulation of K2P Potassium Channel TASK-2 by Intra- and Extracellular pH*

    PubMed Central

    Niemeyer, María Isabel; Cid, L. Pablo; Peña-Münzenmayer, Gaspar; Sepúlveda, Francisco V.

    2010-01-01

    TASK-2 (KCNK5 or K2P5.1) is a background K+ channel that is opened by extracellular alkalinization and plays a role in renal bicarbonate reabsorption and central chemoreception. Here, we demonstrate that in addition to its regulation by extracellular protons (pHo) TASK-2 is gated open by intracellular alkalinization. The following pieces of evidence suggest that the gating process controlled by intracellular pH (pHi) is independent from that under the command of pHo. It was not possible to overcome closure by extracellular acidification by means of intracellular alkalinization. The mutant TASK-2-R224A that lacks sensitivity to pHo had normal pHi-dependent gating. Increasing extracellular K+ concentration acid shifts pHo activity curve of TASK-2 yet did not affect pHi gating of TASK-2. pHo modulation of TASK-2 is voltage-dependent, whereas pHi gating was not altered by membrane potential. These results suggest that pHo, which controls a selectivity filter external gate, and pHi act at different gating processes to open and close TASK-2 channels. We speculate that pHi regulates an inner gate. We demonstrate that neutralization of a lysine residue (Lys245) located at the C-terminal end of transmembrane domain 4 by mutation to alanine abolishes gating by pHi. We postulate that this lysine acts as an intracellular pH sensor as its mutation to histidine acid-shifts the pHi-dependence curve of TASK-2 as expected from its lower pKa. We conclude that intracellular pH, together with pHo, is a critical determinant of TASK-2 activity and therefore of its physiological function. PMID:20351106

  6. Drive Scaling of hohlraums heated with 2ω light

    NASA Astrophysics Data System (ADS)

    Oades, Kevin; Foster, John; Slark, Gary; Stevenson, Mark; Kauffman, Robert; Suter, Larry; Hinkel, Denise; Miller, Mike; Schneider, Marilyn; Springer, Paul

    2002-11-01

    We report on experiments using a single beam from the AWE?s HELEN laser to study scaling of hohlraum drive with hohlraum scale size. The hohlruams were heated with 400 J in a 1 ns square pulse with and without a phaseplate. The drive was measured using a PCD and an FRD. Scattered light was measured using a full aperture backscatter system. Drive is consistent with hohlraum scaling and LASNEX modeling using the absorbed laser energy. Bremsstrahlung from fast electrons and M-shell x-ray production were also measured. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  7. Developing Intentionality and L2 Classroom Task-Engagement

    ERIC Educational Resources Information Center

    Stelma, Juup

    2014-01-01

    This paper extends work on "intentionality", from philosophy, psychology and education to an exploration of learners' meaning-making in L2 classroom task-engagement. The paper draws on both phenomenological and folk-psychological perspectives on intentionality, and employs John R. Searle's intrinsic (mental) and derived (observable)…

  8. Inhibition in task switching: The reliability of the n - 2 repetition cost.

    PubMed

    Kowalczyk, Agnieszka W; Grange, James A

    2017-12-01

    The n - 2 repetition cost seen in task switching is the effect of slower response times performing a recently completed task (e.g. an ABA sequence) compared to performing a task that was not recently completed (e.g. a CBA sequence). This cost is thought to reflect cognitive inhibition of task representations and as such, the n - 2 repetition cost has begun to be used as an assessment of individual differences in inhibitory control; however, the reliability of this measure has not been investigated in a systematic manner. The current study addressed this important issue. Seventy-two participants performed three task switching paradigms; participants were also assessed on rumination traits and processing speed-measures of individual differences potentially modulating the n - 2 repetition cost. We found significant n - 2 repetition costs for each paradigm. However, split-half reliability tests revealed that this cost was not reliable at the individual-difference level. Neither rumination tendencies nor processing speed predicted this cost. We conclude that the n - 2 repetition cost is not reliable as a measure of individual differences in inhibitory control.

  9. Impact decapitation from laboratory to basin scales

    NASA Technical Reports Server (NTRS)

    Schultz, P. H.; Gault, D. E.

    1991-01-01

    Although vertical hypervelocity impacts result in the annihilation (melting/vaporization) of the projectile, oblique impacts (less than 15 deg) fundamentally change the partitioning of energy with fragments as large as 10 percent of the original projectile surviving. Laboratory experiments reveal that both ductile and brittle projectiles produce very similar results where limiting disruption depends on stresses proportional to the vertical velocity component. Failure of the projectile at laboratory impact velocities (6 km/s) is largely controlled by stresses established before the projectile has penetrated a significant distance into the target. The planetary surface record exhibits numerous examples of oblique impacts with evidence fir projectile failure and downrange sibling collisions.

  10. Dysautonomia rating scales in Parkinson's disease: sialorrhea, dysphagia, and constipation--critique and recommendations by movement disorders task force on rating scales for Parkinson's disease.

    PubMed

    Evatt, Marian L; Chaudhuri, K Ray; Chou, Kelvin L; Cubo, Ester; Hinson, Vanessa; Kompoliti, Katie; Yang, Chengwu; Poewe, Werner; Rascol, Olivier; Sampaio, Cristina; Stebbins, Glenn T; Goetz, Christopher G

    2009-04-15

    Upper and lower gastrointestinal dysautonomia symptoms (GIDS)--sialorrhea, dysphagia, and constipation are common in Parkinson's disease (PD) and often socially as well as physically disabling for patients. Available invasive quantitative measures for assessing these symptoms and their response to therapy are time-consuming, require specialized equipment, can cause patient discomfort and present patients with risk. The Movement Disorders Society commissioned a task force to assess available clinical rating scales, critique their clinimetric properties, and make recommendations regarding their clinical utility. Six clinical researchers and a biostatistician systematically searched the literature for scales of sialorrhea, dysphagia, and constipation, evaluated the scales' previous use, performance parameters, and quality of validation data (if available). A scale was designated "Recommended" if the scale was used in clinical studies beyond the group that developed it, has been specifically used in PD reports, and clinimetric studies have established that it is a valid, reliable, and sensitive. "Suggested" scales met at least part of the above criteria, but fell short of meeting all. Based on the systematic review, scales for individual symptoms of sialorrhea, dysphagia, and constipation were identified along with three global scales that include these symptoms in the context of assessing dysautonomia or nonmotor symptoms. Three sialorrhea scales met criteria for Suggested: Drooling Severity and Frequency Scale (DSFS), Drooling Rating Scale, and Sialorrhea Clinical Scale for PD (SCS-PD). Two dysphagia scales, the Swallowing Disturbance Questionnaire (SDQ) and Dysphagia-Specific Quality of Life (SWAL-QOL), met criteria for Suggested. Although Rome III constipation module is widely accepted in the gastroenterology community, and the earlier version from the Rome II criteria has been used in a single study of PD patients, neither met criteria for Suggested or Recommended

  11. Space debris measurement program at Phillips Laboratory

    NASA Technical Reports Server (NTRS)

    Dao, Phan D.; Mcnutt, Ross T.

    1992-01-01

    Ground-based optical sensing was identified as a technique for measuring space debris complementary to radar in the critical debris size range of 1 to 10 cm. The Phillips Laboratory is building a staring optical sensor for space debris measurement and considering search and track optical measurement at additional sites. The staring sensor is implemented in collaboration with Wright Laboratory using the 2.5 m telescope at Wright Patterson AFB, Dayton, Ohio. The search and track sensor is designed to detect and track orbital debris in tasked orbits. A progress report and a discussion of sensor performance and search and track strategies will be given.

  12. Plant and mycorrhizal weathering at the laboratory mesocosm scale

    NASA Astrophysics Data System (ADS)

    Andrews, M. Y.; Leake, J.; Banwart, S. A.; Beerling, D. J.

    2011-12-01

    The evolutionary development of large vascular land plants in the Paleozoic is hypothesized to have enhanced weathering of Ca and Mg silicate minerals. This plant-centric view overlooks the fact that plants and their associated mycorrhizal fungi co-evolved. Many weathering processes usually ascribed to plants may actually be driven by the combined activities of roots and mycorrhizal fungi. This study focuses on two key evolutionary events in plant and fungal evolution: 1) the transition from gymnosperm-only to mixed angiosperm-gymnosperm forests in the Mesozoic and 2) the similarly timed rise of ectomycorrhizal fungi (EM) in a previously arbuscular mycorrhizal (AM) only world. Here we present results from a novel mesocosm-scale laboratory experiment designed to allow investigation of plant- and mycorrhizae-driven carbon fluxes and mineral weathering at different soil depths, and under ambient (400 ppm) and elevated (1500 ppm) atmospheric CO2. To test our hypothesis that photosynthetic carbon flux from the plant to the roots and fungal partner drives biological weathering of minerals, we studied five mycorrhizal plant species: the gymnosperms Sequoia sempervirens (AM), Pinus sylvestris (EM) and Ginkgo biloba (AM), and two angiosperms, Magnolia grandiflora (AM) and Betula pendula (EM). This long term (7-9 months) experiment was grown in controlled environment chambers, with replicated systems at two atmospheric CO2 levels. Each mycorrhizal plant had access to isolated horizontal mesh cores containing crushed granite and basalt at three depths, in a compost:sand (50:50 vol:vol) bulk substrate, with appropriate plant-free and mineral-free controls. 14CO2 pulse-labeling provided a snapshot of the magnitude, timing, and allocation of carbon through the atmosphere-plant-fungi-soil system and also measured mycorrhizal fungal activity associated with the target granite and basalt. Total plant and fungal biomass were also assessed in relation to +/- mineral treatments and

  13. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  14. Managing laboratory automation.

    PubMed

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  15. Anaerobic Digestion of Laminaria japonica Waste from Industrial Production Residues in Laboratory- and Pilot-Scale

    PubMed Central

    Barbot, Yann Nicolas; Thomsen, Claudia; Thomsen, Laurenz; Benz, Roland

    2015-01-01

    The cultivation of macroalgae to supply the biofuel, pharmaceutical or food industries generates a considerable amount of organic residue, which represents a potential substrate for biomethanation. Its use optimizes the total resource exploitation by the simultaneous disposal of waste biomaterials. In this study, we explored the biochemical methane potential (BMP) and biomethane recovery of industrial Laminaria japonica waste (LJW) in batch, continuous laboratory and pilot-scale trials. Thermo-acidic pretreatment with industry-grade HCl or industrial flue gas condensate (FGC), as well as a co-digestion approach with maize silage (MS) did not improve the biomethane recovery. BMPs between 172 mL and 214 mL g−1 volatile solids (VS) were recorded. We proved the feasibility of long-term continuous anaerobic digestion with LJW as sole feedstock showing a steady biomethane production rate of 173 mL g−1 VS. The quality of fermentation residue was sufficient to serve as biofertilizer, with enriched amounts of potassium, sulfur and iron. We further demonstrated the upscaling feasibility of the process in a pilot-scale system where a CH4 recovery of 189 L kg−1 VS was achieved and a biogas composition of 55% CH4 and 38% CO2 was recorded. PMID:26393620

  16. SHC Project 3.63, Task 2, Beneficial Use of Waste Materials ...

    EPA Pesticide Factsheets

    SHC Project 3.63, Task 2, “Beneficial Use of Waste Materials”, is designed to conduct research and analyses to characterize and quantify the risks and benefits of using or reusing waste materials. There are 6 primary research areas in Task 2 that cover a broad spectrum of topics germane to the beneficial use of waste materials and address Agency, Office, Region and other client needs. The 6 research areas include: 1) Materials Recovery Technology, 2) Beneficial Use of Materials Optimization, 3) Novel Products from Waste Materials, 4) Land Application of Biosolids, 5) Soil Remediation Amendments and 6) Improved Leaching Methods for More Accurate Prediction of Environmental Release of Metals. The objectives of each research area, their intended products and progress to date will be presented. The products of this Task will enable communities and the Agency to better protect and enhance human health, well-being and the environment for current and future generations, through the reduction in material consumption, reuse, and recycling of materials. This presentation is designed to convey the rational, purpose and planned research in EPAs Safe and Healthy Communities (SHC) National Research Program Project 3.63 (Sustainable Materials Management) Task 2, “Beneficial Use of Waste Materials”, which is designed to conduct research and analyses to characterize and quantify the risks and benefits of using or reusing waste materials. . This presentation has bee

  17. Prospective memory in young and older adults: the effects of task importance and ongoing task load.

    PubMed

    Smith, Rebekah E; Hunt, R Reed

    2014-01-01

    Remembering to perform an action in the future, called prospective memory, often shows age-related differences in favor of young adults when tested in the laboratory. Recently Smith, Horn, and Bayen (2012; Aging, Neuropsychology, and Cognition, 19, 495) embedded a PM task in an ongoing color-matching task and manipulated the difficulty of the ongoing task by varying the number of colors on each trial of the task. Smith et al. found that age-related differences in PM performance (lower PM performance for older adults relative to young adults) persisted even when older adults could perform the ongoing task as well or better than the young adults. The current study investigates a possible explanation for the pattern of results reported by Smith et al. by including a manipulation of task emphasis: for half of the participants the prospective memory task was emphasize, while for the other half the ongoing color-matching task was emphasized. Older adults performed a 4-color version of the ongoing color-matching task, while young adults completed either the 4-color or a more difficult 6-color version of the ongoing task. Older adults failed to perform as well as the young adults on the prospective memory task regardless of task emphasis, even when older adults were performing as well or better than the young adults on the ongoing color-matching task. The current results indicate that the lack of an effect of ongoing task load on prospective memory task performance is not due to a perception that one or the other task is more important than the other.

  18. Pilot-scale evaluation of a novel TiO2-supported V2O5 catalyst for DeNOx at low temperatures at a waste incinerator.

    PubMed

    Jung, Hyounduk; Park, Eunseuk; Kim, Minsu; Jurng, Jongsoo

    2017-03-01

    The removal of NOx by catalytic technology at low temperatures is significant for treatment of flue gas in waste incineration plants, especially at temperatures below 200°C. A novel highly active TiO 2 -supported vanadium oxide catalyst at low temperatures (200-250°C) has been developed for the selective catalytic reduction (SCR) de-NOx process with ammonia. The catalyst was evaluated in a pilot-scale equipment, and the results were compared with those obtained in our previous work using laboratory scale (small volume test) equipment as well as bench-scale laboratory equipment. In the present work, we have performed our experiments in pilot scale equipment using a part of effluent flue gas that was obtained from flue gas cleaning equipment in a full-scale waste incineration plant in South Korea. Based on our previous work, we have prepared a TiO 2 -supported V 2 O 5 catalyst coated (with a loading of 7wt% of impregnated V 2 O 5 ) on a honeycomb cordierite monolith to remove NOx from a waste incinerator flue gas at low temperatures. The NOx (nitrogen oxides) removal efficiency of the SCR catalyst bed was measured in a catalyst fixed-bed reactor (flow rate: 100m 3 h -1 ) using real exhaust gas from the waste incinerator. The experimental results showed that the V 2 O 5 /TiO 2 SCR catalyst exhibited good DeNOx performance (over 98% conversion at an operating temperature of 300°C, 95% at 250°C, and 70% at 200°C), and was much better than the performance of commercial SCR catalysts (as low as 55% conversion at 250°C) under the same operating conditions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Analyzing Web pages visual scanpaths: between and within tasks variability.

    PubMed

    Drusch, Gautier; Bastien, J M Christian

    2012-01-01

    In this paper, we propose a new method for comparing scanpaths in a bottom-up approach, and a test of the scanpath theory. To do so, we conducted a laboratory experiment in which 113 participants were invited to accomplish a set of tasks on two different websites. For each site, they had to perform two tasks that had to be repeated ounce. The data were analyzed using a procedure similar to the one used by Duchowski et al. [8]. The first step was to automatically identify, then label, AOIs with the mean-shift clustering procedure [19]. Then, scanpaths were compared two by two with a modified version of the string-edit method, which take into account the order of AOIs visualizations [2]. Our results show that scanpaths variability between tasks but within participants seems to be lower than the variability within task for a given participant. In other words participants seem to be more coherent when they perform different tasks, than when they repeat the same tasks. In addition, participants view more of the same AOI when they perform a different task on the same Web page than when they repeated the same task. These results are quite different from what predicts the scanpath theory.

  20. Unsteady Aero Computation of a 1 1/2 Stage Large Scale Rotating Turbine

    NASA Technical Reports Server (NTRS)

    To, Wai-Ming

    2012-01-01

    This report is the documentation of the work performed for the Subsonic Rotary Wing Project under the NASA s Fundamental Aeronautics Program. It was funded through Task Number NNC10E420T under GESS-2 Contract NNC06BA07B in the period of 10/1/2010 to 8/31/2011. The objective of the task is to provide support for the development of variable speed power turbine technology through application of computational fluid dynamics analyses. This includes work elements in mesh generation, multistage URANS simulations, and post-processing of the simulation results for comparison with the experimental data. The unsteady CFD calculations were performed with the TURBO code running in multistage single passage (phase lag) mode. Meshes for the blade rows were generated with the NASA developed TCGRID code. The CFD performance is assessed and improvements are recommended for future research in this area. For that, the United Technologies Research Center's 1 1/2 stage Large Scale Rotating Turbine was selected to be the candidate engine configuration for this computational effort because of the completeness and availability of the data.

  1. Monitoring supports performance in a dual-task paradigm involving a risky decision-making task and a working memory task

    PubMed Central

    Gathmann, Bettina; Schiebener, Johannes; Wolf, Oliver T.; Brand, Matthias

    2015-01-01

    Performing two cognitively demanding tasks at the same time is known to decrease performance. The current study investigates the underlying executive functions of a dual-tasking situation involving the simultaneous performance of decision making under explicit risk and a working memory task. It is suggested that making a decision and performing a working memory task at the same time should particularly require monitoring—an executive control process supervising behavior and the state of processing on two tasks. To test the role of a supervisory/monitoring function in such a dual-tasking situation we investigated 122 participants with the Game of Dice Task plus 2-back task (GDT plus 2-back task). This dual task requires participants to make decisions under risk and to perform a 2-back working memory task at the same time. Furthermore, a task measuring a set of several executive functions gathered in the term concept formation (Modified Card Sorting Test, MCST) and the newly developed Balanced Switching Task (BST), measuring monitoring in particular, were used. The results demonstrate that concept formation and monitoring are involved in the simultaneous performance of decision making under risk and a working memory task. In particular, the mediation analysis revealed that BST performance partially mediates the influence of MCST performance on the GDT plus 2-back task. These findings suggest that monitoring is one important subfunction for superior performance in a dual-tasking situation including decision making under risk and a working memory task. PMID:25741308

  2. Laboratory meter-scale seismic monitoring of varying water levels in granular media

    NASA Astrophysics Data System (ADS)

    Pasquet, S.; Bodet, L.; Bergamo, P.; Guérin, R.; Martin, R.; Mourgues, R.; Tournat, V.

    2016-12-01

    Laboratory physical modelling and non-contacting ultrasonic techniques are frequently proposed to tackle theoretical and methodological issues related to geophysical prospecting. Following recent developments illustrating the ability of seismic methods to image spatial and/or temporal variations of water content in the vadose zone, we developed laboratory experiments aimed at testing the sensitivity of seismic measurements (i.e., pressure-wave travel times and surface-wave phase velocities) to water saturation variations. Ultrasonic techniques were used to simulate typical seismic acquisitions on small-scale controlled granular media presenting different water levels. Travel times and phase velocity measurements obtained at the dry state were validated with both theoretical models and numerical simulations and serve as reference datasets. The increasing water level clearly affects the recorded wave field in both its phase and amplitude, but the collected data cannot yet be inverted in the absence of a comprehensive theoretical model for such partially saturated and unconsolidated granular media. The differences in travel time and phase velocity observed between the dry and wet models show patterns that are interestingly coincident with the observed water level and depth of the capillary fringe, thus offering attractive perspectives for studying soil water content variations in the field.

  3. Alcohol impairment of performance on steering and discrete tasks in a driving simulator

    DOT National Transportation Integrated Search

    1974-12-01

    In this program a simplified laboratory simulator was developed to test two types of tasks used in driving on the open road: a continuous "steering task" to regulate against gust induced disturbances and an intermittent "discrete response task" requi...

  4. Robonaut 2 in the U.S. Laboratory

    NASA Image and Video Library

    2012-02-15

    ISS030-E-074059 (15 Feb. 2012) --- Robonaut 2, nicknamed R2, is pictured in the Destiny laboratory of the International Space Station while NASA astronaut Dan Burbank (mostly out of frame at left) uses a computer during R2?s initial checkouts. R2 later went on to make history with the first human/robotic handshake to be performed in space.

  5. Staffing benchmarks for histology laboratories.

    PubMed

    Buesa, René J

    2010-06-01

    This article summarizes annual workloads for staff positions and work flow productivity (WFP) values from 247 human pathology, 31 veterinary, and 35 forensic histology laboratories (histolabs). There are single summaries for veterinary and forensic histolabs, but the data from human pathology are divided into 2 groups because of statistically significant differences between those from Spain and 6 Hispano American countries (SpHA) and the rest from the United States and 17 other countries. The differences reflect the way the work is organized, but the histotechnicians and histotechnologists (histotechs) from SpHA have the same task productivity levels as those from any other country (Buesa RJ. Productivity standards for histology laboratories. [YADPA 50,552]). The information is also segregated by groups of histolabs with increasing workloads; this aspect also showed statistical differences. The information from human pathology histolabs other than those from SpHA were used to calculate staffing annual benchmarks for pathologists (from 3700 to 6500 cases depending on the histolab annual workload), pathology assistants (20,000 cases), staff histotechs (9900 blocks), cutting histotechs (15,000 blocks), histotechs doing special procedures (9500 slides if done manually or 15,000 slides with autostainers), dieners (100 autopsies), laboratory aides and transcriptionists (15,000 cases each), and secretaries (20,000 cases). There are also recommendations about workload limits for supervisory staff (lead techs and supervisors) and when neither is required. Each benchmark was related with the productivity of the different tasks they include (Buesa RJ. Productivity standards for histology laboratories. [YADPA 50,552]) to calculate the hours per year required to complete them. The relationship between workload and benchmarks allows the director of pathology to determine the staff needed for the efficient operation of the histolab.

  6. CORRELATIONS BETWEEN HOMOLOGUE CONCENTRATIONS OF PCDD/FS AND TOXIC EQUIVALENCY VALUES IN LABORATORY-, PACKAGE BOILER-, AND FIELD-SCALE INCINERATORS

    EPA Science Inventory

    The toxic equivalency (TEQ) values of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs) are predicted with a model based on the homologue concentrations measured from a laboratory-scale reactor (124 data points), a package boiler (61 data points), and ...

  7. The biomechanical demands of manual scaling on the shoulders & neck of dental hygienists.

    PubMed

    La Delfa, Nicholas J; Grondin, Diane E; Cox, Jocelyn; Potvin, Jim R; Howarth, Samuel J

    2017-01-01

    The purpose of this study was to evaluate the postural and muscular demands placed on the shoulders and neck of dental hygienists when performing a simulated manual scaling task. Nineteen healthy female dental hygienists performed 30-min of simulated manual scaling on a manikin head in a laboratory setting. Surface electromyography was used to monitor muscle activity from several neck and shoulder muscles, and neck and arm elevation kinematics were evaluated using motion capture. The simulated scaling task resulted in a large range of neck and arm elevation angles and excessive low-level muscular demands in the neck extensor and scapular stabilising muscles. The physical demands varied depending on the working position of the hygienists relative to the manikin head. These findings are valuable in guiding future ergonomics interventions aimed at reducing the physical exposures of dental hygiene work. Practitioner Summary: Given that this study evaluates the physical demands of manual scaling, a procedure that is fundamental to dental hygiene work, the findings are valuable to identify ergonomics interventions to reduce the prevalence of work-related injuries, disability and the potential for early retirement among this occupational group.

  8. Task Preference, Affective Response, and Engagement in L2 Use in a US University Context

    ERIC Educational Resources Information Center

    Phung, Linh

    2017-01-01

    While learners' engagement has been recognized as important for second language (L2) learning in task-based language teaching (TBLT), how engagement is manifest in learners' L2 use during task performance and how tasks can be designed to facilitate better engagement have not received enough attention in the L2 research. This study investigates the…

  9. Reproducibility of risk figures in 2nd-trimester maternal serum screening for down syndrome: comparison of 2 laboratories.

    PubMed

    Benn, Peter A; Makowski, Gregory S; Egan, James F X; Wright, Dave

    2006-11-01

    Analytical error affects 2nd-trimester maternal serum screening for Down syndrome risk estimation. We analyzed the between-laboratory reproducibility of risk estimates from 2 laboratories. Laboratory 1 used Bayer ACS180 immunoassays for alpha-fetoprotein (AFP) and human chorionic gonadotropin (hCG), Diagnostic Systems Laboratories (DSL) RIA for unconjugated estriol (uE3), and DSL enzyme immunoassay for inhibin-A (INH-A). Laboratory 2 used Beckman immunoassays for AFP, hCG, and uE3, and DSL enzyme immunoassay for INH-A. Analyte medians were separately established for each laboratory. We used the same computational algorithm for all risk calculations, and we used Monte Carlo methods for computer modeling. For 462 samples tested, risk figures from the 2 laboratories differed >2-fold for 44.7%, >5-fold for 7.1%, and >10-fold for 1.7%. Between-laboratory differences in analytes were greatest for uE3 and INH-A. The screen-positive rates were 9.3% for laboratory 1 and 11.5% for laboratory 2, with a significant difference in the patients identified as screen-positive vs screen-negative (McNemar test, P<0.001). Computer modeling confirmed the large between-laboratory risk differences. Differences in performance of assays and laboratory procedures can have a large effect on patient-specific risks. Screening laboratories should minimize test imprecision and ensure that each assay performs in a manner similar to that assumed in the risk computational algorithm.

  10. Effects of Task Complexity on L2 Writing Behaviors and Linguistic Complexity

    ERIC Educational Resources Information Center

    Révész, Andrea; Kourtali, Nektaria-Efstathia; Mazgutova, Diana

    2017-01-01

    This study investigated whether task complexity influences second language (L2) writers' fluency, pausing, and revision behaviors and the cognitive processes underlying these behaviors; whether task complexity affects linguistic complexity of written output; and whether relationships between writing behaviors and linguistic complexity are…

  11. Direct geoelectrical evidence of mass transfer at the laboratory scale

    NASA Astrophysics Data System (ADS)

    Swanson, Ryan D.; Singha, Kamini; Day-Lewis, Frederick D.; Binley, Andrew; Keating, Kristina; Haggerty, Roy

    2012-10-01

    Previous field-scale experimental data and numerical modeling suggest that the dual-domain mass transfer (DDMT) of electrolytic tracers has an observable geoelectrical signature. Here we present controlled laboratory experiments confirming the electrical signature of DDMT and demonstrate the use of time-lapse electrical measurements in conjunction with concentration measurements to estimate the parameters controlling DDMT, i.e., the mobile and immobile porosity and rate at which solute exchanges between mobile and immobile domains. We conducted column tracer tests on unconsolidated quartz sand and a material with a high secondary porosity: the zeolite clinoptilolite. During NaCl tracer tests we collected nearly colocated bulk direct-current electrical conductivity (σb) and fluid conductivity (σf) measurements. Our results for the zeolite show (1) extensive tailing and (2) a hysteretic relation between σf and σb, thus providing evidence of mass transfer not observed within the quartz sand. To identify best-fit parameters and evaluate parameter sensitivity, we performed over 2700 simulations of σf, varying the immobile and mobile domain and mass transfer rate. We emphasized the fit to late-time tailing by minimizing the Box-Cox power transformed root-mean square error between the observed and simulated σf. Low-field proton nuclear magnetic resonance (NMR) measurements provide an independent quantification of the volumes of the mobile and immobile domains. The best-fit parameters based on σf match the NMR measurements of the immobile and mobile domain porosities and provide the first direct electrical evidence for DDMT. Our results underscore the potential of using electrical measurements for DDMT parameter inference.

  12. Direct geoelectrical evidence of mass transfer at the laboratory scale

    USGS Publications Warehouse

    Swanson, Ryan D.; Singha, Kamini; Day-Lewis, Frederick D.; Binley, Andrew; Keating, Kristina; Haggerty, Roy

    2012-01-01

    Previous field-scale experimental data and numerical modeling suggest that the dual-domain mass transfer (DDMT) of electrolytic tracers has an observable geoelectrical signature. Here we present controlled laboratory experiments confirming the electrical signature of DDMT and demonstrate the use of time-lapse electrical measurements in conjunction with concentration measurements to estimate the parameters controlling DDMT, i.e., the mobile and immobile porosity and rate at which solute exchanges between mobile and immobile domains. We conducted column tracer tests on unconsolidated quartz sand and a material with a high secondary porosity: the zeolite clinoptilolite. During NaCl tracer tests we collected nearly colocated bulk direct-current electrical conductivity (σb) and fluid conductivity (σf) measurements. Our results for the zeolite show (1) extensive tailing and (2) a hysteretic relation between σf and σb, thus providing evidence of mass transfer not observed within the quartz sand. To identify best-fit parameters and evaluate parameter sensitivity, we performed over 2700 simulations of σf, varying the immobile and mobile domain and mass transfer rate. We emphasized the fit to late-time tailing by minimizing the Box-Cox power transformed root-mean square error between the observed and simulated σf. Low-field proton nuclear magnetic resonance (NMR) measurements provide an independent quantification of the volumes of the mobile and immobile domains. The best-fit parameters based on σf match the NMR measurements of the immobile and mobile domain porosities and provide the first direct electrical evidence for DDMT. Our results underscore the potential of using electrical measurements for DDMT parameter inference.

  13. Activated carbon enhanced anaerobic digestion of food waste - Laboratory-scale and Pilot-scale operation.

    PubMed

    Zhang, Le; Zhang, Jingxin; Loh, Kai-Chee

    2018-05-01

    Effects of activated carbon (AC) supplementation on anaerobic digestion (AD) of food waste were elucidated in lab- and pilot-scales. Lab-scale AD was performed in 1 L and 8 L digesters, while pilot-scale AD was conducted in a 1000 L digester. Based on the optimal dose of 15 g AC per working volume derived from the 1 L digester, for the same AC dosage in the 8 L digester, an improved operation stability coupled with a higher methane yield was achieved even when digesters without AC supplementation failed after 59 days due to accumulation of substantial organic intermediates. At the same time, color removal from the liquid phase of the digestate was dramatically enhanced and the particle size of the digestate solids was increased by 53% through AC supplementation after running for 59 days. Pyrosequencing of 16S rRNA gene showed the abundance of predominant phyla Firmicutes, Elusimicrobia and Proteobacteria selectively enhanced by 1.7-fold, 2.9-fold and 2.1-fold, respectively. Pilot-scale digester without AC gave an average methane yield of 0.466 L⋅(gVS) -1 ⋅d -1 at a composition of 53-61% v/v methane. With AC augmentation, an increase of 41% in methane yield was achieved in the 1000 L digester under optimal organic loading rate (1.6 g VS FW ·L -1 ·d -1 ). Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Behavioral Economic Laboratory Research in Tobacco Regulatory Science.

    PubMed

    Tidey, Jennifer W; Cassidy, Rachel N; Miller, Mollie E; Smith, Tracy T

    2016-10-01

    Research that can provide a scientific foundation for the United States Food and Drug Administration (FDA) tobacco policy decisions is needed to inform tobacco regulatory policy. One factor that affects the impact of a tobacco product on public health is its intensity of use, which is determined, in part, by its abuse liability or reinforcing efficacy. Behavioral economic tasks have considerable utility for assessing the reinforcing efficacy of current and emerging tobacco products. This paper provides a narrative review of several behavioral economic laboratory tasks and identifies important applications to tobacco regulatory science. Behavioral economic laboratory assessments, including operant self-administration, choice tasks and purchase tasks, can be used generate behavioral economic data on the effect of price and other constraints on tobacco product consumption. These tasks could provide an expedited simulation of the effects of various tobacco control policies across populations of interest to the FDA. Tobacco regulatory research questions that can be addressed with behavioral economic tasks include assessments of the impact of product characteristics on product demand, assessments of the abuse liability of novel and potential modified risk tobacco products (MRTPs), and assessments of the impact of conventional and novel products in vulnerable populations.

  15. Task inhibition, conflict, and the n-2 repetition cost: A combined computational and empirical approach.

    PubMed

    Sexton, Nicholas J; Cooper, Richard P

    2017-05-01

    Task inhibition (also known as backward inhibition) is an hypothesised form of cognitive inhibition evident in multi-task situations, with the role of facilitating switching between multiple, competing tasks. This article presents a novel cognitive computational model of a backward inhibition mechanism. By combining aspects of previous cognitive models in task switching and conflict monitoring, the model instantiates the theoretical proposal that backward inhibition is the direct result of conflict between multiple task representations. In a first simulation, we demonstrate that the model produces two effects widely observed in the empirical literature, specifically, reaction time costs for both (n-1) task switches and n-2 task repeats. Through a systematic search of parameter space, we demonstrate that these effects are a general property of the model's theoretical content, and not specific parameter settings. We further demonstrate that the model captures previously reported empirical effects of inter-trial interval on n-2 switch costs. A final simulation extends the paradigm of switching between tasks of asymmetric difficulty to three tasks, and generates novel predictions for n-2 repetition costs. Specifically, the model predicts that n-2 repetition costs associated with hard-easy-hard alternations are greater than for easy-hard-easy alternations. Finally, we report two behavioural experiments testing this hypothesis, with results consistent with the model predictions. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Biological treatment of whey by Tetrahymena pyriformis and impact study on laboratory-scale wastewater lagoon process.

    PubMed

    Bonnet, J L; Bogaerts, P; Bohatier, J

    1999-06-01

    A procedure based on a biological treatment of whey was tested as part of research on waste treatment at the scale of small cheesemaking units. We studied the potential biodegradation of whey by a protozoan ciliate, Tetrahymena pyriformis, and evaluated the functional, microbiological and physiological disturbances caused by crude whey and the biodegraded whey in laboratory-scale pilots mimicking a natural lagoon treatment. The results show that T. pyriformis can strongly reduce the pollutant load of whey. In the lagoon pilots serving as example of receptor media, crude whey gradually but completely arrested operation, whereas with the biodegraded whey adverse effects were only temporary, and normal operation versus a control was gradually recovered in a few days.

  17. Measuring listening effort: driving simulator vs. simple dual-task paradigm

    PubMed Central

    Wu, Yu-Hsiang; Aksan, Nazan; Rizzo, Matthew; Stangl, Elizabeth; Zhang, Xuyang; Bentler, Ruth

    2014-01-01

    Objectives The dual-task paradigm has been widely used to measure listening effort. The primary objectives of the study were to (1) investigate the effect of hearing aid amplification and a hearing aid directional technology on listening effort measured by a complicated, more real world dual-task paradigm, and (2) compare the results obtained with this paradigm to a simpler laboratory-style dual-task paradigm. Design The listening effort of adults with hearing impairment was measured using two dual-task paradigms, wherein participants performed a speech recognition task simultaneously with either a driving task in a simulator or a visual reaction-time task in a sound-treated booth. The speech materials and road noises for the speech recognition task were recorded in a van traveling on the highway in three hearing aid conditions: unaided, aided with omni directional processing (OMNI), and aided with directional processing (DIR). The change in the driving task or the visual reaction-time task performance across the conditions quantified the change in listening effort. Results Compared to the driving-only condition, driving performance declined significantly with the addition of the speech recognition task. Although the speech recognition score was higher in the OMNI and DIR conditions than in the unaided condition, driving performance was similar across these three conditions, suggesting that listening effort was not affected by amplification and directional processing. Results from the simple dual-task paradigm showed a similar trend: hearing aid technologies improved speech recognition performance, but did not affect performance in the visual reaction-time task (i.e., reduce listening effort). The correlation between listening effort measured using the driving paradigm and the visual reaction-time task paradigm was significant. The finding showing that our older (56 to 85 years old) participants’ better speech recognition performance did not result in reduced

  18. Geometry Laboratory (GEOLAB) surface modeling and grid generation technology and services

    NASA Technical Reports Server (NTRS)

    Kerr, Patricia A.; Smith, Robert E.; Posenau, Mary-Anne K.

    1995-01-01

    The facilities and services of the GEOmetry LABoratory (GEOLAB) at the NASA Langley Research Center are described. Included in this description are the laboratory functions, the surface modeling and grid generation technologies used in the laboratory, and examples of the tasks performed in the laboratory.

  19. Modelling high Reynolds number wall–turbulence interactions in laboratory experiments using large-scale free-stream turbulence

    PubMed Central

    Dogan, Eda; Hearst, R. Jason

    2017-01-01

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to ‘simulate’ high Reynolds number wall–turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales. This article is part of the themed issue ‘Toward the development of high-fidelity models of wall turbulence at large Reynolds number’. PMID:28167584

  20. Modelling high Reynolds number wall-turbulence interactions in laboratory experiments using large-scale free-stream turbulence.

    PubMed

    Dogan, Eda; Hearst, R Jason; Ganapathisubramani, Bharathram

    2017-03-13

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to 'simulate' high Reynolds number wall-turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).

  1. Enhancing Lexical Knowledge through L2 Medium Tasks

    ERIC Educational Resources Information Center

    Vosoughi, Marjan

    2012-01-01

    The purpose of the present study was to investigate the L2 use rate in EFL classrooms through introducing three task conditions in learning lexical items. Data were collected from a group of freshman university students (male and female) studying in Islamic Azad university of Sabzevar, Iran (N = 73). Based on their performance on a Michigan TOEFL…

  2. Improvement of workload estimation techniques in piloting tasks

    NASA Technical Reports Server (NTRS)

    Wierwille, W. W.

    1984-01-01

    The Modified Cooper-Harper (MCH) scale has been shown to be a sensitive indicator of workload in several different types of aircrew tasks. The MCH scale, which is a 10 point scale, and five newly devised scales were examined in two different aircraft simulator experiments in which pilot loading was treated as an independent variable. The five scales included a 15 point scale, computerized versions of the MCH and 15 point scales, a scale in which the decision tree was removed, and one in which a 15 point left-to-right format was used. The results indicate that while one of the new scales may be more sensitive in a given experiment, task dependency is a problem. The MCH scale on the other hand exhibits consistent sensitivity and remains the scale recommended for general use. The MCH scale results are consistent with earlier experiments also. The results of the rating scale experiments are presented and the questionnaire results which were directed at obtaining a better understanding of the reasons for the relative sensitivity of the MCH scale and its variations are described.

  3. Multicenter validation of a bedside antisaccade task as a measure of executive function

    PubMed Central

    Hellmuth, J.; Mirsky, J.; Heuer, H.W.; Matlin, A.; Jafari, A.; Garbutt, S.; Widmeyer, M.; Berhel, A.; Sinha, L.; Miller, B.L.; Kramer, J.H.

    2012-01-01

    Objective: To create and validate a simple, standardized version of the antisaccade (AS) task that requires no specialized equipment for use as a measure of executive function in multicenter clinical studies. Methods: The bedside AS (BAS) task consisted of 40 pseudorandomized AS trials presented on a laptop computer. BAS performance was compared with AS performance measured using an infrared eye tracker in normal elders (NE) and individuals with mild cognitive impairment (MCI) or dementia (n = 33). The neuropsychological domain specificity of the BAS was then determined in a cohort of NE, MCI, and dementia (n = 103) at UCSF, and the BAS was validated as a measure of executive function in a 6-center cohort (n = 397) of normal adults and patients with a variety of brain diseases. Results: Performance on the BAS and laboratory AS task was strongly correlated and BAS performance was most strongly associated with neuropsychological measures of executive function. Even after controlling for disease severity and processing speed, BAS performance was associated with multiple assessments of executive function, most strongly the informant-based Frontal Systems Behavior Scale. Conclusions: The BAS is a simple, valid measure of executive function in aging and neurologic disease. PMID:22573640

  4. Effects of dual tasks and dual-task training on postural stability: a systematic review and meta-analysis

    PubMed Central

    Ghai, Shashank; Ghai, Ishan; Effenberg, Alfred O

    2017-01-01

    The use of dual-task training paradigm to enhance postural stability in patients with balance impairments is an emerging area of interest. The differential effects of dual tasks and dual-task training on postural stability still remain unclear. A systematic review and meta-analysis were conducted to analyze the effects of dual task and training application on static and dynamic postural stability among various population groups. Systematic identification of published literature was performed adhering to Preferred Reporting Items for Systematic Reviews and Meta-analysis (PRISMA) guidelines, from inception until June 2016, on the online databases Scopus, PEDro, MEDLINE, EMBASE, and SportDiscus. Experimental studies analyzing the effects of dual task and dual-task training on postural stability were extracted, critically appraised using PEDro scale, and then summarized according to modified PEDro level of evidence. Of 1,284 records, 42 studies involving 1,480 participants met the review’s inclusion criteria. Of the studies evaluating the effects of dual-task training on postural stability, 87.5% of the studies reported significant enhancements, whereas 30% of the studies evaluating acute effects of dual tasks on posture reported significant enhancements, 50% reported significant decrements, and 20% reported no effects. Meta-analysis of the pooled studies revealed moderate but significant enhancements of dual-task training in elderly participants (95% CI: 1.16–2.10) and in patients suffering from chronic stroke (−0.22 to 0.86). The adverse effects of complexity of dual tasks on postural stability were also revealed among patients with multiple sclerosis (−0.74 to 0.05). The review also discusses the significance of verbalization in a dual-task setting for increasing cognitive–motor interference. Clinical implications are discussed with respect to practical applications in rehabilitation settings. PMID:28356727

  5. Effects of dual tasks and dual-task training on postural stability: a systematic review and meta-analysis.

    PubMed

    Ghai, Shashank; Ghai, Ishan; Effenberg, Alfred O

    2017-01-01

    The use of dual-task training paradigm to enhance postural stability in patients with balance impairments is an emerging area of interest. The differential effects of dual tasks and dual-task training on postural stability still remain unclear. A systematic review and meta-analysis were conducted to analyze the effects of dual task and training application on static and dynamic postural stability among various population groups. Systematic identification of published literature was performed adhering to Preferred Reporting Items for Systematic Reviews and Meta-analysis (PRISMA) guidelines, from inception until June 2016, on the online databases Scopus, PEDro, MEDLINE, EMBASE, and SportDiscus. Experimental studies analyzing the effects of dual task and dual-task training on postural stability were extracted, critically appraised using PEDro scale, and then summarized according to modified PEDro level of evidence. Of 1,284 records, 42 studies involving 1,480 participants met the review's inclusion criteria. Of the studies evaluating the effects of dual-task training on postural stability, 87.5% of the studies reported significant enhancements, whereas 30% of the studies evaluating acute effects of dual tasks on posture reported significant enhancements, 50% reported significant decrements, and 20% reported no effects. Meta-analysis of the pooled studies revealed moderate but significant enhancements of dual-task training in elderly participants (95% CI: 1.16-2.10) and in patients suffering from chronic stroke (-0.22 to 0.86). The adverse effects of complexity of dual tasks on postural stability were also revealed among patients with multiple sclerosis (-0.74 to 0.05). The review also discusses the significance of verbalization in a dual-task setting for increasing cognitive-motor interference. Clinical implications are discussed with respect to practical applications in rehabilitation settings.

  6. Slurry spray distribution within a simulated laboratory scale spray dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertone, P.C.

    1979-12-20

    It was found that the distribution of liquid striking the sides of a simulated room temperature spray dryer was not significantly altered by the choice of nozles, nor by a variation in nozzle operating conditions. Instead, it was found to be a function of the spray dryer's configuration. A cocurrent flow of air down the drying cylinder, not possible with PNL's closed top, favorably altered the spray distribution by both decreasing the amount of liquid striking the interior of the cylinder from 72 to 26% of the feed supplied, and by shifting the zone of maximum impact from 1.0 tomore » 1.7 feet from the nozzle. These findings led to the redesign of the laboratory scale spray dryer to be tested at the Savannah River Plant. The diameter of the drying chamber was increased from 5 to 8 inches, and a cocurrent flow of air was established with a closed recycle. Finally, this investigation suggested a drying scheme which offers all the advantages of spray drying without many of its limitations.« less

  7. The Influence of Complexity in Monologic versus Dialogic Tasks in Dutch L2

    ERIC Educational Resources Information Center

    Michel, Marije C.; Kuiken, Folkert; Vedder, Ineke

    2007-01-01

    This study puts the Cognition Hypothesis (Robinson 2005) to the test with respect to its predictions of the effects of changes in task complexity ([plus or minus] few elements) and task condition ([plus or minus] monologic) on L2 performance. 44 learners of Dutch performed both a simple and a complex oral task in either a monologic or a dialogic…

  8. Task complexity, student perceptions of vocabulary learning in EFL, and task performance.

    PubMed

    Wu, Xiaoli; Lowyck, Joost; Sercu, Lies; Elen, Jan

    2013-03-01

    The study deepened our understanding of how students' self-efficacy beliefs contribute to the context of teaching English as a foreign language in the framework of cognitive mediational paradigm at a fine-tuned task-specific level. The aim was to examine the relationship among task complexity, self-efficacy beliefs, domain-related prior knowledge, learning strategy use, and task performance as they were applied to English vocabulary learning from reading tasks. Participants were 120 second-year university students (mean age 21) from a Chinese university. This experiment had two conditions (simple/complex). A vocabulary level test was first conducted to measure participants' prior knowledge of English vocabulary. Participants were then randomly assigned to one of the learning tasks. Participants were administered task booklets together with the self-efficacy scales, measures of learning strategy use, and post-tests. Data obtained were submitted to multivariate analysis of variance (MANOVA) and path analysis. Results from the MANOVA model showed a significant effect of vocabulary level on self-efficacy beliefs, learning strategy use, and task performance. Task complexity showed no significant effect; however, an interaction effect between vocabulary level and task complexity emerged. Results from the path analysis showed self-efficacy beliefs had an indirect effect on performance. Our results highlighted the mediating role of self-efficacy beliefs and learning strategy use. Our findings indicate that students' prior knowledge plays a crucial role on both self-efficacy beliefs and task performance, and the predictive power of self-efficacy on task performance may lie in its association with learning strategy use. © 2011 The British Psychological Society.

  9. Activity flow over resting-state networks shapes cognitive task activations.

    PubMed

    Cole, Michael W; Ito, Takuya; Bassett, Danielle S; Schultz, Douglas H

    2016-12-01

    Resting-state functional connectivity (FC) has helped reveal the intrinsic network organization of the human brain, yet its relevance to cognitive task activations has been unclear. Uncertainty remains despite evidence that resting-state FC patterns are highly similar to cognitive task activation patterns. Identifying the distributed processes that shape localized cognitive task activations may help reveal why resting-state FC is so strongly related to cognitive task activations. We found that estimating task-evoked activity flow (the spread of activation amplitudes) over resting-state FC networks allowed prediction of cognitive task activations in a large-scale neural network model. Applying this insight to empirical functional MRI data, we found that cognitive task activations can be predicted in held-out brain regions (and held-out individuals) via estimated activity flow over resting-state FC networks. This suggests that task-evoked activity flow over intrinsic networks is a large-scale mechanism explaining the relevance of resting-state FC to cognitive task activations.

  10. Activity flow over resting-state networks shapes cognitive task activations

    PubMed Central

    Cole, Michael W.; Ito, Takuya; Bassett, Danielle S.; Schultz, Douglas H.

    2016-01-01

    Resting-state functional connectivity (FC) has helped reveal the intrinsic network organization of the human brain, yet its relevance to cognitive task activations has been unclear. Uncertainty remains despite evidence that resting-state FC patterns are highly similar to cognitive task activation patterns. Identifying the distributed processes that shape localized cognitive task activations may help reveal why resting-state FC is so strongly related to cognitive task activations. We found that estimating task-evoked activity flow (the spread of activation amplitudes) over resting-state FC networks allows prediction of cognitive task activations in a large-scale neural network model. Applying this insight to empirical functional MRI data, we found that cognitive task activations can be predicted in held-out brain regions (and held-out individuals) via estimated activity flow over resting-state FC networks. This suggests that task-evoked activity flow over intrinsic networks is a large-scale mechanism explaining the relevance of resting-state FC to cognitive task activations. PMID:27723746

  11. Cognitive task analysis: harmonizing tasks to human capacities.

    PubMed

    Neerincx, M A; Griffioen, E

    1996-04-01

    This paper presents the development of a cognitive task analysis that assesses the task load of jobs and provides indicators for the redesign of jobs. General principles of human task performance were selected and, subsequently, integrated into current task modelling techniques. The resulting cognitive task analysis centres around four aspects of task load: the number of actions in a period, the ratio between knowledge- and rule-based actions, lengthy uninterrupted actions, and momentary overloading. The method consists of three stages: (1) construction of a hierarchical task model, (2) a time-line analysis and task load assessment, and (3), if necessary, adjustment of the task model. An application of the cognitive task analysis in railway traffic control showed its benefits over the 'old' task load analysis of the Netherlands Railways. It provided a provisional standard for traffic control jobs, conveyed two load risks -- momentary overloading and underloading -- and resulted in proposals to satisfy the standard and to diminish the two load risk.

  12. New paradigm for task switching strategies while performing multiple tasks: entropy and symbolic dynamics analysis of voluntary patterns.

    PubMed

    Guastello, Stephen J; Gorin, Hillary; Huschen, Samuel; Peters, Natalie E; Fabisch, Megan; Poston, Kirsten

    2012-10-01

    It has become well established in laboratory experiments that switching tasks, perhaps due to interruptions at work, incur costs in response time to complete the next task. Conditions are also known that exaggerate or lessen the switching costs. Although switching costs can contribute to fatigue, task switching can also be an adaptive response to fatigue. The present study introduces a new research paradigm for studying the emergence of voluntary task switching regimes, self-organizing processes therein, and the possibly conflicting roles of switching costs and minimum entropy. Fifty-four undergraduates performed 7 different computer-based cognitive tasks producing sets of 49 responses under instructional conditions requiring task quotas or no quotas. The sequences of task choices were analyzed using orbital decomposition to extract pattern types and lengths, which were then classified and compared with regard to Shannon entropy, topological entropy, number of task switches involved, and overall performance. Results indicated that similar but different patterns were generated under the two instructional conditions, and better performance was associated with lower topological entropy. Both entropy metrics were associated with the amount of voluntary task switching. Future research should explore conditions affecting the trade-off between switching costs and entropy, levels of automaticity between task elements, and the role of voluntary switching regimes on fatigue.

  13. Oxy-acetylene driven laboratory scale shock tubes for studying blast wave effects

    NASA Astrophysics Data System (ADS)

    Courtney, Amy C.; Andrusiv, Lubov P.; Courtney, Michael W.

    2012-04-01

    This paper describes the development and characterization of modular, oxy-acetylene driven laboratory scale shock tubes. Such tools are needed to produce realistic blast waves in a laboratory setting. The pressure-time profiles measured at 1 MHz using high-speed piezoelectric pressure sensors have relevant durations and show a true shock front and exponential decay characteristic of free-field blast waves. Descriptions are included for shock tube diameters of 27-79 mm. A range of peak pressures from 204 kPa to 1187 kPa (with 0.5-5.6% standard error of the mean) were produced by selection of the driver section diameter and distance from the shock tube opening. The peak pressures varied predictably with distance from the shock tube opening while maintaining both a true blast wave profile and relevant pulse duration for distances up to about one diameter from the shock tube opening. This shock tube design provides a more realistic blast profile than current compression-driven shock tubes, and it does not have a large jet effect. In addition, operation does not require specialized personnel or facilities like most blast-driven shock tubes, which reduces operating costs and effort and permits greater throughput and accessibility. It is expected to be useful in assessing the response of various sensors to shock wave loading; assessing the reflection, transmission, and absorption properties of candidate armor materials; assessing material properties at high rates of loading; assessing the response of biological materials to shock wave exposure; and providing a means to validate numerical models of the interaction of shock waves with structures. All of these activities have been difficult to pursue in a laboratory setting due in part to lack of appropriate means to produce a realistic blast loading profile.

  14. ESTABLISHMENT OF AN ENVIRONMENTAL CONTROL TECHNOLOGY LABORATORY WITH A CIRCULATING FLUIDIZED-BED COMBUSTION SYSTEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei-Ping Pan; Andy Wu; John T. Riley

    This report is to present the progress made on the project ''Establishment of an Environmental Control Technology Laboratory (ECTL) with a Circulating Fluidized-Bed Combustion (CFBC) System'' during the period October 1, 2004 through December 31, 2004. The following tasks have been completed. First, the renovation of the new Combustion Laboratory and the construction of the Circulating Fluidized-Bed (CFB) Combustor Building have proceeded well. Second, the detailed design of supporting and hanging structures for the CFBC was completed. Third, the laboratory-scale simulated fluidized-bed facility was modified after completing a series of pretests. The two problems identified during the pretest were solved.more » Fourth, the carbonization of chicken waste and coal was investigated in a tube furnace and a Thermogravimetric Analyzer (TGA). The experimental results from this study are presented in this report. Finally, the proposed work for the next quarter has been outlined in this report.« less

  15. Stationary Engineering Laboratory--2. Teacher's Guide.

    ERIC Educational Resources Information Center

    Steingress, Frederick M.; Frost, Harold J.

    The Stationary Engineering Laboratory Manual 2 Teacher's Guide was designed as an aid to the instructors of vocational-technical high school students who have received instruction in the basics of stationary engineering. The course of study was developed for students who will be operating a live plant and who will be responsible for supplying…

  16. Experimental and operational modal analysis of a laboratory scale model of a tripod support structure.

    NASA Astrophysics Data System (ADS)

    Luczak, M. M.; Mucchi, E.; Telega, J.

    2016-09-01

    The goal of the research is to develop a vibration-based procedure for the identification of structural failures in a laboratory scale model of a tripod supporting structure of an offshore wind turbine. In particular, this paper presents an experimental campaign on the scale model tested in two stages. Stage one encompassed the model tripod structure tested in air. The second stage was done in water. The tripod model structure allows to investigate the propagation of a circumferential representative crack of a cylindrical upper brace. The in-water test configuration included the tower with three bladed rotor. The response of the structure to the different waves loads were measured with accelerometers. Experimental and operational modal analysis was applied to identify the dynamic properties of the investigated scale model for intact and damaged state with different excitations and wave patterns. A comprehensive test matrix allows to assess the differences in estimated modal parameters due to damage or as potentially introduced by nonlinear structural response. The presented technique proves to be effective for detecting and assessing the presence of representative cracks.

  17. Tethered gravity laboratories study

    NASA Technical Reports Server (NTRS)

    Lucchetti, F.

    1990-01-01

    The scope of the study is to investigate ways of controlling the microgravity environment of the International Space Station by means of a tethered system. Four main study tasks were performed. First, researchers analyzed the utilization of the tether systems to improve the lowest possible steady gravity level on the Space Station and the tether capability to actively control the center of gravity position in order to compensate for activities that would upset the mass distribution of the Station. The purpose of the second task was to evaluate the whole of the experiments performable in a variable gravity environment and the related beneficial residual accelerations, both for pure and applied research in the fields of fluid, materials, and life science, so as to assess the relevance of a variable g-level laboratory. The third task involves the Tethered Variable Gravity Laboratory. The use of the facility that would crawl along a deployed tether and expose experiments to varying intensities of reduced gravity is discussed. Last, a study performed on the Attitude Tether Stabilizer concept is discussed. The stabilization effect of ballast masses tethered to the Space Station was investigated as a means of assisting the attitude control system of the Station.

  18. The Deese-Roediger-McDermott (DRM) Task: A Simple Cognitive Paradigm to Investigate False Memories in the Laboratory.

    PubMed

    Pardilla-Delgado, Enmanuelle; Payne, Jessica D

    2017-01-31

    The Deese, Roediger and McDermott (DRM) task is a false memory paradigm in which subjects are presented with lists of semantically related words (e.g., nurse, hospital, etc.) at encoding. After a delay, subjects are asked to recall or recognize these words. In the recognition memory version of the task, subjects are asked whether they remember previously presented words, as well as related (but never presented) critical lure words ('doctor'). Typically, the critical word is recognized with high probability and confidence. This false memory effect has been robustly demonstrated across short (e.g., immediate, 20 min) and long (e.g., 1, 7, 60 d) delays between encoding and memory testing. A strength of using this task to study false memory is its simplicity and short duration. If encoding and retrieval components of the task occur in the same session, the entire task can take as little as 2 - 30 min. However, although the DRM task is widely considered a 'false memory' paradigm, some researchers consider DRM illusions to be based on the activation of semantic memory networks in the brain, and argue that such semantic gist-based false memory errors may actually be useful in some scenarios (e.g., remembering the forest for the trees; remembering that a word list was about "doctors", even though the actual word "doctor" was never presented for study). Remembering the gist of experience (instead of or along with individual details) is arguably an adaptive process and this task has provided a great deal of knowledge about the constructive, adaptive nature of memory. Therefore, researchers should use caution when discussing the overall reach and implications of their experiments when using this task to study 'false memory', as DRM memory errors may not adequately reflect false memories in the real world, such as false memory in eyewitness testimony, or false memories of sexual abuse.

  19. Estimation of waste component-specific landfill decay rates using laboratory-scale decomposition data.

    PubMed

    De la Cruz, Florentino B; Barlaz, Morton A

    2010-06-15

    The current methane generation model used by the U.S. EPA (Landfill Gas Emissions Model) treats municipal solid waste (MSW) as a homogeneous waste with one decay rate. However, component-specific decay rates are required to evaluate the effects of changes in waste composition on methane generation. Laboratory-scale rate constants, k(lab), for the major biodegradable MSW components were used to derive field-scale decay rates (k(field)) for each waste component using the assumption that the average of the field-scale decay rates for each waste component, weighted by its composition, is equal to the bulk MSW decay rate. For an assumed bulk MSW decay rate of 0.04 yr(-1), k(field) was estimated to be 0.298, 0.171, 0.015, 0.144, 0.033, 0.02, 0.122, and 0.029 yr(-1), for grass, leaves, branches, food waste, newsprint, corrugated containers, coated paper, and office paper, respectively. The effect of landfill waste diversion programs on methane production was explored to illustrate the use of component-specific decay rates. One hundred percent diversion of yard waste and food waste reduced the year 20 methane production rate by 45%. When a landfill gas collection schedule was introduced, collectable methane was most influenced by food waste diversion at years 10 and 20 and paper diversion at year 40.

  20. M2 .50-cal. Multishot Capabilities at the US Army Research Laboratory

    DTIC Science & Technology

    2015-09-01

    a platform was designed to simulate the 2-round burst of the M2 .50-caliber Browning machine gun (BMG). The 2-round-burst cyclic rate was achieved...ARL-TN-0694 ● SEP 2015 US Army Research Laboratory M2 .50-cal. Multishot Capabilities at the US Army Research Laboratory by...longer needed. Do not return it to the originator. ARL-TN-0694 ● SEP 2015 US Army Research Laboratory M2 .50-cal. Multishot

  1. Safety Precautions and Operating Procedures in an (A)BSL-4 Laboratory: 1. Biosafety Level 4 Suit Laboratory Suite Entry and Exit Procedures

    PubMed Central

    Janosko, Krisztina; Holbrook, Michael R.; Adams, Ricky; Barr, Jason; Bollinger, Laura; Newton, Je T'aime; Ntiforo, Corrie; Coe, Linda; Wada, Jiro; Pusl, Daniela; Jahrling, Peter B.; Kuhn, Jens H.; Lackemeyer, Matthew G.

    2016-01-01

    Biosafety level 4 (BSL-4) suit laboratories are specifically designed to study high-consequence pathogens for which neither infection prophylaxes nor treatment options exist. The hallmarks of these laboratories are: custom-designed airtight doors, dedicated supply and exhaust airflow systems, a negative-pressure environment, and mandatory use of positive-pressure (“space”) suits. The risk for laboratory specialists working with highly pathogenic agents is minimized through rigorous training and adherence to stringent safety protocols and standard operating procedures. Researchers perform the majority of their work in BSL-2 laboratories and switch to BSL-4 suit laboratories when work with a high-consequence pathogen is required. Collaborators and scientists considering BSL-4 projects should be aware of the challenges associated with BSL-4 research both in terms of experimental technical limitations in BSL-4 laboratory space and the increased duration of such experiments. Tasks such as entering and exiting the BSL-4 suit laboratories are considerably more complex and time-consuming compared to BSL-2 and BSL-3 laboratories. The focus of this particular article is to address basic biosafety concerns and describe the entrance and exit procedures for the BSL-4 laboratory at the NIH/NIAID Integrated Research Facility at Fort Detrick. Such procedures include checking external systems that support the BSL-4 laboratory, and inspecting and donning positive-pressure suits, entering the laboratory, moving through air pressure-resistant doors, and connecting to air-supply hoses. We will also discuss moving within and exiting the BSL-4 suit laboratories, including using the chemical shower and removing and storing positive-pressure suits. PMID:27768063

  2. Safety Precautions and Operating Procedures in an (A)BSL-4 Laboratory: 1. Biosafety Level 4 Suit Laboratory Suite Entry and Exit Procedures.

    PubMed

    Janosko, Krisztina; Holbrook, Michael R; Adams, Ricky; Barr, Jason; Bollinger, Laura; Newton, Je T'aime; Ntiforo, Corrie; Coe, Linda; Wada, Jiro; Pusl, Daniela; Jahrling, Peter B; Kuhn, Jens H; Lackemeyer, Matthew G

    2016-10-03

    Biosafety level 4 (BSL-4) suit laboratories are specifically designed to study high-consequence pathogens for which neither infection prophylaxes nor treatment options exist. The hallmarks of these laboratories are: custom-designed airtight doors, dedicated supply and exhaust airflow systems, a negative-pressure environment, and mandatory use of positive-pressure ("space") suits. The risk for laboratory specialists working with highly pathogenic agents is minimized through rigorous training and adherence to stringent safety protocols and standard operating procedures. Researchers perform the majority of their work in BSL-2 laboratories and switch to BSL-4 suit laboratories when work with a high-consequence pathogen is required. Collaborators and scientists considering BSL-4 projects should be aware of the challenges associated with BSL-4 research both in terms of experimental technical limitations in BSL-4 laboratory space and the increased duration of such experiments. Tasks such as entering and exiting the BSL-4 suit laboratories are considerably more complex and time-consuming compared to BSL-2 and BSL-3 laboratories. The focus of this particular article is to address basic biosafety concerns and describe the entrance and exit procedures for the BSL-4 laboratory at the NIH/NIAID Integrated Research Facility at Fort Detrick. Such procedures include checking external systems that support the BSL-4 laboratory, and inspecting and donning positive-pressure suits, entering the laboratory, moving through air pressure-resistant doors, and connecting to air-supply hoses. We will also discuss moving within and exiting the BSL-4 suit laboratories, including using the chemical shower and removing and storing positive-pressure suits.

  3. Task Complexity, Language-Related Episodes, and Production of L2 Spanish Vowels

    ERIC Educational Resources Information Center

    Solon, Megan; Long, Avizia Y.; Gurzynski-Weiss, Laura

    2017-01-01

    This study tests the theoretical predictions regarding effects of increasing task complexity (Robinson, 2001a, 2001b, 2007, 2010; Robinson & Gilabert, 2007) for second language (L2) pronunciation. Specifically, we examine whether more complex tasks (a) lead to greater incidence of pronunciation-focused language-related episodes (LREs) and (b)…

  4. Subjective rating scales as a workload

    NASA Technical Reports Server (NTRS)

    Bird, K. L.

    1981-01-01

    A multidimensional bipolar-adjective rating scale is employed as a subjective measure of operator workload in the performance of a one-axis tracking task. The rating scale addressed several dimensions of workload, including cognitive, physical, and perceptual task loading as well as fatigue and stress effects. Eight subjects performed a one-axis tracking task (with six levels of difficulty) and rated these tasks on several workload dimensions. Performance measures were tracking error RMS (root-mean square) and the standard deviation of control stick output. Significant relationships were observed between these performance measures and skill required, task complexity, attention level, task difficulty, task demands, and stress level.

  5. Infrared Imagery of Shuttle (IRIS). Task 2, summary report

    NASA Technical Reports Server (NTRS)

    Chocol, C. J.

    1978-01-01

    End-to-end tests of a 16 element indium antimonide sensor array and 10 channels of associated electronic signal processing were completed. Quantitative data were gathered on system responsivity, frequency response, noise, stray capacitance effects, and sensor paralleling. These tests verify that the temperature accuracies, predicted in the Task 1 study, can be obtained with a very carefully designed electro-optical flight system. Pre-flight and inflight calibration of a high quality are mandatory to obtain these accuracies. Also, optical crosstalk in the array-dewar assembly must be carefully eliminated by its design. Tests of the scaled up tracking system reticle also demonstrate that the predicted tracking system accuracies can be met in the flight system. In addition, improvements in the reticle pattern and electronics are possible, which will reduce the complexity of the flight system and increase tracking accuracy.

  6. Resolving task rule incongruence during task switching by competitor rule suppression.

    PubMed

    Meiran, Nachshon; Hsieh, Shulan; Dimov, Eduard

    2010-07-01

    Task switching requires maintaining readiness to execute any task of a given set of tasks. However, when tasks switch, the readiness to execute the now-irrelevant task generates interference, as seen in the task rule incongruence effect. Overcoming such interference requires fine-tuned inhibition that impairs task readiness only minimally. In an experiment involving 2 object classification tasks and 2 location classification tasks, the authors show that irrelevant task rules that generate response conflicts are inhibited. This competitor rule suppression (CRS) is seen in response slowing in subsequent trials, when the competing rules become relevant. CRS is shown to operate on specific rules without affecting similar rules. CRS and backward inhibition, which is another inhibitory phenomenon, produced additive effects on reaction time, suggesting their mutual independence. Implications for current formal theories of task switching as well as for conflict monitoring theories are discussed. (c) 2010 APA, all rights reserved

  7. Defining a roadmap for harmonizing quality indicators in Laboratory Medicine: a consensus statement on behalf of the IFCC Working Group "Laboratory Error and Patient Safety" and EFLM Task and Finish Group "Performance specifications for the extra-analytical phases".

    PubMed

    Sciacovelli, Laura; Panteghini, Mauro; Lippi, Giuseppe; Sumarac, Zorica; Cadamuro, Janne; Galoro, César Alex De Olivera; Pino Castro, Isabel Garcia Del; Shcolnik, Wilson; Plebani, Mario

    2017-08-28

    The improving quality of laboratory testing requires a deep understanding of the many vulnerable steps involved in the total examination process (TEP), along with the identification of a hierarchy of risks and challenges that need to be addressed. From this perspective, the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) is focusing its activity on implementation of an efficient tool for obtaining meaningful information on the risk of errors developing throughout the TEP, and for establishing reliable information about error frequencies and their distribution. More recently, the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has created the Task and Finish Group "Performance specifications for the extra-analytical phases" (TFG-PSEP) for defining performance specifications for extra-analytical phases. Both the IFCC and EFLM groups are working to provide laboratories with a system to evaluate their performances and recognize the critical aspects where improvement actions are needed. A Consensus Conference was organized in Padova, Italy, in 2016 in order to bring together all the experts and interested parties to achieve a consensus for effective harmonization of quality indicators (QIs). A general agreement was achieved and the main outcomes have been the release of a new version of model of quality indicators (MQI), the approval of a criterion for establishing performance specifications and the definition of the type of information that should be provided within the report to the clinical laboratories participating to the QIs project.

  8. Planetary image conversion task

    NASA Technical Reports Server (NTRS)

    Martin, M. D.; Stanley, C. L.; Laughlin, G.

    1985-01-01

    The Planetary Image Conversion Task group processed 12,500 magnetic tapes containing raw imaging data from JPL planetary missions and produced an image data base in consistent format on 1200 fully packed 6250-bpi tapes. The output tapes will remain at JPL. A copy of the entire tape set was delivered to US Geological Survey, Flagstaff, Ariz. A secondary task converted computer datalogs, which had been stored in project specific MARK IV File Management System data types and structures, to flat-file, text format that is processable on any modern computer system. The conversion processing took place at JPL's Image Processing Laboratory on an IBM 370-158 with existing software modified slightly to meet the needs of the conversion task. More than 99% of the original digital image data was successfully recovered by the conversion task. However, processing data tapes recorded before 1975 was destructive. This discovery is of critical importance to facilities responsible for maintaining digital archives since normal periodic random sampling techniques would be unlikely to detect this phenomenon, and entire data sets could be wiped out in the act of generating seemingly positive sampling results. Reccomended follow-on activities are also included.

  9. Technical Basis of Scaling Relationships for the Pretreatment Engineering Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, William L.; Arm, Stuart T.; Huckaby, James L.

    Pacific Northwest National Laboratory has been tasked by Bechtel National Inc. (BNI) on the River Protection Project-Waste Treatment Plant (RPP-WTP) project to perform research and development activities. The Pretreatment Engineering Platform (PEP) is being designed and constructed as part of a plan to respond to an issue raised by the WTP External Flowsheet Review Team (EFRT) entitled “Undemonstrated Leaching Processes” and numbered M12. The PEP replicates the WTP leaching process using prototypic equipment and control strategies. The approach for scaling PEP performance data to predict WTP performance is critical to the successful resolution of the EFRT issue. This report describesmore » the recommended PEP scaling approach, PEP data interpretation and provides recommendations on test conduct and data requirements.« less

  10. TASK-2 K₂p K⁺ channel: thoughts about gating and its fitness to physiological function.

    PubMed

    López-Cayuqueo, Karen I; Peña-Münzenmayer, Gaspar; Niemeyer, María Isabel; Sepúlveda, Francisco V; Cid, L Pablo

    2015-05-01

    TASK-2 (K2P5) was one of the earliest members of the K2P two-pore, four transmembrane domain K(+) channels to be identified. TASK-2 gating is controlled by changes in both extra- and intracellular pH through separate sensors: arginine 224 and lysine 245, located at the extra- and intracellular ends of transmembrane domain 4. TASK-2 is inhibited by a direct effect of CO2 and is regulated by and interacts with G protein subunits. TASK-2 takes part in regulatory adjustments and is a mediator in the chemoreception process in neurons of the retrotrapezoid nucleus where its pHi sensitivity could be important in regulating excitability and therefore signalling of the O2/CO2 status. Extracellular pH increases brought about by HCO3 (-) efflux from proximal tubule epithelial cells have been proposed to couple to TASK-2 activation to maintain electrochemical gradients favourable to HCO3 (-) reabsorption. We demonstrate that, as suspected previously, TASK-2 is expressed at the basolateral membrane of the same proximal tubule cells that express apical membrane Na(+)-H(+)-exchanger NHE-3 and basolateral membrane Na(+)-HCO3 (-) cotransporter NBCe1-A, the main components of the HCO3 (-) transport machinery. We also discuss critically the mechanism by which TASK-2 is modulated and impacts the process of HCO3 (-) reclaim by the proximal tubule epithelium, concluding that more than a mere shift in extracellular pH is probably involved.

  11. Laboratory Measurements of Cometary Photochemical Phenomena.

    DTIC Science & Technology

    1981-12-04

    PROGFIAM ELEMENT.PROJECT TASK Laser .Chemistry Division AREA & WORK UNIT NUMaZRS Department of Chemistry - Howard University NR.051-733 Wash’ ngtQn, D. C...William M. Jackson Laser Chemistry Division Department of Chemistry Howard University .Washington, D. C. 20059 / Published by Jet Propulsion Laboratory...MEASUREMENTS OF COMETARY PHOTOCHEMICAL PHENOMENA William M. Jackson Howard University Washington, DC 20059 Abstract Laboratory experiments are described

  12. Laboratory Modelling of Volcano Plumbing Systems: a review

    NASA Astrophysics Data System (ADS)

    Galland, Olivier; Holohan, Eoghan P.; van Wyk de Vries, Benjamin; Burchardt, Steffi

    2015-04-01

    Earth scientists have, since the XIX century, tried to replicate or model geological processes in controlled laboratory experiments. In particular, laboratory modelling has been used study the development of volcanic plumbing systems, which sets the stage for volcanic eruptions. Volcanic plumbing systems involve complex processes that act at length scales of microns to thousands of kilometres and at time scales from milliseconds to billions of years, and laboratory models appear very suitable to address them. This contribution reviews laboratory models dedicated to study the dynamics of volcano plumbing systems (Galland et al., Accepted). The foundation of laboratory models is the choice of relevant model materials, both for rock and magma. We outline a broad range of suitable model materials used in the literature. These materials exhibit very diverse rheological behaviours, so their careful choice is a crucial first step for the proper experiment design. The second step is model scaling, which successively calls upon: (1) the principle of dimensional analysis, and (2) the principle of similarity. The dimensional analysis aims to identify the dimensionless physical parameters that govern the underlying processes. The principle of similarity states that "a laboratory model is equivalent to his geological analogue if the dimensionless parameters identified in the dimensional analysis are identical, even if the values of the governing dimensional parameters differ greatly" (Barenblatt, 2003). The application of these two steps ensures a solid understanding and geological relevance of the laboratory models. In addition, this procedure shows that laboratory models are not designed to exactly mimic a given geological system, but to understand underlying generic processes, either individually or in combination, and to identify or demonstrate physical laws that govern these processes. From this perspective, we review the numerous applications of laboratory models to

  13. Clinical laboratory accreditation in India.

    PubMed

    Handoo, Anil; Sood, Swaroop Krishan

    2012-06-01

    Test results from clinical laboratories must ensure accuracy, as these are crucial in several areas of health care. It is necessary that the laboratory implements quality assurance to achieve this goal. The implementation of quality should be audited by independent bodies,referred to as accreditation bodies. Accreditation is a third-party attestation by an authoritative body, which certifies that the applicant laboratory meets quality requirements of accreditation body and has demonstrated its competence to carry out specific tasks. Although in most of the countries,accreditation is mandatory, in India it is voluntary. The quality requirements are described in standards developed by many accreditation organizations. The internationally acceptable standard for clinical laboratories is ISO15189, which is based on ISO/IEC standard 17025. The accreditation body in India is the National Accreditation Board for Testing and Calibration Laboratories, which has signed Mutual Recognition Agreement with the regional cooperation the Asia Pacific Laboratory Accreditation Cooperation and with the apex cooperation the International Laboratory Accreditation Cooperation.

  14. Risky Decision Making Assessed With the Gambling Task in Adults with HIV

    PubMed Central

    Hardy, David J.; Hinkin, Charles H.; Castellon, Steven A.; Levine, Andrew J.; Lam, Mona N.

    2010-01-01

    Decision making was assessed using a laboratory gambling task in 67 adults with the Human Immunodeficiency Virus (HIV+) and in 19 HIV-seronegative (HIV−) control participants. Neurocognitive test performance across several domains was also analyzed to examine potential cognitive mechanisms of gambling task performance. As predicted, the HIV+ group performed worse on the gambling task, indicating greater risky decision making. Specifically, the HIV+ group selected more cards from the “risky” or disadvantageous deck that included relatively large payoffs but infrequent large penalties. The control group also selected such risky cards but quickly learned to avoid them. Exploratory analyses also indicated that in the HIV+ group, but not in the control group, gambling task performance was correlated with Stroop Interference performance and long delay free recall on the California Verbal Learning Test, suggesting the role of inhibitory processes and verbal memory in the poorer gambling task performance in HIV. These findings indicate the usefulness of the gambling task as a laboratory tool to examine risky decision making and cognition in the HIV population. PMID:16719628

  15. The influence of risk perception on biosafety level-2 laboratory workers' hand-to-face contact behaviors.

    PubMed

    Johnston, James D; Eggett, Dennis; Johnson, Michele J; Reading, James C

    2014-01-01

    Pathogen transmission in the laboratory is thought to occur primarily through inhalation of infectious aerosols or by direct contact with mucous membranes on the face. While significant research has focused on controlling inhalation exposures, little has been written about hand contamination and subsequent hand-to-face contact (HFC) transmission. HFC may present a significant risk to workers in biosafety level-2 (BSL-2) laboratories where there is typically no barrier between the workers' hands and face. The purpose of this study was to measure the frequency and location of HFC among BSL-2 workers, and to identify psychosocial factors that influence the behavior. Research workers (N = 93) from 21 BSL-2 laboratories consented to participate in the study. Two study personnel measured workers' HFC behaviors by direct observation during activities related to cell culture maintenance, cell infection, virus harvesting, reagent and media preparation, and tissue processing. Following observations, a survey measuring 11 psychosocial predictors of HFC was administered to participants. Study personnel recorded 396 touches to the face over the course of the study (mean = 2.6 HFCs/hr). Of the 93 subjects, 67 (72%) touched their face at least once, ranging from 0.2-16.0 HFCs/hr. Among those who touched their face, contact with the nose was most common (44.9%), followed by contact with the forehead (36.9%), cheek/chin (12.5%), mouth (4.0%), and eye (1.7%). HFC rates were significantly different across laboratories F(20, 72) = 1.85, p = 0.03. Perceived severity of infection predicted lower rates of HFC (p = 0.03). For every one-point increase in the severity scale, workers had 0.41 fewer HFCs/hr (r = -.27, P < 0.05). This study suggests HFC is common among BSL-2 laboratory workers, but largely overlooked as a major route of exposure. Workers' risk perceptions had a modest impact on their HFC behaviors, but other factors not considered in this study, including social modeling and

  16. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    DTIC Science & Technology

    2010-12-01

    the software for reevaluation. Once the ree- valuation process is completed, CERT provides the client a report detailing the software’s con - formance...Flagged Nonconformities (FNC) Software System TP/FNC Ratio Mozilla Firefox version 2.0 6/12 50% Linux kernel version 2.6.15 10/126 8% Wine...inappropriately tuned for analysis of the Linux kernel, which has anomalous results. Customizing SCALe to work with energy system software will help

  17. Competency Assessment of Microbiology Medical Laboratory Technologists in Ontario, Canada

    PubMed Central

    Fleming, Christine Ann

    2014-01-01

    Accreditation in Ontario, Canada, requires that licensed clinical laboratories participate in external quality assessment (also known as proficiency testing) and perform competency evaluation of their staff. To assess the extent of ongoing competency assessment practices, the Quality Management Program—Laboratory Services (QMP-LS) Microbiology Committee surveyed all 112 licensed Ontario microbiology laboratories. The questionnaire consisted of a total of 21 questions that included yes/no, multiple-choice, and short-answer formats. Participants were asked to provide information about existing programs, the frequency of testing, what areas are evaluated, and how results are communicated to the staff. Of the 111 responding laboratories, 6 indicated they did not have a formal evaluation program since they perform only limited bacteriology testing. Of the remaining 105 respondents, 87% perform evaluations at least annually or every 2 years, and 61% include any test or task performed, whereas 16% and 10% focus only on problem areas and high-volume complex tasks, respectively. The most common methods of evaluation were review of external quality assessment (EQA) challenges, direct observation, and worksheet review. With the exception of one participant, all communicate results to staff, and most take remedial action to correct the deficiencies. Although most accredited laboratories have a program to assess the ongoing competency of their staff, the methods used are not standardized or consistently applied, indicating that there is room for improvement. The survey successfully highlighted potential areas for improvement and allowed the QMP-LS Microbiology Committee to provide guidance to Ontario laboratories for establishing or improving existing microbiology-specific competency assessment programs. PMID:24899030

  18. Inactivation of Bacillus anthracis Spores during Laboratory-Scale Composting of Feedlot Cattle Manure

    PubMed Central

    Xu, Shanwei; Harvey, Amanda; Barbieri, Ruth; Reuter, Tim; Stanford, Kim; Amoako, Kingsley K.; Selinger, Leonard B.; McAllister, Tim A.

    2016-01-01

    Anthrax outbreaks in livestock have social, economic and health implications, altering farmer’s livelihoods, impacting trade and posing a zoonotic risk. Our study investigated the survival of Bacillus thuringiensis and B. anthracis spores sporulated at 15, 20, or 37°C, over 33 days of composting. Spores (∼7.5 log10 CFU g-1) were mixed with manure and composted in laboratory scale composters. After 15 days, the compost was mixed and returned to the composter for a second cycle. Temperatures peaked at 71°C on day 2 and remained ≥55°C for an average of 7 days in the first cycle, but did not exceed 55°C in the second. For B. thuringiensis, spores generated at 15 and 21°C exhibited reduced (P < 0.05) viability of 2.7 and 2.6 log10 CFU g-1 respectively, as compared to a 0.6 log10 CFU g-1 reduction for those generated at 37°C. For B. anthracis, sporulation temperature did not impact spore survival as there was a 2.5, 2.2, and 2.8 log10 CFU g-1 reduction after composting for spores generated at 15, 21, and 37°C, respectively. For both species, spore viability declined more rapidly (P < 0.05) in the first as compared to the second composting cycle. Our findings suggest that the duration of thermophilic exposure (≥55°C) is the main factor influencing survival of B. anthracis spores in compost. As sporulation temperature did not influence survival of B. anthracis, composting may lower the viability of spores associated with carcasses infected with B. anthracis over a range of sporulation temperatures. PMID:27303388

  19. Scaling of Theory-of-Mind Tasks

    ERIC Educational Resources Information Center

    Wellman, Henry M.; Liu, David

    2004-01-01

    Two studies address the sequence of understandings evident in preschoolers' developing theory of mind. The first, preliminary study provides a meta-analysis of research comparing different types of mental state understandings (e.g., desires vs. beliefs, ignorance vs. false belief). The second, primary study tests a theory-of-mind scale for…

  20. The Role of L2 Learner Goal Differences in Task-Generated Oral Production

    ERIC Educational Resources Information Center

    Ben Maad, Mohamed Ridha

    2016-01-01

    In light of the growing interest in the cognitive approach to task in second language (L2) research, comparatively little has been done to examine such effect over time and the role of individual differences (IDs). The present study was designed to verify some longitudinal evidence for the role of tasks in L2 production and whether IDs, in the…

  1. 2.5-year-olds succeed at a verbal anticipatory-looking false-belief task.

    PubMed

    He, Zijing; Bolz, Matthias; Baillargeon, Renée

    2012-03-01

    Recent research suggests that infants and toddlers succeed at a wide range of non-elicited-response false-belief tasks (i.e., tasks that do not require children to answer a direct question about a mistaken agent's likely behaviour). However, one exception to this generalization comes from verbal anticipatory-looking tasks, which have produced inconsistent findings with toddlers. One possible explanation for these findings is that toddlers succeed when they correctly interpret the prompt as a self-addressed utterance (making the task a non-elicited-response task), but fail when they mistakenly interpret the prompt as a direct question (making the task an elicited-response task). Here, 2.5-year-old toddlers were tested in a verbal anticipatory-looking task that was designed to help them interpret the anticipatory prompt as a self-addressed utterance: the experimenter looked at the ceiling, chin in hand, during and after the prompt. Children gave evidence of false-belief understanding in this task, but failed when the experimenter looked at the child during and after the prompt. These results reinforce claims of robust continuity in early false-belief reasoning and provide additional support for the distinction between non-elicited- and elicited-response false-belief tasks. Three accounts of the discrepant results obtained with these tasks - and of early false-belief understanding more generally - are discussed. © 2011 The British Psychological Society.

  2. The development of Metacognition test in genetics laboratory for undergraduate students

    NASA Astrophysics Data System (ADS)

    A-nongwech, Nattapong; Pruekpramool, Chaninan

    2018-01-01

    The purpose of this research was to develop a Metacognition test in a Genetics Laboratory for undergraduate students. The participants were 30 undergraduate students of a Rajabhat university in Rattanakosin group in the second semester of the 2016 academic year using purposive sampling. The research instrument consisted of 1) Metacognition test and 2) a Metacognition test evaluation form for experts focused on three main points which were an accurate evaluation form of content, a consistency between Metacognition experiences and questions and the appropriateness of the test. The quality of the test was analyzed by using the Index of Consistency (IOC), discrimination and reliability. The results of developing Metacognition test were summarized as 1) The result of developing Metacognition test in a Genetics Laboratory for undergraduate students found that the Metacognition test contained 56 items of open - ended questions. The test composed of 1) four scientific situations, 2) fourteen items of open - ended questions in each scientific situation for evaluating components of Metacognition. The components of Metacognition consisted of Metacognitive knowledge, which were divided into person knowledge, task knowledge and strategy knowledge and Metacognitive experience, which were divided into planning, monitoring and evaluating, and 3) fourteen items of scoring criteria divided into four scales. 2) The results of the item analysis of Metacognition in Genetics Laboratory for undergraduate students found that Index of Consistency between Metacognitive experiences and questions were in the range between 0.75 - 1.00. An accuracy of content equaled 1.00. The appropriateness of the test equaled 1.00 in all situations and items. The discrimination of the test was in the range between 0.00 - 0.73. Furthermore, the reliability of the test equaled 0.97.

  3. Relationship of Level of Functioning of Institutionalized Women on a Task Analysis of Personal Care for Menstruation and the Adaptive Behavior Scale.

    ERIC Educational Resources Information Center

    Brekke, Beverly W.; And Others

    A 40-item behavior analysis task, the Menstrual Care Scale, was developed and tested with 75 randomly selected institutionalized severely retarded women (13-59 years old). The need for developing personal care skills in menstruation habits had been identified as a priority area for sexuality instruction by staff and confirmed by analysis of…

  4. Task Prioritization in Dual-Tasking: Instructions versus Preferences

    PubMed Central

    Jansen, Reinier J.; van Egmond, René; de Ridder, Huib

    2016-01-01

    The role of task prioritization in performance tradeoffs during multi-tasking has received widespread attention. However, little is known on whether people have preferences regarding tasks, and if so, whether these preferences conflict with priority instructions. Three experiments were conducted with a high-speed driving game and an auditory memory task. In Experiment 1, participants did not receive priority instructions. Participants performed different sequences of single-task and dual-task conditions. Task performance was evaluated according to participants’ retrospective accounts on preferences. These preferences were reformulated as priority instructions in Experiments 2 and 3. The results showed that people differ in their preferences regarding task prioritization in an experimental setting, which can be overruled by priority instructions, but only after increased dual-task exposure. Additional measures of mental effort showed that performance tradeoffs had an impact on mental effort. The interpretation of these findings was used to explore an extension of Threaded Cognition Theory with Hockey’s Compensatory Control Model. PMID:27391779

  5. Walking and cognition, but not symptoms, correlate with dual task cost of walking in multiple sclerosis.

    PubMed

    Motl, Robert W; Sosnoff, Jacob J; Dlugonski, Deirdre; Pilutti, Lara A; Klaren, Rachel; Sandroff, Brian M

    2014-03-01

    Performing a cognitive task while walking results in a reduction of walking performance among persons with MS. To date, very little is known about correlates of this dual task cost (DTC) of walking in MS. We examined walking performance, cognitive processing speed, and symptoms of fatigue, depression, anxiety, and pain as correlates of DTC of walking in MS. 82 persons with MS undertook a 6-min walk test (6MWT) and completed the Symbol Digit Modalities Test (SDMT), Fatigue Severity Scale (FSS), Short-form of the McGill Pain Questionnaire (SF-MPQ), Hospital Anxiety and Depression Scale (HADS), and self-reported Expanded Disability Status Scale (SR-EDSS). The participants completed 4 trials of walking at a self-selected pace on an electronic walkway that recorded spatiotemporal parameters of gait. The first 2 trials were performed without a cognitive task, whereas the second 2 trials were completed while performing a modified Word List Generation task. There were significant and large declines in gait performance with the addition of a cognitive task for velocity (p<.001, η2=.52), cadence (p<.001, η2=.49), and step length (p<.001, η2=.23). 6MWT and SDMT scores correlated with DTC for velocity (r=-.41, p<.001 and r=-.32, p<.001, respectively) and step length (r=-.45, p<.001 and r=-.37, p<.001, respectively); there were no significant associations between FSS, SF-MPQ, and HADS scores with the DTC of walking. Regression analyses indicated that 6MW, but not SDMT, explained variance in DTC for velocity (ΔR2=.11, p<.001) and step length (ΔR2=.13, p<.001), after controlling for SR-EDSS scores. Walking performance might be a target of interventions for reducing the DTC of walking in MS. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. A Comparison of Laboratory and Clinical Working Memory Tests and Their Prediction of Fluid Intelligence

    PubMed Central

    Shelton, Jill T.; Elliott, Emily M.; Hill, B. D.; Calamia, Matthew R.; Gouvier, Wm. Drew

    2010-01-01

    The working memory (WM) construct is conceptualized similarly across domains of psychology, yet the methods used to measure WM function vary widely. The present study examined the relationship between WM measures used in the laboratory and those used in applied settings. A large sample of undergraduates completed three laboratory-based WM measures (operation span, listening span, and n-back), as well as the WM subtests from the Wechsler Adult Intelligence Scale-III and the Wechsler Memory Scale-III. Performance on all of the WM subtests of the clinical batteries shared positive correlations with the lab measures; however, the Arithmetic and Spatial Span subtests shared lower correlations than the other WM tests. Factor analyses revealed that a factor comprising scores from the three lab WM measures and the clinical subtest, Letter-Number Sequencing (LNS), provided the best measurement of WM. Additionally, a latent variable approach was taken using fluid intelligence as a criterion construct to further discriminate between the WM tests. The results revealed that the lab measures, along with the LNS task, were the best predictors of fluid abilities. PMID:20161647

  7. Full-scale laboratory validation of a wireless MEMS-based technology for damage assessment of concrete structures

    NASA Astrophysics Data System (ADS)

    Trapani, Davide; Zonta, Daniele; Molinari, Marco; Amditis, Angelos; Bimpas, Matthaios; Bertsch, Nicolas; Spiering, Vincent; Santana, Juan; Sterken, Tom; Torfs, Tom; Bairaktaris, Dimitris; Bairaktaris, Manos; Camarinopulos, Stefanos; Frondistou-Yannas, Mata; Ulieru, Dumitru

    2012-04-01

    This paper illustrates an experimental campaign conducted under laboratory conditions on a full-scale reinforced concrete three-dimensional frame instrumented with wireless sensors developed within the Memscon project. In particular it describes the assumptions which the experimental campaign was based on, the design of the structure, the laboratory setup and the results of the tests. The aim of the campaign was to validate the performance of Memscon sensing systems, consisting of wireless accelerometers and strain sensors, on a real concrete structure during construction and under an actual earthquake. Another aspect of interest was to assess the effectiveness of the full damage recognition procedure based on the data recorded by the sensors and the reliability of the Decision Support System (DSS) developed in order to provide the stakeholders recommendations for building rehabilitation and the costs of this. With these ends, a Eurocode 8 spectrum-compatible accelerogram with increasing amplitude was applied at the top of an instrumented concrete frame built in the laboratory. MEMSCON sensors were directly compared with wired instruments, based on devices available on the market and taken as references, during both construction and seismic simulation.

  8. Dysautonomia Rating Scales in Parkinson’s Disease: Sialorrhea, Dysphagia, and Constipation—Critique and Recommendations by Movement Disorders Task Force on Rating Scales for Parkinson’s Disease

    PubMed Central

    Evatt, Marian L.; Chaudhuri, K. Ray; Chou, Kelvin L.; Cubo, Ester; Hinson, Vanessa; Kompoliti, Katie; Yang, Chengwu; Poewe, Werner; Rascol, Olivier; Sampaio, Cristina; Stebbins, Glenn T.; Goetz, Christopher G.

    2015-01-01

    Upper and lower gastrointestinal dysautonomia symptoms (GIDS)—sialorrhea, dysphagia, and constipation are common in Parkinson’s disease (PD) and often socially as well as physically disabling for patients. Available invasive quantitative measures for assessing these symptoms and their response to therapy are time-consuming, require specialized equipment, can cause patient discomfort and present patients with risk. The Movement Disorders Society commissioned a task force to assess available clinical rating scales, critique their clinimetric properties, and make recommendations regarding their clinical utility. Six clinical researchers and a biostatistician systematically searched the literature for scales of sialorrhea, dysphagia, and constipation, evaluated the scales’ previous use, performance parameters, and quality of validation data (if available). A scale was designated “Recommended” if the scale was used in clinical studies beyond the group that developed it, has been specifically used in PD reports, and clinimetric studies have established that it is a valid, reliable, and sensitive. “Suggested” scales met at least part of the above criteria, but fell short of meeting all. Based on the systematic review, scales for individual symptoms of sialorrhea, dysphagia, and constipation were identified along with three global scales that include these symptoms in the context of assessing dysautonomia or nonmotor symptoms. Three sialorrhea scales met criteria for Suggested: Drooling Severity and Frequency Scale (DSFS), Drooling Rating Scale, and Sialorrhea Clinical Scale for PD (SCS-PD). Two dysphagia scales, the Swallowing Disturbance Questionnaire (SDQ) and Dysphagia-Specific Quality of Life (SWAL-QOL), met criteria for Suggested. Although Rome III constipation module is widely accepted in the gastroenterology community, and the earlier version from the Rome II criteria has been used in a single study of PD patients, neither met criteria for Suggested

  9. Functional Task Test: 2. Spaceflight-Induced Cardiovascular Change and Recovery During NASA's Functional Task Test

    NASA Technical Reports Server (NTRS)

    Phillips, Tiffany; Arzeno, Natalia M.; Stenger, Michael; Lee, Stuart M. C.; Bloomberg, Jacob J.; Platts, Steven H.

    2011-01-01

    The overall objective of the functional task test (FTT) is to correlate spaceflight-induced physiological adaptations with changes in performance of high priority exploration mission-critical tasks. This presentation will focus on the recovery from fall/stand test (RFST), which measures the cardiovascular response to the transition from the prone posture (simulated fall) to standing in normal gravity, as well as heart rate (HR) during 11 functional tasks. As such, this test describes some aspects of spaceflight-induced cardiovascular deconditioning and the course of recovery in Space Shuttle and International Space Station (ISS) astronauts. The sensorimotor and neuromuscular components of the FTT are described in two separate abstracts: Functional Task Test 1 and 3.

  10. Nutrient removal by up-scaling a hybrid floating treatment bed (HFTB) using plant and periphyton: From laboratory tank to polluted river.

    PubMed

    Liu, Junzhuo; Wang, Fengwu; Liu, Wei; Tang, Cilai; Wu, Chenxi; Wu, Yonghong

    2016-05-01

    Planted floating treatment bed (FTB) is an innovative technique of removing nutrients from polluted water but limited in deep water and cold seasons. Periphyton was integrated into FTB for a hybrid floating treatment bed (HFTB) to improve its nutrient removal capacity. To assess its potential for treating nutrient-polluted rivers, HFTB was up-scaled from 5L laboratory tanks to 350L outdoor tanks and then to a commercial-scale 900m section of polluted river. Plants and periphyton interacted in HFTB with periphyton limiting plant root growth and plants having shading effects on periphyton. Non-overlapping distribution of plants and periphyton can minimize the negative interactions in HFTB. HFTB successfully kept TN and TP of the river at less than 2.0 and 0.02mgL(-1), respectively. This study indicates that HFTB can be easily up-scaled for nutrients removal from polluted rivers in different seasons providing a long-term, environmentally-friendly method to remediate polluted ecosystems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. The total laboratory solution: a new laboratory E-business model based on a vertical laboratory meta-network.

    PubMed

    Friedman, B A

    2001-08-01

    Major forces are now reshaping all businesses on a global basis, including the healthcare and clinical laboratory industries. One of the major forces at work is information technology (IT), which now provides the opportunity to create a new economic and business model for the clinical laboratory industry based on the creation of an integrated vertical meta-network, referred to here as the "total laboratory solution" (TLS). Participants at the most basic level of such a network would include a hospital-based laboratory, a reference laboratory, a laboratory information system/application service provider/laboratory portal vendor, an in vitro diagnostic manufacturer, and a pharmaceutical/biotechnology manufacturer. It is suggested that each of these participants would add value to the network primarily in its area of core competency. Subvariants of such a network have evolved over recent years, but a TLS comprising all or most of these participants does not exist at this time. Although the TLS, enabled by IT and closely akin to the various e-businesses that are now taking shape, offers many advantages from a theoretical perspective over the current laboratory business model, its success will depend largely on (a) market forces, (b) how the collaborative networks are organized and managed, and (c) whether the network can offer healthcare organizations higher quality testing services at lower cost. If the concept is successful, new demands will be placed on hospital-based laboratory professionals to shift the range of professional services that they offer toward clinical consulting, integration of laboratory information from multiple sources, and laboratory information management. These information management and integration tasks can only increase in complexity in the future as new genomic and proteomics testing modalities are developed and come on-line in clinical laboratories.

  12. A large scale laboratory cage trial of Aedes densonucleosis virus (AeDNV).

    PubMed

    Wise de Valdez, Megan R; Suchman, Erica L; Carlson, Jonathan O; Black, William C

    2010-05-01

    Aedes aegypti (L.) (Diptera: Culicidae) the primary vector of dengue viruses (DENV1-4), oviposit in and around human dwellings, including sites difficult to locate, making control of this mosquito challenging. We explored the efficacy and sustainability of Aedes Densonucleosis Virus (AeDNV) as a biocontrol agent for Ae. aegypti in and among oviposition sites in large laboratory cages (> 92 m3) as a prelude to field trials. Select cages were seeded with AeDNV in a single oviposition site (OPS) with unseeded OPSs established at varied distances. Quantitative real-time polymerase chain reaction was used to track dispersal and accumulation of AeDNV among OPSs. All eggs were collected weekly from each cage and counted. We asked: (1) Is AeDNV dispersed over varying distances and can it accumulate and persist in novel OPSs? (2) Are egg densities reduced in AeDNV treated populations? AeDNV was dispersed to and sustained in novel OPSs. Virus accumulation in OPSs was positively correlated with egg densities and proximity to the initial infection source affected the timing of dispersal and maintenance of viral titers. AeDNV did not significantly reduce Ae. aegypti egg densities. The current study documents that adult female Ae. aegypti oviposition behavior leads to successful viral dispersal from treated to novel containers in large-scale cages; however, the AeDNV titers reached were not sufficient to reduce egg densities.

  13. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet

    PubMed Central

    Li, C. K.; Tzeferacos, P.; Lamb, D.; Gregori, G.; Norreys, P. A.; Rosenberg, M. J.; Follett, R. K.; Froula, D. H.; Koenig, M.; Seguin, F. H.; Frenje, J. A.; Rinderknecht, H. G.; Sio, H.; Zylstra, A. B.; Petrasso, R. D.; Amendt, P. A.; Park, H. S.; Remington, B. A.; Ryutov, D. D.; Wilks, S. C.; Betti, R.; Frank, A.; Hu, S. X.; Sangster, T. C.; Hartigan, P.; Drake, R. P.; Kuranz, C. C.; Lebedev, S. V.; Woolsey, N. C.

    2016-01-01

    The remarkable discovery by the Chandra X-ray observatory that the Crab nebula's jet periodically changes direction provides a challenge to our understanding of astrophysical jet dynamics. It has been suggested that this phenomenon may be the consequence of magnetic fields and magnetohydrodynamic instabilities, but experimental demonstration in a controlled laboratory environment has remained elusive. Here we report experiments that use high-power lasers to create a plasma jet that can be directly compared with the Crab jet through well-defined physical scaling laws. The jet generates its own embedded toroidal magnetic fields; as it moves, plasma instabilities result in multiple deflections of the propagation direction, mimicking the kink behaviour of the Crab jet. The experiment is modelled with three-dimensional numerical simulations that show exactly how the instability develops and results in changes of direction of the jet. PMID:27713403

  14. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet.

    PubMed

    Li, C K; Tzeferacos, P; Lamb, D; Gregori, G; Norreys, P A; Rosenberg, M J; Follett, R K; Froula, D H; Koenig, M; Seguin, F H; Frenje, J A; Rinderknecht, H G; Sio, H; Zylstra, A B; Petrasso, R D; Amendt, P A; Park, H S; Remington, B A; Ryutov, D D; Wilks, S C; Betti, R; Frank, A; Hu, S X; Sangster, T C; Hartigan, P; Drake, R P; Kuranz, C C; Lebedev, S V; Woolsey, N C

    2016-10-07

    The remarkable discovery by the Chandra X-ray observatory that the Crab nebula's jet periodically changes direction provides a challenge to our understanding of astrophysical jet dynamics. It has been suggested that this phenomenon may be the consequence of magnetic fields and magnetohydrodynamic instabilities, but experimental demonstration in a controlled laboratory environment has remained elusive. Here we report experiments that use high-power lasers to create a plasma jet that can be directly compared with the Crab jet through well-defined physical scaling laws. The jet generates its own embedded toroidal magnetic fields; as it moves, plasma instabilities result in multiple deflections of the propagation direction, mimicking the kink behaviour of the Crab jet. The experiment is modelled with three-dimensional numerical simulations that show exactly how the instability develops and results in changes of direction of the jet.

  15. Clinch River remedial investigation task 9 -- benthic macroinvertebrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, E.M. Jr.

    1994-08-01

    This report summarizes the results of Task 9 of the TVA/Department of Energy (DOE) Interagency Agreement supporting DOE`s Clinch River Remedial Investigation. Species lists and densities (numbers/m{sup 2}) of benthic macroinvertebrates sampled at 16 sites in the Clinch River and Poplar Creek embayments of upper Watts Bar Reservoir near Oak Ridge, Tennessee, in March, 1994, are presented and briefly discussed. Data are also analyzed to assess and compare quality of benthic communities at each site, according to methods developed for TVA`s Reservoir Vital Signs Monitoring Program. Results of this study will be incorporated with other program tasks in a comprehensivemore » report prepared by Oak Ridge National Laboratory in 1995, which will, in part, assess the effect of sediment contaminants on benthic macroinvertebrate communities in Watts Bar Reservoir.« less

  16. Working Memory Training Improves Dual-Task Performance on Motor Tasks.

    PubMed

    Kimura, Takehide; Kaneko, Fuminari; Nagahata, Keita; Shibata, Eriko; Aoki, Nobuhiro

    2017-01-01

    The authors investigated whether working memory training improves motor-motor dual-task performance consisted of upper and lower limb tasks. The upper limb task was a simple reaction task and the lower limb task was an isometric knee extension task. 45 participants (age = 21.8 ± 1.6 years) were classified into a working memory training group (WM-TRG), dual-task training group, or control group. The training duration was 2 weeks (15 min, 4 times/week). Our results indicated that working memory capacity increased significantly only in the WM-TRG. Dual-task performance improved in the WM-TRG and dual-task training group. Our study provides the novel insight that working memory training improves dual-task performance without specific training on the target motor task.

  17. The validity and scalability of the Theory of Mind Scale with toddlers and preschoolers.

    PubMed

    Hiller, Rachel M; Weber, Nathan; Young, Robyn L

    2014-12-01

    Despite the importance of theory of mind (ToM) for typical development, there remain 2 key issues affecting our ability to draw robust conclusions. One is the continued focus on false belief as the sole measure of ToM. The second is the lack of empirically validated measures of ToM as a broad construct. Our key aim was to examine the validity and reliability of the 5-item ToM scale (Peterson, Wellman, & Liu, 2005). In particular, we extended on previous research of this scale by assessing its scalability and validity for use with children from 2 years of age. Sixty-eight typically developing children (aged 24 to 61 months) were assessed on the scale's 5 tasks, along with a sixth Sally-Anne false-belief task. Our data replicated the scalability of the 5 tasks for a Rasch-but not Guttman-scale. Guttman analysis showed that a 4-item scale may be more suitable for this age range. Further, the tasks showed good internal consistency and validity for use with children as young as 2 years of age. Overall, the measure provides a valid and reliable tool for the assessment of ToM, and in particular, the longitudinal assessment of this ability as a construct. (c) 2014 APA, all rights reserved.

  18. Effects of dopamine replacement therapy on lower extremity kinetics and kinematics during a rapid force production task in persons with Parkinson disease.

    PubMed

    Foreman, K Bo; Singer, Madeline L; Addison, Odessa; Marcus, Robin L; LaStayo, Paul C; Dibble, Leland E

    2014-01-01

    Postural instability appears to be a dopamine resistance motor deficit in persons with Parkinson disease (PD); however, little is known about the effects of dopamine replacement on the relative biomechanical contributions of individual lower extremity joints during postural control tasks. To gain insight, we examined persons with PD using both clinical and laboratory measures. For a clinical measure of motor severity we utilized the Unified Parkinson Disease Rating Scale motor subsection during both OFF and ON medication conditions. For the laboratory measure we utilized data gathered during a rapid lower extremity force production task. Kinematic and kinetic variables at the hip, knee, and ankle were gathered during a counter movement jump during both OFF and ON medication conditions. Sixteen persons with PD with a median Hoehn and Yahr severity of 2.5 completed the study. Medication resulted in significant improvements of angular displacement for the hip, knee, and ankle. Furthermore, significant improvements were revealed only at the hip for peak net moments and average angular velocity compared to the OFF medication condition. These results suggest that dopamine replacement medication result in decreased clinical motor disease severity and have a greater influence on kinetics and kinematics proximally. This proximally focused improvement may be due to active recruitment of muscle force and reductions in passive restraint during lower extremity rapid force production. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Laboratory testing of candidate robotic applications for space

    NASA Technical Reports Server (NTRS)

    Purves, R. B.

    1987-01-01

    Robots have potential for increasing the value of man's presence in space. Some categories with potential benefit are: (1) performing extravehicular tasks like satellite and station servicing, (2) supporting the science mission of the station by manipulating experiment tasks, and (3) performing intravehicular activities which would be boring, tedious, exacting, or otherwise unpleasant for astronauts. An important issue in space robotics is selection of an appropriate level of autonomy. In broad terms three levels of autonomy can be defined: (1) teleoperated - an operator explicitly controls robot movement; (2) telerobotic - an operator controls the robot directly, but by high-level commands, without, for example, detailed control of trajectories; and (3) autonomous - an operator supplies a single high-level command, the robot does all necessary task sequencing and planning to satisfy the command. Researchers chose three projects for their exploration of technology and implementation issues in space robots, one each of the three application areas, each with a different level of autonomy. The projects were: (1) satellite servicing - teleoperated; (2) laboratory assistant - telerobotic; and (3) on-orbit inventory manager - autonomous. These projects are described and some results of testing are summarized.

  20. On the origins of the task mixing cost in the cuing task-switching paradigm.

    PubMed

    Rubin, Orit; Meiran, Nachshon

    2005-11-01

    Poorer performance in conditions involving task repetition within blocks of mixed tasks relative to task repetition within blocks of single task is called mixing cost (MC). In 2 experiments exploring 2 hypotheses regarding the origins of MC, participants either switched between cued shape and color tasks, or they performed them as single tasks. Experiment 1 supported the hypothesis that mixed-tasks trials require the resolution of task ambiguity by showing that MC existed only with ambiguous stimuli that afforded both tasks and not with unambiguous stimuli affording only 1 task. Experiment 2 failed to support the hypothesis that holding multiple task sets in working memory (WM) generates MC by showing that systematic manipulation of the number of stimulus-response rules in WM did not affect MC. The results emphasize the role of competition management between task sets during task control.

  1. Scaling considerations for a multi-megawatt class supercritical CO2 brayton cycle and commercialization.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleming, Darryn D.; Holschuh, Thomas Vernon,; Conboy, Thomas M.

    2013-11-01

    Small-scale supercritical CO2 demonstration loops are successful at identifying the important technical issues that one must face in order to scale up to larger power levels. The Sandia National Laboratories supercritical CO2 Brayton cycle test loops are identifying technical needs to scale the technology to commercial power levels such as 10 MWe. The small size of the Sandia 1 MWth loop has demonstration of the split flow loop efficiency and effectiveness of the Printed Circuit Heat Exchangers (PCHXs) leading to the design of a fully recuperated, split flow, supercritical CO2 Brayton cycle demonstration system. However, there were many problems thatmore » were encountered, such as high rotational speeds in the units. Additionally, the turbomachinery in the test loops need to identify issues concerning the bearings, seals, thermal boundaries, and motor controller problems in order to be proved a reliable power source in the 300 kWe range. Although these issues were anticipated in smaller demonstration units, commercially scaled hardware would eliminate these problems caused by high rotational speeds at small scale. The economic viability and development of the future scalable 10 MWe solely depends on the interest of DOE and private industry. The Intellectual Property collected by Sandia proves that the ~10 MWe supercritical CO2 power conversion loop to be very beneficial when coupled to a 20 MWth heat source (either solar, geothermal, fossil, or nuclear). This paper will identify a commercialization plan, as well as, a roadmap from the simple 1 MWth supercritical CO2 development loop to a power producing 10 MWe supercritical CO2 Brayton loop.« less

  2. Two-dimensional systolic-array architecture for pixel-level vision tasks

    NASA Astrophysics Data System (ADS)

    Vijverberg, Julien A.; de With, Peter H. N.

    2010-05-01

    This paper presents ongoing work on the design of a two-dimensional (2D) systolic array for image processing. This component is designed to operate on a multi-processor system-on-chip. In contrast with other 2D systolic-array architectures and many other hardware accelerators, we investigate the applicability of executing multiple tasks in a time-interleaved fashion on the Systolic Array (SA). This leads to a lower external memory bandwidth and better load balancing of the tasks on the different processing tiles. To enable the interleaving of tasks, we add a shadow-state register for fast task switching. To reduce the number of accesses to the external memory, we propose to share the communication assist between consecutive tasks. A preliminary, non-functional version of the SA has been synthesized for an XV4S25 FPGA device and yields a maximum clock frequency of 150 MHz requiring 1,447 slices and 5 memory blocks. Mapping tasks from video content-analysis applications from literature on the SA yields reductions in the execution time of 1-2 orders of magnitude compared to the software implementation. We conclude that the choice for an SA architecture is useful, but a scaled version of the SA featuring less logic with fewer processing and pipeline stages yielding a lower clock frequency, would be sufficient for a video analysis system-on-chip.

  3. 2. Exterior view of Systems Integration Laboratory Building (T28), looking ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Exterior view of Systems Integration Laboratory Building (T-28), looking southwest. The low-lying concrete Signal Transfer Building (T-28A) is located in the immediate foreground. - Air Force Plant PJKS, Systems Integration Laboratory, Systems Integration Laboratory Building, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO

  4. Improving quality management systems of laboratories in developing countries: an innovative training approach to accelerate laboratory accreditation.

    PubMed

    Yao, Katy; McKinney, Barbara; Murphy, Anna; Rotz, Phil; Wafula, Winnie; Sendagire, Hakim; Okui, Scolastica; Nkengasong, John N

    2010-09-01

    The Strengthening Laboratory Management Toward Accreditation (SLMTA) program was developed to promote immediate, measurable improvement in laboratories of developing countries. The laboratory management framework, a tool that prescribes managerial job tasks, forms the basis of the hands-on, activity-based curriculum. SLMTA is implemented through multiple workshops with intervening site visits to support improvement projects. To evaluate the effectiveness of SLMTA, the laboratory accreditation checklist was developed and subsequently adopted by the World Health Organization Regional Office for Africa (WHO AFRO). The SLMTA program and the implementation model were validated through a pilot in Uganda. SLMTA yielded observable, measurable results in the laboratories and improved patient flow and turnaround time in a laboratory simulation. The laboratory staff members were empowered to improve their own laboratories by using existing resources, communicate with clinicians and hospital administrators, and advocate for system strengthening. The SLMTA program supports laboratories by improving management and building preparedness for accreditation.

  5. 2.5-year-olds Succeed at a Verbal Anticipatory-Looking False-Belief Task

    PubMed Central

    He, Zijing; Bolz, Matthias; Baillargeon, Renée

    2012-01-01

    Recent research suggests that infants and toddlers succeed at a wide range of nonelicited-response false-belief tasks (i.e., tasks that do not require children to answer a direct question about a mistaken agent’s likely behavior). However, one exception to this generalization comes from verbal anticipatory-looking tasks, which have produced inconsistent findings with toddlers. One possible explanation for these findings is that toddlers succeed when they correctly interpret the prompt as a self-addressed utterance (making the task a nonelicited-response task), but fail when they mistakenly interpret the prompt as a direct question (making the task an elicited-response task). Here, 2.5-year-old toddlers were tested in a verbal anticipatory-looking task that was designed to help them interpret the anticipatory prompt as a self-addressed utterance: the experimenter looked at the ceiling, chin in hand, during and after the prompt. Children gave evidence of false-belief understanding in this task, but failed when the experimenter looked at the child during and after the prompt. These results reinforce claims of robust continuity in early false-belief reasoning and provide additional support for the distinction between nonelicited- and elicited-response false-belief tasks. Three accounts of the discrepant results obtained with these tasks—and of early false-belief understanding more generally—are discussed. PMID:22429030

  6. Laboratory and Self-Report Methods to Assess Reappraisal and Distraction in Youth.

    PubMed

    Bettis, Alexandra H; Henry, Lauren; Prussien, Kemar V; Vreeland, Allison; Smith, Michele; Adery, Laura H; Compas, Bruce E

    2018-06-07

    Coping and emotion regulation are central features of risk and resilience in childhood and adolescence, but research on these constructs has relied on different methods of assessment. The current study aimed to bridge the gap between questionnaire and experimental methods of measuring secondary control coping strategies, specifically distraction and cognitive reappraisal, and examine associations with symptoms of anxiety and depression in youth. A community sample of 70 youth (ages 9-15) completed a novel experimental coping and emotion regulation paradigm and self-report measures of coping and emotion regulation and symptoms. Findings indicate that use of distraction and reappraisal during the laboratory paradigm was associated with lower levels of negative emotion during the task. Youth emotion ratings while implementing distraction, but not reappraisal, during the laboratory task were associated with youth self-reported use of secondary control coping in response to family stress. Youth symptoms of anxiety and depression were also significantly positively associated with negative emotion ratings during the laboratory task, and both laboratory task and self-reported coping and emotion regulation accounted for significant variance in symptoms in youth. Both questionnaire and laboratory methods to assess coping and emotion regulation in youth are important for understanding these processes as possible mechanisms of risk and resilience and continued integration of these methods is a priority for future research.

  7. The effect of entrapped nonaqueous phase liquids on tracer transport in heterogeneous porous media: Laboratory experiments at the intermediate scale

    USGS Publications Warehouse

    Barth, Gilbert R.; Illangasekare, T.H.; Rajaram, H.

    2003-01-01

    This work considers the applicability of conservative tracers for detecting high-saturation nonaqueous-phase liquid (NAPL) entrapment in heterogeneous systems. For this purpose, a series of experiments and simulations was performed using a two-dimensional heterogeneous system (10??1.2 m), which represents an intermediate scale between laboratory and field scales. Tracer tests performed prior to injecting the NAPL provide the baseline response of the heterogeneous porous medium. Two NAPL spill experiments were performed and the entrapped-NAPL saturation distribution measured in detail using a gamma-ray attenuation system. Tracer tests following each of the NAPL spills produced breakthrough curves (BTCs) reflecting the impact of entrapped NAPL on conservative transport. To evaluate significance, the impact of NAPL entrapment on the conservative-tracer breakthrough curves was compared to simulated breakthrough curve variability for different realizations of the heterogeneous distribution. Analysis of the results reveals that the NAPL entrapment has a significant impact on the temporal moments of conservative-tracer breakthrough curves. ?? 2003 Elsevier B.V. All rights reserved.

  8. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  9. False-belief understanding in 2.5-year-olds: Evidence from two novel verbal spontaneous-response tasks

    PubMed Central

    Scott, Rose M.; He, Zijing; Baillargeon, Renée; Cummins, Denise

    2011-01-01

    Recent research indicates that toddlers and infants succeed at various non-verbal spontaneous-response false-belief tasks; here we asked whether toddlers would also succeed at verbal spontaneous-response false-belief tasks that imposed significant linguistic demands. 2.5-year-olds were tested using two novel tasks: a preferential-looking task in which children listened to a false-belief story while looking at a picture book (with matching and non-matching pictures), and a violation-of-expectation task in which children watched an adult “Subject” answer (correctly or incorrectly) a standard false-belief question. Positive results were obtained with both tasks, despite their linguistic demands. These results (1) support the distinction between spontaneous-and elicited-response tasks by showing that toddlers succeed at verbal false-belief tasks that do not require them to answer direct questions about agents’ false beliefs, (2) reinforce claims of robust continuity in early false-belief understanding as assessed by spontaneous-response tasks, and (3) provide researchers with new experimental tasks for exploring early false-belief understanding in neurotypical and autistic populations. PMID:22356174

  10. False-belief understanding in 2.5-year-olds: evidence from two novel verbal spontaneous-response tasks.

    PubMed

    Scott, Rose M; He, Zijing; Baillargeon, Renée; Cummins, Denise

    2012-03-01

    Recent research indicates that toddlers and infants succeed at various non-verbal spontaneous-response false-belief tasks; here we asked whether toddlers would also succeed at verbal spontaneous-response false-belief tasks that imposed significant linguistic demands. We tested 2.5-year-olds using two novel tasks: a preferential-looking task in which children listened to a false-belief story while looking at a picture book (with matching and non-matching pictures), and a violation-of-expectation task in which children watched an adult 'Subject' answer (correctly or incorrectly) a standard false-belief question. Positive results were obtained with both tasks, despite their linguistic demands. These results (1) support the distinction between spontaneous- and elicited-response tasks by showing that toddlers succeed at verbal false-belief tasks that do not require them to answer direct questions about agents' false beliefs, (2) reinforce claims of robust continuity in early false-belief understanding as assessed by spontaneous-response tasks, and (3) provide researchers with new experimental tasks for exploring early false-belief understanding in neurotypical and autistic populations. © 2011 Blackwell Publishing Ltd.

  11. Chlor-Alkali Industry: A Laboratory Scale Approach

    ERIC Educational Resources Information Center

    Sanchez-Sanchez, C. M.; Exposito, E.; Frias-Ferrer, A.; Gonzalez-Garaia, J.; Monthiel, V.; Aldaz, A.

    2004-01-01

    A laboratory experiment for students in the last year of degree program in chemical engineering, chemistry, or industrial chemistry is presented. It models the chlor-alkali process, one of the most important industrial applications of electrochemical technology and the second largest industrial consumer of electricity after aluminium industry.

  12. 2. View, structures in Systems Integration Laboratory complex, looking north. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. View, structures in Systems Integration Laboratory complex, looking north. The Components Test Laboratory (T-27) is located in the immediate foreground. Immediately uphill to the left of T-27 is the Boiler Chiller Plant (T-28H). To the left of T-28H is the Oxidizer Conditioning Structure (T-28D). Behind the T-28D is the Long-Term Oxidizer Silo (T-28B). The twin gantry structure at the left is the Systems Integration Laboratory (T-28). - Air Force Plant PJKS, Systems Integration Laboratory, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO

  13. A task-irrelevant stimulus attribute affects perception and short-term memory

    PubMed Central

    Huang, Jie; Kahana, Michael J.; Sekuler, Robert

    2010-01-01

    Selective attention protects cognition against intrusions of task-irrelevant stimulus attributes. This protective function was tested in coordinated psychophysical and memory experiments. Stimuli were superimposed, horizontally and vertically oriented gratings of varying spatial frequency; only one orientation was task relevant. Experiment 1 demonstrated that a task-irrelevant spatial frequency interfered with visual discrimination of the task-relevant spatial frequency. Experiment 2 adopted a two-item Sternberg task, using stimuli that had been scaled to neutralize interference at the level of vision. Despite being visually neutralized, the task-irrelevant attribute strongly influenced recognition accuracy and associated reaction times (RTs). This effect was sharply tuned, with the task-irrelevant spatial frequency having an impact only when the task-relevant spatial frequencies of the probe and study items were highly similar to one another. Model-based analyses of judgment accuracy and RT distributional properties converged on the point that the irrelevant orientation operates at an early stage in memory processing, not at a later one that supports decision making. PMID:19933454

  14. Fate of Salmonella Typhimurium in laboratory-scale drinking water biofilms.

    PubMed

    Schaefer, L M; Brözel, V S; Venter, S N

    2013-12-01

    Investigations were carried out to evaluate and quantify colonization of laboratory-scale drinking water biofilms by a chromosomally green fluorescent protein (gfp)-tagged strain of Salmonella Typhimurium. Gfp encodes the green fluorescent protein and thus allows in situ detection of undisturbed cells and is ideally suited for monitoring Salmonella in biofilms. The fate and persistence of non-typhoidal Salmonella in simulated drinking water biofilms was investigated. The ability of Salmonella to form biofilms in monoculture and the fate and persistence of Salmonella in a mixed aquatic biofilm was examined. In monoculture S. Typhimurium formed loosely structured biofilms. Salmonella colonized established multi-species drinking water biofilms within 24 hours, forming micro-colonies within the biofilm. S. Typhimurium was also released at high levels from the drinking water-associated biofilm into the water passing through the system. This indicated that Salmonella could enter into, survive and grow within, and be released from a drinking water biofilm. The ability of Salmonella to survive and persist in a drinking water biofilm, and be released at high levels into the flow for recolonization elsewhere, indicates the potential for a persistent health risk to consumers once a network becomes contaminated with this bacterium.

  15. 16p11.2 Deletion mice display cognitive deficits in touchscreen learning and novelty recognition tasks.

    PubMed

    Yang, Mu; Lewis, Freeman C; Sarvi, Michael S; Foley, Gillian M; Crawley, Jacqueline N

    2015-12-01

    Chromosomal 16p11.2 deletion syndrome frequently presents with intellectual disabilities, speech delays, and autism. Here we investigated the Dolmetsch line of 16p11.2 heterozygous (+/-) mice on a range of cognitive tasks with different neuroanatomical substrates. Robust novel object recognition deficits were replicated in two cohorts of 16p11.2+/- mice, confirming previous findings. A similarly robust deficit in object location memory was discovered in +/-, indicating impaired spatial novelty recognition. Generalizability of novelty recognition deficits in +/- mice extended to preference for social novelty. Robust learning deficits and cognitive inflexibility were detected using Bussey-Saksida touchscreen operant chambers. During acquisition of pairwise visual discrimination, +/- mice required significantly more training trials to reach criterion than wild-type littermates (+/+), and made more errors and correction errors than +/+. In the reversal phase, all +/+ reached criterion, whereas most +/- failed to reach criterion by the 30-d cutoff. Contextual and cued fear conditioning were normal in +/-. These cognitive phenotypes may be relevant to some aspects of cognitive impairments in humans with 16p11.2 deletion, and support the use of 16p11.2+/- mice as a model system for discovering treatments for cognitive impairments in 16p11.2 deletion syndrome. © 2015 Yang et al.; Published by Cold Spring Harbor Laboratory Press.

  16. Dystonia rating scales: critique and recommendations

    PubMed Central

    Albanese, Alberto; Sorbo, Francesca Del; Comella, Cynthia; Jinnah, H.A.; Mink, Jonathan W.; Post, Bart; Vidailhet, Marie; Volkmann, Jens; Warner, Thomas T.; Leentjens, Albert F.G.; Martinez-Martin, Pablo; Stebbins, Glenn T.; Goetz, Christopher G.; Schrag, Anette

    2014-01-01

    Background Many rating scales have been applied to the evaluation of dystonia, but only few have been assessed for clinimetric properties. The Movement Disorders Society commissioned this task force to critique existing dystonia rating scales and place them in the clinical and clinimetric context. Methods A systematic literature review was conducted to identify rating scales that have either been validated or used in dystonia. Results Thirty six potential scales were identified. Eight were excluded because they did not meet review criteria, leaving twenty-eight scales that were critiqued and rated by the task force. Seven scales were found to meet criteria to be “recommended”: the Blepharospasm Disability Index is recommended for rating blepharospasm; the Cervical Dystonia Impact Scale and the Toronto Western Spasmodic Torticollis Rating Scale for rating cervical dystonia; the Craniocervical Dystonia Questionnaire for blepharospasm and cervical dystonia; the Voice Handicap Index (VHI) and the Vocal Performance Questionnaire (VPQ) for laryngeal dystonia; and the Fahn-Marsden Dystonia Rating Scale for rating generalized dystonia. Two “recommended” scales (VHI and VPQ) are generic scales validated on few patients with laryngeal dystonia, whereas the others are disease-specific scales. Twelve scales met criteria for “suggested” and seven scales met criteria for “listed”. All the scales are individually reviewed in the online appendix. Conclusion The task force recommends five specific dystonia scales and suggests to further validate in dystonia two recommended generic voice-disorder scales. Existing scales for oromandibular, arm and task-specific dystonia should be refined and fully assessed. Scales should be developed for body regions where no scales are available, such as lower limbs and trunk. PMID:23893443

  17. Task switching in a hierarchical task structure: evidence for the fragility of the task repetition benefit.

    PubMed

    Lien, Mei-Ching; Ruthruff, Eric

    2004-05-01

    This study examined how task switching is affected by hierarchical task organization. Traditional task-switching studies, which use a constant temporal and spatial distance between each task element (defined as a stimulus requiring a response), promote a flat task structure. Using this approach, Experiment 1 revealed a large switch cost of 238 ms. In Experiments 2-5, adjacent task elements were grouped temporally and/or spatially (forming an ensemble) to create a hierarchical task organization. Results indicate that the effect of switching at the ensemble level dominated the effect of switching at the element level. Experiments 6 and 7, using an ensemble of 3 task elements, revealed that the element-level switch cost was virtually absent between ensembles but was large within an ensemble. The authors conclude that the element-level task repetition benefit is fragile and can be eliminated in a hierarchical task organization.

  18. Task switching in a hierarchical task structure: evidence for the fragility of the task repetition benefit

    NASA Technical Reports Server (NTRS)

    Lien, Mei-Ching; Ruthruff, Eric

    2004-01-01

    This study examined how task switching is affected by hierarchical task organization. Traditional task-switching studies, which use a constant temporal and spatial distance between each task element (defined as a stimulus requiring a response), promote a flat task structure. Using this approach, Experiment 1 revealed a large switch cost of 238 ms. In Experiments 2-5, adjacent task elements were grouped temporally and/or spatially (forming an ensemble) to create a hierarchical task organization. Results indicate that the effect of switching at the ensemble level dominated the effect of switching at the element level. Experiments 6 and 7, using an ensemble of 3 task elements, revealed that the element-level switch cost was virtually absent between ensembles but was large within an ensemble. The authors conclude that the element-level task repetition benefit is fragile and can be eliminated in a hierarchical task organization.

  19. Classes in the Balance: Latent Class Analysis and the Balance Scale Task

    ERIC Educational Resources Information Center

    Boom, Jan; ter Laak, Jan

    2007-01-01

    Latent class analysis (LCA) has been successfully applied to tasks measuring higher cognitive functioning, suggesting the existence of distinct strategies used in such tasks. With LCA it became possible to classify post hoc. This important step forward in modeling and analyzing cognitive strategies is relevant to the overlapping waves model for…

  20. Integrating human and machine intelligence in galaxy morphology classification tasks

    NASA Astrophysics Data System (ADS)

    Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl

    2018-06-01

    Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.

  1. Scale-up considerations for surface collecting agent assisted in-situ burn crude oil spill response experiments in the Arctic: Laboratory to field-scale investigations.

    PubMed

    Bullock, Robin J; Aggarwal, Srijan; Perkins, Robert A; Schnabel, William

    2017-04-01

    In the event of a marine oil spill in the Arctic, government agencies, industry, and the public have a stake in the successful implementation of oil spill response. Because large spills are rare events, oil spill response techniques are often evaluated with laboratory and meso-scale experiments. The experiments must yield scalable information sufficient to understand the operability and effectiveness of a response technique under actual field conditions. Since in-situ burning augmented with surface collecting agents ("herders") is one of the few viable response options in ice infested waters, a series of oil spill response experiments were conducted in Fairbanks, Alaska, in 2014 and 2015 to evaluate the use of herders to assist in-situ burning and the role of experimental scale. This study compares burn efficiency and herder application for three experimental designs for in-situ burning of Alaska North Slope crude oil in cold, fresh waters with ∼10% ice cover. The experiments were conducted in three project-specific constructed venues with varying scales (surface areas of approximately 0.09 square meters, 9 square meters and 8100 square meters). The results from the herder assisted in-situ burn experiments performed at these three different scales showed good experimental scale correlation and no negative impact due to the presence of ice cover on burn efficiency. Experimental conclusions are predominantly associated with application of the herder material and usability for a given experiment scale to make response decisions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Effect of task load and task load increment on performance and workload

    NASA Technical Reports Server (NTRS)

    Hancock, P. A.; Williams, G.

    1993-01-01

    The goal of adaptive automated task allocation is the 'seamless' transfer of work demand between human and machine. Clearly, at the present time, we are far from this objective. One of the barriers to achieving effortless human-machine symbiosis is an inadequate understanding of the way in which operators themselves seek to reallocate demand among their own personal 'resources.' The paper addresses this through an examination of workload response, which scales an individual's reaction to common levels of experienced external demand. The results indicate the primary driver of performance is the absolute level of task demand over the increment in that demand.

  3. H2@Scale Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruth, Mark

    2017-07-12

    'H2@Scale' is a concept based on the opportunity for hydrogen to act as an intermediate between energy sources and uses. Hydrogen has the potential to be used like the primary intermediate in use today, electricity, because it too is fungible. This presentation summarizes the H2@Scale analysis efforts performed during the first third of 2017. Results of technical potential uses and supply options are summarized and show that the technical potential demand for hydrogen is 60 million metric tons per year and that the U.S. has sufficient domestic resources to meet that demand. A high level infrastructure analysis is also presentedmore » that shows an 85% increase in energy on the grid if all hydrogen is produced from grid electricity. However, a preliminary spatial assessment shows that supply is sufficient in most counties across the U.S. The presentation also shows plans for analysis of the economic potential for the H2@Scale concept. Those plans involve developing supply and demand curves for potential hydrogen generation options and as compared to other options for use of that hydrogen.« less

  4. On the Origins of the Task Mixing Cost in the Cuing Task-Switching Paradigm

    ERIC Educational Resources Information Center

    Rubin, Orit; Meiran, Nachshon

    2005-01-01

    Poorer performance in conditions involving task repetition within blocks of mixed tasks relative to task repetition within blocks of single task is called mixing cost (MC). In 2 experiments exploring 2 hypotheses regarding the origins of MC, participants either switched between cued shape and color tasks, or they performed them as single tasks.…

  5. Developing a Measurement for Task Complexity in Flight.

    PubMed

    Zheng, Yiyuan; Lu, Yanyu; Wang, Zhen; Huang, Dan; Fu, Shan

    2015-08-01

    Task complexity is regarded as an essential metric that is related to a pilot's performance and workload. Normally, pilots follow Standard Operating Procedures (SOPs) during a flight. In this study, we developed a measurement named Task Complexity in Flight (TCIF) to represent the task complexity in the SOPs. The TCIF measurement combined four complexity components into one index: actions logic complexity (ALC), actions size complexity (ASC), information control exchange complexity (ICEC), and control mode complexity (CMC).To verify the measurement, we calculated 11 tasks during the takeoff and landing phases from the SOPs, and invited 10 pilots to perform the same tasks in a flight simulator. After flight, the TCIF results were compared with two workload measurements: the Bedford scale and heart rate. The results of TCIF and the 4 components of the 11 tasks were calculated. Further, the TCIF results showed a significant correlation with the Bedford scores (R=0.851) and were also consistent with the difference in heart rate (R=0.816). Therefore, with the increased TCIF results, both the Bedford scale and the difference in heart rate increased. TCIF was proposed based on the flight operating conditions. Although additional studies of TCIF are necessary, the results of this study suggest this measurement could effectively indicate task complexity in flight, and could also be used to guide pilot training and task allocation on the flight deck.

  6. Comparative assessment of laparoscopic single-site surgery instruments to conventional laparoscopic in laboratory setting.

    PubMed

    Stolzenburg, Jens-Uwe; Kallidonis, Panagiotis; Oh, Min-A; Ghulam, Nabi; Do, Minh; Haefner, Tim; Dietel, Anja; Till, Holger; Sakellaropoulos, George; Liatsikos, Evangelos N

    2010-02-01

    Laparoendoscopic single-site surgery (LESS) represents the latest innovation in laparoscopic surgery. We compare in dry and animal laboratory the efficacy of recently introduced pre-bent instruments with conventional laparoscopic and flexible instruments in terms of time requirement, maneuverability, and ease of handling. Participants of varying laparoscopic experience were included in the study and divided in groups according to their experience. The participants performed predetermined tasks in dry laboratory using all sets of instruments. An experienced laparoscopic surgeon performed 24 nephrectomies in 12 pigs using all sets of instruments. Single port was used for all instrument sets except for the conventional instruments, which were inserted through three ports. The time required for the performance of dry laboratory tasks and the porcine nephrectomies was recorded. Errors in the performance of dry laboratory tasks of each instrument type were also recorded. Pre-bent instruments had a significant advantage over flexible instruments in terms of time requirement to accomplish tasks and procedures as well as maneuverability. Flexible instruments were more time consuming in comparison to the conventional laparoscopic instruments during the performance of the tasks. There were no significant differences in the time required for the accomplishment of dry laboratory tasks or steps of nephrectomy using conventional instruments through appropriate number of ports in comparison to pre-bent instruments through single port. Pre-bent instruments were less time consuming and with better maneuverability in comparison to flexible instruments in experimental single-port access surgery. Further clinical investigations would elucidate the efficacy of pre-bent instruments.

  7. The Rat Grimace Scale: A partially automated method for quantifying pain in the laboratory rat via facial expressions

    PubMed Central

    2011-01-01

    We recently demonstrated the utility of quantifying spontaneous pain in mice via the blinded coding of facial expressions. As the majority of preclinical pain research is in fact performed in the laboratory rat, we attempted to modify the scale for use in this species. We present herein the Rat Grimace Scale, and show its reliability, accuracy, and ability to quantify the time course of spontaneous pain in the intraplantar complete Freund's adjuvant, intraarticular kaolin-carrageenan, and laparotomy (post-operative pain) assays. The scale's ability to demonstrate the dose-dependent analgesic efficacy of morphine is also shown. In addition, we have developed software, Rodent Face Finder®, which successfully automates the most labor-intensive step in the process. Given the known mechanistic dissociations between spontaneous and evoked pain, and the primacy of the former as a clinical problem, we believe that widespread adoption of spontaneous pain measures such as the Rat Grimace Scale might lead to more successful translation of basic science findings into clinical application. PMID:21801409

  8. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Pilot-Scale Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary M. Blythe

    2006-03-01

    This Topical Report summarizes progress on Cooperative Agreement DE-FC26-04NT42309, ''Field Testing of a Wet FGD Additive.'' The objective of the project is to demonstrate the use of a flue gas desulfurization (FGD) additive, Degussa Corporation's TMT-15, to prevent the reemissions of elemental mercury (Hg{sup 0}) in flue gas exiting wet FGD systems on coal-fired boilers. Furthermore, the project intends to demonstrate that the additive can be used to precipitate most of the mercury (Hg) removed in the wet FGD system as a fine TMT salt that can be separated from the FGD liquor and bulk solid byproducts for separate disposal.more » The project will conduct pilot and full-scale tests of the TMT-15 additive in wet FGD absorbers. The tests are intended to determine required additive dosage requirements to prevent Hg{sup 0} reemissions and to separate mercury from the normal FGD byproducts for three coal types: Texas lignite/Power River Basin (PRB) coal blend, high-sulfur Eastern bituminous coal, and low-sulfur Eastern bituminous coal. The project team consists of URS Group, Inc., EPRI, TXU Generation Company LP, Southern Company, and Degussa Corporation. TXU Generation has provided the Texas lignite/PRB co-fired test site for pilot FGD tests, Monticello Steam Electric Station Unit 3. Southern Company is providing the low-sulfur Eastern bituminous coal host site for wet scrubbing tests, as well as the pilot and full-scale jet bubbling reactor (JBR) FGD systems to be tested. A third utility, to be named later, will provide the high-sulfur Eastern bituminous coal full-scale FGD test site. Degussa Corporation is providing the TMT-15 additive and technical support to the test program. The project is being conducted in six tasks. Of the six project tasks, Task 1 involves project planning and Task 6 involves management and reporting. The other four tasks involve field testing on FGD systems, either at pilot or full scale. The four tasks include: Task 2 - Pilot Additive

  9. Phosphatidylinositol (4,5)-bisphosphate dynamically regulates the K2P background K+ channel TASK-2

    PubMed Central

    Niemeyer, María Isabel; Cid, L. Pablo; Paulais, Marc; Teulon, Jacques; Sepúlveda, Francisco V.

    2017-01-01

    Two-pore domain K2P K+ channels responsible for the background K+ conductance and the resting membrane potential, are also finely regulated by a variety of chemical, physical and physiological stimuli. Hormones and transmitters acting through Gq protein-coupled receptors (GqPCRs) modulate the activity of various K2P channels but the signalling involved has remained elusive, in particular whether dynamic regulation by membrane PI(4,5)P2, common among other classes of K+ channels, affects K2P channels is controversial. Here we show that K2P K+ channel TASK-2 requires PI(4,5)P2 for activity, a dependence that accounts for its run down in the absence of intracellular ATP and its full recovery by addition of exogenous PI(4,5)P2, its inhibition by low concentrations of polycation PI scavengers, and inhibition by PI(4,5)P2 depletion from the membrane. Comprehensive mutagenesis suggests that PI(4,5)P2 interaction with TASK-2 takes place at C-terminus where three basic aminoacids are identified as being part of a putative binding site. PMID:28358046

  10. Phosphatidylinositol (4,5)-bisphosphate dynamically regulates the K2P background K+ channel TASK-2.

    PubMed

    Niemeyer, María Isabel; Cid, L Pablo; Paulais, Marc; Teulon, Jacques; Sepúlveda, Francisco V

    2017-03-30

    Two-pore domain K 2P K + channels responsible for the background K + conductance and the resting membrane potential, are also finely regulated by a variety of chemical, physical and physiological stimuli. Hormones and transmitters acting through Gq protein-coupled receptors (GqPCRs) modulate the activity of various K 2P channels but the signalling involved has remained elusive, in particular whether dynamic regulation by membrane PI(4,5)P 2 , common among other classes of K + channels, affects K 2P channels is controversial. Here we show that K 2P K + channel TASK-2 requires PI(4,5)P 2 for activity, a dependence that accounts for its run down in the absence of intracellular ATP and its full recovery by addition of exogenous PI(4,5)P 2 , its inhibition by low concentrations of polycation PI scavengers, and inhibition by PI(4,5)P 2 depletion from the membrane. Comprehensive mutagenesis suggests that PI(4,5)P 2 interaction with TASK-2 takes place at C-terminus where three basic aminoacids are identified as being part of a putative binding site.

  11. Investigating the dynamics of Vulcanian explosions using scaled laboratory experiments

    NASA Astrophysics Data System (ADS)

    Clarke, A. B.; Phillips, J. C.; Chojnicki, K. N.

    2005-12-01

    Laboratory experiments were conducted to investigate the dynamics of Vulcanian eruptions. A reservoir containing a mixture of water and methanol plus solid particles was pressurized and suddenly released via a rapid-release valve into a 2 ft by 2 ft by 4 ft plexiglass tank containing fresh water. Water and methanol created a light interstitial fluid to simulate buoyant volcanic gases in erupted mixtures. The duration of the subsequent experiments was not pre-determined, but instead was limited by the potential energy associated with the pressurized fluid, rather than by the volume of available fluid. Suspending liquid density was varied between 960 and 1000 kg m-3 by changing methanol concentrations from 5 to 20%. Particle size (4 & 45 microns) and concentration (1 to 5 vol%) were varied in order to change particle settling characteristics and control bulk mixture density. Variations in reservoir pressure and vent size allowed exploration of the controlling source parameters, buoyancy flux (Bo) and momentum flux (Mo). The velocity-height relationship of each experiment was documented by high-speed video, permitting classification of the laboratory flows, which ranged from long continuously accelerating jets, to starting plumes, to low-energy thermals, to collapsing fountains generating density currents. Field-documented Vulcanian explosions exhibit this same wide range of behavior (Self et al. 1979, Nature 277; Sparks & Wilson 1982, Geophys. J. R. astr. Soc. 69; Druitt et al. 2002, Geol. Soc. London, 21), demonstrating that flows obtained in the laboratory are relevant to natural systems. A generalized framework of results was defined as follows. Increasing Mo/Bo for small particles (4 microns; settling time > experiment duration) pushes the system from low-energy thermals toward high-energy, continuously accelerating jets; increasing Mo/Bo for large particles (>45 microns; settling time < experiment duration) pushes the system from a low collapsing fountain to a

  12. Productivity standards for histology laboratories.

    PubMed

    Buesa, René J

    2010-04-01

    The information from 221 US histology laboratories (histolabs) and 104 from 24 other countries with workloads from 600 to 116 000 cases per year was used to calculate productivity standards for 23 technical and 27 nontechnical tasks and for 4 types of work flow indicators. The sample includes 254 human, 40 forensic, and 31 veterinary pathology services. Statistical analyses demonstrate that most productivity standards are not different between services or worldwide. The total workload for the US human pathology histolabs averaged 26 061 cases per year, with 54% between 10 000 and less than 30 000. The total workload for 70% of the histolabs from other countries was less than 20 000, with an average of 15 226 cases per year. The fundamental manual technical tasks in the histolab and their productivity standards are as follows: grossing (14 cases per hour), cassetting (54 cassettes per hour), embedding (50 blocks per hour), and cutting (24 blocks per hour). All the other tasks, each with their own productivity standards, can be completed by auxiliary staff or using automatic instruments. Depending on the level of automation of the histolab, all the tasks derived from a workload of 25 cases will require 15.8 to 17.7 hours of work completed by 2.4 to 2.7 employees with 18% of their working time not directly dedicated to the production of diagnostic slides. This article explains how to extrapolate this productivity calculation for any workload and different levels of automation. The overall performance standard for all the tasks, including 8 hours for automated tissue processing, is 3.2 to 3.5 blocks per hour; and its best indicator is the value of the gross work flow productivity that is essentially dependent on how the work is organized. This article also includes productivity standards for forensic and veterinary histolabs, but the staffing benchmarks for histolabs will be the subject of a separate article. Copyright 2010 Elsevier Inc. All rights reserved.

  13. Task Order 7. Use of activated carbon for treatment of explosives-contaminated groundwater at the Milan Army Ammunition Plant (MAAP). Final report, Apr 89-May 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, R.M.; Wujcik, W.J.; Lowe, W.L.

    1990-05-01

    The primary objective of this task was to determine the feasibility of using GAC to treat ground water contaminated by explosives at the Milan Army Ammunition Plant (MAAP) in Milan, Tennessee. Laboratory GAC isotherm studies were conducted and two carbons, Atochem, Inc. GAC 830 and Calgon Filtrasorb 300, were selected for further testing in continuous flow GAC columns. Three pilot scale continuous flow GAC column tests were performed at MAAP using the two carbons selected from the laboratory GAC isotherm studies. The results from the laboratory and pilot studies are presented in this report. They show that concurrent removal ofmore » explosives such as TNT, RDX, HMX, Tetryl, and nitrobenzenes from ground water using continuous flow granular activated carbon is feasible.« less

  14. Competency assessment of microbiology medical laboratory technologists in Ontario, Canada.

    PubMed

    Desjardins, Marc; Fleming, Christine Ann

    2014-08-01

    Accreditation in Ontario, Canada, requires that licensed clinical laboratories participate in external quality assessment (also known as proficiency testing) and perform competency evaluation of their staff. To assess the extent of ongoing competency assessment practices, the Quality Management Program--Laboratory Services (QMP-LS) Microbiology Committee surveyed all 112 licensed Ontario microbiology laboratories. The questionnaire consisted of a total of 21 questions that included yes/no, multiple-choice, and short-answer formats. Participants were asked to provide information about existing programs, the frequency of testing, what areas are evaluated, and how results are communicated to the staff. Of the 111 responding laboratories, 6 indicated they did not have a formal evaluation program since they perform only limited bacteriology testing. Of the remaining 105 respondents, 87% perform evaluations at least annually or every 2 years, and 61% include any test or task performed, whereas 16% and 10% focus only on problem areas and high-volume complex tasks, respectively. The most common methods of evaluation were review of external quality assessment (EQA) challenges, direct observation, and worksheet review. With the exception of one participant, all communicate results to staff, and most take remedial action to correct the deficiencies. Although most accredited laboratories have a program to assess the ongoing competency of their staff, the methods used are not standardized or consistently applied, indicating that there is room for improvement. The survey successfully highlighted potential areas for improvement and allowed the QMP-LS Microbiology Committee to provide guidance to Ontario laboratories for establishing or improving existing microbiology-specific competency assessment programs. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  15. An Information Theoretic Model for the Human Processing of Cognitive Tasks.

    ERIC Educational Resources Information Center

    Moser, Gene W.

    An information-theory model of human memory was tested in thirteen experiments which involved children (six years and older) and graduate students. The subjects conducted science investigations in laboratory and non-laboratory settings, solved problems of electrical circuits, and participated in classroom science lessons. The tasks used involved…

  16. Extensions of a Basic Laboratory Experiment: [4+2] and [2+2] Cycloadditions

    ERIC Educational Resources Information Center

    Amarne, Hazem Y.; Bain, Alex D.; Neumann, Karen; Zelisko, Paul M.

    2008-01-01

    We describe an extended third-year undergraduate chemistry laboratory exercise in which a number of techniques and concepts are applied to the same set of chemical reactions. The reactions are the photochemical and thermal cycloadditions of [beta]-nitrostyrene and 2,3-dimethylbutadiene. This can be viewed as a single long lab or a series of…

  17. Fundamental Research on Percussion Drilling: Improved rock mechanics analysis, advanced simulation technology, and full-scale laboratory investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael S. Bruno

    This report summarizes the research efforts on the DOE supported research project Percussion Drilling (DE-FC26-03NT41999), which is to significantly advance the fundamental understandings of the physical mechanisms involved in combined percussion and rotary drilling, and thereby facilitate more efficient and lower cost drilling and exploration of hard-rock reservoirs. The project has been divided into multiple tasks: literature reviews, analytical and numerical modeling, full scale laboratory testing and model validation, and final report delivery. Literature reviews document the history, pros and cons, and rock failure physics of percussion drilling in oil and gas industries. Based on the current understandings, a conceptualmore » drilling model is proposed for modeling efforts. Both analytical and numerical approaches are deployed to investigate drilling processes such as drillbit penetration with compression, rotation and percussion, rock response with stress propagation, damage accumulation and failure, and debris transportation inside the annulus after disintegrated from rock. For rock mechanics modeling, a dynamic numerical tool has been developed to describe rock damage and failure, including rock crushing by compressive bit load, rock fracturing by both shearing and tensile forces, and rock weakening by repetitive compression-tension loading. Besides multiple failure criteria, the tool also includes a damping algorithm to dissipate oscillation energy and a fatigue/damage algorithm to update rock properties during each impact. From the model, Rate of Penetration (ROP) and rock failure history can be estimated. For cuttings transport in annulus, a 3D numerical particle flowing model has been developed with aid of analytical approaches. The tool can simulate cuttings movement at particle scale under laminar or turbulent fluid flow conditions and evaluate the efficiency of cutting removal. To calibrate the modeling efforts, a series of full-scale fluid hammer

  18. 2. Exterior view of Components Test Laboratory (T27), looking southeast. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Exterior view of Components Test Laboratory (T-27), looking southeast. The building wing on the left houses the equipment room and that on the right houses Test Cell 8 (oxidizer) and the oxidizer storage pit or vault. - Air Force Plant PJKS, Systems Integration Laboratory, Components Test Laboratory, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO

  19. Spatial Scaling of the Profile of Selective Attention in the Visual Field.

    PubMed

    Gannon, Matthew A; Knapp, Ashley A; Adams, Thomas G; Long, Stephanie M; Parks, Nathan A

    2016-01-01

    Neural mechanisms of selective attention must be capable of adapting to variation in the absolute size of an attended stimulus in the ever-changing visual environment. To date, little is known regarding how attentional selection interacts with fluctuations in the spatial expanse of an attended object. Here, we use event-related potentials (ERPs) to investigate the scaling of attentional enhancement and suppression across the visual field. We measured ERPs while participants performed a task at fixation that varied in its attentional demands (attentional load) and visual angle (1.0° or 2.5°). Observers were presented with a stream of task-relevant stimuli while foveal, parafoveal, and peripheral visual locations were probed by irrelevant distractor stimuli. We found two important effects in the N1 component of visual ERPs. First, N1 modulations to task-relevant stimuli indexed attentional selection of stimuli during the load task and further correlated with task performance. Second, with increased task size, attentional modulation of the N1 to distractor stimuli showed a differential pattern that was consistent with a scaling of attentional selection. Together, these results demonstrate that the size of an attended stimulus scales the profile of attentional selection across the visual field and provides insights into the attentional mechanisms associated with such spatial scaling.

  20. Study of Electron-scale Dissipation near the X-line During Magnetic Reconnection in a Laboratory Plasma

    NASA Astrophysics Data System (ADS)

    Ji, H.; Yoo, J.; Dorfman, S. E.; Jara-Almonte, J.; Yamada, M.; Swanson, C.; Daughton, W. S.; Roytershteyn, V.; Kuwahata, A.; Ii, T.; Inomoto, M.; Ono, Y.; von Stechow, A.; Grulke, O.; Phan, T.; Mozer, F.; Bale, S. D.

    2013-12-01

    Despite its disruptive influences on the large-scale structures of space and solar plasmas, the crucial topological changes and associated dissipation during magnetic reconnection take place only near an X-line within thin singular layers. In the modern collisionless models where electrons and ions are allowed to move separately, it has been predicted that ions exhaust efficiently through a thicker, ion-scale dissipative layer while mobile electrons can evacuate through a thinner, electron-scale dissipation layer, allowing for efficient release of magnetic energy. While ion dissipation layers have been frequently detected, the existence of election layers near the X-line and the associated dissipation structures and mechanisms are still an open question, and will be a main subject of the coming MMS mission. In this presentation, we will summarize our efforts in the past a few years to study electron-scale dissipation in a well-controlled and well-diagnosed reconnecting current sheet in a laboratory plasma, with close comparisons with the state-of-the-art, 2D and 3D fully kinetic simulations. Key results include: (1) positive identification of electromagnetic waves detected at the current sheet center as long wave-length, lower-hybrid drift instabilities (EM-LHDI), (2) however, there is strong evidence that this EM-LHDI cannot provide the required force to support the reconnection electric field, (3) detection of 3D flux-rope-like magnetic structures during impulsive reconnection events, and (4) electrons are heated through non-classical mechanisms near the X-line with a small but clear temperature anisotropy. These results, unfortunately, do not resolve the outstanding discrepancies on electron layer thickness between best available experiments and fully kinetic simulations. To make further progress, we are continuously pushing in the both experimental and numerical frontiers. Experimentally, we started investigations on EM-LHDI and electron heating as a function

  1. The Effect of Form-Focussed Pre-Task Activities on Accuracy in L2 Production in an ESP Course in French Higher Education

    ERIC Educational Resources Information Center

    Starkey-Perret, Rebecca; Belan, Sophie; Lê Ngo, Thi Phuong; Rialland, Guillaume

    2017-01-01

    This chapter presents and discusses the results of a large-scale pilot study carried out in the context of a task-based, blended-learning Business English programme in the Foreign Languages and International Trade department of a French University . It seeks to explore the effects of pre-task planned Focus on Form (FonF) on accuracy in students'…

  2. Potential Electrokinetic Remediation Technologies of Laboratory Scale into Field Application- Methodology Overview

    NASA Astrophysics Data System (ADS)

    Ayuni Suied, Anis; Tajudin, Saiful Azhar Ahmad; Nizam Zakaria, Muhammad; Madun, Aziman

    2018-04-01

    Heavy metal in soil possesses high contribution towards soil contamination which causes to unbalance ecosystem. There are many ways and procedures to make the electrokinetic remediation (EKR) method to be efficient, effective, and potential as a low cost soil treatment. Electrode compartment for electrolyte is expected to treat the contaminated soil through electromigration and enhance metal ions movement. The electrokinetic is applicable for many approaches such as electrokinetic remediation (EKR), electrokinetic stabilization (EKS), electrokinetic bioremediation and many more. This paper presents a critical review on comparison of laboratory scale between EKR, EKS and EK bioremediation treatment by removing the heavy metal contaminants. It is expected to propose one framework of contaminated soil mapping. Electrical Resistivity Method (ERM) is one of famous indirect geophysical tools for surface mapping and subsurface profiling. Hence, ERM is used to mapping the migration of heavy metal ions by electrokinetic.

  3. Heart Rate Response During Mission-Critical Tasks After Space Flight

    NASA Technical Reports Server (NTRS)

    Arzeno, Natalia M.; Lee, S. M. C.; Stenger, M. B.; Lawrence, E. L.; Platts, S. H.; Bloomberg, J. J.

    2010-01-01

    Adaptation to microgravity could impair crewmembers? ability to perform required tasks upon entry into a gravity environment, such as return to Earth, or during extraterrestrial exploration. Historically, data have been collected in a controlled testing environment, but it is unclear whether these physiologic measures result in changes in functional performance. NASA?s Functional Task Test (FTT) aims to investigate whether adaptation to microgravity increases physiologic stress and impairs performance during mission-critical tasks. PURPOSE: To determine whether the well-accepted postflight tachycardia observed during standard laboratory tests also would be observed during simulations of mission-critical tasks during and after recovery from short-duration spaceflight. METHODS: Five astronauts participated in the FTT 30 days before launch, on landing day, and 1, 6, and 30 days after landing. Mean heart rate (HR) was measured during 5 simulations of mission-critical tasks: rising from (1) a chair or (2) recumbent seated position followed by walking through an obstacle course (egress from a space vehicle), (3) translating graduated masses from one location to another (geological sample collection), (4) walking on a treadmill at 6.4 km/h (ambulation on planetary surface), and (5) climbing 40 steps on a passive treadmill ladder (ingress to lander). For tasks 1, 2, 3, and 5, astronauts were encouraged to complete the task as quickly as possible. Time to complete tasks and mean HR during each task were analyzed using repeated measures ANOVA and ANCOVA respectively, in which task duration was a covariate. RESULTS: Landing day HR was higher (P < 0.05) than preflight during the upright seat egress (7%+/-3), treadmill walk (13%+/-3) and ladder climb (10%+/-4), and HR remained elevated during the treadmill walk 1 day after landing. During tasks in which HR was not elevated on landing day, task duration was significantly greater on landing day (recumbent seat egress: 25

  4. Nonanalytic Laboratory Automation: A Quarter Century of Progress.

    PubMed

    Hawker, Charles D

    2017-06-01

    Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.

  5. Supplementing the Braden scale for pressure ulcer risk among medical inpatients: the contribution of self-reported symptoms and standard laboratory tests.

    PubMed

    Skogestad, Ingrid Johansen; Martinsen, Liv; Børsting, Tove Elisabet; Granheim, Tove Irene; Ludvigsen, Eirin Sigurdssøn; Gay, Caryl L; Lerdal, Anners

    2017-01-01

    To evaluate medical inpatients' symptom experience and selected laboratory blood results as indicators of their pressure ulcer risk as measured by the Braden scale. Pressure ulcers reduce quality of life and increase treatment costs. The prevalence of pressure ulcers is 6-23% in hospital populations, but literature suggests that most pressure ulcers are avoidable. Prospective, cross-sectional survey. Three hundred and twenty-eight patients admitted to medical wards in an acute hospital in Oslo, Norway consented to participate. Data were collected on 10 days between 2012-2014 by registered nurses and nursing students. Pressure ulcer risk was assessed using the Braden scale, and scores <19 indicated pressure ulcer risk. Skin examinations were categorised as normal or stages I-IV using established definitions. Comorbidities were collected by self-report. Self-reported symptom occurrence and distress were measured with 15 items from the Memorial Symptom Assessment Scale, and pain was assessed using two numeric rating scales. Admission laboratory data were collected from medical records. Prevalence of pressure ulcers was 11·9, and 20·4% of patients were identified as being at risk for developing pressure ulcers. Multivariable analysis showed that pressure ulcer risk was positively associated with age ≥80 years, vomiting, severe pain at rest, urination problems, shortness of breath and low albumin and was negatively associated with nervousness. Our study indicates that using patient-reported symptoms and standard laboratory results as supplemental indicators of pressure ulcer risk may improve identification of vulnerable patients, but replication of these findings in other study samples is needed. Nurses play a key role in preventing pressure ulcers during hospitalisation. A better understanding of the underlying mechanisms may improve the quality of care. Knowledge about symptoms associated with pressure ulcer risk may contribute to a faster clinical judgment of

  6. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.

    PubMed

    Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.

  7. Developing Metacognitive Behaviour in Physical Education Classes: The Use of Task-Pertinent Learning Strategies

    ERIC Educational Resources Information Center

    Lidor, Ronnie

    2004-01-01

    Research in motor learning and sport pedagogy has shown that task-pertinent learning strategies enhance the learning and performance of self-paced motor tasks. Strategy research has typically been conducted under laboratory conditions in which artificial self-paced tasks were executed under well-controlled conditions. The purpose of this study was…

  8. Dual Arm Work Package performance estimates and telerobot task network simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draper, J.V.; Blair, L.M.

    1997-02-01

    This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less

  9. Can Genre Be "Heard" in Scale as Well as Song Tasks? An Exploratory Study of Female Singing in Western Lyric and Musical Theater Styles.

    PubMed

    Kayes, Gillyanne; Welch, Graham F

    2017-05-01

    Using an empirical design, this study investigated perceptual and acoustic differences between the recorded vocal products of songs and scales of professional female singers of classical Western Lyric (WL) and non-legit Musical Theater (MT) styles. A total of 54 audio-recorded samples of songs and scales from professional female singers were rated in a blind randomized testing process by seven expert listeners as being performed by either a WL or MT singer. Songs and scales that were accurately perceived by genre were then analyzed intra- and inter-genre using long-term average spectrum analysis. A high level of agreement was found between judges in ratings for both songs and scales according to genre (P < 0.0001). Judges were more successful in locating WL than MT, but accuracy was always >50%. For the long-term average spectrum analysis intra-genre, song and scale matched better than chance. The highest spectral peak for the WL singers was at the mean fundamental frequency, whereas this spectral area was weaker for the MT singers, who showed a marked peak at 1 kHz. The other main inter-genre difference appeared in the higher frequency region, with a peak in the MT spectrum between 4 and 5 kHz-the region of the "speaker's formant." In comparing female singers of WL and MT styles, scales as well as song tasks appear to be indicative of singer genre behavior. This implied difference in vocal production may be useful to teachers and clinicians dealing with multiple genres. The addition of a scale-in-genre task may be useful in future research seeking to identify genre-distinctive behaviors. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  10. The Sulcis Storage Project: Status of the First Italian Initiative for Pilot-Scale Geological Sequestration of CO2

    NASA Astrophysics Data System (ADS)

    Plaisant, A.; Maggio, E.; Pettinau, A.

    2016-12-01

    The deep aquifer located at a depth of about 1000-1500 m within fractured carbonate in the Sulcis coal basin (South-West Sardinia, Italy) constitutes a potential reservoir to develop a pilot-scale CO2 storage site. The occurrence of several coal mines and the geology of the basin also provide favourable condition to install a permanent infrastructures where advanced CO2 storage technologies can be developed. Overall, the Sulcis project will allow to characterize the Sulcis coal basin (South West Sardinia, Italy) and to develop a permanent infrastructure (know-how, equipment, laboratories, etc.) for advanced international studies on CO2 storage. The research activities are structured in two different phases: (i) site characterization, including the construction of an underground and a fault laboratories and (ii) the installation of a test site for small-scale injection of CO2. In particular, the underground laboratory will host geochemical and geophysical experiments on rocks, taking advantages of the buried environment and the very well confined conditions in the galleries; in parallel, the fault laboratory will be constructed to study CO2 leakage phenomena in a selected fault. The project is currently ongoing and some preliminary results will be presented in this work as well as the structure of the project as a whole. More in detail, preliminary activities comprise: (i) geochemical monitoring; (ii) the minero-petrographycal, physical and geophysical characterization of the rock samples; (iii) the development of both static and dynamic geological models of the reservoir; (iv) the structural geology and fault analysis; (v) the assessment of natural seismicity through a monitoring network (vi) the re-processing and the analysis of the reflection seismic data. Future activities will comprise: (i) the drilling of shallow exploration wells near the faults; (ii) the construction of both the above mentioned laboratories; (iii) drilling of a deep exploration well (1,500 m

  11. Continuous microalgal cultivation in a laboratory-scale photobioreactor under seasonal day-night irradiation: experiments and simulation.

    PubMed

    Bertucco, Alberto; Beraldi, Mariaelena; Sforza, Eleonora

    2014-08-01

    In this work, the production of Scenedesmus obliquus in a continuous flat-plate laboratory-scale photobioreactor (PBR) under alternated day-night cycles was tested both experimentally and theoretically. Variation of light intensity according to the four seasons of the year were simulated experimentally by a tunable LED lamp, and effects on microalgal growth and productivity were measured to evaluate the conversion efficiency of light energy into biomass during the different seasons. These results were used to validate a mathematical model for algae growth that can be applied to simulate a large-scale production unit, carried out in a flat-plate PBR of similar geometry. The cellular concentration in the PBR was calculated in both steady-state and transient conditions, and the value of the maintenance kinetic term was correlated to experimental profiles. The relevance of this parameter was finally outlined.

  12. Synthetic spider silk production on a laboratory scale.

    PubMed

    Hsia, Yang; Gnesa, Eric; Pacheco, Ryan; Kohler, Kristin; Jeffery, Felicia; Vierra, Craig

    2012-07-18

    As society progresses and resources become scarcer, it is becoming increasingly important to cultivate new technologies that engineer next generation biomaterials with high performance properties. The development of these new structural materials must be rapid, cost-efficient and involve processing methodologies and products that are environmentally friendly and sustainable. Spiders spin a multitude of different fiber types with diverse mechanical properties, offering a rich source of next generation engineering materials for biomimicry that rival the best manmade and natural materials. Since the collection of large quantities of natural spider silk is impractical, synthetic silk production has the ability to provide scientists with access to an unlimited supply of threads. Therefore, if the spinning process can be streamlined and perfected, artificial spider fibers have the potential use for a broad range of applications ranging from body armor, surgical sutures, ropes and cables, tires, strings for musical instruments, and composites for aviation and aerospace technology. In order to advance the synthetic silk production process and to yield fibers that display low variance in their material properties from spin to spin, we developed a wet-spinning protocol that integrates expression of recombinant spider silk proteins in bacteria, purification and concentration of the proteins, followed by fiber extrusion and a mechanical post-spin treatment. This is the first visual representation that reveals a step-by-step process to spin and analyze artificial silk fibers on a laboratory scale. It also provides details to minimize the introduction of variability among fibers spun from the same spinning dope. Collectively, these methods will propel the process of artificial silk production, leading to higher quality fibers that surpass natural spider silks.

  13. Synthetic Spider Silk Production on a Laboratory Scale

    PubMed Central

    Hsia, Yang; Gnesa, Eric; Pacheco, Ryan; Kohler, Kristin; Jeffery, Felicia; Vierra, Craig

    2012-01-01

    As society progresses and resources become scarcer, it is becoming increasingly important to cultivate new technologies that engineer next generation biomaterials with high performance properties. The development of these new structural materials must be rapid, cost-efficient and involve processing methodologies and products that are environmentally friendly and sustainable. Spiders spin a multitude of different fiber types with diverse mechanical properties, offering a rich source of next generation engineering materials for biomimicry that rival the best manmade and natural materials. Since the collection of large quantities of natural spider silk is impractical, synthetic silk production has the ability to provide scientists with access to an unlimited supply of threads. Therefore, if the spinning process can be streamlined and perfected, artificial spider fibers have the potential use for a broad range of applications ranging from body armor, surgical sutures, ropes and cables, tires, strings for musical instruments, and composites for aviation and aerospace technology. In order to advance the synthetic silk production process and to yield fibers that display low variance in their material properties from spin to spin, we developed a wet-spinning protocol that integrates expression of recombinant spider silk proteins in bacteria, purification and concentration of the proteins, followed by fiber extrusion and a mechanical post-spin treatment. This is the first visual representation that reveals a step-by-step process to spin and analyze artificial silk fibers on a laboratory scale. It also provides details to minimize the introduction of variability among fibers spun from the same spinning dope. Collectively, these methods will propel the process of artificial silk production, leading to higher quality fibers that surpass natural spider silks. PMID:22847722

  14. False-Belief Understanding in 2.5-Year-Olds: Evidence from Two Novel Verbal Spontaneous-Response Tasks

    ERIC Educational Resources Information Center

    Scott, Rose M.; He, Zijing; Baillargeon, Renee; Cummins, Denise

    2012-01-01

    Recent research indicates that toddlers and infants succeed at various "non-verbal" spontaneous-response false-belief tasks; here we asked whether toddlers would also succeed at verbal spontaneous-response false-belief tasks that imposed significant linguistic demands. We tested 2.5-year-olds using two novel tasks: a "preferential-looking" task in…

  15. Wiggins Content Scales and the MMPI-2.

    PubMed

    Kohutek, K J

    1992-03-01

    The omission of the Wiggins Content Scales occurred because of the number of items deleted as well as the addition of items to the MMPI-2. The purpose of this study is to compare scorings of the items on the Wiggins Scales of the MMPI and the items that remain on these scales on the MMPI-2. The scales of Religious Fundamentalism and Authority Conflict appear to be those most seriously affected by the item change on the MMPI-2. The scales Depression and Family Conflict maintained all of their items, and the remaining nine were not found to be statistically different when the two scorings were compared.

  16. Anticipatory Effects on Lower Extremity Neuromechanics During a Cutting Task.

    PubMed

    Meinerz, Carolyn M; Malloy, Philip; Geiser, Christopher F; Kipp, Kristof

    2015-09-01

    Continued research into the mechanism of noncontact anterior cruciate ligament injury helps to improve clinical interventions and injury-prevention strategies. A better understanding of the effects of anticipation on landing neuromechanics may benefit training interventions. To determine the effects of anticipation on lower extremity neuromechanics during a single-legged land-and-cut task. Controlled laboratory study. University biomechanics laboratory. Eighteen female National Collegiate Athletic Association Division I collegiate soccer players (age = 19.7 ± 0.8 years, height = 167.3 ± 6.0 cm, mass = 66.1 ± 2.1 kg). Participants performed a single-legged land-and-cut task under anticipated and unanticipated conditions. Three-dimensional initial contact angles, peak joint angles, and peak internal joint moments and peak vertical ground reaction forces and sagittal-plane energy absorption of the 3 lower extremity joints; muscle activation of selected hip- and knee-joint muscles. Unanticipated cuts resulted in less knee flexion at initial contact and greater ankle toe-in displacement. Unanticipated cuts were also characterized by greater internal hip-abductor and external-rotator moments and smaller internal knee-extensor and external-rotator moments. Muscle-activation profiles during unanticipated cuts were associated with greater activation of the gluteus maximus during the precontact and landing phases. Performing a cutting task under unanticipated conditions changed lower extremity neuromechanics compared with anticipated conditions. Most of the observed changes in lower extremity neuromechanics indicated the adoption of a hip-focused strategy during the unanticipated condition.

  17. Atomic Oxygen Task

    NASA Technical Reports Server (NTRS)

    Hadaway, James B.

    1997-01-01

    This report details work performed by the Center for Applied Optics (CAO) at the University of Alabama in Huntsville (UAH) on the contract entitled 'Atomic Oxygen Task' for NASA's Marshall Space Flight Center (contract NAS8-38609, Delivery Order 109, modification number 1). Atomic oxygen effects on exposed materials remain a critical concern in designing spacecraft to withstand exposure in the Low Earth Orbit (LEO) environment. The basic objective of atomic oxygen research in NASA's Materials & Processes (M&P) Laboratory is to provide the solutions to material problems facing present and future space missions. The objective of this work was to provide the necessary research for the design of specialized experimental test configurations and development of techniques for evaluating in-situ space environmental effects, including the effects of atomic oxygen and electromagnetic radiation on candidate materials. Specific tasks were performed to address materials issues concerning accelerated environmental testing as well as specifically addressing materials issues of particular concern for LDEF analysis and Space Station materials selection.

  18. Laboratory Measurements of SO2 and N2 Absorption Spectra for Planetary Atmospheres

    NASA Technical Reports Server (NTRS)

    Stark, Glenn

    2003-01-01

    This laboratory project focuses on the following topics: 1) Measurement of SO2 ultraviolet absorption cross sections; and 2) N2 band and Line Oscillator Strengths and Line Widths in the 80 to 100 nm region. Accomplishments for these projects are summarized.

  19. Carpal tunnel syndrome among laboratory technicians in relation to personal and ergonomic factors at work.

    PubMed

    El-Helaly, Mohamed; Balkhy, Hanan H; Vallenius, Laura

    2017-11-25

    Work-related carpal tunnel syndrome (CTS) has been reported in different occupations, including laboratory technicians, so this study was carried out to determine the prevalence and the associated personal and ergonomic factors for CTS among laboratory technicians. A cross-sectional study was conducted among 279 laboratory technicians at King Fahd Hospital, Saudi Arabia, who filled in a self-administered questionnaire, including questions regarding their demographic criteria, occupational history, job tasks, workplace tools, ergonomic factors at work, and symptoms suggestive of CTS. Physical examinations and electrodiagnostic studies were carried out for those who had symptoms suggestive of CTS to confirm the diagnosis. Univariate and multivariate analysis were performed for both personal and physical factors in association with confirmed CTS among laboratory technicians. The prevalence of CTS among the laboratory technicians was 9.7% (27/279). The following were the statistically significant risk factors for CTS among them: gender (all cases of CTS were female, P=0.00), arm/hand exertion (OR: 7.96; 95% CI: 1.84-34.33), pipetting (OR: 7.27; 95% CI: 3.15-16.78), repetitive tasks (OR: 4.60; 95% CI: 1.39-15.70), using unadjustable chairs or desks (OR: 3.35; 95% CI: 1.23-9.15), and working with a biosafety cabinet (OR: 2.49; 95% CI: 1.11-5.59). CTS cases had significant longer work duration (17.9 ± 5.6 years) than CTS non-case (11.5 ± 7.4 yeas) with low OR (1.108). This study demonstrates some personal and ergonomic factors associated with CTS among the laboratory technicians, including female gender, arm/hand exertion, pipetting, repetitive tasks, working with a biosafety cabinet, and an unadjusted workstation.

  20. Tani in the U.S. Laboratory during Node 2/PMA-2 Relocation

    NASA Image and Video Library

    2007-11-14

    ISS016-E-011253 (14 Nov. 2007) --- Astronaut Daniel Tani, Expedition 16 flight engineer, works the controls of the space station's robotic Canadarm2 in the Destiny laboratory of the International Space Station, during the relocation of the Harmony node and Pressurized Mating Adapter 2 (PMA2) from the Unity node to the front of Destiny.

  1. Subjective Estimation of Task Time and Task Difficulty of Simple Movement Tasks.

    PubMed

    Chan, Alan H S; Hoffmann, Errol R

    2017-01-01

    It has been demonstrated in previous work that the same neural structures are used for both imagined and real movements. To provide a strong test of the similarity of imagined and actual movement times, 4 simple movement tasks were used to determine the relationship between estimated task time and actual movement time. The tasks were single-component visually controlled movements, 2-component visually controlled, low index of difficulty (ID) moves and pin-to-hole transfer movements. For each task there was good correspondence between the mean estimated times and actual movement times. In all cases, the same factors determined the actual and estimated movement times: the amplitudes of movement and the IDs of the component movements, however the contribution of each of these variables differed for the imagined and real tasks. Generally, the standard deviations of the estimated times were linearly related to the estimated time values. Overall, the data provide strong evidence for the same neural structures being used for both imagined and actual movements.

  2. Redintegration, task difficulty, and immediate serial recall tasks.

    PubMed

    Ritchie, Gabrielle; Tolan, Georgina Anne; Tehan, Gerald

    2015-03-01

    While current theoretical models remain somewhat inconclusive in their explanation of short-term memory (STM), many theories suggest at least a contribution of long-term memory (LTM) to the short-term system. A number of researchers refer to this process as redintegration (e.g., Schweickert, 1993). Under short-term recall conditions, the current study investigated the effects of redintegration and task difficulty in order to extend research conducted by Neale and Tehan (2007). Thirty participants in Experiment 1 and 26 participants in Experiment 2 completed a serial recall task in which retention interval, presentation rate, and articulatory suppression were used to modify task difficulty. Redintegration was examined by manipulating the characteristics of the to-be-remembered items; lexicality in Experiment 1 and wordlikeness in Experiment 2. Responses were scored based on correct-in-position recall, item scoring, and order accuracy scoring. In line with the Neale and Tehan results, as the difficulty of the task increased so did the effects of redintegration. This was evident in that the advantage for words in Experiment 1 and wordlikeness in Experiment 2 decreased as task difficulty increased. This relationship was observed for item but not order memory, and findings were discussed in relation to the theory of redintegration. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  3. Multi-attribute subjective evaluations of manual tracking tasks vs. objective performance of the human operator

    NASA Technical Reports Server (NTRS)

    Siapkaras, A.

    1977-01-01

    A computational method to deal with the multidimensional nature of tracking and/or monitoring tasks is developed. Operator centered variables, including the operator's perception of the task, are considered. Matrix ratings are defined based on multidimensional scaling techniques and multivariate analysis. The method consists of two distinct steps: (1) to determine the mathematical space of subjective judgements of a certain individual (or group of evaluators) for a given set of tasks and experimental conditionings; and (2) to relate this space with respect to both the task variables and the objective performance criteria used. Results for a variety of second-order trackings with smoothed noise-driven inputs indicate that: (1) many of the internally perceived task variables form a nonorthogonal set; and (2) the structure of the subjective space varies among groups of individuals according to the degree of familiarity they have with such tasks.

  4. Bench-Scale Filtration Testing in Support of the Pretreatment Engineering Platform (PEP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Billing, Justin M.; Daniel, Richard C.; Kurath, Dean E.

    Pacific Northwest National Laboratory (PNNL) has been tasked by Bechtel National Inc. (BNI) on the River Protection Project-Hanford Tank Waste Treatment and Immobilization Plant (RPP-WTP) project to perform research and development activities to resolve technical issues identified for the Pretreatment Facility (PTF). The Pretreatment Engineering Platform (PEP) was designed, constructed and operated as part of a plan to respond to issue M12, “Undemonstrated Leaching Processes.” The PEP is a 1/4.5-scale test platform designed to simulate the WTP pretreatment caustic leaching, oxidative leaching, ultrafiltration solids concentration, and slurry washing processes. The PEP testing program specifies that bench-scale testing is to bemore » performed in support of specific operations, including filtration, caustic leaching, and oxidative leaching.« less

  5. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses

    PubMed Central

    Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295

  6. Autonomous smart sensor network for full-scale structural health monitoring

    NASA Astrophysics Data System (ADS)

    Rice, Jennifer A.; Mechitov, Kirill A.; Spencer, B. F., Jr.; Agha, Gul A.

    2010-04-01

    The demands of aging infrastructure require effective methods for structural monitoring and maintenance. Wireless smart sensor networks offer the ability to enhance structural health monitoring (SHM) practices through the utilization of onboard computation to achieve distributed data management. Such an approach is scalable to the large number of sensor nodes required for high-fidelity modal analysis and damage detection. While smart sensor technology is not new, the number of full-scale SHM applications has been limited. This slow progress is due, in part, to the complex network management issues that arise when moving from a laboratory setting to a full-scale monitoring implementation. This paper presents flexible network management software that enables continuous and autonomous operation of wireless smart sensor networks for full-scale SHM applications. The software components combine sleep/wake cycling for enhanced power management with threshold detection for triggering network wide tasks, such as synchronized sensing or decentralized modal analysis, during periods of critical structural response.

  7. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 2

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Results of a Space Station Data System Analysis/Architecture Study for the Goddard Space Flight Center are presented. This study, which emphasized a system engineering design for a complete, end-to-end data system, was divided into six tasks: (1); Functional requirements definition; (2) Options development; (3) Trade studies; (4) System definitions; (5) Program plan; and (6) Study maintenance. The Task inter-relationship and documentation flow are described. Information in volume 2 is devoted to Task 3: trade Studies. Trade Studies have been carried out in the following areas: (1) software development test and integration capability; (2) fault tolerant computing; (3) space qualified computers; (4) distributed data base management system; (5) system integration test and verification; (6) crew workstations; (7) mass storage; (8) command and resource management; and (9) space communications. Results are presented for each task.

  8. Manufacturing Laboratory | Energy Systems Integration Facility | NREL

    Science.gov Websites

    Manufacturing Laboratory Manufacturing Laboratory Researchers in the Energy Systems Integration Facility's Manufacturing Laboratory develop methods and technologies to scale up renewable energy technology manufacturing capabilities. Photo of researchers and equipment in the Manufacturing Laboratory. Capability Hubs

  9. Retrofit of waste-to-energy facilities equipped with electrostatic precipitators. Volume II: Field and laboratory reports, Part 2 of 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rigo, H.G.; Chandler, A.J.

    Volume II (part 2 of 2) of ''Retrofit of Waste-to-energy Facilities Equipped with Electrostatic Precipitators'' contains the field and laboratory reports, including: (1) field reports, (2) analytic laboratory reports, (3) chain of custody forms, and (4) TCLP laboratory reports.

  10. On the importance of Task 1 and error performance measures in PRP dual-task studies.

    PubMed

    Strobach, Tilo; Schütz, Anja; Schubert, Torsten

    2015-01-01

    The psychological refractory period (PRP) paradigm is a dominant research tool in the literature on dual-task performance. In this paradigm a first and second component task (i.e., Task 1 and Task 2) are presented with variable stimulus onset asynchronies (SOAs) and priority to perform Task 1. The main indicator of dual-task impairment in PRP situations is an increasing Task 2-RT with decreasing SOAs. This impairment is typically explained with some task components being processed strictly sequentially in the context of the prominent central bottleneck theory. This assumption could implicitly suggest that processes of Task 1 are unaffected by Task 2 and bottleneck processing, i.e., decreasing SOAs do not increase reaction times (RTs) and error rates of the first task. The aim of the present review is to assess whether PRP dual-task studies included both RT and error data presentations and statistical analyses and whether studies including both data types (i.e., RTs and error rates) show data consistent with this assumption (i.e., decreasing SOAs and unaffected RTs and/or error rates in Task 1). This review demonstrates that, in contrast to RT presentations and analyses, error data is underrepresented in a substantial number of studies. Furthermore, a substantial number of studies with RT and error data showed a statistically significant impairment of Task 1 performance with decreasing SOA. Thus, these studies produced data that is not primarily consistent with the strong assumption that processes of Task 1 are unaffected by Task 2 and bottleneck processing in the context of PRP dual-task situations; this calls for a more careful report and analysis of Task 1 performance in PRP studies and for a more careful consideration of theories proposing additions to the bottleneck assumption, which are sufficiently general to explain Task 1 and Task 2 effects.

  11. On the importance of Task 1 and error performance measures in PRP dual-task studies

    PubMed Central

    Strobach, Tilo; Schütz, Anja; Schubert, Torsten

    2015-01-01

    The psychological refractory period (PRP) paradigm is a dominant research tool in the literature on dual-task performance. In this paradigm a first and second component task (i.e., Task 1 and Task 2) are presented with variable stimulus onset asynchronies (SOAs) and priority to perform Task 1. The main indicator of dual-task impairment in PRP situations is an increasing Task 2-RT with decreasing SOAs. This impairment is typically explained with some task components being processed strictly sequentially in the context of the prominent central bottleneck theory. This assumption could implicitly suggest that processes of Task 1 are unaffected by Task 2 and bottleneck processing, i.e., decreasing SOAs do not increase reaction times (RTs) and error rates of the first task. The aim of the present review is to assess whether PRP dual-task studies included both RT and error data presentations and statistical analyses and whether studies including both data types (i.e., RTs and error rates) show data consistent with this assumption (i.e., decreasing SOAs and unaffected RTs and/or error rates in Task 1). This review demonstrates that, in contrast to RT presentations and analyses, error data is underrepresented in a substantial number of studies. Furthermore, a substantial number of studies with RT and error data showed a statistically significant impairment of Task 1 performance with decreasing SOA. Thus, these studies produced data that is not primarily consistent with the strong assumption that processes of Task 1 are unaffected by Task 2 and bottleneck processing in the context of PRP dual-task situations; this calls for a more careful report and analysis of Task 1 performance in PRP studies and for a more careful consideration of theories proposing additions to the bottleneck assumption, which are sufficiently general to explain Task 1 and Task 2 effects. PMID:25904890

  12. Word Effects in Dual-Task Studies Using Lexical Decision and Naming as Task 2

    NASA Technical Reports Server (NTRS)

    Remington, Roger; McCann, Robert S.; VanSelst, Mark; Shafto, Michael (Technical Monitor)

    1997-01-01

    Word frequency effects in dual-task, lexical decision are variously reported to be additive or under-additive across SOA. We replicate and extend earlier lexical decision studies and find word frequency to be additive across SOA. To more directly capture lexical processing, we examine dual-task naming. Once again we find word frequency to be additive across SOA. Lexical processing appears to be constrained by central processing limitations.

  13. Response of removal rates to various organic carbon and ammonium loads in laboratory-scale constructed wetlands treating artificial wastewater.

    PubMed

    Wu, Shubiao; Kuschk, Peter; Wiessner, Arndt; Kästner, Matthias; Pang, Changle; Dong, Renjie

    2013-01-01

    High levels (92 and 91%) of organic carbon were successfully removed from artificial wastewater by a laboratory-scale constructed wetland under inflow loads of 670 mg/m2 x d (100 mg/d) and 1600 mg/m2d (240 mg/d), respectively. Acidification to pH 3.0 was observed at the low organic carbon load, which further inhibited the denitrification process. An increase in carbon load, however, was associated with a significant elevation of pH to 6.0. In general, sulfate and nitrate reduction were relatively high, with mean levels of 87 and 90%, respectively. However, inhibition of nitrification was initiated with an increase in carbon loads. This effect was probably a result of competition for oxygen by heterotrophic bacteria and an inhibitory effect of sulfide (S2) toxicity (concentration approximately 3 mg/L). In addition, numbers of healthy stalks of Juncus effusus (common rush) decreased from 14 000 to 10 000/m2 with an increase of sulfide concentration, indicating the negative effect of sulfide toxicity on the wetland plants.

  14. Task-Based Language Teaching for Beginner-Level Learners of L2 French: An Exploratory Study

    ERIC Educational Resources Information Center

    Erlam, Rosemary; Ellis, Rod

    2018-01-01

    This study investigated the effect of input-based tasks on the acquisition of vocabulary and grammar by beginner-level learners of L2 French and reported the introduction of task-based teaching as an innovation in a state secondary school. The experimental group (n = 19) completed a series of focused input-based language tasks, taught by their…

  15. Pro Free Will Priming Enhances "Risk-Taking" Behavior in the Iowa Gambling Task, but Not in the Balloon Analogue Risk Task: Two Independent Priming Studies.

    PubMed

    Schrag, Yann; Tremea, Alessandro; Lagger, Cyril; Ohana, Noé; Mohr, Christine

    2016-01-01

    Studies indicated that people behave less responsibly after exposure to information containing deterministic statements as compared to free will statements or neutral statements. Thus, deterministic primes should lead to enhanced risk-taking behavior. We tested this prediction in two studies with healthy participants. In experiment 1, we tested 144 students (24 men) in the laboratory using the Iowa Gambling Task. In experiment 2, we tested 274 participants (104 men) online using the Balloon Analogue Risk Task. In the Iowa Gambling Task, the free will priming condition resulted in more risky decisions than both the deterministic and neutral priming conditions. We observed no priming effects on risk-taking behavior in the Balloon Analogue Risk Task. To explain these unpredicted findings, we consider the somatic marker hypothesis, a gain frequency approach as well as attention to gains and / or inattention to losses. In addition, we highlight the necessity to consider both pro free will and deterministic priming conditions in future studies. Importantly, our and previous results indicate that the effects of pro free will and deterministic priming do not oppose each other on a frequently assumed continuum.

  16. Effects of a Mindfulness Task on Women's Sexual Response.

    PubMed

    Velten, Julia; Margraf, Jürgen; Chivers, Meredith L; Brotto, Lori A

    2017-12-20

    Mindfulness-based interventions are effective at improving symptoms of sexual dysfunction in women. The mechanisms by which mindfulness improves sexual function are less clear. The main objective of our study was to investigate the impact of a mindfulness task on sexual response in women. Forty-one women (mean age = 27.2, SD = 5.6) participated in two laboratory sessions that each included two erotic films and one attention task that were presented in counterbalanced order. Both attention tasks consisted of a six-minute audio recording of either a modified body scan, focusing on genital arousal sensations (mindfulness condition), or a visualization exercise. Subjective and genital sexual arousal were measured continuously during stimulus presentation. The mindfulness task led to greater subjective and lower genital arousal. The agreement of subjective and genital sexual arousal (i.e., concordance) was greater in the mindfulness condition. Trait mindfulness was related to lower sexual arousal but also greater sexual concordance in women. Mindfulness-based interventions that encourage women to focus on physical arousal sensations in the here and now may be associated with women's improved sexual function by enhancing feelings of sexual arousal during sexual activity and by increasing concordance between subjective and genital sexual arousal.

  17. Virtual reality robotic surgery warm-up improves task performance in a dry laboratory environment: a prospective randomized controlled study.

    PubMed

    Lendvay, Thomas S; Brand, Timothy C; White, Lee; Kowalewski, Timothy; Jonnadula, Saikiran; Mercer, Laina D; Khorsand, Derek; Andros, Justin; Hannaford, Blake; Satava, Richard M

    2013-06-01

    Preoperative simulation warm-up has been shown to improve performance and reduce errors in novice and experienced surgeons, yet existing studies have only investigated conventional laparoscopy. We hypothesized that a brief virtual reality (VR) robotic warm-up would enhance robotic task performance and reduce errors. In a 2-center randomized trial, 51 residents and experienced minimally invasive surgery faculty in General Surgery, Urology, and Gynecology underwent a validated robotic surgery proficiency curriculum on a VR robotic simulator and on the da Vinci surgical robot (Intuitive Surgical Inc). Once they successfully achieved performance benchmarks, surgeons were randomized to either receive a 3- to 5-minute VR simulator warm-up or read a leisure book for 10 minutes before performing similar and dissimilar (intracorporeal suturing) robotic surgery tasks. The primary outcomes compared were task time, tool path length, economy of motion, technical, and cognitive errors. Task time (-29.29 seconds, p = 0.001; 95% CI, -47.03 to -11.56), path length (-79.87 mm; p = 0.014; 95% CI, -144.48 to -15.25), and cognitive errors were reduced in the warm-up group compared with the control group for similar tasks. Global technical errors in intracorporeal suturing (0.32; p = 0.020; 95% CI, 0.06-0.59) were reduced after the dissimilar VR task. When surgeons were stratified by earlier robotic and laparoscopic clinical experience, the more experienced surgeons (n = 17) demonstrated significant improvements from warm-up in task time (-53.5 seconds; p = 0.001; 95% CI, -83.9 to -23.0) and economy of motion (0.63 mm/s; p = 0.007; 95% CI, 0.18-1.09), and improvement in these metrics was not statistically significantly appreciated in the less-experienced cohort (n = 34). We observed significant performance improvement and error reduction rates among surgeons of varying experience after VR warm-up for basic robotic surgery tasks. In addition, the VR warm-up reduced errors on

  18. Task-set inertia and memory-consolidation bottleneck in dual tasks.

    PubMed

    Koch, Iring; Rumiati, Raffaella I

    2006-11-01

    Three dual-task experiments examined the influence of processing a briefly presented visual object for deferred verbal report on performance in an unrelated auditory-manual reaction time (RT) task. RT was increased at short stimulus-onset asynchronies (SOAs) relative to long SOAs, showing that memory consolidation processes can produce a functional processing bottleneck in dual-task performance. In addition, the experiments manipulated the spatial compatibility of the orientation of the visual object and the side of the speeded manual response. This cross-task compatibility produced relative RT benefits only when the instruction for the visual task emphasized overlap at the level of response codes across the task sets (Experiment 1). However, once the effective task set was in place, it continued to produce cross-task compatibility effects even in single-task situations ("ignore" trials in Experiment 2) and when instructions for the visual task did not explicitly require spatial coding of object orientation (Experiment 3). Taken together, the data suggest a considerable degree of task-set inertia in dual-task performance, which is also reinforced by finding costs of switching task sequences (e.g., AC --> BC vs. BC --> BC) in Experiment 3.

  19. [Postgraduates' training as laboratory physicians/clinical pathologists in Japan--board certification of JSLM as a mandatory requirement for chairpersons of laboratory medicine].

    PubMed

    Kumasaka, Kazunari

    2002-04-01

    The educational committee of the Japanese Society of Laboratory Medicine(JSLM) proposed a revised laboratory medicine residency curriculum in 1999 and again in 2001. The committee believes that present undergraduate clinical training is insufficient and that Japanese medical graduates need clinical training for two years after graduation. This two years training should be a precondition for further postgraduate training in laboratory medicine and should include fundamental clinical skills(communication skills, physical examination and common laboratory procedures such as Gram's stain, Wright-Giemsa stain and urinalysis). After the two years training, the minimal training period of laboratory medicine should be three years, and should include: 1) Principles, instrumentation and techniques of each discipline including clinical chemistry, clinical hematology, clinical microbiology, clinical immunology, blood banking and other specific areas. 2) The use of laboratory information in a medical setting. 3) Interaction of the laboratory physician with laboratory staff, physicians and patients. With good on-the-job training and 24 hours on-call duties, laboratory physicians are expected to perform their tasks, including laboratory management, effectively. They should have appropriate educational background and should be well motivated. The background and duties of the laboratory physicians often reflect the institutional needs and personal philosophy of the chairperson of their department. At the moment, few senior physicians in Japan have qualifications in laboratory medicine and are unable, therefore, to provide the necessary guidance to help the laboratory physicians in their work. I therefore believe that the board certification of JSLM should be regarded as mandatory for chairpersons of laboratory medicine. Our on-call service system can enhance the training in laboratory medicine, and improve not only laboratory quality assurance but patients' care as well.

  20. The prestimulus default mode network state predicts cognitive task performance levels on a mental rotation task.

    PubMed

    Kamp, Tabea; Sorger, Bettina; Benjamins, Caroline; Hausfeld, Lars; Goebel, Rainer

    2018-06-22

    Linking individual task performance to preceding, regional brain activation is an ongoing goal of neuroscientific research. Recently, it could be shown that the activation and connectivity within large-scale brain networks prior to task onset influence performance levels. More specifically, prestimulus default mode network (DMN) effects have been linked to performance levels in sensory near-threshold tasks, as well as cognitive tasks. However, it still remains uncertain how the DMN state preceding cognitive tasks affects performance levels when the period between task trials is long and flexible, allowing participants to engage in different cognitive states. We here investigated whether the prestimulus activation and within-network connectivity of the DMN are predictive of the correctness and speed of task performance levels on a cognitive (match-to-sample) mental rotation task, employing a sparse event-related functional magnetic resonance imaging (fMRI) design. We found that prestimulus activation in the DMN predicted the speed of correct trials, with a higher amplitude preceding correct fast response trials compared to correct slow response trials. Moreover, we found higher connectivity within the DMN before incorrect trials compared to correct trials. These results indicate that pre-existing activation and connectivity states within the DMN influence task performance on cognitive tasks, both effecting the correctness and speed of task execution. The findings support existing theories and empirical work on relating mind-wandering and cognitive task performance to the DMN and expand these by establishing a relationship between the prestimulus DMN state and the speed of cognitive task performance. © 2018 The Authors. Brain and Behavior published by Wiley Periodicals, Inc.

  1. ISDP salt batch #2 supernate qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, T. B.; Nash, C. A.; Fink, S. D.

    2009-01-05

    This report covers the laboratory testing and analyses of the second Integrated Salt Disposition Project (ISDP) salt supernate samples, performed in support of initial radioactive operations of Actinide Removal Process (ARP) and Modular Caustic-Side Solvent Extraction Unit (MCU). Major goals of this work include characterizing Tank 22H supernate, characterizing Tank 41H supernate, verifying actinide and strontium adsorption with a standard laboratory-scale test using monosodium titanate (MST) and filtration, and checking cesium mass transfer behavior for the MCU solvent performance when contacted with the liquid produced from MST contact. This study also includes characterization of a post-blend Tank 49H sample asmore » part of the Nuclear Criticality Safety Evaluation (NCSE). This work was specified by Task Technical Request and by Task Technical and Quality Assurance Plan (TTQAP). In addition, a sampling plan will be written to guide analytical future work. Safety and environmental aspects of the work were documented in a Hazard Assessment Package.« less

  2. Is Borg's perceived exertion scale a useful indicator of muscular and cardiovascular load in blue-collar workers with lifting tasks? A cross-sectional workplace study.

    PubMed

    Jakobsen, Markus Due; Sundstrup, Emil; Persson, Roger; Andersen, Christoffer H; Andersen, Lars L

    2014-02-01

    To investigate associations between perceived exertion and objectively assessed muscular and cardiovascular load during a full working day among workers with manual lifting tasks. A total of 159 men and 41 women from 14 workplaces with manual lifting tasks participated. Participants reported perceived exertion (BORG-CR10) at midday and after work. Surface electromyography of the thigh, lower back and neck muscles were normalized to isometric voluntary contractions (MVC) to express relative muscle load during the day. Cardiovascular load was measured with electrocardiography and calculated as the average percentage of the heart rate reserve capacity (((heart rate during work - resting heart rate) / (maximum heart rate - resting heart rate)) * 100) during the day. Using linear regression, significant but weak associations (β < 0.23) were observed between perceived exertion and (1) high muscle activity (>60% of MVC) of the neck muscles and (2) inactivity (<1% of MVC) of the thigh muscles and (3) cardiovascular load, respectively. Using logistic regression, perceived exertion ≥4 (high exertion), referencing <4 (low-to-moderate exertion), was related to high activity of the trapezius muscle [OR 18 (95% CI 2-143)], i.e., the odds for experiencing high exertion during work increased 18-fold for each percentage increase in time above 60% MVC. During a full working day among blue-collar workers with lifting tasks, high neck muscle activity increases the odds for experiencing high perceived physical exertion. Perceived exertion of at least 4 on the BORG CR10 scale appears to be a good indicator that high muscular loading occurs.

  3. Persistence in soil of Miscanthus biochar in laboratory and field conditions

    PubMed Central

    Budai, Alice; O’Toole, Adam; Ma, Xingzhu; Rumpel, Cornelia; Abiven, Samuel

    2017-01-01

    Evaluating biochars for their persistence in soil under field conditions is an important step towards their implementation for carbon sequestration. Current evaluations might be biased because the vast majority of studies are short-term laboratory incubations of biochars produced in laboratory-scale pyrolyzers. Here our objective was to investigate the stability of a biochar produced with a medium-scale pyrolyzer, first through laboratory characterization and stability tests and then through field experiment. We also aimed at relating properties of this medium-scale biochar to that of a laboratory-made biochar with the same feedstock. Biochars were made of Miscanthus biomass for isotopic C-tracing purposes and produced at temperatures between 600 and 700°C. The aromaticity and degree of condensation of aromatic rings of the medium-scale biochar was high, as was its resistance to chemical oxidation. In a 90-day laboratory incubation, cumulative mineralization was 0.1% for the medium-scale biochar vs. 45% for the Miscanthus feedstock, pointing to the absence of labile C pool in the biochar. These stability results were very close to those obtained for biochar produced at laboratory-scale, suggesting that upscaling from laboratory to medium-scale pyrolyzers had little effect on biochar stability. In the field, the medium-scale biochar applied at up to 25 t C ha-1 decomposed at an estimated 0.8% per year. In conclusion, our biochar scored high on stability indices in the laboratory and displayed a mean residence time > 100 years in the field, which is the threshold for permanent removal in C sequestration projects. PMID:28873471

  4. Measuring ignitability for in situ burning of oil spills weathered under Arctic conditions: from laboratory studies to large-scale field experiments.

    PubMed

    Fritt-Rasmussen, Janne; Brandvik, Per Johan

    2011-08-01

    This paper compares the ignitability of Troll B crude oil weathered under simulated Arctic conditions (0%, 50% and 90% ice cover). The experiments were performed in different scales at SINTEF's laboratories in Trondheim, field research station on Svalbard and in broken ice (70-90% ice cover) in the Barents Sea. Samples from the weathering experiments were tested for ignitability using the same laboratory burning cell. The measured ignitability from the experiments in these different scales showed a good agreement for samples with similar weathering. The ice conditions clearly affected the weathering process, and 70% ice or more reduces the weathering and allows a longer time window for in situ burning. The results from the Barents Sea revealed that weathering and ignitability can vary within an oil slick. This field use of the burning cell demonstrated that it can be used as an operational tool to monitor the ignitability of oil spills. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation/Briefing

    DTIC Science & Technology

    2005-10-01

    AFRL-HE-WP-TP-2005-0030 AIR FORCE RESEARCH LABORATORY Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation...TITLE AND SUBTITLE 5a. CONTRACT NUMBER FA8650-04-C-6406 Application of Cognitive Task Analysis in User Requirements 5b.GRANTNUMBER and Prototype...maintainer experience 21 21 Questions & Answers Application of Cognitive Task Analysis in User Requirements Definition and Prototype Design Christopher Curtis

  6. Laboratory and field scale bioremediation of hexachlorocyclohexane (HCH) contaminated soils by means of bioaugmentation and biostimulation.

    PubMed

    Garg, Nidhi; Lata, Pushp; Jit, Simran; Sangwan, Naseer; Singh, Amit Kumar; Dwivedi, Vatsala; Niharika, Neha; Kaur, Jasvinder; Saxena, Anjali; Dua, Ankita; Nayyar, Namita; Kohli, Puneet; Geueke, Birgit; Kunz, Petra; Rentsch, Daniel; Holliger, Christof; Kohler, Hans-Peter E; Lal, Rup

    2016-06-01

    Hexachlorocyclohexane (HCH) contaminated soils were treated for a period of up to 64 days in situ (HCH dumpsite, Lucknow) and ex situ (University of Delhi) in line with three bioremediation approaches. The first approach, biostimulation, involved addition of ammonium phosphate and molasses, while the second approach, bioaugmentation, involved addition of a microbial consortium consisting of a group of HCH-degrading sphingomonads that were isolated from HCH contaminated sites. The third approach involved a combination of biostimulation and bioaugmentation. The efficiency of the consortium was investigated in laboratory scale experiments, in a pot scale study, and in a full-scale field trial. It turned out that the approach of combining biostimulation and bioaugmentation was most effective in achieving reduction in the levels of α- and β-HCH and that the application of a bacterial consortium as compared to the action of a single HCH-degrading bacterial strain was more successful. Although further degradation of β- and δ-tetrachlorocyclohexane-1,4-diol, the terminal metabolites of β- and δ-HCH, respectively, did not occur by the strains comprising the consortium, these metabolites turned out to be less toxic than the parental HCH isomers.

  7. Eighteen-month-olds' memory interference and distraction in a modified A-not-B task is not associated with their anticipatory looking in a false-belief task.

    PubMed

    Zmyj, Norbert; Prinz, Wolfgang; Daum, Moritz M

    2015-01-01

    Infants' performance in non-verbal false-belief tasks is often interpreted as if they have understood false beliefs. This view has been questioned by a recent account that explains infants' performance in non-verbal false-belief tasks as the result of susceptibility to memory interference and distraction. We tested this alternative account by investigating the relationship between infants' false-belief understanding, susceptibility to memory interference and distraction, and general cognitive development in 18-month-old infants (N = 22). False-belief understanding was tested in an anticipatory looking paradigm of a standard false-belief task. Susceptibility to memory interference and distraction was tested in a modified A-not-B task. Cognitive development was measured via the Mental Scale of the Bayley Scales of Infant Development. We did not find any relationship between infants' performance in the false-belief task and the A-not-B task, even after controlling for cognitive development. This study shows that there is no ubiquitous relation between susceptibility to memory interference and distraction and performance in a false-belief task in infancy.

  8. Eighteen-month-olds’ memory interference and distraction in a modified A-not-B task is not associated with their anticipatory looking in a false-belief task

    PubMed Central

    Zmyj, Norbert; Prinz, Wolfgang; Daum, Moritz M.

    2015-01-01

    Infants’ performance in non-verbal false-belief tasks is often interpreted as if they have understood false beliefs. This view has been questioned by a recent account that explains infants’ performance in non-verbal false-belief tasks as the result of susceptibility to memory interference and distraction. We tested this alternative account by investigating the relationship between infants’ false-belief understanding, susceptibility to memory interference and distraction, and general cognitive development in 18-month-old infants (N = 22). False-belief understanding was tested in an anticipatory looking paradigm of a standard false-belief task. Susceptibility to memory interference and distraction was tested in a modified A-not-B task. Cognitive development was measured via the Mental Scale of the Bayley Scales of Infant Development. We did not find any relationship between infants’ performance in the false-belief task and the A-not-B task, even after controlling for cognitive development. This study shows that there is no ubiquitous relation between susceptibility to memory interference and distraction and performance in a false-belief task in infancy. PMID:26157409

  9. Fumigation of a laboratory-scale HVAC system with hydrogen peroxide for decontamination following a biological contamination incident.

    PubMed

    Meyer, K M; Calfee, M W; Wood, J P; Mickelsen, L; Attwood, B; Clayton, M; Touati, A; Delafield, R

    2014-03-01

    To evaluate hydrogen peroxide vapour (H2 O2 ) for its ability to inactivate Bacillus spores within a laboratory-scale heating, ventilation and air-conditioning (HVAC) duct system. Experiments were conducted in a closed-loop duct system, constructed of either internally lined or unlined galvanized metal. Bacterial spores were aerosol-deposited onto 18-mm-diameter test material coupons and strategically placed at several locations within the duct environment. Various concentrations of H2 O2 and exposure times were evaluated to determine the sporicidal efficacy and minimum exposure needed for decontamination. For the unlined duct, high variability was observed in the recovery of spores between sample locations, likely due to complex, unpredictable flow patterns within the ducts. In comparison, the lined duct exhibited a significant desorption of the H2 O2 following the fumigant dwell period and thus resulted in complete decontamination at all sampling locations. These findings suggest that decontamination of Bacillus spore-contaminated unlined HVAC ducts by hydrogen peroxide fumigation may require more stringent conditions (higher concentrations, longer dwell duration) than internally insulated ductwork. These data may help emergency responders when developing remediation plans during building decontamination. © 2013 The Society for Applied Microbiology This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  10. Bridge-in-a-backpack(TM) : task 3.2: investigating soil - structure interaction - modeling and experimental results of steel arches.

    DOT National Transportation Integrated Search

    2015-07-01

    This report includes fulfillment of Task 3.2 of a multi-task contract to further enhance concrete filled FRP tubes, or : the Bridge in a Backpack. Task 3 is an investigation of soil-structure interaction for the FRP tubes. Task 3.2 is the : modeling ...

  11. Measuring ability to enhance and suppress emotional expression: The Flexible Regulation of Emotional Expression (FREE) Scale.

    PubMed

    Burton, Charles L; Bonanno, George A

    2016-08-01

    Flexibility in self-regulatory behaviors has proved to be an important quality for adjusting to stressful life events and requires individuals to have a diverse repertoire of emotion regulation abilities. However, the most commonly used emotion regulation questionnaires assess frequency of behavior rather than ability, with little evidence linking these measures to observable capacity to enact a behavior. The aim of the current investigation was to develop and validate a Flexible Regulation of Emotional Expression (FREE) Scale that measures a person's ability to enhance and suppress displayed emotion across an array of hypothetical contexts. In Studies 1 and 2, a series of confirmatory factor analyses revealed that the FREE Scale consists of 4 first-order factors divided by regulation and emotional valence type that can contribute to 2 higher order factors: expressive enhancement ability and suppression ability. In Study 1, we also compared the FREE Scale to other commonly used emotion regulation measures, which revealed that suppression ability is conceptually distinct from suppression frequency. In Study 3, we compared the FREE Scale with a composite of traditional frequency-based indices of expressive regulation to predict performance in a previously validated emotional modulation paradigm. Participants' enhancement and suppression ability scores on the FREE Scale predicted their corresponding performance on the laboratory task, even when controlling for baseline expressiveness. These studies suggest that the FREE Scale is a valid and flexible measure of expressive regulation ability. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Magnetic resonance imaging in laboratory petrophysical core analysis

    NASA Astrophysics Data System (ADS)

    Mitchell, J.; Chandrasekera, T. C.; Holland, D. J.; Gladden, L. F.; Fordham, E. J.

    2013-05-01

    Magnetic resonance imaging (MRI) is a well-known technique in medical diagnosis and materials science. In the more specialized arena of laboratory-scale petrophysical rock core analysis, the role of MRI has undergone a substantial change in focus over the last three decades. Initially, alongside the continual drive to exploit higher magnetic field strengths in MRI applications for medicine and chemistry, the same trend was followed in core analysis. However, the spatial resolution achievable in heterogeneous porous media is inherently limited due to the magnetic susceptibility contrast between solid and fluid. As a result, imaging resolution at the length-scale of typical pore diameters is not practical and so MRI of core-plugs has often been viewed as an inappropriate use of expensive magnetic resonance facilities. Recently, there has been a paradigm shift in the use of MRI in laboratory-scale core analysis. The focus is now on acquiring data in the laboratory that are directly comparable to data obtained from magnetic resonance well-logging tools (i.e., a common physics of measurement). To maintain consistency with well-logging instrumentation, it is desirable to measure distributions of transverse (T2) relaxation time-the industry-standard metric in well-logging-at the laboratory-scale. These T2 distributions can be spatially resolved over the length of a core-plug. The use of low-field magnets in the laboratory environment is optimal for core analysis not only because the magnetic field strength is closer to that of well-logging tools, but also because the magnetic susceptibility contrast is minimized, allowing the acquisition of quantitative image voxel (or pixel) intensities that are directly scalable to liquid volume. Beyond simple determination of macroscopic rock heterogeneity, it is possible to utilize the spatial resolution for monitoring forced displacement of oil by water or chemical agents, determining capillary pressure curves, and estimating

  13. Job Management and Task Bundling

    NASA Astrophysics Data System (ADS)

    Berkowitz, Evan; Jansen, Gustav R.; McElvain, Kenneth; Walker-Loud, André

    2018-03-01

    High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.

  14. Final report: survey and removal of radioactive surface contamination at environmental restoration sites, Sandia National Laboratories/New Mexico. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, K.A.; Mitchell, M.M.; Jean, D.

    1997-09-01

    This report contains the Appendices A-L including Voluntary Corrective Measure Plans, Waste Management Plans, Task-Specific Health and Safety Plan, Analytical Laboratory Procedures, Soil Sample Results, In-Situ Gamma Spectroscopy Results, Radionuclide Activity Summary, TCLP Soil Sample Results, Waste Characterization Memoranda, Waste Drum Inventory Data, Radiological Risk Assessment, and Summary of Site-Specific Recommendations.

  15. 2. VIEW IN ROOM 111, ATOMIC ABSORPTION BERYLLIUM ANALYSIS LABORATORY. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VIEW IN ROOM 111, ATOMIC ABSORPTION BERYLLIUM ANALYSIS LABORATORY. AIR FILTERS AND SWIPES ARE DISSOLVED WITH ACIDS AND THE REMAINING RESIDUES ARE SUSPENDED IN NITRIC ACID SOLUTION. THE SOLUTION IS PROCESSED THROUGH THE ATOMIC ABSORPTION SPECTROPHOTOMETER TO DETECT THE PRESENCE AND LEVELS OF BERYLLIUM. - Rocky Flats Plant, Health Physics Laboratory, On Central Avenue between Third & Fourth Streets, Golden, Jefferson County, CO

  16. Task Repetition Effects on L1 Use in EFL Child Task-Based Interaction

    ERIC Educational Resources Information Center

    Azkarai, Agurtzane; García Mayo, María del Pilar

    2017-01-01

    Research has shown that tasks provide second language (L2) learners with many opportunities to learn the L2. Task repetition has been claimed to benefit L2 learning since familiarity with procedure and/or content gives learners the chance to focus on more specific aspects of language. Most research on task repetition has focused on adult…

  17. Effective Laboratory Experiences for Students with Disabilities: The Role of a Student Laboratory Assistant

    NASA Astrophysics Data System (ADS)

    Pence, Laura E.; Workman, Harry J.; Riecke, Pauline

    2003-03-01

    Two separate experiences with students whose disabilities significantly limited the number of laboratory activities they could accomplish independently has given us a general experience base for determining successful strategies for accommodating students facing these situatiuons. For a student who had substantially limited physical mobility and for a student who had no visual ability, employing a student laboratory assistant allowed the students with disabilities to have a productive and positive laboratory experience. One of the priorities in these situations should be to avoid depersonalizing the student with a disability. Interactions with the instructor and with other students should focus on the disabled student rather than the student laboratory assistant who may be carrying out specific tasks. One of the most crucial aspects of a successful project is the selection of a laboratory assistant who has excellent interpersonal skills and who will add his or her creativity to that of the student with a disability to meet unforeseen challenges. Other considerations are discussed, such as the importance of advance notification that a disabled student has enrolled in a course as well as factors that should contribute to choosing an optimum laboratory station for each situation.

  18. Highly efficient enzymatic synthesis of 2-monoacylglycerides and structured lipids and their production on a technical scale.

    PubMed

    Pfeffer, Jan; Freund, Andreas; Bel-Rhlid, Rachid; Hansen, Carl-Erik; Reuss, Matthias; Schmid, Rolf D; Maurer, Steffen C

    2007-10-01

    We report here a two-step process for the high-yield enzymatic synthesis of 2-monoacylglycerides (2-MAG) of saturated as well as unsaturated fatty acids with different chain lengths. The process consists of two steps: first the unselective esterification of fatty acids and glycerol leading to a triacylglyceride followed by an sn1,3-selective alcoholysis reaction yielding 2-monoacylglycerides. Remarkably, both steps can be catalyzed by lipase B from Candida antarctica (CalB). The whole process including esterification and alcoholysis was scaled up in a miniplant to a total volume of 10 l. With this volume, a two-step process catalyzed by CalB for the synthesis of 1,3-oleoyl-2-palmitoylglycerol (OPO) using tripalmitate as starting material was established. On a laboratory scale, we obtained gram quantities of the synthesized 2-monoacylglycerides of polyunsaturated fatty acids such as arachidonic-, docosahexaenoic- and eicosapentaenoic acids and up to 96.4% of the theoretically possible yield with 95% purity. On a technical scale (>100 g of product, >5 l of reaction volume), 97% yield was reached in the esterification and 73% in the alcoholysis and a new promising process for the enzymatic synthesis of OPO was established.

  19. Solvent-free microwave extraction of essential oil from aromatic herbs: from laboratory to pilot and industrial scale.

    PubMed

    Filly, Aurore; Fernandez, Xavier; Minuti, Matteo; Visinoni, Francesco; Cravotto, Giancarlo; Chemat, Farid

    2014-05-01

    Solvent-free microwave extraction (SFME) has been proposed as a green method for the extraction of essential oil from aromatic herbs that are extensively used in the food industry. This technique is a combination of microwave heating and dry distillation performed at atmospheric pressure without any added solvent or water. The isolation and concentration of volatile compounds is performed in a single stage. In this work, SFME and a conventional technique, hydro-distillation HD (Clevenger apparatus), are used for the extraction of essential oil from rosemary (Rosmarinus officinalis L.) and are compared. This preliminary laboratory study shows that essential oils extracted by SFME in 30min were quantitatively (yield and kinetics profile) and qualitatively (aromatic profile) similar to those obtained using conventional hydro-distillation in 2h. Experiments performed in a 75L pilot microwave reactor prove the feasibility of SFME up scaling and potential industrial applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Color temperature's impact on task performance and brainwaves of school-age children.

    PubMed

    Park, YunHee

    2015-10-01

    [Purpose] This study investigated color temperature's impact on task performance. It presents a scientific analysis of brainwave and task performance time changes, and the results of a self-report type survey. [Subjects] Twenty-four elementary school fifth-grade boys and girls with no visual problems participated in the experiment. [Methods] Physiological reaction times of task performance were measured in a laboratory that could fix and maintain color temperature. Brainwave changes and the task performance times were measured, and a self-report questionnaire was conducted in order to measure of emotional reactions. [Results] Regarding the brainwave changes associated with color temperature, alpha waves were emitted in the O2 area when puzzle tasks were illuminated by orange light and low and high beta waves were emitted in the F3 area under white light. Five items (Brilliant, Soft, Lively, Relaxed, Open) were reported predominantly in responses to orange light in the self-report questionnaire. [Conclusion] The results of this study show that relaxation and stability are not assured when the color temperature is low, and that concentration and cognitive activity are not necessarily easier when the color temperature is high. The color temperature change when performing tasks promoted emotional factors more than brainwave, a biological change.