Sample records for parallel laboratory experiments

  1. A Laboratory Preparation of Aspartame Analogs Using Simultaneous Multiple Parallel Synthesis Methodology

    ERIC Educational Resources Information Center

    Qvit, Nir; Barda, Yaniv; Gilon, Chaim; Shalev, Deborah E.

    2007-01-01

    This laboratory experiment provides a unique opportunity for students to synthesize three analogues of aspartame, a commonly used artificial sweetener. The students are introduced to the powerful and useful method of parallel synthesis while synthesizing three dipeptides in parallel using solid-phase peptide synthesis (SPPS) and simultaneous…

  2. Theoretical and Experimental Study of the Primary Current Distribution in Parallel-Plate Electrochemical Reactors

    ERIC Educational Resources Information Center

    Vazquez Aranda, Armando I.; Henquin, Eduardo R.; Torres, Israel Rodriguez; Bisang, Jose M.

    2012-01-01

    A laboratory experiment is described to determine the primary current distribution in parallel-plate electrochemical reactors. The electrolyte is simulated by conductive paper and the electrodes are segmented to measure the current distribution. Experiments are reported with the electrolyte confined to the interelectrode gap, where the current…

  3. A Laboratory Exercise in Physics: Determining Single Capacitances and Series and Parallel Combinations of Capacitance.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    This document presents a series of physics experiments which allow students to determine the value of unknown electrical capacitors. The exercises include both parallel and series connected capacitors. (SL)

  4. Experiment E89-044 of quasi-elastic diffusion 3He(e,e'p) at Jefferson Laboratory: Analyze cross sections of the two body breakup in parallel kinematics; Experience E89-044 de diffusion quasi-elastique 3he(e,e'p) au Jefferson Laboratory : analyse des sections efficaces de desintegration a deux corps en cinematique parallele (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Penel-Nottaris, Emilie

    2004-07-01

    The Jefferson Lab Hall A experiment has measured the 3He(e,e'p) reaction cross sections. The separation of the longitudinal and transverse response functions for the two-body breakup reaction in parallel kinematics allows to study the bound proton electromagnetic properties in the 3He nucleus and the involved nuclear mechanisms beyond impulse approximation. Preliminary cross sections show some disagreement with theoretical predictions for the forward angles kinematics around 0 MeV/c missing momenta, and sensitivity to final state interactions and 3He wave functions for missing momenta of 300 MeV/c.

  5. The Microgravity Science Glovebox

    NASA Technical Reports Server (NTRS)

    Baugher, Charles R.; Primm, Lowell (Technical Monitor)

    2001-01-01

    The Microgravity Science Glovebox (MSG) provides scientific investigators the opportunity to implement interactive experiments on the International Space Station. The facility has been designed around the concept of an enclosed scientific workbench that allows the crew to assemble and operate an experimental apparatus with participation from ground-based scientists through real-time data and video links. Workbench utilities provided to operate the experiments include power, data acquisition, computer communications, vacuum, nitrogen. and specialized tools. Because the facility work area is enclosed and held at a negative pressure with respect to the crew living area, the requirements on the experiments for containment of small parts, particulates, fluids, and gasses are substantially reduced. This environment allows experiments to be constructed in close parallel with bench type investigations performed in groundbased laboratories. Such an approach enables experimental scientists to develop hardware that more closely parallel their traditional laboratory experience and transfer these experiments into meaningful space-based research. When delivered to the ISS the MSG will represent a significant scientific capability that will be continuously available for a decade of evolutionary research.

  6. Parallel collisionless shocks forming in simulations of the LAPD experiment

    NASA Astrophysics Data System (ADS)

    Weidl, Martin S.; Jenko, Frank; Niemann, Chris; Winske, Dan

    2016-10-01

    Research on parallel collisionless shocks, most prominently occurring in the Earth's bow shock region, has so far been limited to satellite measurements and simulations. However, the formation of collisionless shocks depends on a wide range of parameters and scales, which can be accessed more easily in a laboratory experiment. Using a kJ-class laser, an ongoing experimental campaign at the Large Plasma Device (LAPD) at UCLA is expected to produce the first laboratory measurements of the formation of a parallel collisionless shock. We present hybrid kinetic/MHD simulations that show how beam instabilities in the background plasma can be driven by ablating carbon ions from a target, causing non-linear density oscillations which develop into a propagating shock front. The free-streaming carbon ions can excite both the resonant right-hand instability and the non-resonant firehose mode. We analyze their respective roles and discuss optimizing their growth rates to speed up the process of shock formation.

  7. Parallel, staged opening switch power conditioning techniques for flux compression generator applications

    NASA Astrophysics Data System (ADS)

    Reinovsky, R. E.; Levi, P. S.; Bueck, J. C.; Goforth, J. H.

    The Air Force Weapons Laboratory, working jointly with Los Alamos National Laboratory, has conducted a series of experiments directed at exploring composite, or staged, switching techniques for use in opening switches in applications which require the conduction of very high currents (or current densities) with very low losses for relatively long times (several tens of microseconds), and the interruption of these currents in much shorter times (ultimately a few hundred nanoseconds). The results of those experiments are reported.

  8. Solution-Phase Synthesis of Dipeptides: A Capstone Project That Employs Key Techniques in an Organic Laboratory Course

    ERIC Educational Resources Information Center

    Marchetti, Louis; DeBoef, Brenton

    2015-01-01

    A contemporary approach to the synthesis and purification of several UV-active dipeptides has been developed for the second-year organic laboratory. This experiment exposes students to the important technique of solution-phase peptide synthesis and allows an instructor to highlight the parallel between what they are accomplishing in the laboratory…

  9. Metal-Acetylacetonate Synthesis Experiments: Which Is Greener?

    ERIC Educational Resources Information Center

    Ribeiro, M. Gabriela T. C.; Machado, Adlio A. S. C.

    2011-01-01

    A procedure for teaching green chemistry through laboratory experiments is presented in which students are challenged to use the 12 principles of green chemistry to review and modify synthesis protocols to improve greenness. A global metric, green star, is used in parallel with green chemistry mass metrics to evaluate the improvement in greenness.…

  10. X-Ray Spectroscopic Laboratory Experiments in Support of the X-Ray Astronomy Program

    NASA Technical Reports Server (NTRS)

    Kahn, Steven M.

    1997-01-01

    Our program is to perform a series of laboratory investigations designed to resolved significant atomic physics uncertainties that limit the interpretation of cosmic X-ray spectra. Specific goals include a quantitative characterization of Fe L-shell spectra; the development of new techniques to simulate Maxwellian plasmas using an Electron Beam Ion Trap (EBIT); and the measurement of dielectronic recombination rates for photoionized gas. New atomic calculations have also been carried out in parallel with the laboratory investigations.

  11. A laboratory study of ion energization by EIC waves and subsequent upstreaming along diverging magnetic field lines

    NASA Technical Reports Server (NTRS)

    Cartier, S. L.; Dangelo, N.; Merlino, R. L.

    1986-01-01

    A laboratory study related to energetic upstreaming ions in the ionosphere-magnetosphere system is described. The experiment was carried out in a cesium Q machine plasma with a region of nonuniform magnetic field. Electrostatic ion cyclotron waves were excited by drawing an electron current to a small biased exciter electrode. In the presence of the instability, ions are heated in the direction perpendicular to B. Using a gridded retarding potential ion energy analyzer, the evolution of the ion velocity distribution was followed as the ions passed through the heating region and subsequently flowed out along the diverging B field lines. As expected, the heated ions transfer their energy from perpendicular to parallel motion as they move through the region of diverging B field. Both their parallel thermal energy and the parallel drift energy increase at the expense of the perpendicular energy.

  12. A landmark recognition and tracking experiment for flight on the Shuttle/Advanced Technology Laboratory (ATL)

    NASA Technical Reports Server (NTRS)

    Welch, J. D.

    1975-01-01

    The preliminary design of an experiment for landmark recognition and tracking from the Shuttle/Advanced Technology Laboratory is described. It makes use of parallel coherent optical processing to perform correlation tests between landmarks observed passively with a telescope and previously made holographic matched filters. The experimental equipment including the optics, the low power laser, the random access file of matched filters and the electro-optical readout device are described. A real time optically excited liquid crystal device is recommended for performing the input non-coherent optical to coherent optical interface function. A development program leading to a flight experiment in 1981 is outlined.

  13. Accretion shocks in the laboratory: Design of an experiment to study star formation

    DOE PAGES

    Young, Rachel P.; Kuranz, C. C.; Drake, R. P.; ...

    2017-02-13

    Here, we present the design of a laboratory-astrophysics experiment to study magnetospheric accretion relevant to young, pre-main-sequence stars. Spectra of young stars show evidence of hotspots created when streams of accreting material impact the surface of the star and create shocks. The structures that form during this process are poorly understood, as the surfaces of young stars cannot be spatially resolved. Our experiment would create a scaled "accretion shock" at a major (several kJ) laser facility. The experiment drives a plasma jet (the "accretion stream") into a solid block (the "stellar surface"), in the presence of a parallel magnetic fieldmore » analogous to the star's local field.« less

  14. Reconstruction of coded aperture images

    NASA Technical Reports Server (NTRS)

    Bielefeld, Michael J.; Yin, Lo I.

    1987-01-01

    Balanced correlation method and the Maximum Entropy Method (MEM) were implemented to reconstruct a laboratory X-ray source as imaged by a Uniformly Redundant Array (URA) system. Although the MEM method has advantages over the balanced correlation method, it is computationally time consuming because of the iterative nature of its solution. Massively Parallel Processing, with its parallel array structure is ideally suited for such computations. These preliminary results indicate that it is possible to use the MEM method in future coded-aperture experiments with the help of the MPP.

  15. Professor Created On-line Biology Laboratory Course

    NASA Technical Reports Server (NTRS)

    Bowman, Arthur W.

    2010-01-01

    This paper will share the creation, implementation, and modification of an online college level general biology laboratory course offered for non-science majors as a part of a General Education Curriculum. The ability of professors to develop quality online laboratories will address a growing need in Higher Education as more institutions combine course sections and look for suitable alternative course delivery formats due to declining departmental budgets requiring reductions in staffing, equipment, and supplies. Also, there is an equal or greater need for more professors to develop the ability to create online laboratory experiences because many of the currently available online laboratory course packages from publishers do not always adequately parallel on-campus laboratory courses, or are not as aligned with the companion lecture sections. From a variety of scientific simulation and animation web sites, professors can easily identify material that closely fit the specific needs of their courses, instructional environment, and students that they serve. All too often, on-campus laboratory courses in the sciences provide what are termed confirmation experiences that do NOT allow students to experience science as would be carried out by scientists. Creatively developed online laboratory experiences can often provide the type of authentic investigative experiences that are not possible on-campus due to the time constraints of a typical two-hour, once-per-week-meeting laboratory course. In addition, online laboratory courses can address issues related to the need for students to more easily complete missing laboratory assignments, and to have opportunities to extend introductory exercises into more advanced undertakings where a greater sense of scientific discovery can be experienced. Professors are strongly encourages to begin creating online laboratory exercises for their courses, and to consider issues regarding assessment, copyrights, and Intellectual Property concerns.

  16. Magnetic turbulence in a table-top laser-plasma relevant to astrophysical scenarios

    NASA Astrophysics Data System (ADS)

    Chatterjee, Gourab; Schoeffler, Kevin M.; Kumar Singh, Prashant; Adak, Amitava; Lad, Amit D.; Sengupta, Sudip; Kaw, Predhiman; Silva, Luis O.; Das, Amita; Kumar, G. Ravindra

    2017-06-01

    Turbulent magnetic fields abound in nature, pervading astrophysical, solar, terrestrial and laboratory plasmas. Understanding the ubiquity of magnetic turbulence and its role in the universe is an outstanding scientific challenge. Here, we report on the transition of magnetic turbulence from an initially electron-driven regime to one dominated by ion-magnetization in a laboratory plasma produced by an intense, table-top laser. Our observations at the magnetized ion scale of the saturated turbulent spectrum bear a striking resemblance with spacecraft measurements of the solar wind magnetic-field spectrum, including the emergence of a spectral kink. Despite originating from diverse energy injection sources (namely, electrons in the laboratory experiment and ion free-energy sources in the solar wind), the turbulent spectra exhibit remarkable parallels. This demonstrates the independence of turbulent spectral properties from the driving source of the turbulence and highlights the potential of small-scale, table-top laboratory experiments for investigating turbulence in astrophysical environments.

  17. The effectiveness of using computer simulated experiments on junior high students' understanding of the volume displacement concept

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soon; Gennaro, Eugene

    Several researchers have suggested that the computer holds much promise as a tool for science teachers for use in their classrooms (Bork, 1979, Lunetta & Hofstein, 1981). It also has been said that there needs to be more research in determining the effectiveness of computer software (Tinker, 1983).This study compared the effectiveness of microcomputer simulated experiences with that of parallel instruction involving hands-on laboratory experiences for teaching the concept of volume displacement to junior high school students. This study also assessed the differential effect on students' understanding of the volume displacement concept using sex of the students as another independent variable. In addition, it compared the degree of retention, after 45 days, of both treatment groups.It was found that computer simulated experiences were as effective as hands-on laboratory experiences, and that males, having had hands-on laboratory experiences, performed better on the posttest than females having had the hands-on laboratory experiences. There were no significant differences in performance when comparing males with females using the computer simulation in the learning of the displacement concept. This study also showed that there were no significant differences in the retention levels when the retention scores of the computer simulation groups were compared to those that had the hands-on laboratory experiences. However, an ANOVA of the retention test scores revealed that males in both treatment conditions retained knowledge of volume displacement better than females.

  18. Experiment E89-044 on the Quasielastic 3He(e,e'p) Reaction at Jefferson Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Penel-Nottaris, Emilie

    The Jefferson Lab Hall A E89-044 experiment has measured the 3He(e,e'p) reaction cross-sections. The extraction of the longitudinal and transverse response functions for the two-body break-up 3He(e,e'p)d reaction in parallel kinematics allows the study of the bound proton electromagnetic properties inside the 3He nucleus and the involved nuclear mechanisms beyond plane wave approximations.

  19. Laboratory plasma probe studies

    NASA Technical Reports Server (NTRS)

    Heikkila, W. J.

    1975-01-01

    Diagnostic experiments performed in a collisionless plasma using CO2 as the working gas are described. In particular, simultaneous measurements that have been performed by means of Langmuir- and RF-probes are presented. A resonance occurring above the parallel resonance in the frequency characteristic of a two electrode system is interpreted as being due to the resonant excitation of electroacoustic waves.

  20. Further studies on the problems of geomagnetic field intensity determination from archaeological baked clay materials

    NASA Astrophysics Data System (ADS)

    Kostadinova-Avramova, M.; Kovacheva, M.

    2015-10-01

    Archaeological baked clay remains provide valuable information about the geomagnetic field in historical past, but determination of the geomagnetic field characteristics, especially intensity, is often a difficult task. This study was undertaken to elucidate the reasons for unsuccessful intensity determination experiments obtained from two different Bulgarian archaeological sites (Nessebar - Early Byzantine period and Malenovo - Early Iron Age). With this aim, artificial clay samples were formed in the laboratory and investigated. The clay used for the artificial samples preparation differs according to its initial state. Nessebar clay was baked in the antiquity, but Malenovo clay was raw, taken from the clay deposit near the site. The obtained artificial samples were repeatedly heated eight times in known magnetic field to 700 °C. X-ray diffraction analyses and rock-magnetic experiments were performed to obtain information about the mineralogical content and magnetic properties of the initial and laboratory heated clays. Two different protocols were applied for the intensity determination-Coe version of Thellier and Thellier method and multispecimen parallel differential pTRM protocol. Various combinations of laboratory fields and mutual positions of the directions of laboratory field and carried thermoremanence were used in the performed Coe experiment. The obtained results indicate that the failure of this experiment is probably related to unfavourable grain sizes of the prevailing magnetic carriers combined with the chosen experimental conditions. The multispecimen parallel differential pTRM protocol in its original form gives excellent results for the artificial samples, but failed for the real samples (samples coming from previously studied kilns of Nessebar and Malenovo sites). Obviously the strong dependence of this method on the homogeneity of the used subsamples hinders its implementation in its original form for archaeomaterials. The latter are often heterogeneous due to variable heating conditions in the different parts of the archaeological structures. The study draws attention to the importance of multiple heating for the stabilization of grain size distribution in baked clay materials and the need of elucidation of this question.

  1. Pumping ions: rapid parallel evolution of ionic regulation following habitat invasions.

    PubMed

    Lee, Carol Eunmi; Kiergaard, Michael; Gelembiuk, Gregory William; Eads, Brian Donovan; Posavi, Marijan

    2011-08-01

    Marine to freshwater colonizations constitute among the most dramatic evolutionary transitions in the history of life. This study examined evolution of ionic regulation following saline-to-freshwater transitions in an invasive species. In recent years, the copepod Eurytemora affinis has invaded freshwater habitats multiple times independently. We found parallel evolutionary shifts in ion-motive enzyme activity (V-type H(+) ATPase, Na(+) /K(+) -ATPase) across independent invasions and in replicate laboratory selection experiments. Freshwater populations exhibited increased V-type H(+) ATPase activity in fresh water (0 PSU) and declines at higher salinity (15 PSU) relative to saline populations. This shift represented marked evolutionary increases in plasticity. In contrast, freshwater populations displayed reduced Na(+) /K(+) -ATPase activity across all salinities. Most notably, modifying salinity alone during laboratory selection experiments recapitulated the evolutionary shifts in V-type H(+) ATPase activity observed in nature. Maternal and embryonic acclimation could not account for the observed shifts in enzyme activity. V-type H(+) ATPase function has been hypothesized to be critical for freshwater and terrestrial adaptations, but evolution of this enzyme function had not been previously demonstrated in the context of habitat transitions. Moreover, the speed of these evolutionary shifts was remarkable, within a few generations in the laboratory and a few decades in the wild. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  2. Calculation of heat sink around cracks formed under pulsed heat load

    NASA Astrophysics Data System (ADS)

    Lazareva, G. G.; Arakcheev, A. S.; Kandaurov, I. V.; Kasatov, A. A.; Kurkuchekov, V. V.; Maksimova, A. G.; Popov, V. A.; Shoshin, A. A.; Snytnikov, A. V.; Trunev, Yu A.; Vasilyev, A. A.; Vyacheslavov, L. N.

    2017-10-01

    The experimental and numerical simulations of the conditions causing the intensive erosion and expected to be realized infusion reactor were carried out. The influence of relevant pulsed heat loads to tungsten was simulated using a powerful electron beam source in BINP. The mechanical destruction, melting and splashing of the material were observed. The laboratory experiments are accompanied by computational ones. Computational experiment allowed to quantitatively describe the overheating near the cracks, caused by parallel to surface cracks.

  3. Evolution Is an Experiment: Assessing Parallelism in Crop Domestication and Experimental Evolution: (Nei Lecture, SMBE 2014, Puerto Rico).

    PubMed

    Gaut, Brandon S

    2015-07-01

    In this commentary, I make inferences about the level of repeatability and constraint in the evolutionary process, based on two sets of replicated experiments. The first experiment is crop domestication, which has been replicated across many different species. I focus on results of whole-genome scans for genes selected during domestication and ask whether genes are, in fact, selected in parallel across different domestication events. If genes are selected in parallel, it implies that the number of genetic solutions to the challenge of domestication is constrained. However, I find no evidence for parallel selection events either between species (maize vs. rice) or within species (two domestication events within beans). These results suggest that there are few constraints on genetic adaptation, but conclusions must be tempered by several complicating factors, particularly the lack of explicit design standards for selection screens. The second experiment involves the evolution of Escherichia coli to thermal stress. Unlike domestication, this highly replicated experiment detected a limited set of genes that appear prone to modification during adaptation to thermal stress. However, the number of potentially beneficial mutations within these genes is large, such that adaptation is constrained at the genic level but much less so at the nucleotide level. Based on these two experiments, I make the general conclusion that evolution is remarkably flexible, despite the presence of epistatic interactions that constrain evolutionary trajectories. I also posit that evolution is so rapid that we should establish a Speciation Prize, to be awarded to the first researcher who demonstrates speciation with a sexual organism in the laboratory. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. High subsonic flow tests of a parallel pipe followed by a large area ratio diffuser

    NASA Technical Reports Server (NTRS)

    Barna, P. S.

    1975-01-01

    Experiments were performed on a pilot model duct system in order to explore its aerodynamic characteristics. The model was scaled from a design projected for the high speed operation mode of the Aircraft Noise Reduction Laboratory. The test results show that the model performed satisfactorily and therefore the projected design will most likely meet the specifications.

  5. Compatibility of photomultiplier tube operation with SQUIDs for a neutron EDM experiment

    NASA Astrophysics Data System (ADS)

    Libersky, Matthew; nEDM Collaboration

    2013-10-01

    An experiment at the Spallation Neutron Source at Oak Ridge National Laboratory with the goal of reducing the experimental limit on the electric dipole moment (EDM) of the neutron will measure the precession frequencies of neutrons when a strong electric field is applied parallel and anti-parallel to a weak magnetic field. A difference in these frequencies would indicate a nonzero neutron EDM. To correct for drifts of the magnetic field in the measurement volume, polarized 3He will be used as a co-magnetometer. In one of the two methods built into the apparatus, superconducting quantum interference devices (SQUIDs) will be used to read out the 3He magnetization. Photomultiplier tubes will be used concurrently to measure scintillation light from neutron capture by 3He. However, the simultaneous noise-sensitive magnetic field measurement by the SQUIDs makes conventional PMT operation problematic due to the alternating current involved in generating the high voltages needed. Tests were carried out at Los Alamos National Laboratory to study the compatibility of simultaneous SQUID and PMT operation, using a custom battery-powered high-voltage power supply developed by Meyer and Smith (NIM A 647.1) to operate the PMT. The results of these tests will be presented.

  6. Air filters from HVAC systems as possible source of volatile organic compounds (VOC) - laboratory and field assays

    NASA Astrophysics Data System (ADS)

    Schleibinger, Hans; Rüden, Henning

    The emission of volatile organic compounds (VOC) from air filters of HVAC systems was to be evaluated. In a first study carbonyl compounds (14 aldehydes and two ketones) were measured by reacting them with 2,4-dinitrophenylhydrazine (DNPH). Analysis was done by HPLC and UV detection. In laboratory experiments pieces of used and unused HVAC filters were incubated in test chambers. Filters to be investigated were taken from a filter bank of a large HVAC system in the centre of Berlin. First results show that - among those compounds - formaldehyde and acetone were found in higher concentrations in the test chambers filled with used filters in comparison to those with unused filters. Parallel field measurements were carried out at the prefilter and main filter banks of the two HVAC systems. Here measurements were carried out simultaneously before and after the filters to investigate whether those aldehydes or ketones arise from the filter material on site. Formaldehyde and acetone significantly increased in concentration after the filters of one HVAC system. In parallel experiments microorganisms were proved to be able to survive on air filters. Therefore, a possible source of formaldehyde and acetone might be microbes.

  7. Simulation of two-dimensional turbulent flows in a rotating annulus

    NASA Astrophysics Data System (ADS)

    Storey, Brian D.

    2004-05-01

    Rotating water tank experiments have been used to study fundamental processes of atmospheric and geophysical turbulence in a controlled laboratory setting. When these tanks are undergoing strong rotation the forced turbulent flow becomes highly two dimensional along the axis of rotation. An efficient numerical method has been developed for simulating the forced quasi-geostrophic equations in an annular geometry to model current laboratory experiments. The algorithm employs a spectral method with Fourier series and Chebyshev polynomials as basis functions. The algorithm has been implemented on a parallel architecture to allow modelling of a wide range of spatial scales over long integration times. This paper describes the derivation of the model equations, numerical method, testing and performance of the algorithm. Results provide reasonable agreement with the experimental data, indicating that such computations can be used as a predictive tool to design future experiments.

  8. Evaluation of Parallel Authentic Research-Based Courses in Human Biology on Student Experiences at Stanford University and the University of Gothenburg

    ERIC Educational Resources Information Center

    Lindh, Jacob; Annerstedt, Claes; Besier, Thor; Matheson, Gordon O.; Rydmark, Martin

    2016-01-01

    Under a previous grant (2005-08), researchers and teachers at Stanford University (SU) and the University of Gothenburg (GU) co-designed a ten-week interdisciplinary, research-based laboratory course in human biology to be taught online to undergraduate students. Essentials in the subject were taught during the first four weeks of this course.…

  9. Astrophysical particle acceleration mechanisms in colliding magnetized laser-produced plasmas

    DOE PAGES

    Fox, W.; Park, J.; Deng, W.; ...

    2017-08-11

    Significant particle energization is observed to occur in numerous astrophysical environments, and in the standard models, this acceleration occurs alongside energy conversion processes including collisionless shocks or magnetic reconnection. Recent platforms for laboratory experiments using magnetized laser-produced plasmas have opened opportunities to study these particle acceleration processes in the laboratory. Through fully kinetic particle-in-cell simulations, we investigate acceleration mechanisms in experiments with colliding magnetized laser-produced plasmas, with geometry and parameters matched to recent high-Mach number reconnection experiments with externally controlled magnetic fields. 2-D simulations demonstrate significant particle acceleration with three phases of energization: first, a “direct” Fermi acceleration driven bymore » approaching magnetized plumes; second, x-line acceleration during magnetic reconnection of anti-parallel fields; and finally, an additional Fermi energization of particles trapped in contracting and relaxing magnetic islands produced by reconnection. Furthermore, the relative effectiveness of these mechanisms depends on plasma and magnetic field parameters of the experiments.« less

  10. Virtual geotechnical laboratory experiments using a simulator

    NASA Astrophysics Data System (ADS)

    Penumadu, Dayakar; Zhao, Rongda; Frost, David

    2000-04-01

    The details of a test simulator that provides a realistic environment for performing virtual laboratory experimentals in soil mechanics is presented. A computer program Geo-Sim that can be used to perform virtual experiments, and allow for real-time observations of material response is presented. The results of experiments, for a given set of input parameters, are obtained with the test simulator using well-trained artificial neural-network-based soil models for different soil types and stress paths. Multimedia capabilities are integrated in Geo-Sim, using software that links and controls a laser disc player with a real-time parallel processing ability. During the simulation of a virtual experiment, relevant portions of the video image of a previously recorded test on an actual soil specimen are dispalyed along with the graphical presentation of response from the feedforward ANN model predictions. The pilot simulator developed to date includes all aspects related to performing a triaxial test on cohesionless soil under undrained and drained conditions. The benefits of the test simulator are also presented.

  11. Improving the Optical Trapping Efficiency in the 225Ra Electric Dipole Moment Experiment via Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Fromm, Steven

    2017-09-01

    In an effort to study and improve the optical trapping efficiency of the 225Ra Electric Dipole Moment experiment, a fully parallelized Monte Carlo simulation of the laser cooling and trapping apparatus was created at Argonne National Laboratory and now maintained and upgraded at Michigan State University. The simulation allows us to study optimizations and upgrades without having to use limited quantities of 225Ra (15 day half-life) in experiment's apparatus. It predicts a trapping efficiency that differs from the observed value in the experiment by approximately a factor of thirty. The effects of varying oven geometry, background gas interactions, laboratory magnetic fields, MOT laser beam configurations and laser frequency noise were studied and ruled out as causes of the discrepancy between measured and predicted values of the overall trapping efficiency. Presently, the simulation is being used to help optimize a planned blue slower laser upgrade in the experiment's apparatus, which will increase the overall trapping efficiency by up to two orders of magnitude. This work is supported by Michigan State University, the Director's Research Scholars Program at the National Superconducting Cyclotron Laboratory, and the U.S. DOE, Office of Science, Office of Nuclear Physics, under Contract DE-AC02-06CH11357.

  12. Laboratory studies of magnetized collisionless flows and shocks using accelerated plasmoids

    NASA Astrophysics Data System (ADS)

    Weber, T. E.; Smith, R. J.; Hsu, S. C.

    2015-11-01

    Magnetized collisionless shocks are thought to play a dominant role in the overall partition of energy throughout the universe, but have historically proven difficult to create in the laboratory. The Magnetized Shock Experiment (MSX) at LANL creates conditions similar to those found in both space and astrophysical shocks by accelerating hot (100s of eV during translation) dense (1022 - 1023 m-3) Field Reversed Configuration (FRC) plasmoids to high velocities (100s of km/s); resulting in β ~ 1, collisionless plasma flows with sonic and Alfvén Mach numbers of ~10. The FRC subsequently impacts a static target such as a strong parallel or anti-parallel (reconnection-wise) magnetic mirror, a solid obstacle, or neutral gas cloud to create shocks with characteristic length and time scales that are both large enough to observe yet small enough to fit within the experiment. This enables study of the complex interplay of kinetic and fluid processes that mediate cosmic shocks and can generate non-thermal distributions, produce density and magnetic field enhancements much greater than predicted by fluid theory, and accelerate particles. An overview of the experimental capabilities of MSX will be presented, including diagnostics, selected recent results, and future directions. Supported by the DOE Office of Fusion Energy Sciences under contract DE-AC52-06NA25369.

  13. Playing in parallel: the effects of multiplayer modes in active video game on motivation and physical exertion.

    PubMed

    Peng, Wei; Crouse, Julia

    2013-06-01

    Although multiplayer modes are common among contemporary video games, the bulk of game research focuses on the single-player mode. To fill the gap in the literature, the current study investigated the effects of different multiplayer modes on enjoyment, future play motivation, and the actual physical activity intensity in an active video game. One hundred sixty-two participants participated in a one-factor between-subject laboratory experiment with three conditions: (a) single player: play against self pretest score; (b) cooperation with another player in the same physical space; (c) parallel competition with another player in separated physical spaces. We found that parallel competition in separate physical spaces was the optimal mode, since it resulted in both high enjoyment and future play motivation and high physical intensity. Implications for future research on multiplayer mode and play space as well as active video game-based physical activity interventions are discussed.

  14. Field characterization of elastic properties across a fault zone reactivated by fluid injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeanne, Pierre; Guglielmi, Yves; Rutqvist, Jonny

    In this paper, we studied the elastic properties of a fault zone intersecting the Opalinus Clay formation at 300 m depth in the Mont Terri Underground Research Laboratory (Switzerland). Four controlled water injection experiments were performed in borehole straddle intervals set at successive locations across the fault zone. A three-component displacement sensor, which allowed capturing the borehole wall movements during injection, was used to estimate the elastic properties of representative locations across the fault zone, from the host rock to the damage zone to the fault core. Young's moduli were estimated by both an analytical approach and numerical finite differencemore » modeling. Results show a decrease in Young's modulus from the host rock to the damage zone by a factor of 5 and from the damage zone to the fault core by a factor of 2. In the host rock, our results are in reasonable agreement with laboratory data showing a strong elastic anisotropy characterized by the direction of the plane of isotropy parallel to the laminar structure of the shale formation. In the fault zone, strong rotations of the direction of anisotropy can be observed. Finally, the plane of isotropy can be oriented either parallel to bedding (when few discontinuities are present), parallel to the direction of the main fracture family intersecting the zone, and possibly oriented parallel or perpendicular to the fractures critically oriented for shear reactivation (when repeated past rupture along this plane has created a zone).« less

  15. Field characterization of elastic properties across a fault zone reactivated by fluid injection

    DOE PAGES

    Jeanne, Pierre; Guglielmi, Yves; Rutqvist, Jonny; ...

    2017-08-12

    In this paper, we studied the elastic properties of a fault zone intersecting the Opalinus Clay formation at 300 m depth in the Mont Terri Underground Research Laboratory (Switzerland). Four controlled water injection experiments were performed in borehole straddle intervals set at successive locations across the fault zone. A three-component displacement sensor, which allowed capturing the borehole wall movements during injection, was used to estimate the elastic properties of representative locations across the fault zone, from the host rock to the damage zone to the fault core. Young's moduli were estimated by both an analytical approach and numerical finite differencemore » modeling. Results show a decrease in Young's modulus from the host rock to the damage zone by a factor of 5 and from the damage zone to the fault core by a factor of 2. In the host rock, our results are in reasonable agreement with laboratory data showing a strong elastic anisotropy characterized by the direction of the plane of isotropy parallel to the laminar structure of the shale formation. In the fault zone, strong rotations of the direction of anisotropy can be observed. Finally, the plane of isotropy can be oriented either parallel to bedding (when few discontinuities are present), parallel to the direction of the main fracture family intersecting the zone, and possibly oriented parallel or perpendicular to the fractures critically oriented for shear reactivation (when repeated past rupture along this plane has created a zone).« less

  16. Series and parallel arc-fault circuit interrupter tests.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Fresquez, Armando J.; Gudgel, Bob

    2013-07-01

    While the 2011 National Electrical Codeª (NEC) only requires series arc-fault protection, some arc-fault circuit interrupter (AFCI) manufacturers are designing products to detect and mitigate both series and parallel arc-faults. Sandia National Laboratories (SNL) has extensively investigated the electrical differences of series and parallel arc-faults and has offered possible classification and mitigation solutions. As part of this effort, Sandia National Laboratories has collaborated with MidNite Solar to create and test a 24-string combiner box with an AFCI which detects, differentiates, and de-energizes series and parallel arc-faults. In the case of the MidNite AFCI prototype, series arc-faults are mitigated by openingmore » the PV strings, whereas parallel arc-faults are mitigated by shorting the array. A range of different experimental series and parallel arc-fault tests with the MidNite combiner box were performed at the Distributed Energy Technologies Laboratory (DETL) at SNL in Albuquerque, NM. In all the tests, the prototype de-energized the arc-faults in the time period required by the arc-fault circuit interrupt testing standard, UL 1699B. The experimental tests confirm series and parallel arc-faults can be successfully mitigated with a combiner box-integrated solution.« less

  17. On making laboratory report work more meaningful through criterion-based evaluation.

    PubMed

    Naeraa, N

    1987-05-01

    The purpose of this work was to encourage students to base their laboratory report work on guidelines reflecting a quality criterion set, previously derived from the functional role of the various sections in scientific papers. The materials were developed by a trial-and-error approach and comprise learning objectives, a parallel structure of manual and reports, general and specific report guidelines and a new common starting experiment. The principal contents are presented, followed by an account of the author's experience with them. Most of the author's students now follow the guidelines. Their conclusions are affected by difficulties in adjusting expected results with due regard to the specific conditions of the experimental subject or to their own deviations from the experimental or analytical procedures prescribed in the manual. Also, problems in interpreting data unbiased by explicit expectations are evident, although a clear distinction between expected and actual results has been helpful for them in seeing the relationship between experiments and textbook contents more clearly, and thus in understanding the hypothetico-deductive approach.

  18. Three Dimensional Hybrid Simulations of Super-Alfvénic Laser Ablation Experiments in the Large Plasma Device

    NASA Astrophysics Data System (ADS)

    Clark, Stephen; Winske, Dan; Schaeffer, Derek; Everson, Erik; Bondarenko, Anton; Constantin, Carmen; Niemann, Christoph

    2014-10-01

    We present 3D hybrid simulations of laser produced expanding debris clouds propagating though a magnetized ambient plasma in the context of magnetized collisionless shocks. New results from the 3D code are compared to previously obtained simulation results using a 2D hybrid code. The 3D code is an extension of a previously developed 2D code developed at Los Alamos National Laboratory. It has been parallelized and ported to execute on a cluster environment. The new simulations are used to verify scaling relationships, such as shock onset time and coupling parameter (Rm /ρd), developed via 2D simulations. Previous 2D results focus primarily on laboratory shock formation relevant to experiments being performed on the Large Plasma Device, where the shock propagates across the magnetic field. The new 3D simulations show wave structure and dynamics oblique to the magnetic field that introduce new physics to be considered in future experiments.

  19. Cleanup Verification Package for the 100-F-20, Pacific Northwest Laboratory Parallel Pits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Appel

    2007-01-22

    This cleanup verification package documents completion of remedial action for the 100-F-20, Pacific Northwest Laboratory Parallel Pits waste site. This waste site consisted of two earthen trenches thought to have received both radioactive and nonradioactive material related to the 100-F Experimental Animal Farm.

  20. Analysis of series resonant converter with series-parallel connection

    NASA Astrophysics Data System (ADS)

    Lin, Bor-Ren; Huang, Chien-Lan

    2011-02-01

    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  1. Mutation-based learning to improve student autonomy and scientific inquiry skills in a large genetics laboratory course.

    PubMed

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a "mutation" method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the "mutations"; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional "cookbook"-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class.

  2. Lead-Free Experiment in a Space Environment

    NASA Technical Reports Server (NTRS)

    Blanche, J. F.; Strickland, S. M.

    2012-01-01

    This Technical Memorandum addresses the Lead-Free Technology Experiment in Space Environment that flew as part of the seventh Materials International Space Station Experiment outside the International Space Station for approximately 18 months. Its intent was to provide data on the performance of lead-free electronics in an actual space environment. Its postflight condition is compared to the preflight condition as well as to the condition of an identical package operating in parallel in the laboratory. Some tin whisker growth was seen on a flight board but the whiskers were few and short. There were no solder joint failures, no tin pest formation, and no significant intermetallic compound formation or growth on either the flight or ground units.

  3. Research in mobile robotics at ORNL/CESAR (Oak Ridge National Laboratory/Center for Engineering Systems Advanced Research)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, R.C.; Weisbin, C.R.; Pin, F.G.

    1989-01-01

    This paper reviews ongoing and planned research with mobile autonomous robots at the Oak Ridge National Laboratory (ORNL), Center for Engineering Systems Advanced Research (CESAR). Specifically we report on results obtained with the robot HERMIES-IIB in navigation, intelligent sensing, learning, and on-board parallel computing in support of these functions. We briefly summarize an experiment with HERMIES-IIB that demonstrates the capability of smooth transitions between robot autonomy and tele-operation. This experiment results from collaboration among teams at the Universities of Florida, Michigan, Tennessee, and Texas; and ORNL in a program targeted at robotics for advanced nuclear power stations. We conclude bymore » summarizing ongoing R D with our new mobile robot HERMIES-III which is equipped with a seven degree-of-freedom research manipulator arm. 12 refs., 4 figs.« less

  4. The VASIMR[registered trademark] VF-200-1 ISS Experiment as a Laboratory for Astrophysics

    NASA Technical Reports Server (NTRS)

    Glover Tim W.; Squire, Jared P.; Longmier, Benjamin; Cassady, Leonard; Ilin, Andrew; Carter, Mark; Olsen, Chris S.; McCaskill, Greg; Diaz, Franklin Chang; Girimaji, Sharath; hide

    2010-01-01

    The VASIMR[R] Flight Experiment (VF-200-1) will be tested in space aboard the International Space Station (ISS) in about four years. It will consist of two 100 kW parallel plasma engines with opposite magnetic dipoles, resulting in a near zero-torque magnetic system. Electrical energy will come from ISS at low power level, be stored in batteries and used to fire the engine at 200 kW. The VF-200-1 project will provide a unique opportunity on the ISS National Laboratory for astrophysicists and space physicists to study the dynamic evolution of an expanding and reconnecting plasma loop. Here, we review the status of the project and discuss our current plans for computational modeling and in situ observation of a dynamic plasma loop on an experimental platform in low-Earth orbit. The VF-200-1 project is still in the early stages of development and we welcome new collaborators.

  5. The VASIMR® VF-200-1 ISS Experiment as a Laboratory for Astrophysics

    NASA Astrophysics Data System (ADS)

    Glover, T.; Squire, J. P.; Longmier, B. W.; Carter, M. D.; Ilin, A. V.; Cassady, L. D.; Olsen, C. S.; Chang Díaz, F.; McCaskill, G. E.; Bering, E. A.; Garrison, D.; Girimaji, S.; Araya, D.; Morin, L.; Shebalin, J. V.

    2010-12-01

    The VASIMR® Flight Experiment (VF-200-1) will be tested in space aboard the International Space Station (ISS) in about four years. It will consist of two 100 kW parallel plasma engines with opposite magnetic dipoles, resulting in a near zero-torque magnetic system. Electrical energy will come from ISS at low power level, be stored in batteries and used to fire the engine at 200 kW. The VF-200-1 project will provide a unique opportunity on the ISS National Laboratory for astrophysicists and space physicists to study the dynamic evolution of an expanding and reconnecting plasma loop. Here, we review the status of the project and discuss our current plans for computational modeling and in situ observation of a dynamic plasma loop on an experimental platform in low-Earth orbit. The VF-200-1 project is still in the early stages of development and we welcome new collaborators.

  6. Summer research program (1992). Summer faculty research program (SFRP) reports. Volume 6. Arnold Engineering Development Center, Civil Engineering Laboratory, Frank J. Seiler research laboratory, Wilford Hall Medical Center. Annual report, 1 September 1991-31 August 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, G.

    1992-12-28

    The following Topics were among those completed at the Air Force Faculty Research Summer Program: Experiences using Model-Based Techniques for the Development of a Large Parallel Instrumentation System; Data Reduction of Laser Induced Fluorescence in Rocket Motor Exhausts; Feasibility of Wavelet Analysis for Plume Data Study; Characterization of Seagrass Meadows in St. Andrew (Crooked Island) Sound, Northern Gulf of Mexico; A Preliminary Study of the Weathering of Jet Fuels in Soil Monitored by SFE with GC Analysis; Preliminary Numerical model of Groundwater Flow at the MADE2 Site.

  7. Experimental verification of internal parameter in magnetically coupled boost used as PV optimizer in parallel association

    NASA Astrophysics Data System (ADS)

    Sawicki, Jean-Paul; Saint-Eve, Frédéric; Petit, Pierre; Aillerie, Michel

    2017-02-01

    This paper presents results of experiments aimed to verify a formula able to compute duty cycle in the case of pulse width modulation control for a DC-DC converter designed and realized in laboratory. This converter, called Magnetically Coupled Boost (MCB) is sized to step up only one photovoltaic module voltage to supply directly grid inverters. Duty cycle formula will be checked in a first time by identifying internal parameter, auto-transformer ratio, and in a second time by checking stability of operating point on the side of photovoltaic module. Thinking on nature of generator source and load connected to converter leads to imagine additional experiments to decide if auto-transformer ratio parameter could be used with fixed value or on the contrary with adaptive value. Effects of load variations on converter behavior or impact of possible shading on photovoltaic module are also mentioned, with aim to design robust control laws, in the case of parallel association, designed to compensate unwanted effects due to output voltage coupling.

  8. [Simultaneous desulfurization and denitrification by TiO2/ACF under different irradiation].

    PubMed

    Han, Jing; Zhao, Yi

    2009-04-15

    The supported TiO2 photocatalysts were prepared in laboratory, and the experiments of simultaneous desulfurization and denitrification were carried out by self-designed photocatalysis reactor. The optimal experimental conditions were achieved, and the efficiencies of simultaneous desulfurization and denitrification under two different light sources were compared. The results show that the oxygen content of flue gas, reaction temperature, flue gas humidity and irradiation intensity are most essential factors to photocatalysis. For TiO2/ACF, the removal efficiencies of 99.7% for SO2 and 64.3% for NO are obtained respectively at optimal experimental conditions under UV irradiation. For TiO2/ACF, the removal efficiencies of 97.5% for SO2 and 49.6% for NO are achieved respectively at optimal experimental conditions under the visible light irradiation. The results of five times parallel experiments indicate standard deviation S of parallel data is little. The mechanism of removal for SO2 and NO is proposed under two light sources by ion chromatography analysis of the absorption liquid.

  9. Refined Source Terms in Wave Watch 3 with Wave Breaking and Sea Spray Forecasts

    DTIC Science & Technology

    2016-08-05

    Farmer at IOS Canada involved a novel scale analysis of breaking waves. This was motivated by the results of the model study of wave breaking onset by...timely development that needs careful examination. 4.11 Highlights of the SPANDEX study SPANDEX, the Spray Production and Dynamics Experiment, is...speed alone. To accomplish this goal, a parallel laboratory study (SPANDEX II) was undertaken to parameterize sea spray flux dependences on breaking

  10. Parallels, How Many? Geometry Module for Use in a Mathematics Laboratory Setting.

    ERIC Educational Resources Information Center

    Brotherton, Sheila; And Others

    This is one of a series of geometry modules developed for use by secondary students in a laboratory setting. This module was conceived as an alternative approach to the usual practice of giving Euclid's parallel postulate and then mentioning that alternate postulates would lead to an alternate geometry or geometries. Instead, the student is led…

  11. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  12. Feasibility of establishing a biosafety level 3 tuberculosis culture laboratory of acceptable quality standards in a resource-limited setting: an experience from Uganda.

    PubMed

    Ssengooba, Willy; Gelderbloem, Sebastian J; Mboowa, Gerald; Wajja, Anne; Namaganda, Carolyn; Musoke, Philippa; Mayanja-Kizza, Harriet; Joloba, Moses Lutaakome

    2015-01-15

    Despite the recent innovations in tuberculosis (TB) and multi-drug resistant TB (MDR-TB) diagnosis, culture remains vital for difficult-to-diagnose patients, baseline and end-point determination for novel vaccines and drug trials. Herein, we share our experience of establishing a BSL-3 culture facility in Uganda as well as 3-years performance indicators and post-TB vaccine trials (pioneer) and funding experience of sustaining such a facility. Between September 2008 and April 2009, the laboratory was set-up with financial support from external partners. After an initial procedure validation phase in parallel with the National TB Reference Laboratory (NTRL) and legal approvals, the laboratory registered for external quality assessment (EQA) from the NTRL, WHO, National Health Laboratories Services (NHLS), and the College of American Pathologists (CAP). The laboratory also instituted a functional quality management system (QMS). Pioneer funding ended in 2012 and the laboratory remained in self-sustainability mode. The laboratory achieved internationally acceptable standards in both structural and biosafety requirements. Of the 14 patient samples analyzed in the procedural validation phase, agreement for all tests with NTRL was 90% (P <0.01). It started full operations in October 2009 performing smear microscopy, culture, identification, and drug susceptibility testing (DST). The annual culture workload was 7,636, 10,242, and 2,712 inoculations for the years 2010, 2011, and 2012, respectively. Other performance indicators of TB culture laboratories were also monitored. Scores from EQA panels included smear microscopy >80% in all years from NTRL, CAP, and NHLS, and culture was 100% for CAP panels and above regional average scores for all years with NHLS. Quarterly DST scores from WHO-EQA ranged from 78% to 100% in 2010, 80% to 100% in 2011, and 90 to 100% in 2012. From our experience, it is feasible to set-up a BSL-3 TB culture laboratory with acceptable quality performance standards in resource-limited countries. With the demonstrated quality of work, the laboratory attracted more research groups and post-pioneer funding, which helped to ensure sustainability. The high skilled experts in this research laboratory also continue to provide an excellent resource for the needed national discussion of the laboratory and quality management systems.

  13. Competitive Dominance among Strains of Luminous Bacteria Provides an Unusual Form of Evidence for Parallel Evolution in Sepiolid Squid-Vibrio Symbioses

    PubMed Central

    Nishiguchi, Michele K.; Ruby, Edward G.; McFall-Ngai, Margaret J.

    1998-01-01

    One of the principal assumptions in symbiosis research is that associated partners have evolved in parallel. We report here experimental evidence for parallel speciation patterns among several partners of the sepiolid squid-luminous bacterial symbioses. Molecular phylogenies for 14 species of host squids were derived from sequences of both the nuclear internal transcribed spacer region and the mitochondrial cytochrome oxidase subunit I; the glyceraldehyde phosphate dehydrogenase locus was sequenced for phylogenetic determinations of 7 strains of bacterial symbionts. Comparisons of trees constructed for each of the three loci revealed a parallel phylogeny between the sepiolids and their respective symbionts. Because both the squids and their bacterial partners can be easily cultured independently in the laboratory, we were able to couple these phylogenetic analyses with experiments to examine the ability of the different symbiont strains to compete with each other during the colonization of one of the host species. Our results not only indicate a pronounced dominance of native symbiont strains over nonnative strains, but also reveal a hierarchy of symbiont competency that reflects the phylogenetic relationships of the partners. For the first time, molecular systematics has been coupled with experimental colonization assays to provide evidence for the existence of parallel speciation among a set of animal-bacterial associations. PMID:9726861

  14. AFL-1: A programming Language for Massively Concurrent Computers.

    DTIC Science & Technology

    1986-11-01

    Bibliography Ackley, D.H., Hinton, G.E., Sejnowski, T.J., "A Learning Algorithm for boltzmann Machines", Cognitive Science, 1985, 9, 147-169. Agre...P.E., "Routines", Memo 828, MIT AI Laboratory, Many 1985. Ballard, D.H., Hayes, P.J., "Parallel Logical Inference", Conference of the Cognitive Science...34Experiments on Semantic Memory and Language Com- 125 prehension", in L.W. Greg (Ed.), Cognition in Learning and Memory, New York, Wiley, 1972._ Collins

  15. Mutation-Based Learning to Improve Student Autonomy and Scientific Inquiry Skills in a Large Genetics Laboratory Course

    PubMed Central

    Wu, Jinlu

    2013-01-01

    Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a “mutation” method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the “mutations”; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional “cookbook”-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class. PMID:24006394

  16. National Ignition Facility: Experimental plan

    NASA Astrophysics Data System (ADS)

    1994-05-01

    As part of the Conceptual Design Report (CDR) for the National Ignition Facility (NIF), scientists from Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Sandia National Laboratory (SNL), the University of Rochester's Laboratory for Laser Energetics (UR/LLE), and EG&G formed an NIF Target Diagnostics Working Group. The purpose of the Target Diagnostics Working Group is to prepare conceptual designs of target diagnostics for inclusion in the facility CDR and to determine how these specifications impact the CDR. To accomplish this, a subgroup has directed its efforts at constructing an approximate experimental plan for the ignition campaign of the NIF CDR. The results of this effort are contained in this document, the Experimental Plan for achieving fusion ignition in the NIF. This group initially concentrated on the flow-down requirements of the experimental campaign leading to ignition, which will dominate the initial efforts of the NIF. It is envisaged, however, that before ignition, there will be parallel campaigns supporting weapons physics, weapons effects, and other research. This plan was developed by analyzing the sequence of activities required to finally fire the laser at the level of power and precision necessary to achieve the conditions of an ignition hohlraum target, and to then use our experience in activating and running Nova experiments to estimate the rate of completing these activities.

  17. Laboratory hydraulic fracturing experiments in intact and pre-fractured rock

    USGS Publications Warehouse

    Zoback, M.D.; Rummel, F.; Jung, R.; Raleigh, C.B.

    1977-01-01

    Laboratory hydraulic fracturing experiments were conducted to investigate two factors which could influence the use of the hydrofrac technique for in-situ stress determinations; the possible dependence of the breakdown pressure upon the rate of borehole pressurization, and the influence of pre-existing cracks on the orientation of generated fractures. The experiments have shown that while the rate of borehole pressurization has a marked effect on breakdown pressures, the pressure at which hydraulic fractures initiate (and thus tensile strength) is independent of the rate of borehole pressurization when the effect of fluid penetration is negligible. Thus, the experiments indicate that use of breakdown pressures rather than fracture initiation pressures may lead to an erroneous estimate of tectonic stresses. A conceptual model is proposed to explain anomalously high breakdown pressures observed when fracturing with high viscosity fluids. In this model, initial fracture propagation is presumed to be stable due to large differences between the borehole pressure and that within the fracture. In samples which contained pre-existing fractures which were 'leaky' to water, we found it possible to generate hydraulic fractures oriented parallel to the direction of maximum compression if high viscosity drilling mud was used as the fracturing fluid. ?? 1977.

  18. Two Non Linear Dynamics Plasma Astrophysics Experiments At LANL

    NASA Astrophysics Data System (ADS)

    Intrator, T.; Weber, T.; Feng, Y.; Sears, J.; Smith, R. J.; Swan, H.; Hutchinson, T.; Boguski, J.; Gao, K.; Chapdelaine, L.; Dunn, J. P.

    2013-12-01

    Two laboratory experiments at Los Alamos National Laboratory (LANL) have been built to gain access to a wide range of fundamental plasma physics issues germane to astro, space, and fusion plasmas. The over arching theme is magnetized plasma dynamics that include currents, MHD forces and instabilities, sheared flows and shocks, along with creation and annihilation of magnetic field. The Relaxation Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, that are observed to kink, bounce, merge and reconnect, shred, and reform in complicated ways. We show recent movies from a large detailed data set that describe the 3D magnetic structure and helicity budget of a driven and dissipative system that spontaneously self saturates a kink instability. The Magnetized Shock Experiment (MSX) uses a Field reversed configuration (FRC) that is ejected at high speed and then stagnated onto a stopping mirror field, which drives a collisionless magnetized shock. A plasmoid accelerator will also access super critical shocks at much larger Alfven Mach numbers. Unique features include access to parallel, oblique and perpendicular shocks, in regions much larger than ion gyro radius and inertial length, large magnetic and fluid Reynolds numbers, and volume for turbulence.

  19. Investigation of Yersinia pestis laboratory adaptation through a combined genomics and proteomics approach

    DOE PAGES

    Leiser, Owen P.; Merkley, Eric D.; Clowers, Brian H.; ...

    2015-11-24

    Here, the bacterial pathogen Yersinia pestis, the cause of plague in humans and animals, normally has a sylvatic lifestyle, cycling between fleas and mammals. In contrast, laboratory-grown Y. pestis experiences a more constant environment and conditions that it would not normally encounter. The transition from the natural environment to the laboratory results in a vastly different set of selective pressures, and represents what could be considered domestication. Understanding the kinds of adaptations Y. pestis undergoes as it becomes domesticated will contribute to understanding the basic biology of this important pathogen. In this study, we performed a Parallel Serial Passage Experimentmore » (PSPE) to explore the mechanisms by which Y. pestis adapts to laboratory conditions, hypothesizing that cells would undergo significant changes in virulence and nutrient acquisition systems. Two wild strains were serially passaged in 12 independent populations each for ~750 generations, after which each population was analyzed using whole-genome sequencing. We observed considerable parallel evolution in the endpoint populations, detecting multiple independent mutations in ail, pepA, and zwf, suggesting that specific selective pressures are shaping evolutionary responses. Complementary LC-MS-based proteomic data provide physiological context to the observed mutations, and reveal regulatory changes not necessarily associated with specific mutations, including changes in amino acid metabolism, envelope biogenesis, iron storage and acquisition, and a type VI secretion system. Proteomic data support hypotheses generated by genomic data in addition to suggesting future mechanistic studies, indicating that future whole-genome sequencing studies be designed to leverage proteomics as a critical complement.« less

  20. Investigation of Yersinia pestis laboratory adaptation through a combined genomics and proteomics approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leiser, Owen P.; Merkley, Eric D.; Clowers, Brian H.

    Here, the bacterial pathogen Yersinia pestis, the cause of plague in humans and animals, normally has a sylvatic lifestyle, cycling between fleas and mammals. In contrast, laboratory-grown Y. pestis experiences a more constant environment and conditions that it would not normally encounter. The transition from the natural environment to the laboratory results in a vastly different set of selective pressures, and represents what could be considered domestication. Understanding the kinds of adaptations Y. pestis undergoes as it becomes domesticated will contribute to understanding the basic biology of this important pathogen. In this study, we performed a Parallel Serial Passage Experimentmore » (PSPE) to explore the mechanisms by which Y. pestis adapts to laboratory conditions, hypothesizing that cells would undergo significant changes in virulence and nutrient acquisition systems. Two wild strains were serially passaged in 12 independent populations each for ~750 generations, after which each population was analyzed using whole-genome sequencing. We observed considerable parallel evolution in the endpoint populations, detecting multiple independent mutations in ail, pepA, and zwf, suggesting that specific selective pressures are shaping evolutionary responses. Complementary LC-MS-based proteomic data provide physiological context to the observed mutations, and reveal regulatory changes not necessarily associated with specific mutations, including changes in amino acid metabolism, envelope biogenesis, iron storage and acquisition, and a type VI secretion system. Proteomic data support hypotheses generated by genomic data in addition to suggesting future mechanistic studies, indicating that future whole-genome sequencing studies be designed to leverage proteomics as a critical complement.« less

  1. Implementation and Assessment of a Virtual Laboratory of Parallel Robots Developed for Engineering Students

    ERIC Educational Resources Information Center

    Gil, Arturo; Peidró, Adrián; Reinoso, Óscar; Marín, José María

    2017-01-01

    This paper presents a tool, LABEL, oriented to the teaching of parallel robotics. The application, organized as a set of tools developed using Easy Java Simulations, enables the study of the kinematics of parallel robotics. A set of classical parallel structures was implemented such that LABEL can solve the inverse and direct kinematic problem of…

  2. A Next-Generation Parallel File System Environment for the OLCF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillow, David A; Fuller, Douglas; Gunasekaran, Raghul

    2012-01-01

    When deployed in 2008/2009 the Spider system at the Oak Ridge National Laboratory s Leadership Computing Facility (OLCF) was the world s largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF s diverse computational environment, Spider has since become a blueprint for shared Lustre environments deployed worldwide. Designed to support the parallel I/O requirements of the Jaguar XT5 system and other smallerscale platforms at the OLCF, the upgrade to the Titan XK6 heterogeneous system will begin to push the limits of Spider s originalmore » design by mid 2013. With a doubling in total system memory and a 10x increase in FLOPS, Titan will require both higher bandwidth and larger total capacity. Our goal is to provide a 4x increase in total I/O bandwidth from over 240GB=sec today to 1TB=sec and a doubling in total capacity. While aggregate bandwidth and total capacity remain important capabilities, an equally important goal in our efforts is dramatically increasing metadata performance, currently the Achilles heel of parallel file systems at leadership. We present in this paper an analysis of our current I/O workloads, our operational experiences with the Spider parallel file systems, the high-level design of our Spider upgrade, and our efforts in developing benchmarks that synthesize our performance requirements based on our workload characterization studies.« less

  3. Xyce parallel electronic simulator : users' guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.

    2011-05-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers; (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-artmore » algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only); and (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a unique electrical simulation capability, designed to meet the unique needs of the laboratory.« less

  4. Early experience shapes vocal neural coding and perception in songbirds

    PubMed Central

    Woolley, Sarah M. N.

    2012-01-01

    Songbirds, like humans, are highly accomplished vocal learners. The many parallels between speech and birdsong and conserved features of mammalian and avian auditory systems have led to the emergence of the songbird as a model system for studying the perceptual mechanisms of vocal communication. Laboratory research on songbirds allows the careful control of early life experience and high-resolution analysis of brain function during vocal learning, production and perception. Here, I review what songbird studies have revealed about the role of early experience in the development of vocal behavior, auditory perception and the processing of learned vocalizations by auditory neurons. The findings of these studies suggest general principles for how exposure to vocalizations during development and into adulthood influences the perception of learned vocal signals. PMID:22711657

  5. Research on the Application of Fast-steering Mirror in Stellar Interferometer

    NASA Astrophysics Data System (ADS)

    Mei, R.; Hu, Z. W.; Xu, T.; Sun, C. S.

    2017-07-01

    For a stellar interferometer, the fast-steering mirror (FSM) is widely utilized to correct wavefront tilt caused by atmospheric turbulence and internal instrumental vibration due to its high resolution and fast response frequency. In this study, the non-coplanar error between the FSM and actuator deflection axis introduced by manufacture, assembly, and adjustment is analyzed. Via a numerical method, the additional optical path difference (OPD) caused by above factors is studied, and its effects on tracking accuracy of stellar interferometer are also discussed. On the other hand, the starlight parallelism between the beams of two arms is one of the main factors of the loss of fringe visibility. By analyzing the influence of wavefront tilt caused by the atmospheric turbulence on fringe visibility, a simple and efficient real-time correction scheme of starlight parallelism is proposed based on a single array detector. The feasibility of this scheme is demonstrated by laboratory experiment. The results show that starlight parallelism meets the requirement of stellar interferometer in wavefront tilt preliminarily after the correction of fast-steering mirror.

  6. Summer Proceedings 2016: The Center for Computing Research at Sandia National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carleton, James Brian; Parks, Michael L.

    Solving sparse linear systems from the discretization of elliptic partial differential equations (PDEs) is an important building block in many engineering applications. Sparse direct solvers can solve general linear systems, but are usually slower and use much more memory than effective iterative solvers. To overcome these two disadvantages, a hierarchical solver (LoRaSp) based on H2-matrices was introduced in [22]. Here, we have developed a parallel version of the algorithm in LoRaSp to solve large sparse matrices on distributed memory machines. On a single processor, the factorization time of our parallel solver scales almost linearly with the problem size for three-dimensionalmore » problems, as opposed to the quadratic scalability of many existing sparse direct solvers. Moreover, our solver leads to almost constant numbers of iterations, when used as a preconditioner for Poisson problems. On more than one processor, our algorithm has significant speedups compared to sequential runs. With this parallel algorithm, we are able to solve large problems much faster than many existing packages as demonstrated by the numerical experiments.« less

  7. Planetary stations and Abyssal Benthic Laboratories: An overview of parallel approaches for long-term investigation in extreme environments

    NASA Technical Reports Server (NTRS)

    Dipippo, S.; Prendin, W.; Gasparoni, F.

    1994-01-01

    In spite of the apparent great differences between deep ocean and space environment, significant similarities can be recognized when considering the possible solutions and technologies enabling the development of remote automatic stations supporting the execution of scientific activities. In this sense it is believed that mutual benefits shall be derived from the exchange of experiences and results between people and organizations involved in research and engineering activities for hostile environments, such as space, deep sea, and polar areas. A significant example of possible technology transfer and common systematic approach is given, which describes in some detail how the solutions and the enabling technologies identified for an Abyssal Benthic Laboratory can be applied for the case of a lunar or planetary station.

  8. Benard and Marangoni convection in multiple liquid layers

    NASA Technical Reports Server (NTRS)

    Koster, Jean N.; Prakash, A.; Fujita, D.; Doi, T.

    1992-01-01

    Convective fluid dynamics of immiscible double and triple liquid layers are considered. First results on multilayer convective flow, in preparation for spaceflight experiment aboard IML-2 (International Microgravity Laboratory), are discussed. Convective flow in liquid layers with one or two horizontal interfaces with heat flow applied parallel to them is one of the systems investigated. The second system comprises two horizontally layered immiscible liquids heated from below and cooled from above, that is, heat flow orthogonal to the interface. In this system convection results due to the classical Benard instability.

  9. A study of optical scattering methods in laboratory plasma diagnosis

    NASA Technical Reports Server (NTRS)

    Phipps, C. R., Jr.

    1972-01-01

    Electron velocity distributions are deduced along axes parallel and perpendicular to the magnetic field in a pulsed, linear Penning discharge in hydrogen by means of a laser Thomson scattering experiment. Results obtained are numerical averages of many individual measurements made at specific space-time points in the plasma evolution. Because of the high resolution in k-space and the relatively low maximum electron density 2 x 10 to the 13th power/cu cm, special techniques were required to obtain measurable scattering signals. These techniques are discussed and experimental results are presented.

  10. Lineation-parallel c-axis Fabric of Quartz Formed Under Water-rich Conditions

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Zhang, J.; Li, P.

    2014-12-01

    The crystallographic preferred orientation (CPO) of quartz is of great significance because it records much valuable information pertinent to the deformation of quartz-rich rocks in the continental crust. The lineation-parallel c-axis CPO (i.e., c-axis forming a maximum parallel to the lineation) in naturally deformed quartz is generally considered to form under high temperature (> ~550 ºC) conditions. However, most laboratory deformation experiments on quartzite failed to produce such a CPO at high temperatures up to 1200 ºC. Here we reported a new occurrence of the lineation-parallel c-axis CPO of quartz from kyanite-quartz veins in eclogite. Optical microstructural observations, fourier transform infrared (FTIR) and electron backscattered diffraction (EBSD) techniques were integrated to illuminate the nature of quartz CPOs. Quartz exhibits mostly straight to slightly curved grain boundaries, modest intracrystalline plasticity, and significant shape preferred orientation (SPO) and CPOs, indicating dislocation creep dominated the deformation of quartz. Kyanite grains in the veins are mostly strain-free, suggestive of their higher strength than quartz. The pronounced SPO and CPOs in kyanite were interpreted to originate from anisotropic crystal growth and/or mechanical rotation during vein-parallel shearing. FTIR results show quartz contains a trivial amount of structurally bound water (several tens of H/106 Si), while kyanite has a water content of 384-729 H/106 Si; however, petrographic observations suggest quartz from the veins were practically deformed under water-rich conditions. We argue that the observed lineation-parallel c-axis fabric in quartz was inherited from preexisting CPOs as a result of anisotropic grain growth under stress facilitated by water, but rather than due to a dominant c-slip. The preservation of the quartz CPOs probably benefited from the preexisting quartz CPOs which renders most quartz grains unsuitably oriented for an easy a-slip at lower temperatures and the weak deformation during subsequent exhumation. This hypothesis provides a reasonable explanation for the observations that most lineation-parallel c-axis fabrics of quartz were found in veins and that deformation experiments on quartz-rich rocks at high temperature failed to produce such CPOs.

  11. Optimizing the effectiveness of a mechanical suture-based anulus fibrosus repair construct in an acute failure laboratory simulation.

    PubMed

    Bartlett, Ashley; Wales, Larry; Houfburg, Rodney; Durfee, William K; Griffith, Steven L; Bentley, Ishmael

    2013-10-01

    In vitro comparative, laboratory experiments. This study developed a laboratory apparatus that measured resistance to failure using pressures similar to intradiscal pressure of a lumbar spinal disk. Various combinations of an anular repair device were compared. Herniated material of the intervertebral disk is removed during a lumbar discectomy; however, the defect in the anulus fibrosus remains and can provide a pathway for future herniation. Repairing the anulus fibrosus could mitigate this reherniation and improve patient outcomes. A pneumatic cylinder was used to increase the pressure of a sealed chamber until artificial nucleus pulposus material was expulsed through either a 3-mm circular (diameter) or a 6-mm slit anular defect created in a surrogate anulus fibrosus. Each unrepaired condition was compared with 3 repaired conditions using a commercially available soft tissue repair system. The repaired conditions included: (1) a single tension band; (2) 2 tension bands in a cruciate pattern; or (3) 2 tension bands in a parallel pattern. Maximum pressure at the point of extrusion of the internal chamber material and failure or nonfailure of the repair was measured. Significant differences were detected (P<0.05) in maximum failure pressures for the nonrepaired (control) versus repaired conditions. With 1 or 2 tension bands repairing the circular defect, the maximum failure pressure increased by approximately 76% and 131%, respectively. In addition, the failure pressure for 2 tension bands in either a cruciate or parallel configuration was not different, and was approximately 32% higher (P<0.05) than a single tension band in the case of the circular defect. Similar results were seen for the slit defect, with the exception that no difference between the repaired conditions (ie, single vs. 2 tension bands) was detected. This laboratory simulation demonstrated that repairing the anulus fibrosus after a discectomy procedure can be beneficial for retaining intradiscal material. The use of 2 tension bands, versus a single tension band, in either a cruciate or parallel configuration may further improve the ability to retain disk material.

  12. Assessment of field-related influences on polychlorinated biphenyl exposures and sorbent amendment using polychaete bioassays and passive sampler measurements

    USGS Publications Warehouse

    Janssen, E.M.; Oen, A.M.; Luoma, S.N.; Luthy, R.G.

    2011-01-01

    Field-related influences on polychlorinated biphenyl (PCB) exposure were evaluated by employing caged deposit-feeders, Neanthes arenaceodentata, along with polyoxymethylene (POM) samplers using parallel in situ and ex situ bioassays with homogenized untreated or activated carbon (AC) amended sediment. The AC amendment achieved a remedial efficiency in reducing bioaccumulation by 90% in the laboratory and by 44% in the field transplants. In situ measurements showed that PCB uptake by POM samplers was greater for POM placed in the surface sediment compared with the underlying AC amendment, suggesting that tidal exchange of surrounding material with similar PCB availability as untreated sediment was redeposited in the cages. Polychlorinated biphenyls bioaccumulation with caged polychaetes from untreated sediment was half as large under field conditions compared with laboratory conditions. A biodynamic model was used to confirm and quantify the different processes that could have influenced these results. Three factors appeared most influential in the bioassays: AC amendment significantly reduces bioavailability under laboratory and field conditions; sediment deposition within test cages in the field partially masks the remedial benefit of underlying AC-amended sediment; and deposit-feeders exhibit less PCB uptake from untreated sediment when feeding is reduced. Ex situ and in situ experiments inevitably show some differences that are associated with measurement methods and effects of the environment. Parallel ex situ and in situ bioassays, passive sampler measurements, and quantifying important processes with a model can tease apart these field influences. ?? 2010 SETAC.

  13. Reduction of product-related species during the fermentation and purification of a recombinant IL-1 receptor antagonist at the laboratory and pilot scale.

    PubMed

    Schirmer, Emily B; Golden, Kathryn; Xu, Jin; Milling, Jesse; Murillo, Alec; Lowden, Patricia; Mulagapati, Srihariraju; Hou, Jinzhao; Kovalchin, Joseph T; Masci, Allyson; Collins, Kathryn; Zarbis-Papastoitsis, Gregory

    2013-08-01

    Through a parallel approach of tracking product quality through fermentation and purification development, a robust process was designed to reduce the levels of product-related species. Three biochemically similar product-related species were identified as byproducts of host-cell enzymatic activity. To modulate intracellular proteolytic activity, key fermentation parameters (temperature, pH, trace metals, EDTA levels, and carbon source) were evaluated through bioreactor optimization, while balancing negative effects on growth, productivity, and oxygen demand. The purification process was based on three non-affinity steps and resolved product-related species by exploiting small charge differences. Using statistical design of experiments for elution conditions, a high-resolution cation exchange capture column was optimized for resolution and recovery. Further reduction of product-related species was achieved by evaluating a matrix of conditions for a ceramic hydroxyapatite column. The optimized fermentation process was transferred from the 2-L laboratory scale to the 100-L pilot scale and the purification process was scaled accordingly to process the fermentation harvest. The laboratory- and pilot-scale processes resulted in similar process recoveries of 60 and 65%, respectively, and in a product that was of equal quality and purity to that of small-scale development preparations. The parallel approach for up- and downstream development was paramount in achieving a robust and scalable clinical process. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Zonal Acoustic Velocimetry in 30-cm, 60-cm, and 3-m Laboratory Models of the Outer Core

    NASA Astrophysics Data System (ADS)

    Rojas, R.; Doan, M. N.; Adams, M. M.; Mautino, A. R.; Stone, D.; Lekic, V.; Lathrop, D. P.

    2016-12-01

    A knowledge of zonal flows and shear is key in understanding magnetic field dynamics in the Earth and laboratory experiments with Earth-like geometries. Traditional techniques for measuring fluid flow using visualization and particle tracking are not well-suited to liquid metal flows. This has led us to develop a flow measurement technique based on acoustic mode velocimetry adapted from helioseismology. As a first step prior to measurements in the liquid sodium experiments, we implement this technique in our 60-cm diameter spherical Couette experiment in air. To account for a more realistic experimental geometry, including deviations from spherical symmetry, we compute predicted frequencies of acoustic normal modes using the finite element method. The higher accuracy of the predicted frequencies allows the identification of over a dozen acoustic modes, and mode identification is further aided by the use of multiple microphones and by analyzing spectra together with those obtained at a variety of nearby Rossby numbers. Differences between the predicted and observed mode frequencies are caused by differences in flow patterns present in the experiment. We compare acoustic mode frequency splittings with theoretical predictions for stationary fluid and solid body flow condition with excellent agreement. We also use this technique to estimate the zonal shear in those experiments across a range of Rossby numbers. Finally, we report on initial attempts to use this in liquid sodium in the 3-meter diameter experiment and parallel experiments performed in water in the 30-cm diameter experiment.

  15. Digital Optical Control System

    NASA Astrophysics Data System (ADS)

    Jordan, David H.; Tipton, Charles A.; Christmann, Charles E.; Hochhausler, Nils P.

    1988-09-01

    We describe the digital optical control system (DOGS), a state-of-the-art controller for electrical feedback in an optical system. The need for a versatile optical controller arose from a number of unique experiments being performed by the Air Force Weapons Laboratory. These experiments use similar detectors and actuator-controlled mirrors, but the control requirements vary greatly. The experiments have in common a requirement for parallel control systems. The DOGS satisfies these needs by allowing several control systems to occupy a single chassis with one master controller. The architecture was designed to allow upward compatibility with future configurations. Combinations of off-the-shelf and custom boards are configured to meet the requirements of each experiment. The configuration described here was used to control piston error to X/80 at a wavelength of 0.51 Am. A peak sample rate of 8 kHz, yielding a closed loop bandwidth of 800 Hz, was achieved.

  16. On-board landmark navigation and attitude reference parallel processor system

    NASA Technical Reports Server (NTRS)

    Gilbert, L. E.; Mahajan, D. T.

    1978-01-01

    An approach to autonomous navigation and attitude reference for earth observing spacecraft is described along with the landmark identification technique based on a sequential similarity detection algorithm (SSDA). Laboratory experiments undertaken to determine if better than one pixel accuracy in registration can be achieved consistent with onboard processor timing and capacity constraints are included. The SSDA is implemented using a multi-microprocessor system including synchronization logic and chip library. The data is processed in parallel stages, effectively reducing the time to match the small known image within a larger image as seen by the onboard image system. Shared memory is incorporated in the system to help communicate intermediate results among microprocessors. The functions include finding mean values and summation of absolute differences over the image search area. The hardware is a low power, compact unit suitable to onboard application with the flexibility to provide for different parameters depending upon the environment.

  17. A real time, FEM based optimal control algorithm and its implementation using parallel processing hardware (transistors) in a microprocessor environment

    NASA Technical Reports Server (NTRS)

    Patten, William Neff

    1989-01-01

    There is an evident need to discover a means of establishing reliable, implementable controls for systems that are plagued by nonlinear and, or uncertain, model dynamics. The development of a generic controller design tool for tough-to-control systems is reported. The method utilizes a moving grid, time infinite element based solution of the necessary conditions that describe an optimal controller for a system. The technique produces a discrete feedback controller. Real time laboratory experiments are now being conducted to demonstrate the viability of the method. The algorithm that results is being implemented in a microprocessor environment. Critical computational tasks are accomplished using a low cost, on-board, multiprocessor (INMOS T800 Transputers) and parallel processing. Progress to date validates the methodology presented. Applications of the technique to the control of highly flexible robotic appendages are suggested.

  18. Evolution of sausage and helical modes in magnetized thin-foil cylindrical liners driven by a Z-pinch

    NASA Astrophysics Data System (ADS)

    Yager-Elorriaga, D. A.; Lau, Y. Y.; Zhang, P.; Campbell, P. C.; Steiner, A. M.; Jordan, N. M.; McBride, R. D.; Gilgenbach, R. M.

    2018-05-01

    In this paper, we present experimental results on axially magnetized (Bz = 0.5 - 2.0 T), thin-foil (400 nm-thick) cylindrical liner-plasmas driven with ˜600 kA by the Michigan Accelerator for Inductive Z-Pinch Experiments, which is a linear transformer driver at the University of Michigan. We show that: (1) the applied axial magnetic field, irrespective of its direction (e.g., parallel or anti-parallel to the flow of current), reduces the instability amplitude for pure magnetohydrodynamic (MHD) modes [defined as modes devoid of the acceleration-driven magneto-Rayleigh-Taylor (MRT) instability]; (2) axially magnetized, imploding liners (where MHD modes couple to MRT) generate m = 1 or m = 2 helical modes that persist from the implosion to the subsequent explosion stage; (3) the merging of instability structures is a mechanism that enables the appearance of an exponential instability growth rate for a longer than expected time-period; and (4) an inverse cascade in both the axial and azimuthal wavenumbers, k and m, may be responsible for the final m = 2 helical structure observed in our experiments. These experiments are particularly relevant to the magnetized liner inertial fusion program pursued at Sandia National Laboratories, where helical instabilities have been observed.

  19. Data acquisition for the new muon g-2 experiment at Fermilab

    DOE PAGES

    Gohn, Wesley

    2015-12-23

    A new measurement of the anomalous magnetic moment of the muon, a μ ≡ (g - 2)/2, will be performed at the Fermi National Accelerator Laboratory. The most recent measurement, performed at Brookhaven National Laboratory and completed in 2001, shows a 3.3-3.6 standard deviation discrepancy with the Standard Model predictions for a μ. The new measurement will accumulate 21 times those statistics, measuring a μ to 140 ppb and reducing the uncertainty by a factor of 4. The data acquisition system for this experiment must have the ability to record deadtime-free records from 700 μs muon spills at a rawmore » data rate of 18 GB per second. Data will be collected using 1296 channels of μTCA-based 800 MSPS, 12 bit waveform digitizers and processed in a layered array of networked commodity processors with 24 GPUs working in parallel to perform a fast recording and processing of detector signals during the spill. The system will be controlled using the MIDAS data acquisition software package. Lastly, the described data acquisition system is currently being constructed, and will be fully operational before the start of the experiment in 2017.« less

  20. Data Acquisition for the New Muon g-2 Experiment at Fermilab

    NASA Astrophysics Data System (ADS)

    Gohn, Wesley

    2015-12-01

    A new measurement of the anomalous magnetic moment of the muon,aμ≡ (g - 2)/2, will be performed at the Fermi National Accelerator Laboratory. The most recent measurement, performed at Brookhaven National Laboratory and completed in 2001, shows a 3.3-3.6 standard deviation discrepancy with the Standard Model predictions for aμ. The new measurement will accumulate 21 times those statistics, measuring aμ to 140 ppb and reducing the uncertainty by a factor of 4. The data acquisition system for this experiment must have the ability to record deadtime-free records from 700 μs muon spills at a raw data rate of 18 GB per second. Data will be collected using 1296 channels of μTCA-based 800 MHz, 12 bit waveform digitizers and processed in a layered array of networked commodity processors with 24 GPUs working in parallel to perform a fast recording and processing of detector signals during the spill. The system will be controlled using the MIDAS data acquisition software package. The described data acquisition system is currently being constructed, and will be fully operational before the start of the experiment in 2017.

  1. Simulation of Martian surface-atmosphere interaction in a space-simulator: Technical considerations and feasibility

    NASA Technical Reports Server (NTRS)

    Moehlmann, D.; Kochan, H.

    1992-01-01

    The Space Simulator of the German Aerospace Research Establishment at Cologne, formerly used for testing satellites, is now, since 1987, the central unit within the research sub-program 'Comet-Simulation' (KOSI). The KOSI team has investigated physical processes relevant to comets and their surfaces. As a byproduct we gained experience in sample-handling under simulated space conditions. In broadening the scope of the research activities of the DLR Institute of Space Simulation an extension to 'Laboratory-Planetology' is planned. Following the KOSI-experiments a Mars Surface-Simulation with realistic minerals and surface soil in a suited environment (temperature, pressure, and CO2-atmosphere) is foreseen as the next step. Here, our main interest is centered on thermophysical properties of the Martian surface and energy transport (and related gas transport) through the surface. These laboratory simulation activities can be related to space missions as typical pre-mission and during-the-mission support of the experiments design and operations (simulation in parallel). Post mission experiments for confirmation and interpretation of results are of great value. The physical dimensions of the Space Simulator (cylinder of about 2.5 m diameter and 5 m length) allows for testing and qualification of experimental hardware under realistic Martian conditions.

  2. Two non linear dynamics plasma astrophysics experiments at LANL

    NASA Astrophysics Data System (ADS)

    Intrator, T. P.; Weber, T. E.; Feng, Y.; Sears, J. A.; Swan, H.; Hutchinson, T.; Boguski, J.; Gao, K.; Chapdelaine, L.; Dunn, J.

    2013-10-01

    Two laboratory experiments at Los Alamos National Laboratory (LANL) have been built to gain access to a wide range of fundamental plasma physics issues germane astro, space, and fusion plasmas. The over arching theme is magnetized plasma dynamics that include currents, MHD forces and instabilities, sheared flows and shocks, creation and annihilation of magnetic field. The Reconnection Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, that can kink, bounce, merge and reconnect, shred, and reform in complicated ways. The most recent movies from a large detailed data set describe the 3D magnetic structure and helicity budget of a driven and dissipative system that spontaneously self saturates a kink instability. The Magnetized Shock Experiment (MSX) uses a Field reversed configuration (FRC) that is ejected at high speed and then stagnated onto a stopping mirror field, which drives a collisionless magnetized shock. A plasmoid accelerator will also access super critical shocks at much larger Alfven Mach numbers. Unique features include access to parallel, oblique and perpendicular shocks, in regions much larger than ion gyro radius and inertial length, large magnetic and fluid Reynolds numbers, and volume for turbulence. Center for Magnetic Self Organization, NASA Geospace NNHIOA044I-Basic, Department of Energy DE-AC52-06NA25369.

  3. Modularized Parallel Neutron Instrument Simulation on the TeraGrid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Meili; Cobb, John W; Hagen, Mark E

    2007-01-01

    In order to build a bridge between the TeraGrid (TG), a national scale cyberinfrastructure resource, and neutron science, the Neutron Science TeraGrid Gateway (NSTG) is focused on introducing productive HPC usage to the neutron science community, primarily the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL). Monte Carlo simulations are used as a powerful tool for instrument design and optimization at SNS. One of the successful efforts of a collaboration team composed of NSTG HPC experts and SNS instrument scientists is the development of a software facility named PSoNI, Parallelizing Simulations of Neutron Instruments. Parallelizing the traditional serialmore » instrument simulation on TeraGrid resources, PSoNI quickly computes full instrument simulation at sufficient statistical levels in instrument de-sign. Upon SNS successful commissioning, to the end of 2007, three out of five commissioned instruments in SNS target station will be available for initial users. Advanced instrument study, proposal feasibility evalua-tion, and experiment planning are on the immediate schedule of SNS, which pose further requirements such as flexibility and high runtime efficiency on fast instrument simulation. PSoNI has been redesigned to meet the new challenges and a preliminary version is developed on TeraGrid. This paper explores the motivation and goals of the new design, and the improved software structure. Further, it describes the realized new fea-tures seen from MPI parallelized McStas running high resolution design simulations of the SEQUOIA and BSS instruments at SNS. A discussion regarding future work, which is targeted to do fast simulation for automated experiment adjustment and comparing models to data in analysis, is also presented.« less

  4. Dynamics of a reconnection-driven runaway ion tail in a reversed field pinch plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, J. K., E-mail: jkanders@wisc.edu; Kim, J.; Bonofiglo, P. J.

    2016-05-15

    While reconnection-driven ion heating is common in laboratory and astrophysical plasmas, the underlying mechanisms for converting magnetic to kinetic energy remain not fully understood. Reversed field pinch discharges are often characterized by rapid ion heating during impulsive reconnection, generating an ion distribution with an enhanced bulk temperature, mainly perpendicular to magnetic field. In the Madison Symmetric Torus, a subset of discharges with the strongest reconnection events develop a very anisotropic, high energy tail parallel to magnetic field in addition to bulk perpendicular heating, which produces a fusion neutron flux orders of magnitude higher than that expected from a Maxwellian distribution.more » Here, we demonstrate that two factors in addition to a perpendicular bulk heating mechanism must be considered to explain this distribution. First, ion runaway can occur in the strong parallel-to-B electric field induced by a rapid equilibrium change triggered by reconnection-based relaxation; this effect is particularly strong on perpendicularly heated ions which experience a reduced frictional drag relative to bulk ions. Second, the confinement of ions varies dramatically as a function of velocity. Whereas thermal ions are governed by stochastic diffusion along tearing-altered field lines (and radial diffusion increases with parallel speed), sufficiently energetic ions are well confined, only weakly affected by a stochastic magnetic field. High energy ions traveling mainly in the direction of toroidal plasma current are nearly classically confined, while counter-propagating ions experience an intermediate confinement, greater than that of thermal ions but significantly less than classical expectations. The details of ion confinement tend to reinforce the asymmetric drive of the parallel electric field, resulting in a very asymmetric, anisotropic distribution.« less

  5. A heating experiment in the argillites in the Meuse/Haute-Marne underground research laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wileveau, Yannick; Su, Kun; Ghoreychi, Mehdi

    2007-07-01

    A heating experiment named TER is being conducted with the objectives to identify the thermal properties, as well as to enhance the knowledge on THM processes in the Callovo-Oxfordian clay at the Meuse/Haute Marne Underground Research Laboratory (France). The in situ experiment has being switched on from early 2006. The heater, 3 m length, is designed to inject the power in the undisturbed zone at 6 m from the gallery wall. A heater packer is inflated in a metallic tubing. During the experiment, numerous sensors are emplaced in the surrounding rock and are experienced to monitor the evolution in temperature,more » pore-water pressure and deformation. The models and numerical codes applied should be validated by comparing the modeling results with the measurements. In parallel, some lab testing have been achieved in order to compare the results given with two different scales (cm up to meter scale). In this paper, we present a general description of the TER experiment with installation of the heater equipment and the surrounding instrumentation. Details of the in situ measurements of temperature, pore-pressure and strain evolutions are given for the several heating and cooling phases. The thermal conductivity and some predominant parameters in THM processes (as linear thermal expansion coefficient and permeability) will be discussed. (authors)« less

  6. Final report for the Tera Computer TTI CRADA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, G.S.; Pavlakos, C.; Silva, C.

    1997-01-01

    Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less

  7. Proceedings of the Flat-Plate Solar Array Project Research Forum on the Design of Flat-Plate Photovoltaic Arrays for Central Stations

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Flat Plate Solar Array Project, focuses on advancing technologies relevant to the design and construction of megawatt level central station systems. Photovoltaic modules and arrays for flat plate central station or other large scale electric power production facilities require the establishment of a technical base that resolves design issues and results in practical and cost effective configurations. Design, qualification and maintenance issues related to central station arrays derived from the engineering and operating experiences of early applications and parallel laboratory reserch activities are investigated. Technical issues are examined from the viewpoint of the utility engineer, architect/engineer and laboratory researcher. Topics on optimum source circuit designs, module insulation design for high system voltages, array safety, structural interface design, measurements, and array operation and maintenance are discussed.

  8. Constituent order and semantic parallelism in online comprehension: eye-tracking evidence from German.

    PubMed

    Knoeferle, Pia; Crocker, Matthew W

    2009-12-01

    Reading times for the second conjunct of and-coordinated clauses are faster when the second conjunct parallels the first conjunct in its syntactic or semantic (animacy) structure than when its structure differs (Frazier, Munn, & Clifton, 2000; Frazier, Taft, Roeper, & Clifton, 1984). What remains unclear, however, is the time course of parallelism effects, their scope, and the kinds of linguistic information to which they are sensitive. Findings from the first two eye-tracking experiments revealed incremental constituent order parallelism across the board-both during structural disambiguation (Experiment 1) and in sentences with unambiguously case-marked constituent order (Experiment 2), as well as for both marked and unmarked constituent orders (Experiments 1 and 2). Findings from Experiment 3 revealed effects of both constituent order and subtle semantic (noun phrase similarity) parallelism. Together our findings provide evidence for an across-the-board account of parallelism for processing and-coordinated clauses, in which both constituent order and semantic aspects of representations contribute towards incremental parallelism effects. We discuss our findings in the context of existing findings on parallelism and priming, as well as mechanisms of sentence processing.

  9. Colonization by aerobic bacteria in karst: Laboratory and in situ experiments

    USGS Publications Warehouse

    Personne, J.-C.; Poty, F.; Mahler, B.J.; Drogue, C.

    2004-01-01

    Experiments were carried out to investigate the potential for bacterial colonization of different substrates in karst aquifers and the nature of the colonizing bacteria. Laboratory batch experiments were performed using limestone and PVC as substrates, a natural bacterial isolate and a known laboratory strain (Escherichia coli [E. coli]) as inocula, and karst ground water and a synthetic formula as growth media. In parallel, fragments of limestone and granite were submerged in boreholes penetrating two karst aquifers for more than one year; the boreholes are periodically contaminated by enteric bacteria from waste water. Once a month, rock samples were removed and the colonizing bacteria quantified and identified. The batch experiments demonstrated that the natural isolate and E. coli both readily colonized limestone surfaces using karst ground water as the growth medium. In contrast, bacterial colonization of both the limestone and granite substrates, when submerged in the karst, was less intense. More than 300 bacterial strains were isolated over the period sampled, but no temporal pattern in colonization was seen as far as strain, and colonization by E. coli was notably absent, although strains of Salmonella and Citrobacter were each observed once. Samples suspended in boreholes penetrating highly fractured zones were less densely colonized than those in the borehole penetrating a less fractured zone. The results suggest that contamination of karst aquifers by enteric bacteria is unlikely to be persistent. We hypothesize that this may be a result of the high flow velocities found in karst conduits, and of predation of colonizing bacteria by autochthonous zooplankton.

  10. Parallel labeling experiments and metabolic flux analysis: Past, present and future methodologies.

    PubMed

    Crown, Scott B; Antoniewicz, Maciek R

    2013-03-01

    Radioactive and stable isotopes have been applied for decades to elucidate metabolic pathways and quantify carbon flow in cellular systems using mass and isotope balancing approaches. Isotope-labeling experiments can be conducted as a single tracer experiment, or as parallel labeling experiments. In the latter case, several experiments are performed under identical conditions except for the choice of substrate labeling. In this review, we highlight robust approaches for probing metabolism and addressing metabolically related questions though parallel labeling experiments. In the first part, we provide a brief historical perspective on parallel labeling experiments, from the early metabolic studies when radioisotopes were predominant to present-day applications based on stable-isotopes. We also elaborate on important technical and theoretical advances that have facilitated the transition from radioisotopes to stable-isotopes. In the second part of the review, we focus on parallel labeling experiments for (13)C-metabolic flux analysis ((13)C-MFA). Parallel experiments offer several advantages that include: tailoring experiments to resolve specific fluxes with high precision; reducing the length of labeling experiments by introducing multiple entry-points of isotopes; validating biochemical network models; and improving the performance of (13)C-MFA in systems where the number of measurements is limited. We conclude by discussing some challenges facing the use of parallel labeling experiments for (13)C-MFA and highlight the need to address issues related to biological variability, data integration, and rational tracer selection. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Optimized R functions for analysis of ecological community data using the R virtual laboratory (RvLab)

    PubMed Central

    Varsos, Constantinos; Patkos, Theodore; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos

    2016-01-01

    Abstract Background Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. New information In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data – Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/ PMID:27932907

  12. Optimized R functions for analysis of ecological community data using the R virtual laboratory (RvLab).

    PubMed

    Varsos, Constantinos; Patkos, Theodore; Oulas, Anastasis; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos

    2016-01-01

    Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data - Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/.

  13. Preliminary summary of the ETF conceptual studies

    NASA Technical Reports Server (NTRS)

    Seikel, G. R.; Bercaw, R. W.; Pearson, C. V.; Owens, W. R.

    1978-01-01

    Power plant studies have shown the attractiveness of MHD topped steam power plants for baseload utility applications. To realize these advantages, a three-phase development program was initiated. In the first phase, the engineering data and experience were developed for the design and construction of a pilot plant, the Engineering Test Facility (ETF). Results of the ETF studies are reviewed. These three parallel independent studies were conducted by industrial teams led by the AVCO Everett Research Laboratory, the General Electric Corporation, and the Westinghouse Corporation. A preliminary analysis and the status of the critical evaluation of these results are presented.

  14. Implementation of a production Ada project: The GRODY study

    NASA Technical Reports Server (NTRS)

    Godfrey, Sara; Brophy, Carolyn Elizabeth

    1989-01-01

    The use of the Ada language and design methodologies that encourage full use of its capabilities have a strong impact on all phases of the software development project life cycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The differences observed during the implementation, unit testing, and integration phases of the two projects are described and the lessons learned during the implementation phase of the Ada development are outlined. Included are recommendations for future Ada development projects.

  15. Turbulence-driven anisotropic electron tail generation during magnetic reconnection

    NASA Astrophysics Data System (ADS)

    DuBois, A. M.; Scherer, A.; Almagri, A. F.; Anderson, J. K.; Pandya, M. D.; Sarff, J. S.

    2018-05-01

    Magnetic reconnection (MR) plays an important role in particle transport, energization, and acceleration in space, astrophysical, and laboratory plasmas. In the Madison Symmetric Torus reversed field pinch, discrete MR events release large amounts of energy from the equilibrium magnetic field, a fraction of which is transferred to electrons and ions. Previous experiments revealed an anisotropic electron tail that favors the perpendicular direction and is symmetric in the parallel. New profile measurements of x-ray emission show that the tail distribution is localized near the magnetic axis, consistent modeling of the bremsstrahlung emission. The tail appears first near the magnetic axis and then spreads radially, and the dynamics in the anisotropy and diffusion are discussed. The data presented imply that the electron tail formation likely results from a turbulent wave-particle interaction and provides evidence that high energy electrons are escaping the core-localized region through pitch angle scattering into the parallel direction, followed by stochastic parallel transport to the plasma edge. New measurements also show a strong correlation between high energy x-ray measurements and tearing mode dynamics, suggesting that the coupling between core and edge tearing modes is essential for energetic electron tail formation.

  16. A Technological Acceptance of Remote Laboratory in Chemistry Education

    ERIC Educational Resources Information Center

    Ling, Wendy Sing Yii; Lee, Tien Tien; Tho, Siew Wei

    2017-01-01

    The purpose of this study is to evaluate the technological acceptance of Chemistry students, and the opinions of Chemistry lecturers and laboratory assistants towards the use of remote laboratory in Chemistry education. The convergent parallel design mixed method was carried out in this study. The instruments involved were questionnaire and…

  17. The Intermediate Neutrino Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, C.; Alonso, J. R.; Ankowski, A. M.

    2017-04-03

    The US neutrino community gathered at the Workshop on the Intermediate Neutrino Program (WINP) at Brookhaven National Laboratory February 4-6, 2015 to explore opportunities in neutrino physics over the next five to ten years. Scientists from particle, astroparticle and nuclear physics participated in the workshop. The workshop examined promising opportunities for neutrino physics in the intermediate term, including possible new small to mid-scale experiments, US contributions to large experiments, upgrades to existing experiments, R&D plans and theory. The workshop was organized into two sets of parallel working group sessions, divided by physics topics and technology. Physics working groups covered topicsmore » on Sterile Neutrinos, Neutrino Mixing, Neutrino Interactions, Neutrino Properties and Astrophysical Neutrinos. Technology sessions were organized into Theory, Short-Baseline Accelerator Neutrinos, Reactor Neutrinos, Detector R&D and Source, Cyclotron and Meson Decay at Rest sessions.This report summarizes discussion and conclusions from the workshop.« less

  18. Research and test facilities for development of technologies and experiments with commercial applications

    NASA Technical Reports Server (NTRS)

    1989-01-01

    One of NASA'S agency-wide goals is the commercial development of space. To further this goal NASA is implementing a policy whereby U.S. firms are encouraged to utilize NASA facilities to develop and test concepts having commercial potential. Goddard, in keeping with this policy, will make the facilities and capabilities described in this document available to private entities at a reduced cost and on a noninterference basis with internal NASA programs. Some of these facilities include: (1) the Vibration Test Facility; (2) the Battery Test Facility; (3) the Large Area Pulsed Solar Simulator Facility; (4) the High Voltage Testing Facility; (5) the Magnetic Field Component Test Facility; (6) the Spacecraft Magnetic Test Facility; (7) the High Capacity Centrifuge Facility; (8) the Acoustic Test Facility; (9) the Electromagnetic Interference Test Facility; (10) the Space Simulation Test Facility; (11) the Static/Dynamic Balance Facility; (12) the High Speed Centrifuge Facility; (13) the Optical Thin Film Deposition Facility; (14) the Gold Plating Facility; (15) the Paint Formulation and Application Laboratory; (16) the Propulsion Research Laboratory; (17) the Wallops Range Facility; (18) the Optical Instrument Assembly and Test Facility; (19) the Massively Parallel Processor Facility; (20) the X-Ray Diffraction and Scanning Auger Microscopy/Spectroscopy Laboratory; (21) the Parts Analysis Laboratory; (22) the Radiation Test Facility; (23) the Ainsworth Vacuum Balance Facility; (24) the Metallography Laboratory; (25) the Scanning Electron Microscope Laboratory; (26) the Organic Analysis Laboratory; (27) the Outgassing Test Facility; and (28) the Fatigue, Fracture Mechanics and Mechanical Testing Laboratory.

  19. Experience during early adulthood shapes the learning capacities and the number of synaptic boutons in the mushroom bodies of honey bees (Apis mellifera).

    PubMed

    Cabirol, Amélie; Brooks, Rufus; Groh, Claudia; Barron, Andrew B; Devaud, Jean-Marc

    2017-10-01

    The honey bee mushroom bodies (MBs) are brain centers required for specific learning tasks. Here, we show that environmental conditions experienced as young adults affect the maturation of MB neuropil and performance in a MB-dependent learning task. Specifically, olfactory reversal learning was selectively impaired following early exposure to an impoverished environment lacking some of the sensory and social interactions present in the hive. In parallel, the overall number of synaptic boutons increased within the MB olfactory neuropil, whose volume remained unaffected. This suggests that experience of the rich in-hive environment promotes MB maturation and the development of MB-dependent learning capacities. © 2017 Cabirol et al.; Published by Cold Spring Harbor Laboratory Press.

  20. Experimental evolution and the dynamics of adaptation and genome evolution in microbial populations.

    PubMed

    Lenski, Richard E

    2017-10-01

    Evolution is an on-going process, and it can be studied experimentally in organisms with rapid generations. My team has maintained 12 populations of Escherichia coli in a simple laboratory environment for >25 years and 60 000 generations. We have quantified the dynamics of adaptation by natural selection, seen some of the populations diverge into stably coexisting ecotypes, described changes in the bacteria's mutation rate, observed the new ability to exploit a previously untapped carbon source, characterized the dynamics of genome evolution and used parallel evolution to identify the genetic targets of selection. I discuss what the future might hold for this particular experiment, briefly highlight some other microbial evolution experiments and suggest how the fields of experimental evolution and microbial ecology might intersect going forward.

  1. Calibration of the QCM/SAW Cascade Impactor for Measurement of Ozone in the Stratosphere

    NASA Technical Reports Server (NTRS)

    Wright, Cassandra K.; Sims, S. C.; Peterson, C. B.; Morris, V. R.

    1997-01-01

    The Quartz Crystal Microbalance Surface Acoustic Wave (QCM/SAW) cascade impactor collects size-fractionated distributions of aerosols on a series of 10 MHz quartz crystals and employs SAW devices coated with chemical sensors for gas detection. Presently, we are calibrating the ER-2 certified QCM/SAW cascade impactor in the laboratory for the detection of ozone. Experiments have been performed to characterize the QCM and SAW mass loading, saturation limits, mass frequency relationships, and sensitivity. We are also characterizing sampling efficiency by measuring the loss of ozone on different materials. There are parallel experiments underway to measure the variations in the sensitivity and response of the QCM/SAW crystals as a function of temperature and pressure. Results of the work to date will be shown.

  2. Relationship Between Faults Oriented Parallel and Oblique to Bedding in Neogene Massive Siliceous Mudstones at The Horonobe Underground Research Laboratory, Japan

    NASA Astrophysics Data System (ADS)

    Hayano, Akira; Ishii, Eiichi

    2016-10-01

    This study investigates the mechanical relationship between bedding-parallel and bedding-oblique faults in a Neogene massive siliceous mudstone at the site of the Horonobe Underground Research Laboratory (URL) in Hokkaido, Japan, on the basis of observations of drill-core recovered from pilot boreholes and fracture mapping on shaft and gallery walls. Four bedding-parallel faults with visible fault gouge, named respectively the MM Fault, the Last MM Fault, the S1 Fault, and the S2 Fault (stratigraphically, from the highest to the lowest), were observed in two pilot boreholes (PB-V01 and SAB-1). The distribution of the bedding-parallel faults at 350 m depth in the Horonobe URL indicates that these faults are spread over at least several tens of meters in parallel along a bedding plane. The observation that the bedding-oblique fault displaces the Last MM fault is consistent with the previous interpretation that the bedding- oblique faults formed after the bedding-parallel faults. In addition, the bedding-parallel faults terminate near the MM and S1 faults, indicating that the bedding-parallel faults with visible fault gouge act to terminate the propagation of younger bedding-oblique faults. In particular, the MM and S1 faults, which have a relatively thick fault gouge, appear to have had a stronger control on the propagation of bedding-oblique faults than did the Last MM fault, which has a relatively thin fault gouge.

  3. Ultrareliable fault-tolerant control systems

    NASA Technical Reports Server (NTRS)

    Webster, L. D.; Slykhouse, R. A.; Booth, L. A., Jr.; Carson, T. M.; Davis, G. J.; Howard, J. C.

    1984-01-01

    It is demonstrated that fault-tolerant computer systems, such as on the Shuttles, based on redundant, independent operation are a viable alternative in fault tolerant system designs. The ultrareliable fault-tolerant control system (UFTCS) was developed and tested in laboratory simulations of an UH-1H helicopter. UFTCS includes asymptotically stable independent control elements in a parallel, cross-linked system environment. Static redundancy provides the fault tolerance. A polling is performed among the computers, with results allowing for time-delay channel variations with tight bounds. When compared with the laboratory and actual flight data for the helicopter, the probability of a fault was, for the first 10 hr of flight given a quintuple computer redundancy, found to be 1 in 290 billion. Two weeks of untended Space Station operations would experience a fault probability of 1 in 24 million. Techniques for avoiding channel divergence problems are identified.

  4. Final report for “Extreme-scale Algorithms and Solver Resilience”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gropp, William Douglas

    2017-06-30

    This is a joint project with principal investigators at Oak Ridge National Laboratory, Sandia National Laboratories, the University of California at Berkeley, and the University of Tennessee. Our part of the project involves developing performance models for highly scalable algorithms and the development of latency tolerant iterative methods. During this project, we extended our performance models for the Multigrid method for solving large systems of linear equations and conducted experiments with highly scalable variants of conjugate gradient methods that avoid blocking synchronization. In addition, we worked with the other members of the project on alternative techniques for resilience and reproducibility.more » We also presented an alternative approach for reproducible dot-products in parallel computations that performs almost as well as the conventional approach by separating the order of computation from the details of the decomposition of vectors across the processes.« less

  5. Neutron capture and neutron-induced fission experiments on americium isotopes with DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jandel, M.; Bredeweg, T. A.; Fowler, M. M.

    2009-01-28

    Neutron capture cross section data on Am isotopes were measured using the Detector for Advanced Neutron Capture Experiments (DANCE) at Los Alamos National Laboratory. The neutron capture cross section was determined for {sup 241}Am for neutron energies between thermal and 320 keV. Preliminary results were also obtained for {sup 243}Am for neutron energies between 10 eV and 250 keV. The results on concurrent neutron-induced fission and neutron-capture measurements on {sup 242m}Am will be presented where the fission events were actively triggered during the experiments. In these experiments, a Parallel-Plate Avalanche Counter (PPAC) detector that surrounds the target located in themore » center of the DANCE array was used as a fission-tagging detector to separate (n,{gamma}) events from (n,f) events. The first direct observation of neutron capture on {sup 242m}Am in the resonance region in between 2 and 9 eV of the neutron energy was obtained.« less

  6. Neutron capture and neutron-induced fission experiments on americium isotopes with DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jandel, Marian

    2008-01-01

    Neutron capture cross section data on Am isotopes were measured using the Detector for Advanced Neutron Capture Experiments (DANCE) at Los Alamos National Laboratory. The neutron capture cross section was determined for {sup 241}Am for neutron energies between thermal and 320 keV. Preliminary results were also obtained for {sup 243}Am for neutron energies between 35 eV and 200 keV. The results on concurrent neutron-induced fission and neutron-capture measurements on {sup 242m}Am will be presented, where the fission events were actively triggered during the experiments. In these experiments, the Parallel-Plate Avalanche Counter (PPAC) detector that surrounds the target located in themore » center of the DANCE array was used as a fission-tagging detector to separate (n,{gamma}) from (n,f) events. The first evidence of neutron capture on {sup 242m}Am in the resonance region in between 2 and 9 eV of the neutron energy was obtained.« less

  7. Development of gallium arsenide high-speed, low-power serial parallel interface modules: Executive summary

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Final report to NASA LeRC on the development of gallium arsenide (GaAS) high-speed, low power serial/parallel interface modules. The report discusses the development and test of a family of 16, 32 and 64 bit parallel to serial and serial to parallel integrated circuits using a self aligned gate MESFET technology developed at the Honeywell Sensors and Signal Processing Laboratory. Lab testing demonstrated 1.3 GHz clock rates at a power of 300 mW. This work was accomplished under contract number NAS3-24676.

  8. Separation and parallel sequencing of the genomes and transcriptomes of single cells using G&T-seq.

    PubMed

    Macaulay, Iain C; Teng, Mabel J; Haerty, Wilfried; Kumar, Parveen; Ponting, Chris P; Voet, Thierry

    2016-11-01

    Parallel sequencing of a single cell's genome and transcriptome provides a powerful tool for dissecting genetic variation and its relationship with gene expression. Here we present a detailed protocol for G&T-seq, a method for separation and parallel sequencing of genomic DNA and full-length polyA(+) mRNA from single cells. We provide step-by-step instructions for the isolation and lysis of single cells; the physical separation of polyA(+) mRNA from genomic DNA using a modified oligo-dT bead capture and the respective whole-transcriptome and whole-genome amplifications; and library preparation and sequence analyses of these amplification products. The method allows the detection of thousands of transcripts in parallel with the genetic variants captured by the DNA-seq data from the same single cell. G&T-seq differs from other currently available methods for parallel DNA and RNA sequencing from single cells, as it involves physical separation of the DNA and RNA and does not require bespoke microfluidics platforms. The process can be implemented manually or through automation. When performed manually, paired genome and transcriptome sequencing libraries from eight single cells can be produced in ∼3 d by researchers experienced in molecular laboratory work. For users with experience in the programming and operation of liquid-handling robots, paired DNA and RNA libraries from 96 single cells can be produced in the same time frame. Sequence analysis and integration of single-cell G&T-seq DNA and RNA data requires a high level of bioinformatics expertise and familiarity with a wide range of informatics tools.

  9. The AMINO experiment: exposure of amino acids in the EXPOSE-R experiment on the International Space Station and in laboratory

    NASA Astrophysics Data System (ADS)

    Bertrand, Marylène; Chabin, Annie; Colas, Cyril; Cadène, Martine; Chaput, Didier; Brack, Andre; Cottin, Herve

    2015-01-01

    In order to confirm the results of previous experiments concerning the chemical behaviour of organic molecules in the space environment, organic molecules (amino acids and a dipeptide) in pure form and embedded in meteorite powder were exposed in the AMINO experiment in the EXPOSE-R facility onboard the International Space Station. After exposure to space conditions for 24 months (2843 h of irradiation), the samples were returned to the Earth and analysed in the laboratory for reactions caused by solar ultraviolet (UV) and other electromagnetic radiation. Laboratory UV exposure was carried out in parallel in the Cologne DLR Center (Deutsches Zentrum für Luft und Raumfahrt). The molecules were extracted from the sample holder and then (1) derivatized by silylation and analysed by gas chromatography coupled to a mass spectrometer (GC-MS) in order to quantify the rate of degradation of the compounds and (2) analysed by high-resolution mass spectrometry (HRMS) in order to understand the chemical reactions that occurred. The GC-MS results confirm that resistance to irradiation is a function of the chemical nature of the exposed molecules and of the wavelengths of the UV light. They also confirm the protective effect of a coating of meteorite powder. The most altered compounds were the dipeptides and aspartic acid while the most robust were compounds with a hydrocarbon chain. The MS analyses document the products of reactions, such as decarboxylation and decarbonylation of aspartic acid, taking place after UV exposure. Given the universality of chemistry in space, our results have a broader implication for the fate of organic molecules that seeded the planets as soon as they became habitable as well as for the effects of UV radiation on exposed molecules at the surface of Mars, for example.

  10. Two LANL laboratory astrophysics experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Intrator, Thomas P.

    2014-01-24

    Two laboratory experiments are described that have been built at Los Alamos (LANL) to gain access to a wide range of fundamental plasma physics issues germane to astro, space, and fusion plasmas. The overarching theme is magnetized plasma dynamics which includes significant currents, MHD forces and instabilities, magnetic field creation and annihilation, sheared flows and shocks. The Relaxation Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, and can kink, bounce, merge and reconnect, shred, and reform in complicated ways. Recent movies from a large data set describe the 3D magnetic structure of a drivenmore » and dissipative single flux rope that spontaneously self-saturates a kink instability. Examples of a coherent shear flow dynamo driven by colliding flux ropes will also be shown. The Magnetized Shock Experiment (MSX) uses Field reversed configuration (FRC) experimental hardware that forms and ejects FRCs at 150km/sec. This is sufficient to drive a collision less magnetized shock when stagnated into a mirror stopping field region with Alfven Mach number MA=3 so that super critical shocks can be studied. We are building a plasmoid accelerator to drive Mach numbers MA >> 3 to access solar wind and more exotic astrophysical regimes. Unique features of this experiment include access to parallel, oblique and perpendicular shocks, shock region much larger than ion gyro radii and ion inertial length, room for turbulence, and large magnetic and fluid Reynolds numbers.« less

  11. Genome-wide mapping of mutations at single-nucleotide resolution for protein, metabolic and genome engineering.

    PubMed

    Garst, Andrew D; Bassalo, Marcelo C; Pines, Gur; Lynch, Sean A; Halweg-Edwards, Andrea L; Liu, Rongming; Liang, Liya; Wang, Zhiwen; Zeitoun, Ramsey; Alexander, William G; Gill, Ryan T

    2017-01-01

    Improvements in DNA synthesis and sequencing have underpinned comprehensive assessment of gene function in bacteria and eukaryotes. Genome-wide analyses require high-throughput methods to generate mutations and analyze their phenotypes, but approaches to date have been unable to efficiently link the effects of mutations in coding regions or promoter elements in a highly parallel fashion. We report that CRISPR-Cas9 gene editing in combination with massively parallel oligomer synthesis can enable trackable editing on a genome-wide scale. Our method, CRISPR-enabled trackable genome engineering (CREATE), links each guide RNA to homologous repair cassettes that both edit loci and function as barcodes to track genotype-phenotype relationships. We apply CREATE to site saturation mutagenesis for protein engineering, reconstruction of adaptive laboratory evolution experiments, and identification of stress tolerance and antibiotic resistance genes in bacteria. We provide preliminary evidence that CREATE will work in yeast. We also provide a webtool to design multiplex CREATE libraries.

  12. Observations of a field-aligned ion/ion-beam instability in a magnetized laboratory plasma

    NASA Astrophysics Data System (ADS)

    Heuer, P. V.; Weidl, M. S.; Dorst, R. S.; Schaeffer, D. B.; Bondarenko, A. S.; Tripathi, S. K. P.; Van Compernolle, B.; Vincena, S.; Constantin, C. G.; Niemann, C.; Winske, D.

    2018-03-01

    Collisionless coupling between super Alfvénic ions and an ambient plasma parallel to a background magnetic field is mediated by a set of electromagnetic ion/ion-beam instabilities including the resonant right hand instability (RHI). To study this coupling and its role in parallel shock formation, a new experimental configuration at the University of California, Los Angeles utilizes high-energy and high-repetition-rate lasers to create a super-Alfvénic field-aligned debris plasma within an ambient plasma in the Large Plasma Device. We used a time-resolved fluorescence monochromator and an array of Langmuir probes to characterize the laser plasma velocity distribution and density. The debris ions were observed to be sufficiently super-Alfvénic and dense to excite the RHI. Measurements with magnetic flux probes exhibited a right-hand circularly polarized frequency chirp consistent with the excitation of the RHI near the laser target. We compared measurements to 2D hybrid simulations of the experiment.

  13. Observations of a field-aligned ion/ion-beam instability in a magnetized laboratory plasma

    DOE PAGES

    Heuer, P. V.; Weidl, M. S.; Dorst, R. S.; ...

    2018-03-01

    Collisionless coupling between super Alfvénic ions and an ambient plasma parallel to a background magnetic field is mediated by a set of electromagnetic ion/ion-beam instabilities including the resonant right hand instability (RHI). To study this coupling and its role in parallel shock formation, a new experimental configuration at the University of California, Los Angeles utilizes high-energy and high-repetition-rate lasers to create a super-Alfvénic field-aligned debris plasma within an ambient plasma in the Large Plasma Device. We used a time-resolved fluorescence monochromator and an array of Langmuir probes to characterize the laser plasma velocity distribution and density. The debris ions weremore » observed to be sufficiently super-Alfvénic and dense to excite the RHI. Measurements with magnetic flux probes exhibited a right-hand circularly polarized frequency chirp consistent with the excitation of the RHI near the laser target. To conclude, we compared measurements to 2D hybrid simulations of the experiment.« less

  14. Enhanced performance of a submerged membrane bioreactor with powdered activated carbon addition for municipal secondary effluent treatment.

    PubMed

    Lin, Hongjun; Wang, Fangyuan; Ding, Linxian; Hong, Huachang; Chen, Jianrong; Lu, Xiaofeng

    2011-09-15

    The aim of this study was to investigate the feasibility of PAC-MBR process treating municipal secondary effluent. Two laboratory-scale submerged MBRs (SMBR) with and without PAC addition were continuously operated in parallel for secondary effluent treatment. Approximately 63%TOC, 95% NH(4)(+)-N and 98% turbidity in secondary effluent were removed by the PAC-MBR process. Most organics in the secondary effluent were found to be low molecular weight (MW) substances, which could be retained in the reactor and then removed to some extent by using PAC-MBR process. Parallel experiments showed that the addition of PAC significantly increased organic removal and responsible for the largest fraction of organic removal. Membrane fouling analysis showed the enhanced membrane performance in terms of sustainable operational time and filtration resistances by PAC addition. Based on these results, the PAC-MBR process was considered as an attractive option for the reduction of pollutants in secondary effluent. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Observations of a field-aligned ion/ion-beam instability in a magnetized laboratory plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heuer, P. V.; Weidl, M. S.; Dorst, R. S.

    Collisionless coupling between super Alfvénic ions and an ambient plasma parallel to a background magnetic field is mediated by a set of electromagnetic ion/ion-beam instabilities including the resonant right hand instability (RHI). To study this coupling and its role in parallel shock formation, a new experimental configuration at the University of California, Los Angeles utilizes high-energy and high-repetition-rate lasers to create a super-Alfvénic field-aligned debris plasma within an ambient plasma in the Large Plasma Device. We used a time-resolved fluorescence monochromator and an array of Langmuir probes to characterize the laser plasma velocity distribution and density. The debris ions weremore » observed to be sufficiently super-Alfvénic and dense to excite the RHI. Measurements with magnetic flux probes exhibited a right-hand circularly polarized frequency chirp consistent with the excitation of the RHI near the laser target. To conclude, we compared measurements to 2D hybrid simulations of the experiment.« less

  16. Organics removal of combined wastewater through shallow soil infiltration treatment: a field and laboratory study.

    PubMed

    Zhang, Zhiyin; Lei, Zhongfang; Zhang, Zhenya; Sugiura, Norio; Xu, Xiaotian; Yin, Didi

    2007-11-19

    Soil infiltration treatment (SIT) was proved to be an effective and low-cost treatment technique for decentralized effluents in the areas without perfect sewage systems. Field-scale experiments were conducted under several conditions to assess organics removals through a shallow soil infiltration treatment (SSIT, with effective depth 0.3m) of combined wastewater (discharge from toilets, restaurants and a gas station), while bench-scale soil column experiments were performed in laboratory in parallel to investigate biological and abiological effects of this kind of system. From the start-up to the 10th month, the field SSIT trenches experienced the lowest and highest temperatures of the operation period in Shanghai and exhibited effective organics removals after maturation, with the highest removal rate 75.8% of chemical oxygen demand (COD), highest ultraviolet absorption at 254 nm (UV(254)) decrease by 67.2% and 35.2-100% removals of phenolic and phthalate pollutants. The laboratory results indicated that more organics could be removed in room-temperatured (25+/-2 degrees C) SSIT systems under different influent COD concentrations from 45 mg/l to 406 mg/l, and the highest total COD removal rate could reach 94.0%, in which biological effect accounted for 57.7-71.9%. The results showed that temperature and hydraulic loading rate were the most important factors influencing the removals of COD and organic pollutants in SSIT.

  17. A historical overview of nuclear structure studies in Strasbourg Laboratories: instrumentation, measurements and theory modelling—hand-in-hand

    NASA Astrophysics Data System (ADS)

    Beck, F. A.

    2018-05-01

    This article overviews a long period of an important evolution in the nuclear structure research in Strasbourg Laboratories, focussed on tracking of the weaker and weaker experimental signals carrying the more and more important physics messages. In particular we address the research of signatures of the collective behaviour of the nucleus as suggested in the early works of Bohr, Mottelson and Rainwater—at high and very high angular momenta—as well as the competition between the collective and non-collective excitation modes. These ambitious goals were possible to achieve only by constructing powerful spectrometers and developing related detectors, electronics and data acquisition systems. The theoretical modelling developed in parallel, provided essential guidance when choosing the right experiments and optimising their realisation. Theory calculations were equally helpful in interpreting the results of experiments leading to a more complete understanding of the underlying physics. Moreover, thanks to the development of heavy ion accelerators, the Strasbourg centre was the place where crossed the ways of many experimenters from European countries both from the Western and from the Central part of Europe, the place of the gradual development of more and more sophisticated European gamma-spectrometers in collaboration with more and more laboratories from the increasing number of countries allowing for the frontier-level studies of the nuclear behaviour at very high angular momenta.

  18. Biodynamic feedback training to assure learning partial load bearing on forearm crutches.

    PubMed

    Krause, Daniel; Wünnemann, Martin; Erlmann, Andre; Hölzchen, Timo; Mull, Melanie; Olivier, Norbert; Jöllenbeck, Thomas

    2007-07-01

    To examine how biodynamic feedback training affects the learning of prescribed partial load bearing (200N). Three pre-post experiments. Biomechanics laboratory in a German university. A volunteer sample of 98 uninjured subjects who had not used crutches recently. There were 24 subjects in experiment 1 (mean age, 23.2y); 64 in experiment 2 (mean age, 43.6y); and 10 in experiment 3 (mean age, 40.3y), parallelized by arm force. Video instruction and feedback training: In experiment 1, 2 varied instruction videos and reduced feedback frequency; in experiment 2, varied frequencies of changing tasks (contextual interference); and in experiment 3, feedback training (walking) and transfer (stair tasks). Vertical ground reaction force. Absolute error of practiced tasks was significantly reduced for all samples (P<.050). Varied contextual interference conditions did not significantly affect retention (P=.798) or transfer (P=.897). Positive transfer between tasks was significant in experiment 2 (P<.001) and was contrary to findings in experiment 3 (P=.071). Biodynamic feedback training is applicable for learning prescribed partial load bearing. The frequency of changing tasks is irrelevant. Despite some support for transfer effects, additional practice in climbing and descending stairs might be beneficial.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kargupta, H.; Stafford, B.; Hamzaoglu, I.

    This paper describes an experimental parallel/distributed data mining system PADMA (PArallel Data Mining Agents) that uses software agents for local data accessing and analysis and a web based interface for interactive data visualization. It also presents the results of applying PADMA for detecting patterns in unstructured texts of postmortem reports and laboratory test data for Hepatitis C patients.

  20. Effect of Unsaturated Flow Modes on Partitioning Dynamics of Gravity-Driven Flow at a Simple Fracture Intersection: Laboratory Study and Three-Dimensional Smoothed Particle Hydrodynamics Simulations

    NASA Astrophysics Data System (ADS)

    Kordilla, Jannes; Noffz, Torsten; Dentz, Marco; Geyer, Tobias; Tartakovsky, Alexandre M.

    2017-11-01

    In this work, we study gravity-driven flow of water in the presence of air on a synthetic surface intersected by a horizontal fracture and investigate the importance of droplet and rivulet flow modes on the partitioning behavior at the fracture intersection. We present laboratory experiments, three-dimensional smoothed particle hydrodynamics (SPH) simulations using a heavily parallelized code, and a theoretical analysis. The flow-rate-dependent mode switching from droplets to rivulets is observed in experiments and reproduced by the SPH model, and the transition ranges agree in SPH simulations and laboratory experiments. We show that flow modes heavily influence the "bypass" behavior of water flowing along a fracture junction. Flows favoring the formation of droplets exhibit a much stronger bypass capacity compared to rivulet flows, where nearly the whole fluid mass is initially stored within the horizontal fracture. The effect of fluid buffering within the horizontal fracture is presented in terms of dimensionless fracture inflow so that characteristic scaling regimes can be recovered. For both cases (rivulets and droplets), the flow within the horizontal fracture transitions into a Washburn regime until a critical threshold is reached and the bypass efficiency increases. For rivulet flows, the initial filling of the horizontal fracture is described by classical plug flow. Meanwhile, for droplet flows, a size-dependent partitioning behavior is observed, and the filling of the fracture takes longer. For the case of rivulet flow, we provide an analytical solution that demonstrates the existence of classical Washburn flow within the horizontal fracture.

  1. System testing of a production Ada (trademark) project: The GRODY study

    NASA Technical Reports Server (NTRS)

    Seigle, Jeffrey; Esker, Linda; Shi, Ying-Liang

    1990-01-01

    The use of the Ada language and design methodologies that utilize its features has a strong impact on all phases of the software development project lifecycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The teams found some qualitative differences between the system test phases of the two projects. Although planning for system testing and conducting of tests were not generally affected by the use of Ada, the solving of problems found in system testing was generally facilitated by Ada constructs and design methodology. Most problems found in system testing were not due to difficulty with the language or methodology but to lack of experience with the application.

  2. Predicting mesoscale microstructural evolution in electron beam welding

    DOE PAGES

    Rodgers, Theron M.; Madison, Jonathan D.; Tikare, Veena; ...

    2016-03-16

    Using the kinetic Monte Carlo simulator, Stochastic Parallel PARticle Kinetic Simulator, from Sandia National Laboratories, a user routine has been developed to simulate mesoscale predictions of a grain structure near a moving heat source. Here, we demonstrate the use of this user routine to produce voxelized, synthetic, three-dimensional microstructures for electron-beam welding by comparing them with experimentally produced microstructures. When simulation input parameters are matched to experimental process parameters, qualitative and quantitative agreement for both grain size and grain morphology are achieved. The method is capable of simulating both single- and multipass welds. As a result, the simulations provide anmore » opportunity for not only accelerated design but also the integration of simulation and experiments in design such that simulations can receive parameter bounds from experiments and, in turn, provide predictions of a resultant microstructure.« less

  3. Knowledge representation into Ada parallel processing

    NASA Technical Reports Server (NTRS)

    Masotto, Tom; Babikyan, Carol; Harper, Richard

    1990-01-01

    The Knowledge Representation into Ada Parallel Processing project is a joint NASA and Air Force funded project to demonstrate the execution of intelligent systems in Ada on the Charles Stark Draper Laboratory fault-tolerant parallel processor (FTPP). Two applications were demonstrated - a portion of the adaptive tactical navigator and a real time controller. Both systems are implemented as Activation Framework Objects on the Activation Framework intelligent scheduling mechanism developed by Worcester Polytechnic Institute. The implementations, results of performance analyses showing speedup due to parallelism and initial efficiency improvements are detailed and further areas for performance improvements are suggested.

  4. Laboratory glassware rack for seismic safety

    NASA Technical Reports Server (NTRS)

    Cohen, M. M. (Inventor)

    1985-01-01

    A rack for laboratory bottles and jars for chemicals and medicines has been designed to provide the maximum strength and security to the glassware in the event of a significant earthquake. The rack preferably is rectangular and may be made of a variety of chemically resistant materials including polypropylene, polycarbonate, and stainless steel. It comprises a first plurality of parallel vertical walls, and a second plurality of parallel vertical walls, perpendicular to the first. These intersecting vertical walls comprise a self-supporting structure without a bottom which sits on four legs. The top surface of the rack is formed by the top edges of all the vertical walls, which are not parallel but are skewed in three dimensions. These top edges form a grid matrix having a number of intersections of the vertical walls which define a number of rectangular compartments having varying widths and lengths and varying heights.

  5. The astrobiological mission EXPOSE-R on board of the International Space Station

    NASA Astrophysics Data System (ADS)

    Rabbow, Elke; Rettberg, Petra; Barczyk, Simon; Bohmeier, Maria; Parpart, Andre; Panitz, Corinna; Horneck, Gerda; Burfeindt, Jürgen; Molter, Ferdinand; Jaramillo, Esther; Pereira, Carlos; Weiß, Peter; Willnecker, Rainer; Demets, René; Dettmann, Jan

    2015-01-01

    EXPOSE-R flew as the second of the European Space Agency (ESA) EXPOSE multi-user facilities on the International Space Station. During the mission on the external URM-D platform of the Zvezda service module, samples of eight international astrobiology experiments selected by ESA and one Russian guest experiment were exposed to low Earth orbit space parameters from March 10th, 2009 to January 21st, 2011. EXPOSE-R accommodated a total of 1220 samples for exposure to selected space conditions and combinations, including space vacuum, temperature cycles through 273 K, cosmic radiation, solar electromagnetic radiation at >110, >170 or >200 nm at various fluences up to GJ m-2. Samples ranged from chemical compounds via unicellular organisms and multicellular mosquito larvae and seeds to passive radiation dosimeters. Additionally, one active radiation measurement instrument was accommodated on EXPOSE-R and commanded from ground in accordance with the facility itself. Data on ultraviolet radiation, cosmic radiation and temperature were measured every 10 s and downlinked by telemetry and data carrier every few months. The EXPOSE-R trays and samples returned to Earth on March 9th, 2011 with Shuttle flight, Space Transportation System (STS)-133/ULF 5, Discovery, after successful total mission duration of 27 months in space. The samples were analysed in the individual investigators laboratories. A parallel Mission Ground Reference experiment was performed on ground with a parallel set of hardware and samples under simulated space conditions following to the data transmitted from the flight mission.

  6. LDRD final report on massively-parallel linear programming : the parPCx system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parekh, Ojas; Phillips, Cynthia Ann; Boman, Erik Gunnar

    2005-02-01

    This report summarizes the research and development performed from October 2002 to September 2004 at Sandia National Laboratories under the Laboratory-Directed Research and Development (LDRD) project ''Massively-Parallel Linear Programming''. We developed a linear programming (LP) solver designed to use a large number of processors. LP is the optimization of a linear objective function subject to linear constraints. Companies and universities have expended huge efforts over decades to produce fast, stable serial LP solvers. Previous parallel codes run on shared-memory systems and have little or no distribution of the constraint matrix. We have seen no reports of general LP solver runsmore » on large numbers of processors. Our parallel LP code is based on an efficient serial implementation of Mehrotra's interior-point predictor-corrector algorithm (PCx). The computational core of this algorithm is the assembly and solution of a sparse linear system. We have substantially rewritten the PCx code and based it on Trilinos, the parallel linear algebra library developed at Sandia. Our interior-point method can use either direct or iterative solvers for the linear system. To achieve a good parallel data distribution of the constraint matrix, we use a (pre-release) version of a hypergraph partitioner from the Zoltan partitioning library. We describe the design and implementation of our new LP solver called parPCx and give preliminary computational results. We summarize a number of issues related to efficient parallel solution of LPs with interior-point methods including data distribution, numerical stability, and solving the core linear system using both direct and iterative methods. We describe a number of applications of LP specific to US Department of Energy mission areas and we summarize our efforts to integrate parPCx (and parallel LP solvers in general) into Sandia's massively-parallel integer programming solver PICO (Parallel Interger and Combinatorial Optimizer). We conclude with directions for long-term future algorithmic research and for near-term development that could improve the performance of parPCx.« less

  7. Assessing the relationship between the abundance and properties of microplastics in water and in mussels.

    PubMed

    Qu, Xiaoyun; Su, Lei; Li, Hengxiang; Liang, Mingzhong; Shi, Huahong

    2018-04-15

    Microplastic pollution is increasingly becoming a great environmental concern worldwide. Microplastics have been found in many marine organisms as a result of increasing plastic pollution within marine environments. However, the relationship between micoplastics in organisms and their living environment is still relatively poorly understood. In the present study, we investigated microplastic pollution in the water and the mussels (Mytilus edulis, Perna viridis) at 25 sites along the coastal waters of China. We also, for the first time, conducted an exposure experiment in parallel on the same site using M. edulis in the laboratory. A strong positive linear relationship was found between microplastic levels in the water and in the mussels. Fibers were the dominant microplastics. The sizes of microplastics in the mussels were smaller than those in the water. During exposure experiments, the abundance of microbeads was significantly higher than that of fibers, even though the nominal abundance of fibers was eight times that of microbeads. In general, our results supported positive and quantitative correlations of microplastics in mussels and in their surrounding waters and that mussels were more likely to ingest smaller microplastics. Laboratory exposure experiment is a good way to understand the relative impacts of microplastics ingested by marine organisms. However, significant differences in the results between exposure experiments and field investigations indicated that further efforts are needed to simulate the diverse environmentally relevant properties of microplastics. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. A possibility of parallel and anti-parallel diffraction measurements on neu- tron diffractometer employing bent perfect crystal monochromator at the monochromatic focusing condition

    NASA Astrophysics Data System (ADS)

    Choi, Yong Nam; Kim, Shin Ae; Kim, Sung Kyu; Kim, Sung Baek; Lee, Chang-Hee; Mikula, Pavel

    2004-07-01

    In a conventional diffractometer having single monochromator, only one position, parallel position, is used for the diffraction experiment (i.e. detection) because the resolution property of the other one, anti-parallel position, is very poor. However, a bent perfect crystal (BPC) monochromator at monochromatic focusing condition can provide a quite flat and equal resolution property at both parallel and anti-parallel positions and thus one can have a chance to use both sides for the diffraction experiment. From the data of the FWHM and the Delta d/d measured on three diffraction geometries (symmetric, asymmetric compression and asymmetric expansion), we can conclude that the simultaneous diffraction measurement in both parallel and anti-parallel positions can be achieved.

  9. Two LANL laboratory astrophysics experiments

    NASA Astrophysics Data System (ADS)

    Intrator, Thomas; Weber, Thomas; Feng, Yan; Hutchinson, Trevor; Dunn, John; Akcay, Cihan

    2014-06-01

    Two laboratory experiments are described that have been built at Los Alamos (LANL) to gain access to a wide range of fundamental plasma physics issues germane to astro, space, and fusion plasmas. The over arching theme is magnetized plasma dynamics which includes significant currents, MHD forces and instabilities, magnetic field creation and annihilation, sheared flows and shocks. The Relaxation Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, and can kink, bounce, merge and reconnect, shred, and reform in complicated ways. Recent movies from a large data set describe the 3D magnetic structure of a driven and dissipative single flux rope that spontaneously self saturates a kink instability. Examples of a coherent shear flow dynamo driven by colliding flux ropes will also be shown.The Magnetized Shock Experiment (MSX) uses Field reversed configuration (FRC) experimental hardware that forms and ejects FRCs at 150km/sec. This is sufficient to drive a collision less magnetized shock when stagnated into a mirror stopping field region with Alfven Mach number MA=3 so that super critical shocks can be studied. We are building a plasmoid accelerator to drive Mach numbers MA >> 3 to access solar wind and more exotic astrophysical regimes. Unique features of this experiment include access to parallel, oblique and perpendicular shocks, shock region much larger than ion gyro radii and ion inertial length, room for turbulence, and large magnetic and fluid Reynolds numbers.*DOE Office of Fusion Energy Sciences under LANS contract DE-AC52-06NA25396, NASA Geospace NNHIOA044I, Basic, Center for Magnetic Self Organization

  10. Fluctuation dynamics in reconnecting current sheets

    NASA Astrophysics Data System (ADS)

    von Stechow, Adrian; Grulke, Olaf; Ji, Hantao; Yamada, Masaaki; Klinger, Thomas

    2015-11-01

    During magnetic reconnection, a highly localized current sheet forms at the boundary between opposed magnetic fields. Its steep perpendicular gradients and fast parallel drifts can give rise to a range of instabilities which can contribute to the overall reconnection dynamics. In two complementary laboratory reconnection experiments, MRX (PPPL, Princeton) and VINETA.II (IPP, Greifswald, Germany), magnetic fluctuations are observed within the current sheet. Despite the large differences in geometries (toroidal vs. linear), plasma parameters (high vs. low beta) and magnetic configuration (low vs. high magnetic guide field), similar broadband fluctuation characteristics are observed in both experiments. These are identified as Whistler-like fluctuations in the lower hybrid frequency range that propagate along the current sheet in the electron drift direction. They are intrinsic to the localized current sheet and largely independent of the slower reconnection dynamics. This contribution characterizes these magnetic fluctuations within the wide parameter range accessible by both experiments. Specifically, the fluctuation spectra and wave dispersion are characterized with respect to the magnetic topology and plasma parameters of the reconnecting current sheet.

  11. Tracking Connections: An Exercise about Series and Parallel Resistances

    ERIC Educational Resources Information Center

    Jankovic, Srdjan

    2010-01-01

    Unlike many other topics in basic physics, series and parallel resistances are rarely noticed in the real life of an ordinary individual, making it difficult to design a laboratory activity that can simulate something familiar. The activities described here entail minimal costs and are based on a puzzle-like game of tracking wire connections. A…

  12. Virtual earthquake engineering laboratory with physics-based degrading materials on parallel computers

    NASA Astrophysics Data System (ADS)

    Cho, In Ho

    For the last few decades, we have obtained tremendous insight into underlying microscopic mechanisms of degrading quasi-brittle materials from persistent and near-saintly efforts in laboratories, and at the same time we have seen unprecedented evolution in computational technology such as massively parallel computers. Thus, time is ripe to embark on a novel approach to settle unanswered questions, especially for the earthquake engineering community, by harmoniously combining the microphysics mechanisms with advanced parallel computing technology. To begin with, it should be stressed that we placed a great deal of emphasis on preserving clear meaning and physical counterparts of all the microscopic material models proposed herein, since it is directly tied to the belief that by doing so, the more physical mechanisms we incorporate, the better prediction we can obtain. We departed from reviewing representative microscopic analysis methodologies, selecting out "fixed-type" multidirectional smeared crack model as the base framework for nonlinear quasi-brittle materials, since it is widely believed to best retain the physical nature of actual cracks. Microscopic stress functions are proposed by integrating well-received existing models to update normal stresses on the crack surfaces (three orthogonal surfaces are allowed to initiate herein) under cyclic loading. Unlike the normal stress update, special attention had to be paid to the shear stress update on the crack surfaces, due primarily to the well-known pathological nature of the fixed-type smeared crack model---spurious large stress transfer over the open crack under nonproportional loading. In hopes of exploiting physical mechanism to resolve this deleterious nature of the fixed crack model, a tribology-inspired three-dimensional (3d) interlocking mechanism has been proposed. Following the main trend of tribology (i.e., the science and engineering of interacting surfaces), we introduced the base fabric of solid particle-soft matrix to explain realistic interlocking over rough crack surfaces, and the adopted Gaussian distribution feeds random particle sizes to the entire domain. Validation against a well-documented rough crack experiment reveals promising accuracy of the proposed 3d interlocking model. A consumed energy-based damage model has been proposed for the weak correlation between the normal and shear stresses on the crack surfaces, and also for describing the nature of irrecoverable damage. Since the evaluation of the consumed energy is directly linked to the microscopic deformation, which can be efficiently tracked on the crack surfaces, the proposed damage model is believed to provide a more physical interpretation than existing damage mechanics, which fundamentally stem from mathematical derivation with few physical counterparts. Another novel point of the present work lies in the topological transition-based "smart" steel bar model, notably with evolving compressive buckling length. We presented a systematic framework of information flow between the key ingredients of composite materials (i.e., steel bar and its surrounding concrete elements). The smart steel model suggested can incorporate smooth transition during reversal loading, tensile rupture, early buckling after reversal from excessive tensile loading, and even compressive buckling. Especially, the buckling length is made to evolve according to the damage states of the surrounding elements of each bar, while all other dominant models leave the length unchanged. What lies behind all the aforementioned novel attempts is, of course, the problem-optimized parallel platform. In fact, the parallel computing in our field has been restricted to monotonic shock or blast loading with explicit algorithm which is characteristically feasible to be parallelized. In the present study, efficient parallelization strategies for the highly demanding implicit nonlinear finite element analysis (FEA) program for real-scale reinforced concrete (RC) structures under cyclic loading are proposed. Quantitative comparison of state-of-the-art parallel strategies, in terms of factorization, had been carried out, leading to the problem-optimized solver, which is successfully embracing the penalty method and banded nature. Particularly, the penalty method employed imparts considerable smoothness to the global response, which yields a practical superiority of the parallel triangular system solver over other advanced solvers such as parallel preconditioned conjugate gradient method. Other salient issues on parallelization are also addressed. The parallel platform established offers unprecedented access to simulations of real-scale structures, giving new understanding about the physics-based mechanisms adopted and probabilistic randomness at the entire system level. Particularly, the platform enables bold simulations of real-scale RC structures exposed to cyclic loading---H-shaped wall system and 4-story T-shaped wall system. The simulations show the desired capability of accurate prediction of global force-displacement responses, postpeak softening behavior, and compressive buckling of longitudinal steel bars. It is fascinating to see that intrinsic randomness of the 3d interlocking model appears to cause "localized" damage of the real-scale structures, which is consistent with reported observations in different fields such as granular media. Equipped with accuracy, stability and scalability as demonstrated so far, the parallel platform is believed to serve as a fertile ground for the introducing of further physical mechanisms into various research fields as well as the earthquake engineering community. In the near future, it can be further expanded to run in concert with reliable FEA programs such as FRAME3d or OPENSEES. Following the central notion of "multiscale" analysis technique, actual infrastructures exposed to extreme natural hazard can be successfully tackled by this next generation analysis tool---the harmonious union of the parallel platform and a general FEA program. At the same time, any type of experiments can be easily conducted by this "virtual laboratory."

  13. EDITORIAL: Interrelationship between plasma phenomena in the laboratory and in space

    NASA Astrophysics Data System (ADS)

    Koepke, Mark

    2008-07-01

    The premise of investigating basic plasma phenomena relevant to space is that an alliance exists between both basic plasma physicists, using theory, computer modelling and laboratory experiments, and space science experimenters, using different instruments, either flown on different spacecraft in various orbits or stationed on the ground. The intent of this special issue on interrelated phenomena in laboratory and space plasmas is to promote the interpretation of scientific results in a broader context by sharing data, methods, knowledge, perspectives, and reasoning within this alliance. The desired outcomes are practical theories, predictive models, and credible interpretations based on the findings and expertise available. Laboratory-experiment papers that explicitly address a specific space mission or a specific manifestation of a space-plasma phenomenon, space-observation papers that explicitly address a specific laboratory experiment or a specific laboratory result, and theory or modelling papers that explicitly address a connection between both laboratory and space investigations were encouraged. Attention was given to the utility of the references for readers who seek further background, examples, and details. With the advent of instrumented spacecraft, the observation of waves (fluctuations), wind (flows), and weather (dynamics) in space plasmas was approached within the framework provided by theory with intuition provided by the laboratory experiments. Ideas on parallel electric field, magnetic topology, inhomogeneity, and anisotropy have been refined substantially by laboratory experiments. Satellite and rocket observations, theory and simulations, and laboratory experiments have contributed to the revelation of a complex set of processes affecting the accelerations of electrons and ions in the geospace plasma. The processes range from meso-scale of several thousands of kilometers to micro-scale of a few meters to kilometers. Papers included in this special issue serve to synthesise our current understanding of processes related to the coupling and feedback at disparate scales. Categories of topics included here are (1) ionospheric physics and (2) Alfvén-wave physics, both of which are related to the particle acceleration responsible for auroral displays, (3) whistler-mode triggering mechanism, which is relevant to radiation-belt dynamics, (4) plasmoid encountering a barrier, which has applications throughout the realm of space and astrophysical plasmas, and (5) laboratory investigations of the entire magnetosphere or the plasma surrounding the magnetosphere. The papers are ordered from processes that take place nearest the Earth to processes that take place at increasing distances from Earth. Many advances in understanding space plasma phenomena have been linked to insight derived from theoretical modeling and/or laboratory experiments. Observations from space-borne instruments are typically interpreted using theoretical models developed to predict the properties and dynamics of space and astrophysical plasmas. The usefulness of customized laboratory experiments for providing confirmation of theory by identifying, isolating, and studying physical phenomena efficiently, quickly, and economically has been demonstrated in the past. The benefits of laboratory experiments to investigating space-plasma physics are their reproducibility, controllability, diagnosability, reconfigurability, and affordability compared to a satellite mission or rocket campaign. Certainly, the plasma being investigated in a laboratory device is quite different from that being measured by a spaceborne instrument; nevertheless, laboratory experiments discover unexpected phenomena, benchmark theoretical models, develop physical insight, establish observational signatures, and pioneer diagnostic techniques. Explicit reference to such beneficial laboratory contributions is occasionally left out of the citations in the space-physics literature in favor of theory-paper counterparts and, thus, the scientific support that laboratory results can provide to the development of space-relevant theoretical models is often under-recognized. It is unrealistic to expect the dimensional parameters corresponding to space plasma to be matchable in the laboratory. However, a laboratory experiment is considered well designed if the subset of parameters relevant to a specific process shares the same phenomenological regime as the subset of analogous space parameters, even if less important parameters are mismatched. Regime boundaries are assigned by normalizing a dimensional parameter to an appropriate reference or scale value to make it dimensionless and noting the values at which transitions occur in the physical behavior or approximations. An example of matching regimes for cold-plasma waves is finding a 45° diagonal line on the log--log CMA diagram along which lie both a laboratory-observed wave and a space-observed wave. In such a circumstance, a space plasma and a lab plasma will support the same kind of modes if the dimensionless parameters are scaled properly (Bellan 2006 Fundamentals of Plasma Physics (Cambridge: Cambridge University Press) p 227). The plasma source, configuration geometry, and boundary conditions associated with a specific laboratory experiment are characteristic elements that affect the plasma and plasma processes that are being investigated. Space plasma is not exempt from an analogous set of constraining factors that likewise influence the phenomena that occur. Typically, each morphologically distinct region of space has associated with it plasma that is unique by virtue of the various mechanisms responsible for the plasma's presence there, as if the plasma were produced by a unique source. Boundary effects that typically constrain the possible parameter values to lie within one or more restricted ranges are inescapable in laboratory plasma. The goal of a laboratory experiment is to examine the relevant physics within these ranges and extrapolate the results to space conditions that may or may not be subject to any restrictions on the values of the plasma parameters. The interrelationship between laboratory and space plasma experiments has been cultivated at a low level and the potential scientific benefit in this area has yet to be realized. The few but excellent examples of joint papers, joint experiments, and directly relevant cross-disciplinary citations are a direct result of the emphasis placed on this interrelationship two decades ago. Building on this special issue Plasma Physics and Controlled Fusion plans to create a dedicated webpage to highlight papers directly relevant to this field published either in the recent past or in the future. It is hoped that this resource will appeal to the readership in the laboratory-experiment and space-plasma communities and improve the cross-fertilization between them.

  14. Large Eddy Simulations of the Tilted Rig Experiment: A Two-dimensional Rayleigh-Taylor Instability Case

    NASA Astrophysics Data System (ADS)

    Rollin, Bertrand; Denissen, Nicholas A.; Reisner, Jon M.; Andrews, Malcolm J.

    2012-11-01

    The tilted rig experiment is a derivative of the rocket rig experiment designed to investigate turbulent mixing induced by the Rayleigh-Taylor (RT) instability. A tank containing two fluids of different densities is accelerated downwards between two parallel guiding rods by rocket motors. The acceleration is such that the pressure and density gradients face opposite directions at the fluids interface, creating a Rayleigh-Taylor unstable configuration. The rig is tilted such that the tank is initially at an angle and the acceleration is not perpendicular to the fluids interface when the rockets fire. This results in a two dimensional Rayleigh-Taylor instability case where the fluids experience RT mixing and a bulk overturning motion. The tilted rig is therefore a valuable experiment to help calibrating two-dimensional mixing models. Large Eddy Simulations of the tilted rig experiments will be compared to available experimental results. A study of the behavior of turbulence variables relevant to turbulence modeling will be presented. LA-UR 12-23829. This work was performed for the U.S. Department of Energy by Los Alamos National Laboratory under Contract No.DEAC52- 06NA2-5396.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kasemir, Kay; Pearson, Matthew R

    For several years, the Control System Studio (CS-Studio) Scan System has successfully automated the operation of beam lines at the Oak Ridge National Laboratory (ORNL) High Flux Isotope Reactor (HFIR) and Spallation Neutron Source (SNS). As it is applied to additional beam lines, we need to support simultaneous adjustments of temperatures or motor positions. While this can be implemented via virtual motors or similar logic inside the Experimental Physics and Industrial Control System (EPICS) Input/Output Controllers (IOCs), doing so requires a priori knowledge of experimenters requirements. By adding support for the parallel control of multiple process variables (PVs) to themore » Scan System, we can better support ad hoc automation of experiments that benefit from such simultaneous PV adjustments.« less

  16. Grazing-incidence small angle x-ray scattering studies of nanoscale polymer gratings

    NASA Astrophysics Data System (ADS)

    Doxastakis, Manolis; Suh, Hyo Seon; Chen, Xuanxuan; Rincon Delgadillo, Paulina A.; Wan, Lingshu; Williamson, Lance; Jiang, Zhang; Strzalka, Joseph; Wang, Jin; Chen, Wei; Ferrier, Nicola; Ramirez-Hernandez, Abelardo; de Pablo, Juan J.; Gronheid, Roel; Nealey, Paul

    2015-03-01

    Grazing-Incidence Small Angle X-ray Scattering (GISAXS) offers the ability to probe large sample areas, providing three-dimensional structural information at high detail in a thin film geometry. In this study we exploit the application of GISAXS to structures formed at one step of the LiNe (Liu-Nealey) flow using chemical patterns for directed self-assembly of block copolymer films. Experiments conducted at the Argonne National Laboratory provided scattering patterns probing film characteristics at both parallel and normal directions to the surface. We demonstrate the application of new computational methods to construct models based on scattering measured. Such analysis allows for extraction of structural characteristics at unprecedented detail.

  17. Initial clinical laboratory experience in noninvasive prenatal testing for fetal aneuploidy from maternal plasma DNA samples.

    PubMed

    Futch, Tracy; Spinosa, John; Bhatt, Sucheta; de Feo, Eileen; Rava, Richard P; Sehnert, Amy J

    2013-06-01

    The aim of this study is to report the experience of noninvasive prenatal DNA testing using massively parallel sequencing in an accredited clinical laboratory. Laboratory information was examined for blood samples received for testing between February and November 2012 for chromosome 21 (Chr21), Chr18, and Chr13. Monosomy X (MX) testing was available from July 2012 for cystic hygroma indication. Outcomes were collected from providers on samples with positive results. There were 5974 samples tested, and results were issued within an average of 5.1 business days. Aneuploidy was detected in 284 (4.8%) samples (155 Chr21, 66 Chr18, 19 Chr13, 40 MX, and four double aneuploidy). Follow-ups are available for 245/284 (86%), and 77/284 (27.1%) are confirmed, including one double-aneuploidy case concordant with cytogenetics from maternal malignancy. Fourteen (0.2%) discordant (putative false-positive) results (one Chr21, six Chr18, three Chr13, three MX, and one Chr21/13) have been identified. Five (0.08%) false-negative cases are reported (two trisomy 21, two trisomy 18, and one MX). In 170 (2.8%) cases, the result for a single chromosome was indefinite. This report suggests that clinical testing of maternal cell-free DNA for fetal aneuploidy operates within performance parameters established in validation studies. Noninvasive prenatal testing is sensitive to biological contributions from placental and maternal sources. ©2013 Verinata Health, Inc. Prenatal Diagnosis published by John Wiley & Sons, Ltd.

  18. Kinesiology Workbook and Laboratory Manual.

    ERIC Educational Resources Information Center

    Harris, Ruth W.

    This manual is written for students in anatomy, kinesiology, or introductory biomechanics courses. The book is divided into two sections, a kinesiology workbook and a laboratory manual. The two sections parallel each other in content and format. Each is divided into three corresponding sections: (1) Anatomical bases for movement description; (2)…

  19. HOMOLOGOUS MEASURES OF COGNITIVE FUNCTION IN HUMAN INFANTS AND LABORATORY ANIMALS TO IDENTIFY ENVIRONMENTAL HEALTH RISKS TO CHILDREN

    EPA Science Inventory

    The importance of including neurodevelopmental endpoints in environmental studies is clear. A validated measure of cognitive fucntion in human infants that also has a parallel test in laboratory animal studies will provide a valuable approach for largescale studies. Such a ho...

  20. A sweep algorithm for massively parallel simulation of circuit-switched networks

    NASA Technical Reports Server (NTRS)

    Gaujal, Bruno; Greenberg, Albert G.; Nicol, David M.

    1992-01-01

    A new massively parallel algorithm is presented for simulating large asymmetric circuit-switched networks, controlled by a randomized-routing policy that includes trunk-reservation. A single instruction multiple data (SIMD) implementation is described, and corresponding experiments on a 16384 processor MasPar parallel computer are reported. A multiple instruction multiple data (MIMD) implementation is also described, and corresponding experiments on an Intel IPSC/860 parallel computer, using 16 processors, are reported. By exploiting parallelism, our algorithm increases the possible execution rate of such complex simulations by as much as an order of magnitude.

  1. Efficient high-throughput biological process characterization: Definitive screening design with the ambr250 bioreactor system.

    PubMed

    Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam

    2015-01-01

    The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.

  2. A Collimated Retarding Potential Analyzer for the Study of Magnetoplasma Rocket Plumes

    NASA Technical Reports Server (NTRS)

    Glover, T. W.; Chan, A. A.; Chang-Diaz, F. R.; Kittrell, C.

    2003-01-01

    A gridded retarding potential analyzer (RPA) has been developed to characterize the magnetized plasma exhaust of the 10 kW Variable Specific Impulse Magnetoplasma Rocket (VX-10) experiment at NASA's Advanced Space Propulsion Laboratory. In this system, plasma is energized through coupling of radio frequency waves at the ion cyclotron resonance (ICR). The particles are subsequently accelerated in a magnetic nozzle to provide thrust. Downstream of the nozzle, the RPA's mounting assembly enables the detector to make complete axial and radial scans of the plasma. A multichannel collimator can be inserted into the RPA to remove ions with pitch angles greater than approximately 1 deg. A calculation of the general collimator transmission as a function over velocity space is presented, which shows the instrument's sensitivity in detecting changes in both the parallel and perpendicular components of the ion energy. Data from initial VX-10 ICRH experiments show evidence of ion heating.

  3. Preliminary Observing System Simulation Experiments for Doppler Wind Lidars Deployed on the International Space Station

    NASA Technical Reports Server (NTRS)

    Kemp, E.; Jacob, J.; Rosenberg, R.; Jusem, J. C.; Emmitt, G. D.; Wood, S.; Greco, L. P.; Riishojgaard, L. P.; Masutani, M.; Ma, Z.; hide

    2013-01-01

    NASA Goddard Space Flight Center's Software Systems Support Office (SSSO) is participating in a multi-agency study of the impact of assimilating Doppler wind lidar observations on numerical weather prediction. Funded by NASA's Earth Science Technology Office, SSSO has worked with Simpson Weather Associates to produce time series of synthetic lidar observations mimicking the OAWL and WISSCR lidar instruments deployed on the International Space Station. In addition, SSSO has worked to assimilate a portion of these observations those drawn from the NASA fvGCM Nature Run into the NASA GEOS-DAS global weather prediction system in a series of Observing System Simulation Experiments (OSSEs). These OSSEs will complement parallel OSSEs prepared by the Joint Center for Satellite Data Assimilation and by NOAA's Atlantic Oceanographic and Meteorological Laboratory. In this talk, we will describe our procedure and provide available OSSE results.

  4. Electron Pressure Anisotropy in the Terrestrial Reconnection Experiment and the Magnetospheric Multiscale Mission

    NASA Astrophysics Data System (ADS)

    Myers, Rachel; Egedal, Jan; Olson, Joseph; Greess, Samuel; Millet-Ayala, Alexander; Clark, Michael; Nonn, Paul; Wallace, John; Forest, Cary

    2017-10-01

    The NASA Magnetospheric Multiscale (MMS) Mission seeks to measure heating and motion of charged particles from reconnection events in the magnetotail and dayside magnetopause. MMS is paralleled by the Terrestrial Reconnection Experiment (TREX) at the Wisconsin Plasma Astrophysics Laboratory (WiPAL) in its study of collisionless magnetic reconnection. In the regimes seen by TREX and MMS, electron pressure anisotropy should develop, driving large-scale current layer formation. MMS has witnessed anisotropy, but the spatial coverage of the data is too limited to determine how the pressure anisotropy affects jet and current layer creation. Measurements of pressure anisotropy on TREX will be presented, and implications for reconnecting current layer structure in the magnetosphere, as measured by MMS, will be discussed. This research was conducted with support from a UW-Madison University Fellowship as well as the NSF/DOE award DE-SC0013032.

  5. Flight- and ground-test correlation study of BMDO SDS materials: Phase 1 report

    NASA Technical Reports Server (NTRS)

    Chung, Shirley Y.; Brinza, David E.; Minton, Timothy K.; Stiegman, Albert E.; Kenny, James T.; Liang, Ranty H.

    1993-01-01

    The NASA Evaluation of Oxygen Interactions with Materials-3 (EOIM-3) experiment served as a test bed for a variety of materials that are candidates for Ballistic Missile Defense Organization (BMDO) space assets. The materials evaluated on this flight experiment were provided by BMDO contractors and technology laboratories. A parallel ground exposure evaluation was conducted using the FAST atomic-oxygen simulation facility at Physical Sciences, Inc. The EOIM-3 materials were exposed to an atomic oxygen fluence of approximately 2.3 x 10(exp 2) atoms/sq. cm. The ground-exposed materials' fluence of 2.0 - 2.5 x 10(exp 2) atoms/sq. cm permits direct comparison of ground-exposed materials' performance with that of the flight-exposed specimens. The results from the flight test conducted aboard STS-46 and the correlative ground exposure are presented in this publication.

  6. An open-source, extensible system for laboratory timing and control

    NASA Astrophysics Data System (ADS)

    Gaskell, Peter E.; Thorn, Jeremy J.; Alba, Sequoia; Steck, Daniel A.

    2009-11-01

    We describe a simple system for timing and control, which provides control of analog, digital, and radio-frequency signals. Our system differs from most common laboratory setups in that it is open source, built from off-the-shelf components, synchronized to a common and accurate clock, and connected over an Ethernet network. A simple bus architecture facilitates creating new and specialized devices with only moderate experience in circuit design. Each device operates independently, requiring only an Ethernet network connection to the controlling computer, a clock signal, and a trigger signal. This makes the system highly robust and scalable. The devices can all be connected to a single external clock, allowing synchronous operation of a large number of devices for situations requiring precise timing of many parallel control and acquisition channels. Provided an accurate enough clock, these devices are capable of triggering events separated by one day with near-microsecond precision. We have achieved precisions of ˜0.1 ppb (parts per 109) over 16 s.

  7. Chloride-induced corrosion of steel in cracked concrete – Part I: Experimental studies under accelerated and natural marine environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otieno, M., E-mail: Mike.Otieno@wits.ac.za; Beushausen, H.; Alexander, M.

    Parallel corrosion experiments were carried out for 2¼ years by exposing one half of 210 beam specimens (120 × 130 × 375 mm long) to accelerated laboratory corrosion (cyclic wetting and drying) while the other half underwent natural corrosion in a marine tidal zone. Experimental variables were crack width w{sub cr} (0, incipient crack, 0.4, 0.7 mm), cover c (20, 40 mm), binder type (PC, PC/GGBS, PC/FA) and w/b ratio (0.40, 0.55). Results show that corrosion rate (i{sub corr}) was affected by the experimental variables in the following manner: i{sub corr} increased with increase in crack width, and decreased withmore » increase in concrete quality and cover depth. The results also show that the corrosion performance of concretes in the field under natural corrosion cannot be inferred from its performance in the laboratory under accelerated corrosion. Other factors such as corrosion process should be taken into account.« less

  8. Laboratory research of fracture geometry in multistage HFF in triaxial state

    NASA Astrophysics Data System (ADS)

    Bondarenko, T. M.; Hou, B.; Chen, M.; Yan, L.

    2017-05-01

    Multistage hydraulic fracturing of formation (HFF) in wells with horizontal completion is an efficientmethod for intensifying oil extraction which, as a rule, is used to develop nontraditional collectors. It is assumed that the complicated character of HFF fractures significantly influences the fracture geometry in the rock matrix. Numerous theoretical models proposed to predict the fracture geometry and the character of interaction of mechanical stresses in the multistage HFF have not been proved experimentally. In this paper, we present the results of laboratory modeling of the multistage HFF performed on a contemporary laboratory-scale plant in the triaxial stress state by using a gel-solution as the HFF agent. As a result of the experiment, a fracturing pattern was formed in the cubic specimen of the model material. The laboratory results showed that a nearly plane fracture is formed at the firstHFF stage, while a concave fracture is formed at the second HFF stage. The interaction of the stress fields created by the two principal HFF fractures results in the growth of secondary fractures whose directions turned out to be parallel to the modeled well bore. But this stress interference leads to a decrease in the width of the second principal fracture. It is was discovered that the penny-shaped fracture model is more appropriate for predicting the geometry of HFF fractures in horizontal wells than the two-dimensional models of fracture propagation (PKN model, KGD model). A computational experiment based on the boundary element method was carried out to obtain the qualitative description of the multistage HFF processes. As a result, a mechanical model of fracture propagation was constructed,which was used to obtain the mechanical stress field (the stress contrast) and the fracture opening angle distribution over fracture length and fracture orientation direction. The conclusions made in the laboratory modeling of the multistage HFF technology agree well with the conclusions made in the computational experiment. Special attention must be paid to the design of the HFF stage spacing density in the implementation of the multistage HFF in wells with horizontal completion.

  9. Currents between tethered electrodes in a magnetized laboratory plasma

    NASA Technical Reports Server (NTRS)

    Stenzel, R. L.; Urrutia, J. M.

    1989-01-01

    Laboratory experiments on important plasma physics issues of electrodynamic tethers were performed. These included current propagation, formation of wave wings, limits of current collection, nonlinear effects and instabilities, charging phenomena, and characteristics of transmission lines in plasmas. The experiments were conducted in a large afterglow plasma. The current system was established with a small electron-emitting hot cathode tethered to an electron-collecting anode, both movable across the magnetic field and energized by potential difference up to V approx.=100 T(sub e). The total current density in space and time was obtained from complete measurements of the perturbed magnetic field. The fast spacecraft motion was reproduced in the laboratory by moving the tethered electrodes in small increments, applying delayed current pulses, and reconstructing the net field by a linear superposition of locally emitted wavelets. With this technique, the small-amplitude dc current pattern is shown to form whistler wings at each electrode instead of the generally accepted Alfven wings. For the beam electrode, the whistler wing separates from the field-aligned beam which carries no net current. Large amplitude return currents to a stationary anode generate current-driven microinstabilities, parallel electric fields, ion depletions, current disruptions and time-varying electrode charging. At appropriately high potentials and neutral densities, excess neutrals are ionized near the anode. The anode sheath emits high-frequency electron transit-time oscillations at the sheath-plasma resonance. The beam generates Langmuir turbulence, ion sound turbulence, electron heating, space charge fields, and Hall currents. An insulated, perfectly conducting transmission line embedded in the plasma becomes lossy due to excitation of whistler waves and magnetic field diffusion effects. The implications of the laboratory observations on electrodynamic tethers in space are discussed.

  10. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  11. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  12. Examining HPV threat-to-efficacy ratios in the Extended Parallel Process Model.

    PubMed

    Carcioppolo, Nick; Jensen, Jakob D; Wilson, Steven R; Collins, W Bart; Carrion, Melissa; Linnemeier, Georgiann

    2013-01-01

    The Extended Parallel Process Model (EPPM) posits that an effective fear appeal includes both threat and efficacy components; however, research has not addressed whether there is an optimal threat-to-efficacy ratio. It is possible that varying levels of threat and efficacy in a persuasive message could yield different effects on attitudes, beliefs, and behaviors. In a laboratory experiment, women (n = 442) were exposed to human papilloma virus (HPV) prevention messages containing one of six threat-to-efficacy ratios and one of two message frames (messages emphasizing the connection between HPV and cervical cancer or HPV and genital warts). Multiple mediation analysis revealed that a 1-to-1 ratio of threat to efficacy was most effective at increasing prevention intentions, primarily because it caused more fear and risk susceptibility than other message ratios. Response efficacy significantly mediated the relationship between message framing and intentions, such that participants exposed to a genital warts message reported significantly higher intentions, and this association can be explained in part through response efficacy. Implications for future theoretical research as well as campaigns and intervention research are discussed.

  13. Pulsating Magnetic Reconnection Driven by Three-Dimensional Flux-Rope Interactions.

    PubMed

    Gekelman, W; De Haas, T; Daughton, W; Van Compernolle, B; Intrator, T; Vincena, S

    2016-06-10

    The dynamics of magnetic reconnection is investigated in a laboratory experiment consisting of two magnetic flux ropes, with currents slightly above the threshold for the kink instability. The evolution features periodic bursts of magnetic reconnection. To diagnose this complex evolution, volumetric three-dimensional data were acquired for both the magnetic and electric fields, allowing key field-line mapping quantities to be directly evaluated for the first time with experimental data. The ropes interact by rotating about each other and periodically bouncing at the kink frequency. During each reconnection event, the formation of a quasiseparatrix layer (QSL) is observed in the magnetic field between the flux ropes. Furthermore, a clear correlation is demonstrated between the quasiseparatrix layer and enhanced values of the quasipotential computed by integrating the parallel electric field along magnetic field lines. These results provide clear evidence that field lines passing through the quasiseparatrix layer are undergoing reconnection and give a direct measure of the nonlinear reconnection rate. The measurements suggest that the parallel electric field within the QSL is supported predominantly by electron pressure; however, resistivity may play a role.

  14. Experience in highly parallel processing using DAP

    NASA Technical Reports Server (NTRS)

    Parkinson, D.

    1987-01-01

    Distributed Array Processors (DAP) have been in day to day use for ten years and a large amount of user experience has been gained. The profile of user applications is similar to that of the Massively Parallel Processor (MPP) working group. Experience has shown that contrary to expectations, highly parallel systems provide excellent performance on so-called dirty problems such as the physics part of meteorological codes. The reasons for this observation are discussed. The arguments against replacing bit processors with floating point processors are also discussed.

  15. First experiments probing the collision of parallel magnetic fields using laser-produced plasmas

    DOE PAGES

    Rosenberg, M. J.; Li, C. K.; Fox, W.; ...

    2015-04-08

    Novel experiments to study the strongly-driven collision of parallel magnetic fields in β~10, laser-produced plasmas have been conducted using monoenergetic proton radiography. These experiments were designed to probe the process of magnetic flux pileup, which has been identified in prior laser-plasma experiments as a key physical mechanism in the reconnection of anti-parallel magnetic fields when the reconnection inflow is dominated by strong plasma flows. In the present experiments using colliding plasmas carrying parallel magnetic fields, the magnetic flux is found to be conserved and slightly compressed in the collision region. Two-dimensional (2D) particle-in-cell (PIC) simulations predict a stronger flux compressionmore » and amplification of the magnetic field strength, and this discrepancy is attributed to the three-dimensional (3D) collision geometry. Future experiments may drive a stronger collision and further explore flux pileup in the context of the strongly-driven interaction of magnetic fields.« less

  16. Speeding up parallel processing

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    In 1967 Amdahl expressed doubts about the ultimate utility of multiprocessors. The formulation, now called Amdahl's law, became part of the computing folklore and has inspired much skepticism about the ability of the current generation of massively parallel processors to efficiently deliver all their computing power to programs. The widely publicized recent results of a group at Sandia National Laboratory, which showed speedup on a 1024 node hypercube of over 500 for three fixed size problems and over 1000 for three scalable problems, have convincingly challenged this bit of folklore and have given new impetus to parallel scientific computing.

  17. Competitive Genomic Screens of Barcoded Yeast Libraries

    PubMed Central

    Urbanus, Malene; Proctor, Michael; Heisler, Lawrence E.; Giaever, Guri; Nislow, Corey

    2011-01-01

    By virtue of advances in next generation sequencing technologies, we have access to new genome sequences almost daily. The tempo of these advances is accelerating, promising greater depth and breadth. In light of these extraordinary advances, the need for fast, parallel methods to define gene function becomes ever more important. Collections of genome-wide deletion mutants in yeasts and E. coli have served as workhorses for functional characterization of gene function, but this approach is not scalable, current gene-deletion approaches require each of the thousands of genes that comprise a genome to be deleted and verified. Only after this work is complete can we pursue high-throughput phenotyping. Over the past decade, our laboratory has refined a portfolio of competitive, miniaturized, high-throughput genome-wide assays that can be performed in parallel. This parallelization is possible because of the inclusion of DNA 'tags', or 'barcodes,' into each mutant, with the barcode serving as a proxy for the mutation and one can measure the barcode abundance to assess mutant fitness. In this study, we seek to fill the gap between DNA sequence and barcoded mutant collections. To accomplish this we introduce a combined transposon disruption-barcoding approach that opens up parallel barcode assays to newly sequenced, but poorly characterized microbes. To illustrate this approach we present a new Candida albicans barcoded disruption collection and describe how both microarray-based and next generation sequencing-based platforms can be used to collect 10,000 - 1,000,000 gene-gene and drug-gene interactions in a single experiment. PMID:21860376

  18. A CS1 pedagogical approach to parallel thinking

    NASA Astrophysics Data System (ADS)

    Rague, Brian William

    Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.

  19. Effect of geometrical configuration of sediment replenishment on the development of bed form patterns in a gravel bed channel

    NASA Astrophysics Data System (ADS)

    Battisacco, Elena; Franca, Mário J.; Schleiss, Anton J.

    2016-04-01

    Dams interrupt the longitudinal continuity of river reaches since they store water and trap sediment in the upstream reservoir. By the interruption of the sediment continuum, the transport capacity of downstream stretch exceeds the sediment supply, thus the flow becomes "hungry". Sediment replenishment is an increasingly used method for restoring the continuity in rivers and for re-establishing the sediment regime of such disturbed river reaches. This research evaluates the effect of different geometrical configurations of sediment replenishment on the evolution of the bed morphology by systematic laboratory experiments. A typical straight armoured gravel reach is reproduced in a laboratory flume in terms of slope, grain size and cross section. The total amount of replenished sediment is placed in four identical volumes on both channel banks, forming six different geometrical configurations. Both alternated and parallel combinations are studied. Preliminary studies demonstrate that a complete submergence condition of the replenishment deposits is most adequate for obtaining a complete erosion and a high persistence of the replenished material in the channel. The response of the channel bed morphology to replenishment is documented by camera and laser scanners installed on a moveable carriage. The parallel configurations create an initially strong narrowing of the channel section. The transport capacity is thus higher and most of the replenished sediments exit the channel. The parallel configurations result in a more spread distribution of grains but with no clear morphological pattern. Clear bed form patterns can be observed when applying alternated configurations. Furthermore, the wavelength of depositions correspond to the replenishment deposit length. These morphological forms can be assumed as mounds. In order to enhance channel bed morphology on an armoured bed by sediment replenishment, alternated deposit configurations are more favourable and effective. The present study is supported by FOEN (Federal Office for the Environment, Switzerland).

  20. Sigmoidal equilibria and eruptive instabilities in laboratory magnetic flux ropes

    NASA Astrophysics Data System (ADS)

    Myers, C. E.; Yamada, M.; Belova, E.; Ji, H.; Yoo, J.

    2013-12-01

    The Magnetic Reconnection Experiment (MRX) has recently been modified to study quasi-statically driven line-tied magnetic flux ropes in the context of storage-and-release eruptions in the corona. Detailed in situ magnetic measurements and supporting MHD simulations permit quantitative analysis of the plasma behavior. We find that the behavior of these flux ropes depends strongly on the properties of the applied potential magnetic field arcade. For example, when the arcade is aligned parallel to the flux rope footpoints, force free currents induced in the expanding rope modify the pressure and tension in the arcade, resulting in a confined, quiescent discharge with a saturated kink instability. When the arcade is obliquely aligned to the footpoints, on the other hand, a highly sigmoidal equilibrium forms that can dynamically erupt (see Fig. 1 and Fig. 2). To our knowledge, these storage-and-release eruptions are the first of their kind to be produced in the laboratory. A new 2D magnetic probe array is used to map out the internal structure of the flux ropes during both the storage and the release phases of the discharge. The kink instability and the torus instability are studied as candidate eruptive mechanisms--the latter by varying the vertical gradient of the potential field arcade. We also investigate magnetic reconnection events that accompany the eruptions. The long-term objective of this work is to use internal magnetic measurements of the flux rope structure to better understand the evolution and eruption of comparable structures in the corona. This research is supported by DoE Contract Number DE-AC02-09CH11466 and by the Center for Magnetic Self-Organization (CMSO). Qualitative sketches of flux ropes formed in (1) a parallel potential field arcade; and (2) an oblique potential field arcade. One-dimensional magnetic measurements from (1) a parallel arcade discharge that is confined; and (2) an oblique arcade discharge that erupts.

  1. Object-Oriented Implementation of the NAS Parallel Benchmarks using Charm++

    NASA Technical Reports Server (NTRS)

    Krishnan, Sanjeev; Bhandarkar, Milind; Kale, Laxmikant V.

    1996-01-01

    This report describes experiences with implementing the NAS Computational Fluid Dynamics benchmarks using a parallel object-oriented language, Charm++. Our main objective in implementing the NAS CFD kernel benchmarks was to develop a code that could be used to easily experiment with different domain decomposition strategies and dynamic load balancing. We also wished to leverage the object-orientation provided by the Charm++ parallel object-oriented language, to develop reusable abstractions that would simplify the process of developing parallel applications. We first describe the Charm++ parallel programming model and the parallel object array abstraction, then go into detail about each of the Scalar Pentadiagonal (SP) and Lower/Upper Triangular (LU) benchmarks, along with performance results. Finally we conclude with an evaluation of the methodology used.

  2. Parallelization of NAS Benchmarks for Shared Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.

  3. COMPARABLE MEASURES OF COGNITIVE FUNCTION IN HUMAN INFANTS AND LABORATORY ANIMALS TO IDENTIFY ENVIRONMENTAL HEALTH RISKS TO CHILDREN

    EPA Science Inventory

    The importance of including neurodevelopmental end points in environmental studies is clear. A validated measure of cognitive function in human infants that also has a homologous or parallel test in laboratory animal studies will provide a valuable approach for large-scale studie...

  4. U.S. Army Aeromedical Research Laboratory Annual Progress Report FY 1986

    DTIC Science & Technology

    1986-10-01

    19 Contracts ................................................. 19 Small Business Innovation...universities and businesses which parallels the research requirements of the laboratories under the USAMRDC command. Because of the scientific manpower...Software is being written to allow double entry verification of data. 2) Small business innovation research Each year, in compliance with the Small

  5. Translational Behavior Analysis: From Laboratory Science in Stimulus Control to Intervention with Persons with Neurodevelopmental Disabilities

    ERIC Educational Resources Information Center

    McIlvane, William J.

    2009-01-01

    Throughout its history, laboratory research in the experimental analysis of behavior has been successful in elucidating and clarifying basic learning principles and processes in both humans and nonhumans. In parallel, applied behavior analysis has shown how fundamental behavior-analytic principles and procedures can be employed to promote…

  6. 3D magnetic field configuration of small-scale reconnection events in the solar plasma atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shimizu, T., E-mail: shimizu@solar.isas.jaxa.jp; Department of Earth and Planetary Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033

    2015-10-15

    The outer solar atmosphere, i.e., the corona and the chromosphere, is replete with small energy-release events, which are accompanied by transient brightening and jet-like ejections. These events are considered to be magnetic reconnection events in the solar plasma, and their dynamics have been studied using recent advanced observations from the Hinode spacecraft and other observatories in space and on the ground. These events occur at different locations in the solar atmosphere and vary in their morphology and amount of the released energy. The magnetic field configurations of these reconnection events are inferred based on observations of magnetic fields at themore » photospheric level. Observations suggest that these magnetic configurations can be classified into two groups. In the first group, two anti-parallel magnetic fields reconnect to each other, yielding a 2D emerging flux configuration. In the second group, helical or twisted magnetic flux tubes are parallel or at a relative angle to each other. Reconnection can occur only between anti-parallel components of the magnetic flux tubes and may be referred to as component reconnection. The latter configuration type may be more important for the larger class of small-scale reconnection events. The two types of magnetic configurations can be compared to counter-helicity and co-helicity configurations, respectively, in laboratory plasma collision experiments.« less

  7. Long-term morphological developments of river channels separated by a longitudinal training wall

    NASA Astrophysics Data System (ADS)

    Le, T. B.; Crosato, A.; Uijttewaal, W. S. J.

    2018-03-01

    Rivers have been trained for centuries by channel narrowing and straightening. This caused important damages to their ecosystems, particularly around the bank areas. We analyze here the possibility to train rivers in a new way by subdividing their channel in main and ecological channel with a longitudinal training wall. The effectiveness of longitudinal training walls in achieving this goal and their long-term effects on the river morphology have not been thoroughly investigated yet. In particular, studies that assess the stability of the two parallel channels separated by the training wall are still lacking. This work studies the long-term morphological developments of river channels subdivided by a longitudinal training wall in the presence of steady alternate bars. This type of bars, common in alluvial rivers, alters the flow field and the sediment transport direction and might affect the stability of the bifurcating system. The work comprises both laboratory experiments and numerical simulations (Delft3D). The results show that a system of parallel channels divided by a longitudinal training wall has the tendency to become unstable. An important factor is found to be the location of the upstream termination of the longitudinal wall with respect to a neighboring steady bar. The relative widths of the two parallel channels separated by the wall and variable discharge do not substantially change the final evolution of the system.

  8. A benchtop biorobotic platform for in vitro observation of muscle-tendon dynamics with parallel mechanical assistance from an elastic exoskeleton.

    PubMed

    Robertson, Benjamin D; Vadakkeveedu, Siddarth; Sawicki, Gregory S

    2017-05-24

    We present a novel biorobotic framework comprised of a biological muscle-tendon unit (MTU) mechanically coupled to a feedback controlled robotic environment simulation that mimics in vivo inertial/gravitational loading and mechanical assistance from a parallel elastic exoskeleton. Using this system, we applied select combinations of biological muscle activation (modulated with rate-coded direct neural stimulation) and parallel elastic assistance (applied via closed-loop mechanical environment simulation) hypothesized to mimic human behavior based on previously published modeling studies. These conditions resulted in constant system-level force-length dynamics (i.e., stiffness), reduced biological loads, increased muscle excursion, and constant muscle average positive power output-all consistent with laboratory experiments on intact humans during exoskeleton assisted hopping. Mechanical assistance led to reduced estimated metabolic cost and MTU apparent efficiency, but increased apparent efficiency for the MTU+Exo system as a whole. Findings from this study suggest that the increased natural resonant frequency of the artificially stiffened MTU+Exo system, along with invariant movement frequencies, may underlie observed limits on the benefits of exoskeleton assistance. Our novel approach demonstrates that it is possible to capture the salient features of human locomotion with exoskeleton assistance in an isolated muscle-tendon preparation, and introduces a powerful new tool for detailed, direct examination of how assistive devices affect muscle-level neuromechanics and energetics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Dissolved organic matter fluorescence at wavelength 275/342 nm as a key indicator for detection of point-source contamination in a large Chinese drinking water lake.

    PubMed

    Zhou, Yongqiang; Jeppesen, Erik; Zhang, Yunlin; Shi, Kun; Liu, Xiaohan; Zhu, Guangwei

    2016-02-01

    Surface drinking water sources have been threatened globally and there have been few attempts to detect point-source contamination in these waters using chromophoric dissolved organic matter (CDOM) fluorescence. To determine the optimal wavelength derived from CDOM fluorescence as an indicator of point-source contamination in drinking waters, a combination of field campaigns in Lake Qiandao and a laboratory wastewater addition experiment was used. Parallel factor (PARAFAC) analysis identified six components, including three humic-like, two tryptophan-like, and one tyrosine-like component. All metrics showed strong correlation with wastewater addition (r(2) > 0.90, p < 0.0001). Both the field campaigns and the laboratory contamination experiment revealed that CDOM fluorescence at 275/342 nm was the most responsive wavelength to the point-source contamination in the lake. Our results suggest that pollutants in Lake Qiandao had the highest concentrations in the river mouths of upstream inflow tributaries and the single wavelength at 275/342 nm may be adapted for online or in situ fluorescence measurements as an early warning of contamination events. This study demonstrates the potential utility of CDOM fluorescence to monitor water quality in surface drinking water sources. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The project ownership survey: measuring differences in scientific inquiry experiences.

    PubMed

    Hanauer, David I; Dolan, Erin L

    2014-01-01

    A growing body of research documents the positive outcomes of research experiences for undergraduates, including increased persistence in science. Study of undergraduate lab learning experiences has demonstrated that the design of the experience influences the extent to which students report ownership of the project and that project ownership is one of the psychosocial factors involved in student retention in the sciences. To date, methods for measuring project ownership have not been suitable for the collection of larger data sets. The current study aims to rectify this by developing, presenting, and evaluating a new instrument for measuring project ownership. Eighteen scaled items were generated based on prior research and theory related to project ownership and combined with 30 items shown to measure respondents' emotions about an experience, resulting in the Project Ownership survey (POS). The POS was analyzed to determine its dimensionality, reliability, and validity. The POS had a coefficient alpha of 0.92 and thus has high internal consistency. Known-groups validity was analyzed through the ability of the instrument to differentiate between students who studied in traditional versus research-based laboratory courses. The POS scales as differentiated between the groups and findings paralleled previous results in relation to the characteristics of project ownership.

  11. When Gender Identity Doesn't Equal Sex Recorded at Birth: The Role of the Laboratory in Providing Effective Healthcare to the Transgender Community.

    PubMed

    Goldstein, Zil; Corneil, Trevor A; Greene, Dina N

    2017-08-01

    Transgender is an umbrella term used to describe individuals who identify with a gender incongruent to or variant from their sex recorded at birth. Affirming gender identity through a variety of social, medical, and surgical interventions is critical to the mental health of transgender individuals. In recent years, awareness surrounding transgender identities has increased, which has highlighted the health disparities that parallel this demographic. These disparities are reflected in the experience of transgender patients and their providers when seeking clinical laboratory services. Little is known about the effect of gender-affirming hormone therapy and surgery on optimal laboratory test interpretation. Efforts to diminish health disparities encountered by transgender individuals and their providers can be accomplished by increasing social and clinical awareness regarding sex/gender incongruence and gaining insight into the physiological manifestations and laboratory interpretations of gender-affirming strategies. This review summarizes knowledge required to understand transgender healthcare including current clinical interventions for gender dysphoria. Particular attention is paid to the subsequent impact of these interventions on laboratory test utilization and interpretation. Common nomenclature and system barriers are also discussed. Understanding gender incongruence, the clinical changes associated with gender transition, and systemic barriers that maintain a gender/sex binary are key to providing adequate healthcare to transgender community. Transgender appropriate reference interval studies are virtually absent within the medical literature and should be explored. The laboratory has an important role in improving the physiological understanding, electronic medical system recognition, and overall social awareness of the transgender community. © 2017 American Association for Clinical Chemistry.

  12. Metabolic, endocrine and appetite-related responses to acute and daily milk snack consumption in healthy, adolescent males.

    PubMed

    Green, Benjamin P; Stevenson, Emma J; Rumbold, Penny L S

    2017-01-01

    Comprising of two experiments, this study assessed the metabolic, endocrine and appetite-related responses to acute and chronic milk consumption in adolescent males (15-18 y). Eleven adolescents [mean ± SD age: 16.5 ± 0.9 y; BMI: 23.3 ± 3.3 kg/m 2 ] participated in the acute experiment and completed two laboratory visits (milk vs. fruit-juice) in a randomized crossover design, separated by 7-d. Seventeen adolescents [age: 16.1 ± 0.9 y; BMI: 21.8 ± 3.7 kg/m 2 ] completed the chronic experiment. For the chronic experiment, a parallel design with two groups was used. Participants were randomly allocated and consumed milk (n = 9) or fruit-juice (n = 8) for 28-d, completing laboratory visits on the first (baseline, day-0) and last day (follow-up, day-28) of the intervention phase. On laboratory visits (for both experiments), measures of appetite, metabolism and endocrine responses were assessed at regular intervals. In addition, eating behavior was quantified by ad libitum assessment under laboratory conditions and in the free-living environment by weighed food record. Acute milk intake stimulated glucagon (P = 0.027 [16.8 pg mL; 95% CI: 2.4, 31.3]) and reduced ad libitum energy intake relative to fruit-juice (P = 0.048 [-651.3 kJ; 95% CI: -1294.1, -8.6]), but was comparable in the free-living environment. Chronic milk intake reduced free-living energy intake at the follow-up visit compared to baseline (P = 0.013 [-1910.9 kJ; 95% CI: -554.6, -3267.2]), whereas the opposite was apparent for fruit-juice. Relative to baseline, chronic milk intake increased the insulin response to both breakfast (P = 0.031) and mid-morning milk consumption (P = 0.050) whilst attenuating blood glucose (P = 0.025). Together, these findings suggest milk consumption impacts favorably on eating behavior in adolescent males, potentially through integrated endocrine responses. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Clastogenic effects of radiofrequency radiations on chromosomes of Tradescantia.

    PubMed

    Haider, T; Knasmueller, S; Kundi, M; Haider, M

    1994-06-01

    The clastogenicity of electromagnetic fields (EMF) has so far been studied only under laboratory conditions. We used the Tradescantia-micronucleus (Trad-MCN) bioassay in an in situ experiment to find out whether short-wave electromagnetic fields used for broadcasting (10-21 MHz) may show genotoxic effects. Plant cuttings bearing young flower buds were exposed (30 h) on both sides of a slewable curtain antenna (300/500 kW, 40-170 V/m) and 15 m (90 V/m) and 30 m (70 V/m) distant from a vertical cage antenna (100 kW) as well as at the neighbors living near the broadcasting station (200 m, 1-3 V/m). The exposure at both sides of the slewable curtain antenna was performed simultaneously within cages, one of the Faraday type shielding the field and one non-shielding mesh cage. Laboratory controls were maintained for comparison. Higher MCN frequencies than in laboratory controls were found for all exposure sites in the immediate vicinity of the antennae, where the exposure standards of the electric field strength of the International Radiation Protection Association (IRPA) were exceeded. The results at all exposure sites except one were statistically significant. Since the parallel exposure in a non-shielding and a shielding cage also revealed significant differences in MCN frequencies (the latter showing no significant differences from laboratory controls), the clastogenic effects are clearly attributable to the short-wave radiation from the antennae.

  14. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    2011-07-20

    This report summarizes work carried out by the Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of January 1, 2011 through June 30, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. To learn more about our project, please visit our UV-CDAT website (URL: http://uv-cdat.org). This report will be forwarded to the program manager for the Department of Energy (DOE) Office of Biological and Environmental Research (BER), national and international collaborators and stakeholders, and to researchers working on a wide range of other climate model, reanalysis, and observation evaluation activities. Themore » UV-CDAT executive committee consists of Dean N. Williams of Lawrence Livermore National Laboratory (LLNL); Dave Bader and Galen Shipman of Oak Ridge National Laboratory (ORNL); Phil Jones and James Ahrens of Los Alamos National Laboratory (LANL), Claudio Silva of Polytechnic Institute of New York University (NYU-Poly); and Berk Geveci of Kitware, Inc. The UV-CDAT team consists of researchers and scientists with diverse domain knowledge whose home institutions also include the National Aeronautics and Space Administration (NASA) and the University of Utah. All work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Working directly with BER climate science analysis projects, this consortium will develop and deploy data and computational resources useful to a wide variety of stakeholders, including scientists, policymakers, and the general public. Members of this consortium already collaborate with other institutions and universities in researching data discovery, management, visualization, workflow analysis, and provenance. The UV-CDAT team will address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, and Visualization interfaces.« less

  15. Closely Spaced Independent Parallel Runway Simulation.

    DTIC Science & Technology

    1984-10-01

    facility consists of the Central Computer Facility, the Controller Laboratory, and the Simulator Pilot Complex. CENTRAL COMPUTER FACILITY. The Central... Computer Facility consists of a group of mainframes, minicomputers, and associated peripherals which host the operational and data acquisition...in the Controller Laboratory and convert their verbal directives into a keyboard entry which is transmitted to the Central Computer Complex, where

  16. Evidence for the presence of quasi-two-dimensional nearly incompressible fluctuations in the solar wind

    NASA Technical Reports Server (NTRS)

    Matthaeus, William H.; Goldstein, Melvyn L.; Roberts, D. Aaron

    1990-01-01

    Assuming that the slab and isotropic models of solar wind turbulence need modification (largely due to the observed anisotropy of the interplanetary fluctuations and the results of laboratory plasma experiments), this paper proposes a model of the solar wind. The solar wind is seen as a fluid which contains both classical transverse Alfvenic fluctuations and a population of quasi-transverse fluctuations. In quasi-two-dimensional turbulence, the pitch angle scattering by resonant wave-particle interactions is suppressed, and the direction of minimum variance of interplanetary fluctuations is parallel to the mean magnetic field. The assumed incompressibility is consistent with the fact that the density fluctuations are small and anticorrelated, and that the total pressure at small scales is nearly constant.

  17. The geological record of ocean acidification.

    PubMed

    Hönisch, Bärbel; Ridgwell, Andy; Schmidt, Daniela N; Thomas, Ellen; Gibbs, Samantha J; Sluijs, Appy; Zeebe, Richard; Kump, Lee; Martindale, Rowan C; Greene, Sarah E; Kiessling, Wolfgang; Ries, Justin; Zachos, James C; Royer, Dana L; Barker, Stephen; Marchitto, Thomas M; Moyer, Ryan; Pelejero, Carles; Ziveri, Patrizia; Foster, Gavin L; Williams, Branwen

    2012-03-02

    Ocean acidification may have severe consequences for marine ecosystems; however, assessing its future impact is difficult because laboratory experiments and field observations are limited by their reduced ecologic complexity and sample period, respectively. In contrast, the geological record contains long-term evidence for a variety of global environmental perturbations, including ocean acidification plus their associated biotic responses. We review events exhibiting evidence for elevated atmospheric CO(2), global warming, and ocean acidification over the past ~300 million years of Earth's history, some with contemporaneous extinction or evolutionary turnover among marine calcifiers. Although similarities exist, no past event perfectly parallels future projections in terms of disrupting the balance of ocean carbonate chemistry-a consequence of the unprecedented rapidity of CO(2) release currently taking place.

  18. Cavendish's crocodile and dark horse: the lives of Rutherford and Aston in parallel.

    PubMed

    Downard, Kevin M

    2007-01-01

    Ernest Rutherford and Francis Aston were born a world apart but both would become two of the most influential physicists of their time. Their separate training, under the direction of J.J. Thomson at the Cavendish Laboratory, shaped their future and allowed both men to develop and apply their considerable skills in experimental physics. It also catalyzed their careers and ultimately led to each receiving a Nobel Prize. Although they had very different characters, Rutherford and Aston became close colleagues and confidants who spent considerable time together within the confines of the Cavendish Laboratory, at Trinity College, and elsewhere in Cambridge. They also traveled the world in company, usually as part of a group or British delegation of scientists attending conferences and meetings overseas. This article parallels the lives of the two men. It describes how they came to work at the Cavendish, their scientific accomplishments and accolades, and their activities and interactions away from the laboratory.

  19. Heating-freezing effects on the orientation of kaolin clay particles

    DOE PAGES

    Jaradat, Karam A.; Darbari, Zubin; Elbakhshwan, Mohamed; ...

    2017-09-29

    The effects of temperature changes on the particle orientation of a consolidated kaolin are studied using XRD experiments. Here, two sets of equipment were utilized in this study: a benchtop equipment, and a synchrotron beamline at the National Synchrotron Light Source II (NSLS-II) at Brookhaven National Laboratory. The kaolin specimens tested in the benchtop XRD were subjected to elevated and freezing temperatures ex-situ, while those used for the NSLS-II experiment were exposed to the temperature changes in-situ. The temperatures considered in this study range from freezing (-10 °C) to elevated temperature below boiling (90 °C). The thermally-induced reorientation of claymore » mineral particles is highly dependent on the relative orientation of the clay mineral particles with respect to the applied thermal gradient. For example, kaolin samples with kaolinite particles oriented perpendicular to the thermal gradient, and to the expected thermally-induced pore water flow, experience much higher particles reorientations compared to samples with particles initially oriented parallel to the thermal gradient. Lastly, freezing kaolin preserved its microstructure as ice crystals form.« less

  20. Heating-freezing effects on the orientation of kaolin clay particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaradat, Karam A.; Darbari, Zubin; Elbakhshwan, Mohamed

    The effects of temperature changes on the particle orientation of a consolidated kaolin are studied using XRD experiments. Here, two sets of equipment were utilized in this study: a benchtop equipment, and a synchrotron beamline at the National Synchrotron Light Source II (NSLS-II) at Brookhaven National Laboratory. The kaolin specimens tested in the benchtop XRD were subjected to elevated and freezing temperatures ex-situ, while those used for the NSLS-II experiment were exposed to the temperature changes in-situ. The temperatures considered in this study range from freezing (-10 °C) to elevated temperature below boiling (90 °C). The thermally-induced reorientation of claymore » mineral particles is highly dependent on the relative orientation of the clay mineral particles with respect to the applied thermal gradient. For example, kaolin samples with kaolinite particles oriented perpendicular to the thermal gradient, and to the expected thermally-induced pore water flow, experience much higher particles reorientations compared to samples with particles initially oriented parallel to the thermal gradient. Lastly, freezing kaolin preserved its microstructure as ice crystals form.« less

  1. Dual-Wavelength Interferometry and Light Emission Study for Experimental Support of Dual-Wire Ablation Experiments

    NASA Astrophysics Data System (ADS)

    Hamilton, Andrew; Caplinger, James; Sotnikov, Vladimir; Sarkisov, Gennady; Leland, John

    2017-10-01

    In the Plasma Physics and Sensors Laboratory, located at Wright Patterson Air Force Base, we utilize a pulsed power source to create plasma through a wire ablation process of metallic wires. With a parallel arrangement of wires the azimuthal magnetic fields generated around each wire, along with the Ohmic current dissipation and heating occurring upon wire evaporation, launch strong radial outflows of magnetized plasmas towards the centralized stagnation region. It is in this region that we investigate two phases of the wire ablation process. Observations in the first phase are collsionless and mostly comprised of light ions ejected from the initial corona. The second phase is observed when the wire core is ablated and heavy ions dominate collisions in the stagnation region. In this presentation we will show how dual-wavelength interferometric techniques can provide information about electron and atomic densities from experiments. Additionally, we expect white-light emission to provide a qualitative confirmation of the instabilities observed from our experiments. The material is based upon work supported by the Air Force Office of Scientific Research under Award Number 16RYCOR289.

  2. Natural and laboratory compaction bands in porous carbonates: a three-dimensional characterization using synchrotron X-ray computed microtomography

    NASA Astrophysics Data System (ADS)

    Cilona, A.; Arzilli, F.; Mancini, L.; Emanuele, T.

    2014-12-01

    Porous carbonates form important reservoirs for water and hydrocarbons. The fluid flow properties of carbonate reservoirs may be affected by post-depositional processes (e.g., mechanical and chemical), which need to be quantified. Field-based studies described bed-parallel compaction bands (CBs) within carbonates with a wide range of porosities. These burial-related structures accommodate volumetric strain by grain rotation, translation, pore collapse and pressure solution. Recently, the same structures have been reproduced for the first time in the laboratory by performing triaxial compaction experiments on porous grainstones. These laboratory studies characterized and compared the microstructures of natural and laboratory CBs, but no analysis of pore connectivity has been performed. In this paper, we use an innovative approach to characterize the pore networks (e.g. porosity, connectivity) of natural and laboratory CBs and compare them with the host rock one. We collected the data using the synchrotron X-ray computed microtomography technique at the SYRMEP beamline of the Elettra-Sincrotrone Trieste Laboratory (Italy). Quantitative analyses of the samples were performed with the Pore3D software library. The porosity was calculated from segmented 3D images of pristine and deformed carbonates. A process of skeletonization was then applied to quantify the number of connected pores within the rock volume. The analysis of the skeleton allowed us to highlight the differences between natural and laboratory CBs, and to investigate how pore connectivity evolves as a function of different deformation pathways. Both pore volume and connectivity are reduced within the CBs respect to the pristine rock and the natural CB has a lower porosity with respect to the laboratory one. The grain contacts in the natural CB are welded, whereas in the laboratory one they have more irregular shapes and grain crushing is the predominant process.

  3. Fringe Capacitance of a Parallel-Plate Capacitor.

    ERIC Educational Resources Information Center

    Hale, D. P.

    1978-01-01

    Describes an experiment designed to measure the forces between charged parallel plates, and determines the relationship among the effective electrode area, the measured capacitance values, and the electrode spacing of a parallel plate capacitor. (GA)

  4. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  5. Decoupling Principle Analysis and Development of a Parallel Three-Dimensional Force Sensor

    PubMed Central

    Zhao, Yanzhi; Jiao, Leihao; Weng, Dacheng; Zhang, Dan; Zheng, Rencheng

    2016-01-01

    In the development of the multi-dimensional force sensor, dimension coupling is the ubiquitous factor restricting the improvement of the measurement accuracy. To effectively reduce the influence of dimension coupling on the parallel multi-dimensional force sensor, a novel parallel three-dimensional force sensor is proposed using a mechanical decoupling principle, and the influence of the friction on dimension coupling is effectively reduced by making the friction rolling instead of sliding friction. In this paper, the mathematical model is established by combining with the structure model of the parallel three-dimensional force sensor, and the modeling and analysis of mechanical decoupling are carried out. The coupling degree (ε) of the designed sensor is defined and calculated, and the calculation results show that the mechanical decoupling parallel structure of the sensor possesses good decoupling performance. A prototype of the parallel three-dimensional force sensor was developed, and FEM analysis was carried out. The load calibration and data acquisition experiment system are built, and then calibration experiments were done. According to the calibration experiments, the measurement accuracy is less than 2.86% and the coupling accuracy is less than 3.02%. The experimental results show that the sensor system possesses high measuring accuracy, which provides a basis for the applied research of the parallel multi-dimensional force sensor. PMID:27649194

  6. SCOUT: a small vacuum chamber for nano-wire grid polarizer tests in the ultraviolet band

    NASA Astrophysics Data System (ADS)

    Landini, F.; Pancrazzi, M.; Totaro, M.; Pennelli, G.; Romoli, M.

    2012-01-01

    Within the Section of Astronomy of the Department of Physics and Astronomy of the University of Firenze, Italy), the XUVLab laboratory is active since 1998 dedicated to technological development, mainly UV oriented. The technological research is focused both on electronics and optics. Our last approach is dedicated to the development of innovative wiregrid polarizers optimized to work in transmission at 121.6 nm. The manufacturing of such optical devices requires advanced technological expertise and suitable experimental structures. First, nanotechnology capability is necessary, in order to build several tiny parallel conductive lines separated by tens of nanometers on wide areas to be macroscopically exploitable in an optical laboratory. Moreover, the characterization of such an advanced optical device has to be performed in vacuum, being air absorptive at 121.6 nm. A dedicated small vacuum chamber, SCOUT (Small Chamber for Optical UV Tests) was developed within our laboratory in order to perform practical and fast measurements. SCOUT hosts an optical bench and is equipped with several opening flanges, in order to be as flexible as possible. The flexibility that has been reached with SCOUT allows us to use the chamber beyond the goals it was thought for. It is exploitable by whatever compact (within 1 m) optical experiment that investigates the UV band of the spectrum.

  7. Information engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, D.N.

    1997-02-01

    The Information Engineering thrust area develops information technology to support the programmatic needs of Lawrence Livermore National Laboratory`s Engineering Directorate. Progress in five programmatic areas are described in separate reports contained herein. These are entitled Three-dimensional Object Creation, Manipulation, and Transport, Zephyr:A Secure Internet-Based Process to Streamline Engineering Procurements, Subcarrier Multiplexing: Optical Network Demonstrations, Parallel Optical Interconnect Technology Demonstration, and Intelligent Automation Architecture.

  8. A Skyline Plugin for Pathway-Centric Data Browsing

    NASA Astrophysics Data System (ADS)

    Degan, Michael G.; Ryadinskiy, Lillian; Fujimoto, Grant M.; Wilkins, Christopher S.; Lichti, Cheryl F.; Payne, Samuel H.

    2016-11-01

    For targeted proteomics to be broadly adopted in biological laboratories as a routine experimental protocol, wet-bench biologists must be able to approach selected reaction monitoring (SRM) and parallel reaction monitoring (PRM) assay design in the same way they approach biological experimental design. Most often, biological hypotheses are envisioned in a set of protein interactions, networks, and pathways. We present a plugin for the popular Skyline tool that presents public mass spectrometry data in a pathway-centric view to assist users in browsing available data and determining how to design quantitative experiments. Selected proteins and their underlying mass spectra are imported to Skyline for further assay design (transition selection). The same plugin can be used for hypothesis-driven data-independent acquisition (DIA) data analysis, again utilizing the pathway view to help narrow down the set of proteins that will be investigated. The plugin is backed by the Pacific Northwest National Laboratory (PNNL) Biodiversity Library, a corpus of 3 million peptides from >100 organisms, and the draft human proteome. Users can upload personal data to the plugin to use the pathway navigation prior to importing their own data into Skyline.

  9. Locus and persistence of capacity limitations in visual information processing.

    PubMed

    Kleiss, J A; Lane, D M

    1986-05-01

    Although there is considerable evidence that stimuli such as digits and letters are extensively processed in parallel and without capacity limitations, recent data suggest that only the features of stimuli are processed in parallel. In an attempt to reconcile this discrepancy, we used the simultaneous/successive detection paradigm with stimuli from experiments indicating parallel processing and with stimuli from experiments indicating that only features can be processed in parallel. In Experiment 1, large differences between simultaneous and successive presentations were obtained with an R target among P and Q distractors and among P and B distractors, but not with digit targets among letter distractors. As predicted by the feature integration theory of attention, false-alarm rates in the simultaneous condition were much higher than in the successive condition with the R/PQ stimuli. In Experiment 2, the possibility that attention is required for any difficult discrimination was ruled out as an explanation of the discrepancy between the digit/letter results and the R/PQ and R/PB results. Experiment 3A replicated the R/PQ and R/PB results of Experiment 1, and Experiment 3B extended these findings to a new set of stimuli. In Experiment 4, we found that large amounts of consistent practice did not generally eliminate capacity limitations. From this series of experiments we strongly conclude that the notion of capacity-free letter perception has limited generality.

  10. Quantitative and qualitative measure of intralaboratory two-dimensional protein gel reproducibility and the effects of sample preparation, sample load, and image analysis.

    PubMed

    Choe, Leila H; Lee, Kelvin H

    2003-10-01

    We investigate one approach to assess the quantitative variability in two-dimensional gel electrophoresis (2-DE) separations based on gel-to-gel variability, sample preparation variability, sample load differences, and the effect of automation on image analysis. We observe that 95% of spots present in three out of four replicate gels exhibit less than a 0.52 coefficient of variation (CV) in fluorescent stain intensity (% volume) for a single sample run on multiple gels. When four parallel sample preparations are performed, this value increases to 0.57. We do not observe any significant change in quantitative value for an increase or decrease in sample load of 30% when using appropriate image analysis variables. Increasing use of automation, while necessary in modern 2-DE experiments, does change the observed level of quantitative and qualitative variability among replicate gels. The number of spots that change qualitatively for a single sample run in parallel varies from a CV = 0.03 for fully manual analysis to CV = 0.20 for a fully automated analysis. We present a systematic method by which a single laboratory can measure gel-to-gel variability using only three gel runs.

  11. Propagation of acoustic shock waves between parallel rigid boundaries and into shadow zones

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desjouy, C., E-mail: cyril.desjouy@gmail.com; Ollivier, S.; Dragna, D.

    2015-10-28

    The study of acoustic shock propagation in complex environments is of great interest for urban acoustics, but also for source localization, an underlying problematic in military applications. To give a better understanding of the phenomenon taking place during the propagation of acoustic shocks, laboratory-scale experiments and numerical simulations were performed to study the propagation of weak shock waves between parallel rigid boundaries, and into shadow zones created by corners. In particular, this work focuses on the study of the local interactions taking place between incident, reflected, and diffracted waves according to the geometry in both regular or irregular – alsomore » called Von Neumann – regimes of reflection. In this latter case, an irregular reflection can lead to the formation of a Mach stem that can modify the spatial distribution of the acoustic pressure. Short duration acoustic shock waves were produced by a 20 kilovolts electric spark source and a schlieren optical method was used to visualize the incident shockfront and the reflection/diffraction patterns. Experimental results are compared to numerical simulations based on the high-order finite difference solution of the two dimensional Navier-Stokes equations.« less

  12. Numerical Analysis of Ginzburg-Landau Models for Superconductivity.

    NASA Astrophysics Data System (ADS)

    Coskun, Erhan

    Thin film conventional, as well as High T _{c} superconductors of various geometric shapes placed under both uniform and variable strength magnetic field are studied using the universially accepted macroscopic Ginzburg-Landau model. A series of new theoretical results concerning the properties of solution is presented using the semi -discrete time-dependent Ginzburg-Landau equations, staggered grid setup and natural boundary conditions. Efficient serial algorithms including a novel adaptive algorithm is developed and successfully implemented for solving the governing highly nonlinear parabolic system of equations. Refinement technique used in the adaptive algorithm is based on modified forward Euler method which was also developed by us to ease the restriction on time step size for stability considerations. Stability and convergence properties of forward and modified forward Euler schemes are studied. Numerical simulations of various recent physical experiments of technological importance such as vortes motion and pinning are performed. The numerical code for solving time-dependent Ginzburg-Landau equations is parallelized using BlockComm -Chameleon and PCN. The parallel code was run on the distributed memory multiprocessors intel iPSC/860, IBM-SP1 and cluster of Sun Sparc workstations, all located at Mathematics and Computer Science Division, Argonne National Laboratory.

  13. A parallel Jacobson-Oksman optimization algorithm. [parallel processing (computers)

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Markos, A. T.

    1975-01-01

    A gradient-dependent optimization technique which exploits the vector-streaming or parallel-computing capabilities of some modern computers is presented. The algorithm, derived by assuming that the function to be minimized is homogeneous, is a modification of the Jacobson-Oksman serial minimization method. In addition to describing the algorithm, conditions insuring the convergence of the iterates of the algorithm and the results of numerical experiments on a group of sample test functions are presented. The results of these experiments indicate that this algorithm will solve optimization problems in less computing time than conventional serial methods on machines having vector-streaming or parallel-computing capabilities.

  14. Experimental determination of interfacial tension by different dynamical methods under simple shear flow conditions with a novel computer-controlled parallel band apparatus.

    PubMed

    Megías-Alguacil, David; Fischer, Peter; Windhab, Erich J

    2004-06-15

    We present experimental investigations on droplet deformation under simple shear flow conditions, using a computer-controlled parallel band apparatus and an optical device which allows us to record the time dependence of the droplet shape. Several methods are applied to determine the interfacial tension from the observed shape and relaxation mechanism. Specific software developed in our laboratory allows the droplet to be fixed in a certain position for extended times, in fact, indefinite. This is an advantage over most other work done in this area, where only limited time is available. In our experiments, the transient deformation of sheared droplets can be observed to reach the steady state. The measured systems were Newtonian, both droplet and fluid phase. Droplet deformation, orientation angle and retraction were studied and compared to several models. The interfacial tension of the different systems was calculated using the theories of Taylor, Rallison, and Hinch and Acrivos. The results obtained from the analysis of the droplet deformation were in very good agreement with drop detachment experiments of Feigl and co-workers. The study of orientation angle shows qualitative agreement to the theory of Hinch and Acrivos but reveals larger quantitative discrepancies for several empirical fitting parameters of the used model. Analysis of the relaxation of sheared drops provided estimates of the interfacial tension that were in very good agreement with the steady-state measurements.

  15. Progress on the Multiphysics Capabilities of the Parallel Electromagnetic ACE3P Simulation Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kononenko, Oleksiy

    2015-03-26

    ACE3P is a 3D parallel simulation suite that is being developed at SLAC National Accelerator Laboratory. Effectively utilizing supercomputer resources, ACE3P has become a key tool for the coupled electromagnetic, thermal and mechanical research and design of particle accelerators. Based on the existing finite-element infrastructure, a massively parallel eigensolver is developed for modal analysis of mechanical structures. It complements a set of the multiphysics tools in ACE3P and, in particular, can be used for the comprehensive study of microphonics in accelerating cavities ensuring the operational reliability of a particle accelerator.

  16. Optimisation of insect cell growth in deep-well blocks: development of a high-throughput insect cell expression screen.

    PubMed

    Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian

    2005-01-01

    This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garimella, Sarvesh; Kristensen, Thomas Bjerring; Ignatius, Karolina

    The SPectrometer for Ice Nuclei (SPIN) is a commercially available ice nucleating particle (INP) counter manufactured by Droplet Measurement Technologies in Boulder, CO. The SPIN is a continuous flow diffusion chamber with parallel plate geometry based on the Zurich Ice Nucleation Chamber and the Portable Ice Nucleation Chamber. This study presents a standard description for using the SPIN instrument and also highlights methods to analyze measurements in more advanced ways. It characterizes and describes the behavior of the SPIN chamber, reports data from laboratory measurements, and quantifies uncertainties associated with the measurements. Experiments with ammonium sulfate are used to investigatemore » homogeneous freezing of deliquesced haze droplets and droplet breakthrough. Experiments with kaolinite, NX illite, and silver iodide are used to investigate heterogeneous ice nucleation. SPIN nucleation results are compared to those from the literature. A machine learning approach for analyzing depolarization data from the SPIN optical particle counter is also presented (as an advanced use). Altogether, we report that the SPIN is able to reproduce previous INP counter measurements.« less

  18. Impact of reproductive laws on maternal mortality: the chilean natural experiment.

    PubMed

    Koch, Elard

    2013-05-01

    Improving maternal health and decreasing morbidity and mortality due to induced abortion are key endeavors in developing countries. One of the most controversial subjects surrounding interventions to improve maternal health is the effect of abortion laws. Chile offers a natural laboratory to perform an investigation on the determinants influencing maternal health in a large parallel time-series of maternal deaths, analyzing health and socioeconomic indicators, and legislative policies including abortion banning in 1989. Interestingly, abortion restriction in Chile was not associated with an increase in overall maternal mortality or with abortion deaths and total number of abortions. Contrary to the notion proposing a negative impact of restrictive abortion laws on maternal health, the abortion mortality ratio did not increase after the abortion ban in Chile. Rather, it decreased over 96 percent, from 10.8 to 0.39 per 100,000 live births. Thus, the Chilean natural experiment provides for the first time, strong evidence supporting the hypothesis that legalization of abortion is unnecessary to improve maternal health in Latin America.

  19. Results of the multiwell experiment in situ stresses, natural fractures, and other geological controls on reservoirs

    NASA Astrophysics Data System (ADS)

    Lorenz, John C.; Warpinski, Norman R.; Teufel, Lawrence W.; Branagan, Paul T.; Sattler, Allan R.; Northrop, David A.

    Hundreds of millions of cubic meters of natural gas are locked up in low-permeability, natural gas reservoirs. The Multiwell Experiment (MWX) was designed to characterize such reservoirs, typical of much of the western United States, and to assess and develop a technology for the production of this unconventional resource. Flow-rate tests of the MWX reservoirs indicate a system permeability that is several orders of magnitude higher than laboratory permeability measurements made on matrix-rock sandstones. This enhanced permeability is caused by natural fractures. The single set of fractures present in the reservoirs provides a significant permeability anisotropy that is aligned with the maximum in situ horizontal stress. Hydraulic fractures therefore form parallel to the natural fractures and are consequently an inefficient mechanism for stimulation. Successful stimulation may be possible by perturbing the local stress field with a large hydraulic fracture in one well so that a second hydraulic fracture in an offset well propagates transverse to the natural fracture permeability trend.

  20. CO 2-induced chemo-mechanical alteration in reservoir rocks assessed via batch reaction experiments and scratch testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aman, Michael; Espinoza, D. Nicolas; Ilgen, Anastasia G.

    Here, the injection of carbon dioxide (CO 2) into geological formations results in a chemical re-equilibration between the mineral assemblage and the pore fluid, with ensuing mineral dissolution and re-precipitation. Hence, target rock formations may exhibit changes of mechanical and petrophysical properties due to CO 2 exposure. We conducted batch reaction experiments with Entrada Sandstone and Summerville Siltstone exposed to de-ionized water and synthetic brine under reservoir pressure (9–10 MPa) and temperature (80°C) for up to four weeks. Samples originate from the Crystal Geyser field site, where a naturally occurring CO 2 seepage alters portions of these geologic formations. Wemore » conducted micro-scratch tests on rock samples without alteration, altered under laboratory conditions, and naturally altered over geologic time. Scratch toughness and hardness decrease as a function of exposure time and water salinity up to 52% in the case of Entrada and 87% in the case of Summerville after CO 2-induced alteration in the laboratory. Imaging of altered cores with SEM-EDS and X-ray microCT methods show dissolution of carbonate and silica cements and matrix accompanied by minor dissolution of Fe-oxides, clays, and other silicates. Parallel experiments using powdered samples confirm that dissolution of carbonate and silica are the primary reactions. The batch reaction experiments in the autoclave utilize a high fluid to rock volume ratio and represent an end member of possible alteration associated with CO 2 storage systems. These types of tests serve as a pre-screening tool to identify the susceptibility of rock facies to CO 2-related chemical-mechanical alteration during long-term CO 2 storage.« less

  1. CO 2-induced chemo-mechanical alteration in reservoir rocks assessed via batch reaction experiments and scratch testing

    DOE PAGES

    Aman, Michael; Espinoza, D. Nicolas; Ilgen, Anastasia G.; ...

    2017-09-22

    Here, the injection of carbon dioxide (CO 2) into geological formations results in a chemical re-equilibration between the mineral assemblage and the pore fluid, with ensuing mineral dissolution and re-precipitation. Hence, target rock formations may exhibit changes of mechanical and petrophysical properties due to CO 2 exposure. We conducted batch reaction experiments with Entrada Sandstone and Summerville Siltstone exposed to de-ionized water and synthetic brine under reservoir pressure (9–10 MPa) and temperature (80°C) for up to four weeks. Samples originate from the Crystal Geyser field site, where a naturally occurring CO 2 seepage alters portions of these geologic formations. Wemore » conducted micro-scratch tests on rock samples without alteration, altered under laboratory conditions, and naturally altered over geologic time. Scratch toughness and hardness decrease as a function of exposure time and water salinity up to 52% in the case of Entrada and 87% in the case of Summerville after CO 2-induced alteration in the laboratory. Imaging of altered cores with SEM-EDS and X-ray microCT methods show dissolution of carbonate and silica cements and matrix accompanied by minor dissolution of Fe-oxides, clays, and other silicates. Parallel experiments using powdered samples confirm that dissolution of carbonate and silica are the primary reactions. The batch reaction experiments in the autoclave utilize a high fluid to rock volume ratio and represent an end member of possible alteration associated with CO 2 storage systems. These types of tests serve as a pre-screening tool to identify the susceptibility of rock facies to CO 2-related chemical-mechanical alteration during long-term CO 2 storage.« less

  2. Xyce Parallel Electronic Simulator Users Guide Version 6.2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. Trademarks The information herein is subject to change without notice. Copyright c 2002-2014 Sandia Corporation. All rights reserved. Xyce TM Electronic Simulator and Xyce TM are trademarks of Sandia Corporation. Portions of the Xyce TM code are: Copyright c 2002, The Regents of the University of California. Produced at the Lawrence Livermore National Laboratory. Written by Alan Hindmarsh, Allan Taylor, Radu Serban. UCRL-CODE-2002-59 All rights reserved. Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence Design Systems, Inc. Microsoft, Windows and Windows 7 are registered trademarks of Microsoft Corporation. Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation. Amtec and TecPlot are trademarks of Amtec Engineering, Inc. Xyce 's expression library is based on that inside Spice 3F5 developed by the EECS Department at the University of California. The EKV3 MOSFET model was developed by the EKV Team of the Electronics Laboratory-TUC of the Technical University of Crete. All other trademarks are property of their respective owners. Contacts Bug Reports (Sandia only) http://joseki.sandia.gov/bugzilla http://charleston.sandia.gov/bugzilla World Wide Web http://xyce.sandia.gov http://charleston.sandia.gov/xyce (Sandia only) Email xyce@sandia.gov (outside Sandia) xyce-sandia@sandia.gov (Sandia only)« less

  3. Xyce Parallel Electronic Simulator Users Guide Version 6.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. Trademarks The information herein is subject to change without notice. Copyright c 2002-2015 Sandia Corporation. All rights reserved. Xyce TM Electronic Simulator and Xyce TM are trademarks of Sandia Corporation. Portions of the Xyce TM code are: Copyright c 2002, The Regents of the University of California. Produced at the Lawrence Livermore National Laboratory. Written by Alan Hindmarsh, Allan Taylor, Radu Serban. UCRL-CODE-2002-59 All rights reserved. Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence Design Systems, Inc. Microsoft, Windows and Windows 7 are registered trademarks of Microsoft Corporation. Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation. Amtec and TecPlot are trademarks of Amtec Engineering, Inc. Xyce 's expression library is based on that inside Spice 3F5 developed by the EECS Department at the University of California. The EKV3 MOSFET model was developed by the EKV Team of the Electronics Laboratory-TUC of the Technical University of Crete. All other trademarks are property of their respective owners. Contacts Bug Reports (Sandia only) http://joseki.sandia.gov/bugzilla http://charleston.sandia.gov/bugzilla World Wide Web http://xyce.sandia.gov http://charleston.sandia.gov/xyce (Sandia only) Email xyce@sandia.gov (outside Sandia) xyce-sandia@sandia.gov (Sandia only)« less

  4. Programming for 1.6 Millon cores: Early experiences with IBM's BG/Q SMP architecture

    NASA Astrophysics Data System (ADS)

    Glosli, James

    2013-03-01

    With the stall in clock cycle improvements a decade ago, the drive for computational performance has continues along a path of increasing core counts on a processor. The multi-core evolution has been expressed in both a symmetric multi processor (SMP) architecture and cpu/GPU architecture. Debates rage in the high performance computing (HPC) community which architecture best serves HPC. In this talk I will not attempt to resolve that debate but perhaps fuel it. I will discuss the experience of exploiting Sequoia, a 98304 node IBM Blue Gene/Q SMP at Lawrence Livermore National Laboratory. The advantages and challenges of leveraging the computational power BG/Q will be detailed through the discussion of two applications. The first application is a Molecular Dynamics code called ddcMD. This is a code developed over the last decade at LLNL and ported to BG/Q. The second application is a cardiac modeling code called Cardioid. This is a code that was recently designed and developed at LLNL to exploit the fine scale parallelism of BG/Q's SMP architecture. Through the lenses of these efforts I'll illustrate the need to rethink how we express and implement our computational approaches. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  5. Microphysics of Magnetic Reconnection: Experiments on RSX and Simulation

    NASA Astrophysics Data System (ADS)

    Intrator, T. P.; Furno, I. G.; Hsu, S. C.; Lapenta, G.; Ricci, P.

    2003-12-01

    Using a unique LANL laboratory facility, the Reconnection Scaling Experiment (RSX), and a state-of-the-art LANL numerical code, CELESTE3D, we are beginning an experimental and numerical study of the microphysics of 2D and 3D "fast magnetic reconnection". RSX at Los Alamos National Laboratory is already operational and producing research plasmas. In RSX, the radial boundaries and thus the reconnection geometry are not constrained to two dimensions. It is capable of investigating 3D magnetic reconnection occurring in a free-boundary 3D linear geometry during the coalescence of two parallel current plasma channels, which are produced by using plasma gun technology. RSX can also scale the guide field (ion gyroradius) independently of other reconnection parameters. Frontier reconnection research invokes (1) `anomalous' microinstability-induced resistivity, which enhances dissipation rates inside the reconnection layer and (2) terms of the two-fluid generalized Ohm's law which introduce whistler and kinetic Alfvén wave dynamics. The two-fluid approach predicts (a) a two-spatial-scale spatial structure of the reconnection layer, with outer (inner) thickness equal to the ion (electron) skin depth and (b) Hall currents in the reconnection plane and out-of-plane magnetic field on the electron scale. We will show spatially resolved RSX experimental measurements of the dynamics of the reconnection layer, and take advantage of our scaling capabilities to address the applicability of the two-fluid approach.

  6. Landfill leachate management in Istanbul: applications and alternatives.

    PubMed

    Calli, Baris; Mertoglu, Bulent; Inanc, Bulent

    2005-05-01

    Treatment alternatives for Istanbul, Komurcuoda Landfill (KL) leachate that is currently transported to the nearest central wastewater treatment plant were comprehensively investigated with laboratory scale experiments. As flow rate of leachate increases parallel to increment in landfilled solid waste, an individual treatment will be needed to reduce the transportation cost and pollution load on central treatment. However, if the leachate is separately treated and discharged to a brook, in that case more stringent discharge standards will be valid and therefore advanced processes in addition to conventional ones should be included. In laboratory scale experiments, the young landfill leachate having BOD5/COD ratio above 0.6 was successfully treated with efficiencies above 90% in upflow anaerobic reactors if pH is kept below free ammonia inhibition level. Subsequently, nitrification of anaerobically treated leachate was performed with rates of about 8.5 mg NH4+-Ng-1 VSS h-1 and efficiencies above 99% were provided with automated pH regulation by using sodium bicarbonate. Furthermore, denitrification rates as high as 8.1 mg NOx-N g-1VSS h-1 was obtained when carbon source was externally supplied. In addition to nitrification and denitrification, air stripping and struvite precipitation were also applied to remove ammonia in leachate and in average 94% and 98% efficiencies were achieved, respectively. Finally, in average 85% of biologically inert COD was successfully removed by using either ozone or Fenton's oxidation.

  7. Cool Flames in Propane-Oxygen Premixtures at Low and Intermediate Temperatures at Reduced-Gravity

    NASA Technical Reports Server (NTRS)

    Pearlman, Howard; Foster, Michael; Karabacak, Devrez

    2003-01-01

    The Cool Flame Experiment aims to address the role of diffusive transport on the structure and the stability of gas-phase, non-isothermal, hydrocarbon oxidation reactions, cool flames and auto-ignition fronts in an unstirred, static reactor. These reactions cannot be studied on Earth where natural convection due to self-heating during the course of slow reaction dominates diffusive transport and produces spatio-temporal variations in the thermal and thus species concentration profiles. On Earth, reactions with associated Rayleigh numbers (Ra) less than the critical Ra for onset of convection (Ra(sub cr) approx. 600) cannot be achieved in laboratory-scale vessels for conditions representative of nearly all low-temperature reactions. In fact, the Ra at 1g ranges from 10(exp 4) - 10(exp 5) (or larger), while at reduced-gravity, these values can be reduced two to six orders of magnitude (below Ra(sub cr)), depending on the reduced-gravity test facility. Currently, laboratory (1g) and NASA s KC-135 reduced-gravity (g) aircraft studies are being conducted in parallel with the development of a detailed chemical kinetic model that includes thermal and species diffusion. Select experiments have also been conducted at partial gravity (Martian, 0.3gearth) aboard the KC-135 aircraft. This paper discusses these preliminary results for propane-oxygen premixtures in the low to intermediate temperature range (310- 350 C) at reduced-gravity.

  8. Process optimization using combinatorial design principles: parallel synthesis and design of experiment methods.

    PubMed

    Gooding, Owen W

    2004-06-01

    The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.

  9. Further Development and Validation of the frog Embryo Teratogenesis Assay - Xenopus (FETAX)

    DTIC Science & Technology

    1991-02-28

    abnormalities.39 40 The teratogenic effects of serotonin in the laboratory rat include anophthalmia , hydrocephalus, exencephaly, omphalocoele and vacuolization...kinky tail. ZnSO4 in Xenopus, should be tested in parallel with hemangioma. anophthalmia and scoliosis). Skeletal a metabolic activation system to show...teratogenic effects of 0 serotonin in the laboratory rat include anophthalmia , hydrocephalus, exencephaly, omphalocele and vacuolization of myocardial cells.41

  10. A Framework for Laboratory Pre-Work Based on the Concepts, Tools and Techniques Questioning Method

    ERIC Educational Resources Information Center

    Huntula, J.; Sharma, M. D.; Johnston, I.; Chitaree, R.

    2011-01-01

    Learning in the laboratory is different from learning in other contexts because students have to engage with various aspects of the practice of science. They have to use many skills and knowledge in parallel--not only to understand the concepts of physics but also to use the tools and analyse the data. The question arises, how to best guide…

  11. Forecasting Three-Month Outcomes in a Laboratory School Comparison of Mixed Amphetamine Salts Extended Release (Adderall XR) and Atomoxetine (Strattera) in School-Aged Children with ADHD

    ERIC Educational Resources Information Center

    Faraone, Stephen V.; Wigal, Sharon B.; Hodgkins, Paul

    2007-01-01

    Objective: Compare observed and forecasted efficacy of mixed amphetamine salts extended release (MAS-XR; Adderall) with atomoxetine (Strattera) in ADHD children. Method: The authors analyze data from a randomized, double-blind, multicenter, parallel-group, forced-dose-escalation laboratory school study of children ages 6 to 12 with ADHD combined…

  12. Development of a parallel FE simulator for modeling the whole trans-scale failure process of rock from meso- to engineering-scale

    NASA Astrophysics Data System (ADS)

    Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao

    2017-01-01

    Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.

  13. Evaluation of selected chemical processes for production of low-cost silicon, phase 3

    NASA Technical Reports Server (NTRS)

    Blocher, J. M., Jr.; Browning, M. F.; Seifert, D. A.

    1981-01-01

    A Process Development Unit (PDU), which consisted of the four major units of the process, was designed, installed, and experimentally operated. The PDU was sized to 50MT/Yr. The deposition took place in a fluidized bed reactor. As a consequences of the experiments, improvements in the design an operation of these units were undertaken and their experimental limitations were partially established. A parallel program of experimental work demonstrated that Zinc can be vaporized for introduction into the fluidized bed reactor, by direct induction-coupled r.f. energy. Residual zinc in the product can be removed by heat treatment below the melting point of silicon. Current efficiencies of 94 percent and above, and power efficiencies around 40 percent are achievable in the laboratory-scale electrolysis of ZnCl2.

  14. Investigations of the Rayleigh-Taylor and Richtmyer-Meshkov Instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riccardo Bonazza; Mark Anderson; Jason Oakley

    2008-03-14

    The present program is centered on the experimental study of shock-induced interfacial fluid instabilities. Both 2-D (near-sinusoids) and 3-D (spheres) initial conditions are studied in a large, vertical square shock tube facility. The evolution of the interface shape, its distortion, the modal growth rates and the mixing of the fluids at the interface are all objectives of the investigation. In parallel to the experiments, calculations are performed using the Raptor code, on platforms made available by LLNL. These flows are of great relevance to both ICF and stockpile stewardship. The involvement of four graduate students is in line with themore » national laboratories' interest in the education of scientists and engineers in disciplines and technologies consistent with the labs' missions and activities.« less

  15. Investigation of the Richtmyer-Meshkov instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riccardo Bonazza; Mark Anderson; Jason Oakley

    2008-12-22

    The present program is centered on the experimental study of shock-induced interfacial fluid instabilities. Both 2-D (near-sinusoids) and 3-D (spheres) initial conditions are studied in a large, vertical square shock tube facility. The evolution of the interface shape, its distortion, the modal growth rates and the mixing of the fluids at the interface are all objectives of the investigation. In parallel to the experiments, calculations are performed using the Raptor code, on platforms made available by LLNL. These flows are of great relevance to both ICF and stockpile stewardship. The involvement of three graduate students is in line with themore » national laboratories' interest in the education of scientists and engineers in disciplines and technologies consistent with the labs' missions and activities.« less

  16. Parallel processing of genomics data

    NASA Astrophysics Data System (ADS)

    Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-10-01

    The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.

  17. Simultaneous hydrogenation and UV-photolysis experiments of NO in CO-rich interstellar ice analogues; linking HNCO, OCN-, NH2CHO, and NH2OH

    NASA Astrophysics Data System (ADS)

    Fedoseev, G.; Chuang, K.-J.; van Dishoeck, E. F.; Ioppolo, S.; Linnartz, H.

    2016-08-01

    The laboratory work presented here simulates the chemistry on icy dust grains as typical for the `CO freeze-out stage' in dark molecular clouds. It differs from previous studies in that solid-state hydrogenation and vacuum UV photoprocessing are applied simultaneously to co-depositing molecules. In parallel, the reactions at play are described for fully characterized laboratory conditions. The focus is on the formation of molecules containing both carbon and nitrogen atoms, starting with NO in CO-, H2CO-, and CH3OH-rich ices at 13 K. The experiments yield three important conclusions. (1) Without UV processing hydroxylamine (NH2OH) is formed, as reported previously. (2) With UV processing (energetic) NH2 is formed through photodissociation of NH2OH. This radical is key in the formation of species with an N-C bond. (3) The formation of three N-C bearing species, HNCO, OCN-, and NH2CHO, is observed. The experiments put a clear chemical link between these species; OCN- is found to be a direct derivative of HNCO and the latter is shown to have the same precursor as formamide (NH2CHO). Moreover, the addition of VUV competing channels decreases the amount of NO molecules converted into NH2OH by at least one order of magnitude. Consequently, this decrease in NH2OH formation yield directly influences the amount of NO molecules that can be converted into HNCO, OCN-, and NH2CHO.

  18. Overview of the Focused Isoprene eXperiment at the California Institute of Technology (FIXCIT): mechanistic chamber studies on the oxidation of biogenic compounds

    DOE PAGES

    Nguyen, T. B.; Crounse, J. D.; Schwantes, R. H.; ...

    2014-12-19

    The Focused Isoprene eXperiment at the California Institute of Technology (FIXCIT) was a collaborative atmospheric chamber campaign that occurred during January 2014. FIXCIT is the laboratory component of a synergistic field and laboratory effort aimed toward (1) better understanding the chemical details behind ambient observations relevant to the southeastern United States, (2) advancing the knowledge of atmospheric oxidation mechanisms of important biogenic hydrocarbons, and (3) characterizing the behavior of field instrumentation using authentic standards. Approximately 20 principal scientists from 14 academic and government institutions performed parallel measurements at a forested site in Alabama and at the atmospheric chambers at Caltech.more » During the 4 week campaign period, a series of chamber experiments was conducted to investigate the dark- and photo-induced oxidation of isoprene, α-pinene, methacrolein, pinonaldehyde, acylperoxy nitrates, isoprene hydroxy nitrates (ISOPN), isoprene hydroxy hydroperoxides (ISOPOOH), and isoprene epoxydiols (IEPOX) in a highly controlled and atmospherically relevant manner. Pinonaldehyde and isomer-specific standards of ISOPN, ISOPOOH, and IEPOX were synthesized and contributed by campaign participants, which enabled explicit exploration into the oxidation mechanisms and instrument responses for these important atmospheric compounds. The present overview describes the goals, experimental design, instrumental techniques, and preliminary observations from the campaign. This work provides context for forthcoming publications affiliated with the FIXCIT campaign. Insights from FIXCIT are anticipated to aid significantly in interpretation of field data and the revision of mechanisms currently implemented in regional and global atmospheric models.« less

  19. Bacterial Communities of Diverse Drosophila Species: Ecological Context of a Host–Microbe Model System

    PubMed Central

    Bhatnagar, Srijak; Eisen, Jonathan A.; Kopp, Artyom

    2011-01-01

    Drosophila melanogaster is emerging as an important model of non-pathogenic host–microbe interactions. The genetic and experimental tractability of Drosophila has led to significant gains in our understanding of animal–microbial symbiosis. However, the full implications of these results cannot be appreciated without the knowledge of the microbial communities associated with natural Drosophila populations. In particular, it is not clear whether laboratory cultures can serve as an accurate model of host–microbe interactions that occur in the wild, or those that have occurred over evolutionary time. To fill this gap, we characterized natural bacterial communities associated with 14 species of Drosophila and related genera collected from distant geographic locations. To represent the ecological diversity of Drosophilids, examined species included fruit-, flower-, mushroom-, and cactus-feeders. In parallel, wild host populations were compared to laboratory strains, and controlled experiments were performed to assess the importance of host species and diet in shaping bacterial microbiome composition. We find that Drosophilid flies have taxonomically restricted bacterial communities, with 85% of the natural bacterial microbiome composed of only four bacterial families. The dominant bacterial taxa are widespread and found in many different host species despite the taxonomic, ecological, and geographic diversity of their hosts. Both natural surveys and laboratory experiments indicate that host diet plays a major role in shaping the Drosophila bacterial microbiome. Despite this, the internal bacterial microbiome represents only a highly reduced subset of the external bacterial communities, suggesting that the host exercises some level of control over the bacteria that inhabit its digestive tract. Finally, we show that laboratory strains provide only a limited model of natural host–microbe interactions. Bacterial taxa used in experimental studies are rare or absent in wild Drosophila populations, while the most abundant associates of natural Drosophila populations are rare in the lab. PMID:21966276

  20. Innovative mathematical modeling in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yeh, Gour T.; National Central Univ.; Univ. of Central Florida

    2013-05-01

    There are two different ways to model reactive transport: ad hoc and innovative reaction-based approaches. The former, such as the Kd simplification of adsorption, has been widely employed by practitioners, while the latter has been mainly used in scientific communities for elucidating mechanisms of biogeochemical transport processes. It is believed that innovative mechanistic-based models could serve as protocols for environmental remediation as well. This paper reviews the development of a mechanistically coupled fluid flow, thermal transport, hydrologic transport, and reactive biogeochemical model and example-applications to environmental remediation problems. Theoretical bases are sufficiently described. Four example problems previously carried out aremore » used to demonstrate how numerical experimentation can be used to evaluate the feasibility of different remediation approaches. The first one involved the application of a 56-species uranium tailing problem to the Melton Branch Subwatershed at Oak Ridge National Laboratory (ORNL) using the parallel version of the model. Simulations were made to demonstrate the potential mobilization of uranium and other chelating agents in the proposed waste disposal site. The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium. The third example simulated laboratory experiments involving extremely high concentrations of uranium, technetium, aluminum, nitrate, and toxic metals (e.g.,Ni, Cr, Co).The fourth example modeled microbially-mediated immobilization of uranium in an unconfined aquifer using acetate amendment in a field-scale experiment. The purposes of these modeling studies were to simulate various mechanisms of mobilization and immobilization of radioactive wastes and to illustrate how to apply reactive transport models for environmental remediation.The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium.« less

  1. Estimation of small-scale soil erosion in laboratory experiments with Structure from Motion photogrammetry

    NASA Astrophysics Data System (ADS)

    Balaguer-Puig, Matilde; Marqués-Mateu, Ángel; Lerma, José Luis; Ibáñez-Asensio, Sara

    2017-10-01

    The quantitative estimation of changes in terrain surfaces caused by water erosion can be carried out from precise descriptions of surfaces given by means of digital elevation models (DEMs). Some stages of water erosion research efforts are conducted in the laboratory using rainfall simulators and soil boxes with areas less than 1 m2. Under these conditions, erosive processes can lead to very small surface variations and high precision DEMs are needed to account for differences measured in millimetres. In this paper, we used a photogrammetric Structure from Motion (SfM) technique to build DEMs of a 0.5 m2 soil box to monitor several simulated rainfall episodes in the laboratory. The technique of DEM of difference (DoD) was then applied using GIS tools to compute estimates of volumetric changes between each pair of rainfall episodes. The aim was to classify the soil surface into three classes: erosion areas, deposition areas, and unchanged or neutral areas, and quantify the volume of soil that was eroded and deposited. We used a thresholding criterion of changes based on the estimated error of the difference of DEMs, which in turn was obtained from the root mean square error of the individual DEMs. Experimental tests showed that the choice of different threshold values in the DoD can lead to volume differences as large as 60% when compared to the direct volumetric difference. It turns out that the choice of that threshold was a key point in this method. In parallel to photogrammetric work, we collected sediments from each rain episode and obtained a series of corresponding measured sediment yields. The comparison between computed and measured sediment yields was significantly correlated, especially when considering the accumulated value of the five simulations. The computed sediment yield was 13% greater than the measured sediment yield. The procedure presented in this paper proved to be suitable for the determination of sediment yields in rainfall-driven soil erosion experiments conducted in the laboratory.

  2. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    NASA Astrophysics Data System (ADS)

    Nash, Thomas

    1989-12-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC system, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described.

  3. Digital lock-in amplifier based on soundcard interface for physics laboratory

    NASA Astrophysics Data System (ADS)

    Sinlapanuntakul, J.; Kijamnajsuk, P.; Jetjamnong, C.; Chotikaprakhan, S.

    2017-09-01

    The purpose of this paper is to develop a digital lock-in amplifier based on soundcard interface for undergraduate physics laboratory. Both series and parallel RLC circuit laboratory are tested because of its well-known, easy to understand and simple confirm. The sinusoidal signal at the frequency of 10 Hz - 15 kHz is generated to the circuits. The amplitude and phase of the voltage drop across the resistor, R are measured in 10 step decade. The signals from soundcard interface and lock-in amplifier are compared. The results give a good correlation. It indicates that the design digital lock-in amplifier is promising for undergraduate physic laboratory.

  4. Using HeLa cell stress response to introduce first year students to the scientific method, laboratory techniques, primary literature, and scientific writing.

    PubMed

    Resendes, Karen K

    2015-01-01

    Incorporating scientific literacy into inquiry driven research is one of the most effective mechanisms for developing an undergraduate student's strength in writing. Additionally, discovery-based laboratories help develop students who approach science as critical thinkers. Thus, a three-week laboratory module for an introductory cell and molecular biology course that couples inquiry-based experimental design with extensive scientific writing was designed at Westminster College to expose first year students to these concepts early in their undergraduate career. In the module students used scientific literature to design and then implement an experiment on the effect of cellular stress on protein expression in HeLa cells. In parallel the students developed a research paper in the style of the undergraduate journal BIOS to report their results. HeLa cells were used to integrate the research experience with the Westminster College "Next Chapter" first year program, in which the students explored the historical relevance of HeLa cells from a sociological perspective through reading The Immortal Life of Henrietta Lacks by Rebecca Skloot. In this report I detail the design, delivery, student learning outcomes, and assessment of this module, and while this exercise was designed for an introductory course at a small primarily undergraduate institution, suggestions for modifications at larger universities or for upper division courses are included. Finally, based on student outcomes suggestions are provided for improving the module to enhance the link between teaching students skills in experimental design and execution with developing student skills in information literacy and writing. © 2015 The International Union of Biochemistry and Molecular Biology.

  5. Innovative mathematical modeling in environmental remediation.

    PubMed

    Yeh, Gour-Tsyh; Gwo, Jin-Ping; Siegel, Malcolm D; Li, Ming-Hsu; Fang, Yilin; Zhang, Fan; Luo, Wensui; Yabusaki, Steve B

    2013-05-01

    There are two different ways to model reactive transport: ad hoc and innovative reaction-based approaches. The former, such as the Kd simplification of adsorption, has been widely employed by practitioners, while the latter has been mainly used in scientific communities for elucidating mechanisms of biogeochemical transport processes. It is believed that innovative mechanistic-based models could serve as protocols for environmental remediation as well. This paper reviews the development of a mechanistically coupled fluid flow, thermal transport, hydrologic transport, and reactive biogeochemical model and example-applications to environmental remediation problems. Theoretical bases are sufficiently described. Four example problems previously carried out are used to demonstrate how numerical experimentation can be used to evaluate the feasibility of different remediation approaches. The first one involved the application of a 56-species uranium tailing problem to the Melton Branch Subwatershed at Oak Ridge National Laboratory (ORNL) using the parallel version of the model. Simulations were made to demonstrate the potential mobilization of uranium and other chelating agents in the proposed waste disposal site. The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium. The third example simulated laboratory experiments involving extremely high concentrations of uranium, technetium, aluminum, nitrate, and toxic metals (e.g., Ni, Cr, Co). The fourth example modeled microbially-mediated immobilization of uranium in an unconfined aquifer using acetate amendment in a field-scale experiment. The purposes of these modeling studies were to simulate various mechanisms of mobilization and immobilization of radioactive wastes and to illustrate how to apply reactive transport models for environmental remediation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Initial Experimental Results of a Laboratory Mini-Magnetosphere for Astronaut Protection

    NASA Astrophysics Data System (ADS)

    Bamford, R. A.; Bingham, R.; Gibson, K.; Thornton, A.; Bradford, J.; Hapgood, M.; Gargate, L.; Silva, L.; Norberg, C.; Todd, T.; Wilson, H.; Stamper, R.

    2007-12-01

    Radiation is a major scientific and technological challenge for manned missions to Mars. With an interplanetary flight time of months to years there is a high probability of Solar Energetic Particle events during the flight. Radiation damage to human tissue could result in acute sickness or death of the occupants of an unprotected spacecraft. Thus there is much interest in techniques to mitigate the effects of these events and of the exposure to cosmic rays. The experimental and modelling work presented here concerns one of several innovative "Active Shield" solutions being proposed [1]. The idea of generating an artificial magnetosphere to recreate the protective shield of the Earth's magnetic field for space craft travelling to the Moon or Mars was considered seriously in the 1960's during the Apollo era. With most of the space agencies around the world setting their sights returning to the Moon and then on to Mars, the idea of some sort of active field solution is experiencing a resurgence. Results from the laboratory experiment to determine the effectiveness of a mini-magnetosphere barrier to be able to expel a flowing energetic "solar wind" plasma will be presented. This is compared to a 3D hybrid simulation code that has been successfully compared to other astrophysical situations e.g. AMPTE artificial comet releases [2]. The experiment and modelling comparisons will demonstrate the scalability between the laboratory and astrophysical scale. [1] Adams, J.H. et al., "Revolutionary Concepts of Radiation Shielding for Human Exploration of Space", NASA/TM- 2005-213688, March 2005. [2] Gargate, L.; Bingham, R.; Fonseca, R. A.; Silva, L. O., "dHybrid: A massively parallel code for hybrid simulations of space plasmas", Computer Physics Communications, Volume 176, Issue 6, Pages 419-425, 15 March 2007, doi:10.1016/j.cpc.2006.11.013

  7. The role of parallelism in the real-time processing of anaphora.

    PubMed

    Poirier, Josée; Walenski, Matthew; Shapiro, Lewis P

    2012-06-01

    Parallelism effects refer to the facilitated processing of a target structure when it follows a similar, parallel structure. In coordination, a parallelism-related conjunction triggers the expectation that a second conjunct with the same structure as the first conjunct should occur. It has been proposed that parallelism effects reflect the use of the first structure as a template that guides the processing of the second. In this study, we examined the role of parallelism in real-time anaphora resolution by charting activation patterns in coordinated constructions containing anaphora, Verb-Phrase Ellipsis (VPE) and Noun-Phrase Traces (NP-traces). Specifically, we hypothesised that an expectation of parallelism would incite the parser to assume a structure similar to the first conjunct in the second, anaphora-containing conjunct. The speculation of a similar structure would result in early postulation of covert anaphora. Experiment 1 confirms that following a parallelism-related conjunction, first-conjunct material is activated in the second conjunct. Experiment 2 reveals that an NP-trace in the second conjunct is posited immediately where licensed, which is earlier than previously reported in the literature. In light of our findings, we propose an intricate relation between structural expectations and anaphor resolution.

  8. The role of parallelism in the real-time processing of anaphora

    PubMed Central

    Poirier, Josée; Walenski, Matthew; Shapiro, Lewis P.

    2012-01-01

    Parallelism effects refer to the facilitated processing of a target structure when it follows a similar, parallel structure. In coordination, a parallelism-related conjunction triggers the expectation that a second conjunct with the same structure as the first conjunct should occur. It has been proposed that parallelism effects reflect the use of the first structure as a template that guides the processing of the second. In this study, we examined the role of parallelism in real-time anaphora resolution by charting activation patterns in coordinated constructions containing anaphora, Verb-Phrase Ellipsis (VPE) and Noun-Phrase Traces (NP-traces). Specifically, we hypothesised that an expectation of parallelism would incite the parser to assume a structure similar to the first conjunct in the second, anaphora-containing conjunct. The speculation of a similar structure would result in early postulation of covert anaphora. Experiment 1 confirms that following a parallelism-related conjunction, first-conjunct material is activated in the second conjunct. Experiment 2 reveals that an NP-trace in the second conjunct is posited immediately where licensed, which is earlier than previously reported in the literature. In light of our findings, we propose an intricate relation between structural expectations and anaphor resolution. PMID:23741080

  9. Evaluation of fault-tolerant parallel-processor architectures over long space missions

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1989-01-01

    The impact of a five year space mission environment on fault-tolerant parallel processor architectures is examined. The target application is a Strategic Defense Initiative (SDI) satellite requiring 256 parallel processors to provide the computation throughput. The reliability requirements are that the system still be operational after five years with .99 probability and that the probability of system failure during one-half hour of full operation be less than 10(-7). The fault tolerance features an architecture must possess to meet these reliability requirements are presented, many potential architectures are briefly evaluated, and one candidate architecture, the Charles Stark Draper Laboratory's Fault-Tolerant Parallel Processor (FTPP) is evaluated in detail. A methodology for designing a preliminary system configuration to meet the reliability and performance requirements of the mission is then presented and demonstrated by designing an FTPP configuration.

  10. Framework for Parallel Preprocessing of Microarray Data Using Hadoop

    PubMed Central

    2018-01-01

    Nowadays, microarray technology has become one of the popular ways to study gene expression and diagnosis of disease. National Center for Biology Information (NCBI) hosts public databases containing large volumes of biological data required to be preprocessed, since they carry high levels of noise and bias. Robust Multiarray Average (RMA) is one of the standard and popular methods that is utilized to preprocess the data and remove the noises. Most of the preprocessing algorithms are time-consuming and not able to handle a large number of datasets with thousands of experiments. Parallel processing can be used to address the above-mentioned issues. Hadoop is a well-known and ideal distributed file system framework that provides a parallel environment to run the experiment. In this research, for the first time, the capability of Hadoop and statistical power of R have been leveraged to parallelize the available preprocessing algorithm called RMA to efficiently process microarray data. The experiment has been run on cluster containing 5 nodes, while each node has 16 cores and 16 GB memory. It compares efficiency and the performance of parallelized RMA using Hadoop with parallelized RMA using affyPara package as well as sequential RMA. The result shows the speed-up rate of the proposed approach outperforms the sequential approach and affyPara approach. PMID:29796018

  11. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  12. A parallel orbital-updating based plane-wave basis method for electronic structure calculations

    NASA Astrophysics Data System (ADS)

    Pan, Yan; Dai, Xiaoying; de Gironcoli, Stefano; Gong, Xin-Gao; Rignanese, Gian-Marco; Zhou, Aihui

    2017-11-01

    Motivated by the recently proposed parallel orbital-updating approach in real space method [1], we propose a parallel orbital-updating based plane-wave basis method for electronic structure calculations, for solving the corresponding eigenvalue problems. In addition, we propose two new modified parallel orbital-updating methods. Compared to the traditional plane-wave methods, our methods allow for two-level parallelization, which is particularly interesting for large scale parallelization. Numerical experiments show that these new methods are more reliable and efficient for large scale calculations on modern supercomputers.

  13. A comparison of parallel and diverging screw angles in the stability of locked plate constructs.

    PubMed

    Wähnert, D; Windolf, M; Brianza, S; Rothstock, S; Radtke, R; Brighenti, V; Schwieger, K

    2011-09-01

    We investigated the static and cyclical strength of parallel and angulated locking plate screws using rigid polyurethane foam (0.32 g/cm(3)) and bovine cancellous bone blocks. Custom-made stainless steel plates with two conically threaded screw holes with different angulations (parallel, 10° and 20° divergent) and 5 mm self-tapping locking screws underwent pull-out and cyclical pull and bending tests. The bovine cancellous blocks were only subjected to static pull-out testing. We also performed finite element analysis for the static pull-out test of the parallel and 20° configurations. In both the foam model and the bovine cancellous bone we found the significantly highest pull-out force for the parallel constructs. In the finite element analysis there was a 47% more damage in the 20° divergent constructs than in the parallel configuration. Under cyclical loading, the mean number of cycles to failure was significantly higher for the parallel group, followed by the 10° and 20° divergent configurations. In our laboratory setting we clearly showed the biomechanical disadvantage of a diverging locking screw angle under static and cyclical loading.

  14. Investigating Brittle Rock Failure and Associated Seismicity Using Laboratory Experiments and Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Zhao, Qi

    Rock failure process is a complex phenomenon that involves elastic and plastic deformation, microscopic cracking, macroscopic fracturing, and frictional slipping of fractures. Understanding this complex behaviour has been the focus of a significant amount of research. In this work, the combined finite-discrete element method (FDEM) was first employed to study (1) the influence of rock discontinuities on hydraulic fracturing and associated seismicity and (2) the influence of in-situ stress on seismic behaviour. Simulated seismic events were analyzed using post-processing tools including frequency-magnitude distribution (b-value), spatial fractal dimension (D-value), seismic rate, and fracture clustering. These simulations demonstrated that at the local scale, fractures tended to propagate following the rock mass discontinuities; while at reservoir scale, they developed in the direction parallel to the maximum in-situ stress. Moreover, seismic signature (i.e., b-value, D-value, and seismic rate) can help to distinguish different phases of the failure process. The FDEM modelling technique and developed analysis tools were then coupled with laboratory experiments to further investigate the different phases of the progressive rock failure process. Firstly, a uniaxial compression experiment, monitored using a time-lapse ultrasonic tomography method, was carried out and reproduced by the numerical model. Using this combination of technologies, the entire deformation and failure processes were studied at macroscopic and microscopic scales. The results not only illustrated the rock failure and seismic behaviours at different stress levels, but also suggested several precursory behaviours indicating the catastrophic failure of the rock. Secondly, rotary shear experiments were conducted using a newly developed rock physics experimental apparatus ERDmu-T) that was paired with X-ray micro-computed tomography (muCT). This combination of technologies has significant advantages over conventional rotary shear experiments since it allowed for the direct observation of how two rough surfaces interact and deform without perturbing the experimental conditions. Some intriguing observations were made pertaining to key areas of the study of fault evolution, making possible for a more comprehensive interpretation of the frictional sliding behaviour. Lastly, a carefully calibrated FDEM model that was built based on the rotary experiment was utilized to investigate facets that the experiment was not able to resolve, for example, the time-continuous stress condition and the seismic activity on the shear surface. The model reproduced the mechanical behaviour observed in the laboratory experiment, shedding light on the understanding of fault evolution.

  15. Research of influence of open-winding faults on properties of brushless permanent magnets motor

    NASA Astrophysics Data System (ADS)

    Bogusz, Piotr; Korkosz, Mariusz; Powrózek, Adam; Prokop, Jan; Wygonik, Piotr

    2017-12-01

    The paper presents an analysis of influence of selected fault states on properties of brushless DC motor with permanent magnets. The subject of study was a BLDC motor designed by the authors for unmanned aerial vehicle hybrid drive. Four parallel branches per each phase were provided in the discussed 3-phase motor. After open-winding fault in single or few parallel branches, a further operation of the motor can be continued. Waveforms of currents, voltages and electromagnetic torque were determined in discussed fault states based on the developed mathematical and simulation models. Laboratory test results concerning an influence of open-windings faults in parallel branches on properties of BLDC motor were presented.

  16. Experimental characterization of a binary actuated parallel manipulator

    NASA Astrophysics Data System (ADS)

    Giuseppe, Carbone

    2016-05-01

    This paper describes the BAPAMAN (Binary Actuated Parallel MANipulator) series of parallel manipulators that has been conceived at Laboratory of Robotics and Mechatronics (LARM). Basic common characteristics of BAPAMAN series are described. In particular, it is outlined the use of a reduced number of active degrees of freedom, the use of design solutions with flexural joints and Shape Memory Alloy (SMA) actuators for achieving miniaturization, cost reduction and easy operation features. Given the peculiarities of BAPAMAN architecture, specific experimental tests have been proposed and carried out with the aim to validate the proposed design and to evaluate the practical operation performance and the characteristics of a built prototype, in particular, in terms of operation and workspace characteristics.

  17. Eigensolver for a Sparse, Large Hermitian Matrix

    NASA Technical Reports Server (NTRS)

    Tisdale, E. Robert; Oyafuso, Fabiano; Klimeck, Gerhard; Brown, R. Chris

    2003-01-01

    A parallel-processing computer program finds a few eigenvalues in a sparse Hermitian matrix that contains as many as 100 million diagonal elements. This program finds the eigenvalues faster, using less memory, than do other, comparable eigensolver programs. This program implements a Lanczos algorithm in the American National Standards Institute/ International Organization for Standardization (ANSI/ISO) C computing language, using the Message Passing Interface (MPI) standard to complement an eigensolver in PARPACK. [PARPACK (Parallel Arnoldi Package) is an extension, to parallel-processing computer architectures, of ARPACK (Arnoldi Package), which is a collection of Fortran 77 subroutines that solve large-scale eigenvalue problems.] The eigensolver runs on Beowulf clusters of computers at the Jet Propulsion Laboratory (JPL).

  18. Parallel gene analysis with allele-specific padlock probes and tag microarrays

    PubMed Central

    Banér, Johan; Isaksson, Anders; Waldenström, Erik; Jarvius, Jonas; Landegren, Ulf; Nilsson, Mats

    2003-01-01

    Parallel, highly specific analysis methods are required to take advantage of the extensive information about DNA sequence variation and of expressed sequences. We present a scalable laboratory technique suitable to analyze numerous target sequences in multiplexed assays. Sets of padlock probes were applied to analyze single nucleotide variation directly in total genomic DNA or cDNA for parallel genotyping or gene expression analysis. All reacted probes were then co-amplified and identified by hybridization to a standard tag oligonucleotide array. The technique was illustrated by analyzing normal and pathogenic variation within the Wilson disease-related ATP7B gene, both at the level of DNA and RNA, using allele-specific padlock probes. PMID:12930977

  19. Parallel Unsteady Overset Mesh Methodology for Adaptive and Moving Grids with Multiple Solvers

    DTIC Science & Technology

    2010-01-01

    Research Laboratory Hampton, Virginia Jayanarayanan Sitaraman National Institute of Aerospace Hampton, Virginia ABSTRACT This paper describes a new...Army Research Laboratory ,Hampton, VA, , , 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) NATO/RTO...results section ( 3.6 and 3.5). Good linear scalability was observed for all three cases up to 12 processors. Beyond that the scalability drops off

  20. Axisymmetric magnetorotational instability in ideal and viscous laboratory plasmas

    NASA Astrophysics Data System (ADS)

    Mikhailovskii, A. B.; Lominadze, J. G.; Churikov, A. P.; Erokhin, N. N.; Pustovitov, V. D.; Konovalov, S. V.

    2008-10-01

    The original analysis of the axisymmetric magnetorotational instability (MRI) by Velikhov (Sov. Phys. JETP 9, 995 (1959)) and Chandrasekhar (Proc. Nat. Acad. Sci. 46, 253 (1960)), applied to the ideally conducting magnetized medium in the laboratory conditions and restricted to the incompressible approximation, is extended by allowing for the compressibility. Thereby, two additional driving mechanisms of MRI are revealed in addition to the standard drive due to the negative medium rotation frequency gradient (the Velikhov effect). One is due to the squared medium pressure gradient and another is a combined effect of the pressure and density gradients. For laboratory applications, the expression for the MRI boundary with all the above driving mechanisms and the stabilizing magnetoacoustic effect is derived. The effects of parallel and perpendicular viscosities on the MRI in the laboratory plasma are investigated. It is shown that, for strong viscosity, there is a family of MRI driven for the same condition as the ideal one. It is also revealed that the presence of strong viscosity leads to additional family of instabilities called the viscosity-driven MRI. Then the parallel-viscositydriven MRI looks as an overstability (oscillatory instability) possessing both the growth rate and the real part of oscillation frequency, while the perpendicular-viscosity MRI is the aperiodical instability.

  1. Clinical validation of the 50 gene AmpliSeq Cancer Panel V2 for use on a next generation sequencing platform using formalin fixed, paraffin embedded and fine needle aspiration tumour specimens.

    PubMed

    Rathi, Vivek; Wright, Gavin; Constantin, Diana; Chang, Siok; Pham, Huong; Jones, Kerryn; Palios, Atha; Mclachlan, Sue-Anne; Conron, Matthew; McKelvie, Penny; Williams, Richard

    2017-01-01

    The advent of massively parallel sequencing has caused a paradigm shift in the ways cancer is treated, as personalised therapy becomes a reality. More and more laboratories are looking to introduce next generation sequencing (NGS) as a tool for mutational analysis, as this technology has many advantages compared to conventional platforms like Sanger sequencing. In Australia all massively parallel sequencing platforms are still considered in-house in vitro diagnostic tools by the National Association of Testing Authorities (NATA) and a comprehensive analytical validation of all assays, and not just mere verification, is a strict requirement before accreditation can be granted for clinical testing on these platforms. Analytical validation of assays on NGS platforms can prove to be extremely challenging for pathology laboratories. Although there are many affordable and easily accessible NGS instruments available, there are no standardised guidelines as yet for clinical validation of NGS assays. We present an accreditation development procedure that was both comprehensive and applicable in a setting of hospital laboratory for NGS services. This approach may also be applied to other NGS applications in service laboratories. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  2. Parallel processing and expert systems

    NASA Technical Reports Server (NTRS)

    Lau, Sonie; Yan, Jerry C.

    1991-01-01

    Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 1990s cannot enjoy an increased level of autonomy without the efficient implementation of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real-time demands are met for larger systems. Speedup via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial laboratories in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems is surveyed. The survey discusses multiprocessors for expert systems, parallel languages for symbolic computations, and mapping expert systems to multiprocessors. Results to date indicate that the parallelism achieved for these systems is small. The main reasons are (1) the body of knowledge applicable in any given situation and the amount of computation executed by each rule firing are small, (2) dividing the problem solving process into relatively independent partitions is difficult, and (3) implementation decisions that enable expert systems to be incrementally refined hamper compile-time optimization. In order to obtain greater speedups, data parallelism and application parallelism must be exploited.

  3. Xyce parallel electronic simulator users guide, version 6.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  4. Xyce parallel electronic simulator users' guide, Version 6.0.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  5. Xyce parallel electronic simulator users guide, version 6.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  6. A compositional reservoir simulator on distributed memory parallel computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rame, M.; Delshad, M.

    1995-12-31

    This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. Amore » portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented.« less

  7. Parallel-plate transmission line type of EMP simulators: Systematic review and recommendations

    NASA Astrophysics Data System (ADS)

    Giri, D. V.; Liu, T. K.; Tesche, F. M.; King, R. W. P.

    1980-05-01

    This report presents various aspects of the two-parallel-plate transmission line type of EMP simulator. Much of the work is the result of research efforts conducted during the last two decades at the Air Force Weapons Laboratory, and in industries/universities as well. The principal features of individual simulator components are discussed. The report also emphasizes that it is imperative to hybridize our understanding of individual components so that we can draw meaningful conclusions of simulator performance as a whole.

  8. Performance Evaluation of Parallel Algorithms and Architectures in Concurrent Multiprocessor Systems

    DTIC Science & Technology

    1988-09-01

    HEP and Other Parallel processors, Report no. ANL-83-97, Argonne National Laboratory, Argonne, Ill. 1983. [19] Davidson, G . S. A Practical Paradigm for...IEEE Comp. Soc., 1986. [241 Peir, Jih-kwon, and D. Gajski , "CAMP: A Programming Aide For Multiprocessors," Proc. 1986 ICPP, IEEE Comp. Soc., pp475...482. [251 Pfister, G . F., and V. A. Norton, "Hot Spot Contention and Combining in Multistage Interconnection Networks,"IEEE Trans. Comp., C-34, Oct

  9. Final report on EURAMET.L-S21: `Supplementary comparison of parallel thread gauges'

    NASA Astrophysics Data System (ADS)

    Mudronja, Vedran; Šimunovic, Vedran; Acko, Bojan; Matus, Michael; Bánréti, Edit; István, Dicso; Thalmann, Rudolf; Lassila, Antti; Lillepea, Lauri; Bartolo Picotto, Gian; Bellotti, Roberto; Pometto, Marco; Ganioglu, Okhan; Meral, Ilker; Salgado, José Antonio; Georges, Vailleau

    2015-01-01

    The results of the comparison of parallel thread gauges between ten European countries are presented. Three thread plugs and three thread rings were calibrated in one loop. Croatian National Laboratory for Length (HMI/FSB-LPMD) acted as the coordinator and pilot laboratory of the comparison. Thread angle, thread pitch, simple pitch diameter and pitch diameter were measured. Pitch diameters were calibrated within 1a, 2a, 1b and 2b calibration categories in accordance with the EURAMET cg-10 calibration guide. A good agreement between the measurement results and differences due to different calibration categories are analysed in this paper. This comparison was a first EURAMET comparison of parallel thread gauges based on the EURAMET ctg-10 calibration guide, and has made a step towards the harmonization of future comparisons with the registration of CMC values for thread gauges. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCL, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  10. A comparison of refuse attenuation in laboratory and field scale lysimeters.

    PubMed

    Youcai, Zhao; Luochun, Wang; Renhua, Hua; Dimin, Xu; Guowei, Gu

    2002-01-01

    For this study, small and middle scale laboratory lysimeters, and a large scale field lysimeter in situ in Shanghai Refuse Landfill, with refuse weights of 187,600 and 10,800,000 kg, respectively, were created. These lysimeters are compared in terms of leachate quality (pH, concentrations of COD, BOD and NH3-N), refuse composition (biodegradable matter and volatile solid) and surface settlement for a monitoring period of 0-300 days. The objectives of this study were to explore both the similarities and disparities between laboratory and field scale lysimeters, and to compare degradation behaviors of refuse at the intensive reaction phase in the different scale lysimeters. Quantitative relationships of leachate quality and refuse composition with placement time show that degradation behaviors of refuse seem to depend heavily on the scales of the lysimeters and the parameters of concern, especially in the starting period of 0-6 months. However, some similarities exist between laboratory and field lysimeters after 4-6 months of placement because COD and BOD concentrations in leachate in the field lysimeter decrease regularly in a parallel pattern with those in the laboratory lysimeters. NH3-N, volatile solid (VS) and biodegradable matter (BDM) also gradually decrease in parallel in this intensive reaction phase for all scale lysimeters as refuse ages. Though the concrete data are different among the different scale lysimeters, it may be considered that laboratory lysimeters with sufficient scale are basically applicable for a rough simulation of a real landfill, especially for illustrating the degradation pattern and mechanism. Settlement of refuse surface is roughly proportional to the initial refuse height.

  11. Fast parallel algorithm for slicing STL based on pipeline

    NASA Astrophysics Data System (ADS)

    Ma, Xulong; Lin, Feng; Yao, Bo

    2016-05-01

    In Additive Manufacturing field, the current researches of data processing mainly focus on a slicing process of large STL files or complicated CAD models. To improve the efficiency and reduce the slicing time, a parallel algorithm has great advantages. However, traditional algorithms can't make full use of multi-core CPU hardware resources. In the paper, a fast parallel algorithm is presented to speed up data processing. A pipeline mode is adopted to design the parallel algorithm. And the complexity of the pipeline algorithm is analyzed theoretically. To evaluate the performance of the new algorithm, effects of threads number and layers number are investigated by a serial of experiments. The experimental results show that the threads number and layers number are two remarkable factors to the speedup ratio. The tendency of speedup versus threads number reveals a positive relationship which greatly agrees with the Amdahl's law, and the tendency of speedup versus layers number also keeps a positive relationship agreeing with Gustafson's law. The new algorithm uses topological information to compute contours with a parallel method of speedup. Another parallel algorithm based on data parallel is used in experiments to show that pipeline parallel mode is more efficient. A case study at last shows a suspending performance of the new parallel algorithm. Compared with the serial slicing algorithm, the new pipeline parallel algorithm can make full use of the multi-core CPU hardware, accelerate the slicing process, and compared with the data parallel slicing algorithm, the new slicing algorithm in this paper adopts a pipeline parallel model, and a much higher speedup ratio and efficiency is achieved.

  12. Parallel Implementation of a High Order Implicit Collocation Method for the Heat Equation

    NASA Technical Reports Server (NTRS)

    Kouatchou, Jules; Halem, Milton (Technical Monitor)

    2000-01-01

    We combine a high order compact finite difference approximation and collocation techniques to numerically solve the two dimensional heat equation. The resulting method is implicit arid can be parallelized with a strategy that allows parallelization across both time and space. We compare the parallel implementation of the new method with a classical implicit method, namely the Crank-Nicolson method, where the parallelization is done across space only. Numerical experiments are carried out on the SGI Origin 2000.

  13. Implementing and measuring the level of laboratory service integration in a program setting in Nigeria.

    PubMed

    Mbah, Henry; Negedu-Momoh, Olubunmi Ruth; Adedokun, Oluwasanmi; Ikani, Patrick Anibbe; Balogun, Oluseyi; Sanwo, Olusola; Ochei, Kingsley; Ekanem, Maurice; Torpey, Kwasi

    2014-01-01

    The surge of donor funds to fight HIV&AIDS epidemic inadvertently resulted in the setup of laboratories as parallel structures to rapidly respond to the identified need. However these parallel structures are a threat to the existing fragile laboratory systems. Laboratory service integration is critical to remedy this situation. This paper describes an approach to quantitatively measure and track integration of HIV-related laboratory services into the mainstream laboratory services and highlight some key intervention steps taken, to enhance service integration. A quantitative before-and-after study conducted in 122 Family Health International (FHI360) supported health facilities across Nigeria. A minimum service package was identified including management structure; trainings; equipment utilization and maintenance; information, commodity and quality management for laboratory integration. A check list was used to assess facilities at baseline and 3 months follow-up. Level of integration was assessed on an ordinal scale (0 = no integration, 1 = partial integration, 2 = full integration) for each service package. A composite score grading expressed as a percentage of total obtainable score of 14 was defined and used to classify facilities (≤ 80% FULL, 25% to 79% PARTIAL and <25% NO integration). Weaknesses were noted and addressed. We analyzed 9 (7.4%) primary, 104 (85.2%) secondary and 9 (7.4%) tertiary level facilities. There were statistically significant differences in integration levels between baseline and 3 months follow-up period (p<0.01). Baseline median total integration score was 4 (IQR 3 to 5) compared to 7 (IQR 4 to 9) at 3 months follow-up (p = 0.000). Partial and fully integrated laboratory systems were 64 (52.5%) and 0 (0.0%) at baseline, compared to 100 (82.0%) and 3 (2.4%) respectively at 3 months follow-up (p = 0.000). This project showcases our novel approach to measure the status of each laboratory on the integration continuum.

  14. Educating Laboratory Science Learners at a Distance Using Interactive Television

    ERIC Educational Resources Information Center

    Reddy, Christopher

    2014-01-01

    Laboratory science classes offered to students learning at a distance require a methodology that allows for the completion of tactile activities. Literature describes three different methods of solving the distance laboratory dilemma: kit-based laboratory experience, computer-based laboratory experience, and campus-based laboratory experience,…

  15. Xyce Parallel Electronic Simulator : users' guide, version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont

    2004-06-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator capable of simulating electrical circuits at a variety of abstraction levels. Primarily, Xyce has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability the current state-of-the-art in the following areas: {sm_bullet} Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. {sm_bullet} Improved performance for allmore » numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. {sm_bullet} Device models which are specifically tailored to meet Sandia's needs, including many radiation-aware devices. {sm_bullet} A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). {sm_bullet} Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing of computing platforms. These include serial, shared-memory and distributed-memory parallel implementation - which allows it to run efficiently on the widest possible number parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. One feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce These input formats include standard analytical models, behavioral models look-up Parallel Electronic Simulator is designed to support a variety of device model inputs. tables, and mesh-level PDE device models. Combined with this flexible interface is an architectural design that greatly simplifies the addition of circuit models. One of the most important feature of Xyce is in providing a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia now has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods) research and development can be performed. Ultimately, these capabilities are migrated to end users.« less

  16. A laboratory model for solidification of Earth's core

    NASA Astrophysics Data System (ADS)

    Bergman, Michael I.; Macleod-Silberstein, Marget; Haskel, Michael; Chandler, Benjamin; Akpan, Nsikan

    2005-11-01

    To better understand the influence of rotating convection in the outer core on the solidification of the inner core we have constructed a laboratory model for solidification of Earth's core. The model consists of a 15 cm radius hemispherical acrylic tank concentric with a 5 cm radius hemispherical aluminum heat exchanger that serves as the incipient inner core onto which we freeze ice from salt water. Long exposure photographs of neutrally buoyant particles in illuminated planes suggest reduction of flow parallel to the rotation axis. Thermistors in the tank near the heat exchanger show that in experiments with rotation the temperature near the pole is lower than near the equator, unlike for control experiments without rotation or with a polymer that increases the fluid viscosity. The photographs and thermistors suggest that our observation that ice grows faster near the pole than near the equator for experiments with rotation is a result of colder water not readily convecting away from the pole. Because of the reversal of the thermal gradient, we expect faster equatorial solidification in the Earth's core. Such anisotropy in solidification has been suggested as a cause of inner core elastic (and attenuation) anisotropy, though the plausibility of this suggestion will depend on the core Nusselt number and the slope of the liquidus, and the effects of post-solidification deformation. Previous experiments on hexagonal close-packed alloys such as sea ice and zinc-tin have shown that fluid flow in the melt can result in a solidification texture transverse to the solidification direction, with the texture depending on the nature of the flow. A comparison of the visualized flow and the texture of columnar ice crystals in thin sections from these experiments confirms flow-induced transverse textures. This suggests that the convective pattern at the base of the outer core is recorded in the texture of the inner core, and that outer core convection might contribute to the complexity in the seismically inferred pattern of anisotropy in the Earth's inner core.

  17. A survey of pulse shape options for a revised plastic ablator ignition design

    NASA Astrophysics Data System (ADS)

    Clark, Daniel; Eder, David; Haan, Steven; Hinkel, Denise; Jones, Ogden; Marinak, Michael; Milovich, Jose; Peterson, Jayson; Robey, Harold; Salmonson, Jay; Smalyuk, Vladimir; Weber, Christopher

    2014-10-01

    Recent experimental results using the ``high foot'' pulse shape on the National Ignition Facility (NIF) have shown encouraging progress compared to earlier ``low foot'' experiments. These results strongly suggest that controlling ablation front instability growth can dramatically improve implosion performance, even in the presence of persistent, large, low-mode distortions. In parallel, Hydro. Growth Radiography experiments have so far validated the techniques used for modeling ablation front growth in NIF experiments. It is timely then to combine these two results and ask how current ignition pulse shapes could be modified so as to improve implosion performance, namely fuel compressibility, while maintaining the stability properties demonstrated with the high foot. This talk presents a survey of pulse shapes intermediate between the low and high foot extremes in search of a more optimal design. From the database of pulse shapes surveyed, a higher picket version of the original low foot pulse shape shows the most promise for improved compression without loss of stability. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  18. Thermal barrier coating life prediction model development

    NASA Technical Reports Server (NTRS)

    Demasi, J. T.

    1986-01-01

    A methodology is established to predict thermal barrier coating life in a environment similar to that experienced by gas turbine airfoils. Experiments were conducted to determine failure modes of the thermal barrier coating. Analytical studies were employed to derive a life prediction model. A review of experimental and flight service components as well as laboratory post evaluations indicates that the predominant mode of TBC failure involves thermomechanical spallation of the ceramic coating layer. This ceramic spallation involves the formation of a dominant crack in the ceramic coating parallel to and closely adjacent to the topologically complex metal ceramic interface. This mechanical failure mode clearly is influenced by thermal exposure effects as shown in experiments conducted to study thermal pre-exposure and thermal cycle-rate effects. The preliminary life prediction model developed focuses on the two major damage modes identified in the critical experiments tasks. The first of these involves a mechanical driving force, resulting from cyclic strains and stresses caused by thermally induced and externally imposed mechanical loads. The second is an environmental driving force based on experimental results, and is believed to be related to bond coat oxidation. It is also believed that the growth of this oxide scale influences the intensity of the mechanical driving force.

  19. Bubble-detector measurements of neutron radiation in the international space station: ISS-34 to ISS-37

    PubMed Central

    Smith, M. B.; Khulapko, S.; Andrews, H. R.; Arkhangelsky, V.; Ing, H.; Koslowksy, M. R.; Lewis, B. J.; Machrafi, R.; Nikolaev, I.; Shurshakov, V.

    2016-01-01

    Bubble detectors have been used to characterise the neutron dose and energy spectrum in several modules of the International Space Station (ISS) as part of an ongoing radiation survey. A series of experiments was performed during the ISS-34, ISS-35, ISS-36 and ISS-37 missions between December 2012 and October 2013. The Radi-N2 experiment, a repeat of the 2009 Radi-N investigation, included measurements in four modules of the US orbital segment: Columbus, the Japanese experiment module, the US laboratory and Node 2. The Radi-N2 dose and spectral measurements are not significantly different from the Radi-N results collected in the same ISS locations, despite the large difference in solar activity between 2009 and 2013. Parallel experiments using a second set of detectors in the Russian segment of the ISS included the first characterisation of the neutron spectrum inside the tissue-equivalent Matroshka-R phantom. These data suggest that the dose inside the phantom is ∼70 % of the dose at its surface, while the spectrum inside the phantom contains a larger fraction of high-energy neutrons than the spectrum outside the phantom. The phantom results are supported by Monte Carlo simulations that provide good agreement with the empirical data. PMID:25899609

  20. Harmonisation of seven common enzyme results through EQA.

    PubMed

    Weykamp, Cas; Franck, Paul; Gunnewiek, Jacqueline Klein; de Jonge, Robert; Kuypers, Aldy; van Loon, Douwe; Steigstra, Herman; Cobbaert, Christa

    2014-11-01

    Equivalent results between different laboratories enable optimal patient care and can be achieved with harmonisation. We report on EQA-initiated national harmonisation of seven enzymes using commutable samples. EQA samples were prepared from human serum spiked with human recombinant enzymes. Target values were assigned with the IFCC Reference Measurement Procedures. The same samples were included at four occasions in the EQA programmes of 2012 and 2013. Laboratories were encouraged to report IFCC traceable results. A parallel study was done to confirm commutability of the samples. Of the 223 participating laboratories, 95% reported IFCC traceable results, ranging from 98% (ASAT) to 87% (amylase). Users of Roche and Siemens (97%) more frequently reported in IFCC traceable results than users of Abbott (91%), Beckman (90%), and Olympus (87%). The success of harmonisation, expressed as the recovery of assigned values and the inter-laboratory CV was: ALAT (recovery 100%; inter-lab CV 4%), ASAT (102%; 4%), LD (98%; 3%), CK (101%; 5%), GGT (98%; 4%), AP (96%; 6%), amylase (99%; 4%). There were no significant differences between the manufacturers. Commutability was demonstrated in the parallel study. Equal results in the same sample in the 2012 and 2013 EQA programmes demonstrated stability of the samples. The EQA-initiated national harmonisation of seven enzymes, using stable, commutable human serum samples, spiked with human recombinant enzymes, and targeted with the IFCC Reference Measurement Procedures, was successful in terms of implementation of IFCC traceable results (95%), recovery of the target (99%), and inter-laboratory CV (4%).

  1. Dockres: a computer program that analyzes the output of virtual screening of small molecules

    PubMed Central

    2010-01-01

    Background This paper describes a computer program named Dockres that is designed to analyze and summarize results of virtual screening of small molecules. The program is supplemented with utilities that support the screening process. Foremost among these utilities are scripts that run the virtual screening of a chemical library on a large number of processors in parallel. Methods Dockres and some of its supporting utilities are written Fortran-77; other utilities are written as C-shell scripts. They support the parallel execution of the screening. The current implementation of the program handles virtual screening with Autodock-3 and Autodock-4, but can be extended to work with the output of other programs. Results Analysis of virtual screening by Dockres led to both active and selective lead compounds. Conclusions Analysis of virtual screening was facilitated and enhanced by Dockres in both the authors' laboratories as well as laboratories elsewhere. PMID:20205801

  2. Responses of mental health professionals to man-made trauma: the Israeli experience.

    PubMed

    Solomon, Z

    1996-09-01

    The reactions and responses of mental health professionals in the area of armed conflict is the focus of this paper. It examines the way the therapeutic community has dealt with the survivors of two catastrophes-the Holocaust and warfare. A parallel process of a gradual change of attitudes towards the survivors was observed: emotional detachment, lack of recognition in the early stages and, eventually, social acceptance and empathy. The origins of these attitudes will be discussed, and three explanations will be offered. Israel is a small, stress-ridden country that has known seven full-scale wars and countless hostilities during its 47 years of existence. Our national history over 2000 years has been beset with persecution, programs and deportations, culminating in the Nazi Holocaust. The establishment of the State of Israel brought with it the hope of a secure existence. Unfortunately, this has not been achieved, and Israel is a natural laboratory of war stress. The reactions and responses of mental health professionals in areas of armed conflict is the focus of this paper. Presented here will be this author's analysis of the way the Israeli society and the helping professions in Israel have dealt with two kinds of man-made catastrophic events: the Nazi Holocaust and seven Arab-Israeli wars. In these different events of human violence, a parallel process of a gradual change of attitude towards the survivors was observed. This remarkable parallel presents emotional detachment, lack of recognition and at times blaming the victims in the early stages and, eventually, social acceptance and empathy. The process of social change becomes complex when the agents of change are themselves members of the social entity undergoing the change. This paper shall demonstrate that therapists and mental health planners had considerable difficulties in transcending public attitudes toward survivors of the Holocaust and psychiatric casualties of the Israeli-Arab conflict. As a result, they were unable to treat properly those injured by trauma until certain social changes took place. This paper submits that the Israeli experience is not isolated and limited to our part of the globe. It represents a general, universal process, from which parallel processes in other countries and in other man-made trauma can be drawn.

  3. Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.

    PubMed

    Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej

    2017-05-15

    Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.

  4. Integrated Optoelectronics for Parallel Microbioanalysis

    NASA Technical Reports Server (NTRS)

    Stirbl, Robert; Moynihan, Philip; Bearman, Gregory; Lane, Arthur

    2003-01-01

    Miniature, relatively inexpensive microbioanalytical systems ("laboratory-on-achip" devices) have been proposed for the detection of hazardous microbes and toxic chemicals. Each system of this type would include optoelectronic sensors and sensor-output-processing circuitry that would simultaneously look for the optical change, fluorescence, delayed fluorescence, or phosphorescence signatures from multiple redundant sites that have interacted with the test biomolecules in order to detect which one(s) was present in a given situation. These systems could be used in a variety of settings that could include doctors offices, hospitals, hazardous-material laboratories, biological-research laboratories, military operations, and chemical-processing plants.

  5. Thermal-hydraulic posttest analysis for the ANL/MCTF 360/sup 0/ model heat-exchanger water test under mixed convection. [LMFBR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, C.I.; Sha, W.T.; Kasza, K.E.

    As a result of the uncertainties in the understanding of the influence of thermal-buoyancy effects on the flow and heat transfer in Liquid Metal Fast Breeder Reactor heat exchangers and steam generators under off-normal operating conditions, an extensive experimental program is being conducted at Argonne National Laboratory to eliminate these uncertainties. Concurrently, a parallel analytical effort is also being pursued to develop a three-dimensional transient computer code (COMMIX-IHX) to study and predict heat exchanger performance under mixed, forced, and free convection conditions. This paper presents computational results from a heat exchanger simulation and compares them with the results from amore » test case exhibiting strong thermal buoyancy effects. Favorable agreement between experiment and code prediction is obtained.« less

  6. Catastrophic onset of fast magnetic reconnection with a guide field

    NASA Astrophysics Data System (ADS)

    Cassak, P. A.; Drake, J. F.; Shay, M. A.

    2007-05-01

    It was recently shown that the slow (collisional) Sweet-Parker and the fast (collisionless) Hall magnetic reconnection solutions simultaneously exist for a wide range of resistivities; reconnection is bistable [Cassak, Shay, and Drake, Phys. Rev. Lett., 95, 235002 (2005)]. When the thickness of the dissipation region becomes smaller than a critical value, the Sweet-Parker solution disappears and fast reconnection ensues, potentially explaining how large amounts of magnetic free energy can accrue without significant release before the onset of fast reconnection. Two-fluid numerical simulations extending the previous results for anti-parallel reconnection (where the critical thickness is the ion skin depth) to component reconnection with a large guide field (where the critical thickness is the thermal ion Larmor radius) are presented. Applications to laboratory experiments of magnetic reconnection and the sawtooth crash are discussed.

  7. NASA Bioculture System: From Experiment Definition to Flight Payload

    NASA Technical Reports Server (NTRS)

    Sato, Kevin Y.; Almeida, Eduardo; Austin, Edward M.

    2014-01-01

    Starting in 2015, the NASA Bioculture System will be available to the science community to conduct cell biology and microbiology experiments on ISS. The Bioculture System carries ten environmentally independent Cassettes, which house the experiments. The closed loop fluids flow path subsystem in each Cassette provides a perfusion-based method for maintain specimen cultures in a shear-free environment by using a biochamber based on porous hollow fiber bioreactor technology. Each Cassette contains an incubator and separate insulated refrigerator compartment for storage of media, samples, nutrients and additives. The hardware is capable of fully automated or manual specimen culturing and processing, including in-flight experiment initiation, sampling and fixation, up to BSL-2 specimen culturing, and the ability to up to 10 independent cultures in parallel for statistical analysis. The incubation and culturing of specimens in the Bioculture System is a departure from standard laboratory culturing methods. Therefore, it is critical that the PI has an understanding the pre-flight test required for successfully using the Bioculture System to conduct an on-orbit experiment. Overall, the PI will conduct a series of ground tests to define flight experiment and on-orbit implementation requirements, verify biocompatibility, and determine base bioreactor conditions. The ground test processes for the utilization of the Bioculture System, from experiment selection to flight, will be reviewed. Also, pre-flight test schedules and use of COTS ground test equipment (CellMax and FiberCell systems) and the Bioculture System will be discussed.

  8. Memory and visual search in naturalistic 2D and 3D environments

    PubMed Central

    Li, Chia-Ling; Aivar, M. Pilar; Kit, Dmitry M.; Tong, Matthew H.; Hayhoe, Mary M.

    2016-01-01

    The role of memory in guiding attention allocation in daily behaviors is not well understood. In experiments with two-dimensional (2D) images, there is mixed evidence about the importance of memory. Because the stimulus context in laboratory experiments and daily behaviors differs extensively, we investigated the role of memory in visual search, in both two-dimensional (2D) and three-dimensional (3D) environments. A 3D immersive virtual apartment composed of two rooms was created, and a parallel 2D visual search experiment composed of snapshots from the 3D environment was developed. Eye movements were tracked in both experiments. Repeated searches for geometric objects were performed to assess the role of spatial memory. Subsequently, subjects searched for realistic context objects to test for incidental learning. Our results show that subjects learned the room-target associations in 3D but less so in 2D. Gaze was increasingly restricted to relevant regions of the room with experience in both settings. Search for local contextual objects, however, was not facilitated by early experience. Incidental fixations to context objects do not necessarily benefit search performance. Together, these results demonstrate that memory for global aspects of the environment guides search by restricting allocation of attention to likely regions, whereas task relevance determines what is learned from the active search experience. Behaviors in 2D and 3D environments are comparable, although there is greater use of memory in 3D. PMID:27299769

  9. Experimental verification of the role of electron pressure in fast magnetic reconnection with a guide field

    DOE PAGES

    Fox, W.; Sciortino, F.; v. Stechow, A.; ...

    2017-03-21

    We report detailed laboratory observations of the structure of a reconnection current sheet in a two-fluid plasma regime with a guide magnetic field. We observe and quantitatively analyze the quadrupolar electron pressure variation in the ion-diffusion region, as originally predicted by extended magnetohydrodynamics simulations. The projection of the electron pressure gradient parallel to the magnetic field contributes significantly to balancing the parallel electric field, and the resulting cross-field electron jets in the reconnection layer are diamagnetic in origin. Furthermore, these results demonstrate how parallel and perpendicular force balance are coupled in guide field reconnection and confirm basic theoretical models ofmore » the importance of electron pressure gradients for obtaining fast magnetic reconnection.« less

  10. New cosmic rays experiments in the underground laboratory of IFIN-HH from Slanic Prahova, Romania

    NASA Astrophysics Data System (ADS)

    Mitrica, Bogdan; Stanca, Denis; Brancus, Iliana; Margineanu, Romul; Blebea-Apostu, Ana-Maria; Gomoiu, Claudia; Saftoiu, Alexandra; Toma, Gabriel; Rebel, Heinigerd; Haungs, Andreas; Sima, Octavian; Gherghel-Lascu, Alexandru; Niculescu-Oglinzanu, Mihai

    2015-02-01

    Since 2006 a modern laboratory has been developed by IFIN-HH in the underground of Slanic Prahova salt ore. This work presents a short review of previous scientific activities performed in the underground laboratory, in parallel with some plans for the future. A mobile detector for cosmic muon flux measurements has been set up at IFIN-HH, Romania. The device is used to measure the muon flux on different locations at the surface and underground and it consists of two detection layers, each one including four large scintillator plates. A new rotatable detector for measurements of the directional variation of the muon flux has been designed and it is presently under preliminary tests. Built from four layers of sensitive material and using for collecting the signals and directing them to the micro PMTs a new technique, through optical fibers instead wave length shifters, it allows an easy discrimination of the moun flux on the arrival directions of muons. Combining the possibility to rotate and the directionality properties, the underground muon detector is acting like a muon tomography device, being able to scan, using cosmic muons, the rock material above the detector. In parallel new detection system based on SiPM will be also installed in the following weeks. It should be composed by four layers, each layer consisting in 4 scintillator plates what we consider in the following as a module of detection. For this purpose, first two scintillator layers, with the optical fibers positioned on perpendicular directions are put in coincidence with other two layers, 1 m distance from the first two, with similar optical fiber arrangement, thus allowing reconstructing muon trajectory. It is intended also to design and construct an experimental device for the investigation of such radio antennas and the behavior of the signal in rock salt at the Slanic salt mine in Romania. Another method to detect high energy neutrinos is based on the detection of secondary particles resulting from the interaction with the salt massive. We intent to design and construct a 3D array in the underground of Slanic Prahova salt ore.

  11. Pilot Non-Conformance to Alerting System Commands During Closely Spaced Parallel Approaches

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.; Hansman, R. John

    1997-01-01

    Pilot non-conformance to alerting system commands has been noted in general and to a TCAS-like collision avoidance system in a previous experiment. This paper details two experiments studying collision avoidance during closely-spaced parallel approaches in instrument meteorological conditions (IMC), and specifically examining possible causal factors of, and design solutions to, pilot non-conformance.

  12. Prediction and validation of the energy dissipation of a friction damper

    NASA Astrophysics Data System (ADS)

    Lopez, I.; Nijmeijer, H.

    2009-12-01

    Friction dampers can be a cheap and efficient way to reduce the vibration levels of a wide range of mechanical systems. In the present work it is shown that the maximum energy dissipation and corresponding optimum friction force of friction dampers with stiff localized contacts and large relative displacements within the contact, can be determined with sufficient accuracy using a dry (Coulomb) friction model. Both the numerical calculations with more complex friction models and the experimental results in a laboratory test set-up show that these two quantities are relatively robust properties of a system with friction. The numerical calculations are performed with several friction models currently used in the literature. For the stick phase smooth approximations like viscous damping or the arctan function are considered but also the non-smooth switch friction model is used. For the slip phase several models of the Stribeck effect are used. The test set-up for the laboratory experiments consists of a mass sliding on parallel ball-bearings, where additional friction is created by a sledge attached to the mass, which is pre-stressed against a friction plate. The measured energy dissipation is in good agreement with the theoretical results for Coulomb friction.

  13. Electrostatic solitary waves generated by beam injection in LAPD

    NASA Astrophysics Data System (ADS)

    Chen, L.; Gekelman, W. N.; Lefebvre, B.; Kintner, P. M.; Pickett, J. S.; Pribyl, P.; Vincena, S. T.

    2011-12-01

    Spacecraft data have revealed that electrostatic solitary waves are ubiquitous in non-equilibrium collisionless space plasmas. These solitary waves are often the main constituents of the observed electrostatic turbulence. The ubiquitous presence of these solitary waves in space motivated laboratory studies on their generation and evolution in the Large Plasma Device (LAPD) at UCLA. In order to observe these structures, microprobes with scale sizes of order of the Debye length (30 microns) had to be built using Mems technology. A suprathermal electron beam was injected into the afterglow plasma, and solitary waves as well as nonlinear wave packets were measured. The solitary waves are interpreted as BGK electron holes based on their width, amplitude, and velocity characteristics. The ensuing turbulence, including the solitary waves and wave packets, exhibits a band dispersion relation with its central line consistent with the electrostatic whistler mode. One surprise brought by the laboratory experiments is that the electron holes were not generated through resonant two-stream instabilities, but likely through an instability due to parallel currents. The characteristics of the LAPD electron holes and those observed in space will be compared to motivate further theoretical, simulation, and experimental work.

  14. Dismantling of Highly Contaminated Process Installations of the German Reprocessing Facility (WAK) - Status of New Remote Handling Technology - 13287

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dux, Joachim; Friedrich, Daniel; Lutz, Werner

    2013-07-01

    Decommissioning and dismantling of the former German Pilot Reprocessing Plant Karlsruhe (WAK) including the Vitrification Facility (VEK) is being executed in different Project steps related to the reprocessing, HLLW storage and vitrification complexes /1/. While inside the reprocessing building the total inventory of process equipment has already been dismantled and disposed of, the HLLW storage and vitrification complex has been placed out of operation since vitrification and tank rinsing procedures where finalized in year 2010. This paper describes the progress made in dismantling of the shielded boxes of the highly contaminated laboratory as a precondition to get access to themore » hot cells of the HLLW storage. The major challenges of the dismantling of this laboratory were the high dose rates up to 700 mSv/h and the locking technology for the removal of the hot cell installations. In parallel extensive prototype testing of different carrier systems and power manipulators to be applied to dismantle the HLLW-tanks and other hot cell equipment is ongoing. First experiences with the new manipulator carrier system and a new master slave manipulator with force reflection will be reported. (authors)« less

  15. A laboratory study of mean flow generation in rotating fluids by Reynolds stress gradients

    NASA Astrophysics Data System (ADS)

    McGuinness, D. S.; Boyer, D. L.; Fernando, H. J. S.

    2001-06-01

    Laboratory experiments were conducted that demonstrate that a mean azimuthal flow can be produced by introducing Reynolds stress gradients to a rotating fluid with zero initial mean flow. This mechanism may play a role in the generation of mean currents in coastal regions. The experiments entail the establishment of turbulence in a thin annular-shaped region centered within a cylindrical test cell through the use of a vertically oscillating grid. This region rests in a horizontal plane perpendicular to the vertical axis of the tank, and the entire system is placed on a turntable to simulate background rotation. Flow visualization techniques are used to depict qualitative features of the resulting flow field. Measurements of the mean and turbulent velocity fields are performed using a two-component laser-Doppler velocimeter. The results show how rectified currents (mean flows) can be generated via Reynolds stress gradients induced by periodic forcing of the grid. In the absence of background rotation, rectified flow is observed in the radial and vertical directions only. The presence of background rotation tends to organize these motions in that the flow tends to move parallel to the turbulent source, i.e., in the azimuthal direction, with the source (strong turbulence) located to the right, facing downstream. The influence of rotation on the Reynolds stresses and their gradients as well as on the ensuing mean flow is evaluated, and the observations are examined by considering individual contributions of the terms in the Reynolds-averaged momentum equations.

  16. Fluvial experiments using inertial sensors.

    NASA Astrophysics Data System (ADS)

    Maniatis, Georgios; Valyrakis, Manousos; Hodge, Rebecca; Drysdale, Tim; Hoey, Trevor

    2017-04-01

    During the last four years we have announced results on the development of a smart pebble that is constructed and calibrated specifically for capturing the dynamics of coarse sediment motion in river beds, at a grain scale. In this presentation we report details of our experimental validation across a range of flow regimes. The smart pebble contains Inertial Measurements Units (IMUs), which are sensors capable of recording the inertial acceleration and the angular velocity of the rigid bodies into which they are attached. IMUs are available across a range of performance levels, with commensurate increase in size, cost and performance as one progresses from integrated-circuit devices for use in commercial applications such as gaming and mobile phones, to larger brick-sized systems sometimes found in industrial applications such as vibration monitoring and quality control, or even the rack-mount equipment used in some aerospace and navigation applications (which can go as far as to include lasers and optical components). In parallel with developments in commercial and industrial settings, geomorphologists started recently to explore means of deploying IMUs in smart pebbles. The less-expensive, chip-scale IMUs have been shown to have adequate performance for this application, as well as offering a sufficiently compact form-factor. Four prototype sensors have been developed so far, and the latest (400 g acceleration range, 50-200 Hz sampling frequency) has been tested in fluvial laboratory experiments. We present results from three different experimental regimes designed for the evaluation of this sensor: a) an entrainment threshold experiment ; b) a bed impact experiment ; and c) a rolling experiment. All experiments used a 100 mm spherical sensor, and set a) were repeated using an equivalent size elliptical sensor. The experiments were conducted in the fluvial laboratory of the University of Glasgow (0.9 m wide flume) under different hydraulic conditions. The use of IMU results into direct parametrization of the inertial forces of grains which for the tested grain sizes were, as expected, always comparable to the independently measured hydrodynamic forces. However, the validity of IMU measurements is subjected to specific design, processing and experimental considerations, and we present the results of our analysis of these.

  17. Parallel Lattice Basis Reduction Using a Multi-threaded Schnorr-Euchner LLL Algorithm

    NASA Astrophysics Data System (ADS)

    Backes, Werner; Wetzel, Susanne

    In this paper, we introduce a new parallel variant of the LLL lattice basis reduction algorithm. Our new, multi-threaded algorithm is the first to provide an efficient, parallel implementation of the Schorr-Euchner algorithm for today’s multi-processor, multi-core computer architectures. Experiments with sparse and dense lattice bases show a speed-up factor of about 1.8 for the 2-thread and about factor 3.2 for the 4-thread version of our new parallel lattice basis reduction algorithm in comparison to the traditional non-parallel algorithm.

  18. LABORATORY ANALYSES OF CORONA DISCHARGES

    EPA Science Inventory

    The paper discusses an experimental research program to characterize corona generation from different electrode geometries in a range of conditions comparable to those found in electrostatic precipitators (ESPs). A wire-parallel plate device and a wire-cylinder device were used t...

  19. LABORATORY ANALYSIS OF BACK-CORONA DISCHARGE

    EPA Science Inventory

    The paper discusses an experimental research program to characterize back-corona generation and behavior in a range of environments and geometries common to electrostatic precipitators (ESPs). A wire-parallel plate device was used to monitor the intensity and distribution of back...

  20. Measuring the Index of Refraction.

    ERIC Educational Resources Information Center

    Phelps, F. M., III; Jacobson, B. S.

    1980-01-01

    Presents two methods for measuring the index of refraction of glass or lucite. These two methods, used in the freshman laboratory, are based on the fact that a ray of light inside a block will be refracted parallel to the surface. (HM)

  1. Xyce Parallel Electronic Simulator Users' Guide Version 6.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase$-$ a message passing parallel implementation $-$ which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  2. A 3D unstructured grid nearshore hydrodynamic model based on the vortex force formalism

    NASA Astrophysics Data System (ADS)

    Zheng, Peng; Li, Ming; van der A, Dominic A.; van der Zanden, Joep; Wolf, Judith; Chen, Xueen; Wang, Caixia

    2017-08-01

    A new three-dimensional nearshore hydrodynamic model system is developed based on the unstructured-grid version of the third generation spectral wave model SWAN (Un-SWAN) coupled with the three-dimensional ocean circulation model FVCOM to enable the full representation of the wave-current interaction in the nearshore region. A new wave-current coupling scheme is developed by adopting the vortex-force (VF) scheme to represent the wave-current interaction. The GLS turbulence model is also modified to better reproduce wave-breaking enhanced turbulence, together with a roller transport model to account for the effect of surface wave roller. This new model system is validated first against a theoretical case of obliquely incident waves on a planar beach, and then applied to three test cases: a laboratory scale experiment of normal waves on a beach with a fixed breaker bar, a field experiment of oblique incident waves on a natural, sandy barred beach (Duck'94 experiment), and a laboratory study of normal-incident waves propagating around a shore-parallel breakwater. Overall, the model predictions agree well with the available measurements in these tests, illustrating the robustness and efficiency of the present model for very different spatial scales and hydrodynamic conditions. Sensitivity tests indicate the importance of roller effects and wave energy dissipation on the mean flow (undertow) profile over the depth. These tests further suggest to adopt a spatially varying value for roller effects across the beach. In addition, the parameter values in the GLS turbulence model should be spatially inhomogeneous, which leads to better prediction of the turbulent kinetic energy and an improved prediction of the undertow velocity profile.

  3. A parallel algorithm for the eigenvalues and eigenvectors for a general complex matrix

    NASA Technical Reports Server (NTRS)

    Shroff, Gautam

    1989-01-01

    A new parallel Jacobi-like algorithm is developed for computing the eigenvalues of a general complex matrix. Most parallel methods for this parallel typically display only linear convergence. Sequential norm-reducing algorithms also exit and they display quadratic convergence in most cases. The new algorithm is a parallel form of the norm-reducing algorithm due to Eberlein. It is proven that the asymptotic convergence rate of this algorithm is quadratic. Numerical experiments are presented which demonstrate the quadratic convergence of the algorithm and certain situations where the convergence is slow are also identified. The algorithm promises to be very competitive on a variety of parallel architectures.

  4. Development of a fiber optic compressor blade sensor

    NASA Technical Reports Server (NTRS)

    Dhadwal, Harbans Singh

    1995-01-01

    A complete working prototype of the fiber optic blade tip sensor was first tested in the laboratory, followed by a thorough evaluation at NASA W8 Single Compressor Stage Facility in Lewis Research Center. Subsequently, a complete system with three parallel channels was fabricated and delivered to Dr. Kurkov. The final system was tested in the Subsonic Wind Tunnel Facility, in parallel with The General Electric Company's light probe system. The results at all operating speeds were comparable. This report provides a brief description of the system and presents a summary of the experimental results.

  5. The ethics of wildlife research: a nine R theory.

    PubMed

    Curzer, Howard J; Wallace, Mark C; Perry, Gad; Muhlberger, Peter J; Perry, Dan

    2013-01-01

    The commonsense ethical constraints on laboratory animal research known as the three Rs are widely accepted, but no constraints tailored to research on animals in the wild are available. In this article, we begin to fill that gap. We sketch a set of commonsense ethical constraints on ecosystem research parallel to the constraints that govern laboratory animal research. Then we combine the animal and ecosystem constraints into a single theory to govern research on animals in the wild.

  6. Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher R. Johnson, Charles D. Hansen

    2001-10-29

    The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less

  7. Laboratory Study of the Displacement Coalbed CH4 Process and Efficiency of CO2 and N2 Injection

    PubMed Central

    Wang, Liguo; Wang, Yongkang

    2014-01-01

    ECBM displacement experiments are a direct way to observe the gas displacement process and efficiency by inspecting the produced gas composition and flow rate. We conducted two sets of ECBM experiments by injecting N2 and CO2 through four large parallel specimens (300 × 50 × 50 mm coal briquette). N2 or CO2 is injected at pressures of 1.5, 1.8, and 2.2 MPa and various crustal stresses. The changes in pressure along the briquette and the concentration of the gas mixture flowing out of the briquette were analyzed. Gas injection significantly enhances CBM recovery. Experimental recoveries of the original extant gas are in excess of 90% for all cases. The results show that the N2 breakthrough occurs earlier than the CO2 breakthrough. The breakthrough time of N2 is approximately 0.5 displaced volumes. Carbon dioxide, however, breaks through at approximately 2 displaced volumes. Coal can adsorb CO2, which results in a slower breakthrough time. In addition, ground stress significantly influences the displacement effect of the gas injection. PMID:24741346

  8. High Power Ion Cyclotron Heating in the VASIMR

    NASA Astrophysics Data System (ADS)

    Longmier, B. W.; Brukardt, M. S.; Bering, E. A.; Chang Diaz, F.; Squire, J.

    2009-12-01

    The Variable Specific Impulse Magnetoplasma Rocket (VASIMR®) is an electric propulsion system under development at Ad Astra Rocket Company that utilizes several processes of ion acceleration and heating that occur in the Birkeland currents of an auroral arc system. Among these processes are parallel electric field acceleration, lower hybrid resonance heating, and ion cyclotron resonance heating. The VASIMR® is capable of laboratory simulation of electromagnetic ion cyclotron wave heating during a single pass of ions through the resonance region. The plasma is generated by a helicon discharge of 35 kW then passes through a 176 kW RF booster stage that couples left hand polarized slow mode waves from the high field side of the resonance. VX-200 auroral simulation results from the past year are discussed. Ambipolar acceleration has been shown to produce 35eV argon ions in the helicon exhaust. The effects on the ion exhaust with an addition of 150-200 kW of ion cyclotron heating are presented. The changes to the VASIMR® experiment at Ad Astra Rocket Company's new facility in Webster, Texas will also be discussed, including the possibility of collaborative experiments.

  9. Ion Cyclotron Waves in the VASIMR

    NASA Astrophysics Data System (ADS)

    Brukardt, M. S.; Bering, E. A.; Chang-Diaz, F. R.; Squire, J. P.; Longmier, B.

    2008-12-01

    The Variable Specific Impulse Magnetoplasma Rocket is an electric propulsion system under development at Ad Astra Rocket Company that utilizes several processes of ion acceleration and heating that occur in the Birkeland currents of an auroral arc system. Among these processes are parallel electric field acceleration, lower hybrid resonance heating, and ion cyclotron resonance heating. The VASIMR is capable of laboratory simulation of electromagnetic ion cyclotron wave heating during a single pass of the plasma through the resonance region. The plasma is generated by a helicon discharge of about 25 kW then passes through an RF booster stage that shoots left hand polarized slow mode waves from the high field side of the resonance. This paper will focus on the upgrades to the VX-200 test model over the last year. After summarizing the VX- 50 and VX-100 results, the new data from the VX-200 model will be presented. Lastly, the changes to the VASIMR experiment due to Ad Astra Rocket Company's new facility in Webster, Texas will also be discussed, including the possibility of collaborative experiments at the new facility.

  10. Tortuous pathways: Fundamental characterisation of the anisotropic permeability through clay-rich shales from macro- to nano-scale.

    NASA Astrophysics Data System (ADS)

    Mitchell, T. M.; Backeberg, N. R.; Iacoviello, F.; Rittner, M.; Jones, A. P.; Wheeler, J.; Day, R.; Vermeesch, P.; Shearing, P. R.; Striolo, A.

    2017-12-01

    The permeability of shales is important, because it controls where oil and gas resources can migrate to and where in the Earth hydrocarbons are ultimately stored. Shales have a well-known anisotropic directional permeability that is inherited from the depositional layering of sedimentary laminations, where the highest permeability is measured parallel to laminations and the lowest permeability is perpendicular to laminations. We combine state of the art laboratory permeability experiments with high-resolution X-ray computed tomography and for the first time can quantify the three-dimensional interconnected pathways through a rock that define the anisotropic behaviour of shales. Experiments record a physical anisotropy in permeability of one to two orders of magnitude. Two- and three-dimensional analyses of micro- and nano-scale X-ray computed tomography illuminate that the directional anisotropy is fundamentally controlled by the bulk rock mineral geometry, which determines the finite length (or tortuosity) of the interconnected pathways through the porous/permeable phases in shales. Understanding the mineral-scale control on permeability will allow for better estimations of the extent of recoverable reserves in shale gas plays globally.

  11. Mechanism of Tennis Racket Spin Performance

    NASA Astrophysics Data System (ADS)

    Kawazoe, Yoshihiko; Okimoto, Kenji; Okimoto, Keiko

    Players often say that some strings provide a better grip and more spin than others, but ball spin did not depend on string type, gauge, or tension in pervious laboratory experiments. There was no research work on spin to uncover what is really happening during an actual tennis impact because of the difficulty of performing the appropriate experiments. The present paper clarified the mechanism of top spin and its improvement by lubrication of strings through the use of high-speed video analysis. It also provided a more detailed explanation of spin behavior by comparing a racket with lubricated strings with the famous “spaghetti” strung racket, which was banned in 1978 by the International Tennis Federation because it used plastic spaghetti tubing over the strings to reduce friction, resulting in excessive ball spin. As the main strings stretch and slide sideways more, the ball is given additional spin due to the restoring force parallel to the string face when the main strings spring back and the ball is released from the strings. Herein, we also showed that the additional spin results in a reduction of shock vibrations of the wrist joint during impact.

  12. The SPectrometer for Ice Nuclei (SPIN): An instrument to investigate ice nucleation

    DOE PAGES

    Garimella, Sarvesh; Kristensen, Thomas Bjerring; Ignatius, Karolina; ...

    2016-07-06

    The SPectrometer for Ice Nuclei (SPIN) is a commercially available ice nucleating particle (INP) counter manufactured by Droplet Measurement Technologies in Boulder, CO. The SPIN is a continuous flow diffusion chamber with parallel plate geometry based on the Zurich Ice Nucleation Chamber and the Portable Ice Nucleation Chamber. This study presents a standard description for using the SPIN instrument and also highlights methods to analyze measurements in more advanced ways. It characterizes and describes the behavior of the SPIN chamber, reports data from laboratory measurements, and quantifies uncertainties associated with the measurements. Experiments with ammonium sulfate are used to investigatemore » homogeneous freezing of deliquesced haze droplets and droplet breakthrough. Experiments with kaolinite, NX illite, and silver iodide are used to investigate heterogeneous ice nucleation. SPIN nucleation results are compared to those from the literature. A machine learning approach for analyzing depolarization data from the SPIN optical particle counter is also presented (as an advanced use). Altogether, we report that the SPIN is able to reproduce previous INP counter measurements.« less

  13. Trade-offs in osmoregulation and parallel shifts in molecular function follow ecological transitions to freshwater in the Alewife

    USGS Publications Warehouse

    Velotta, Jonathan P.; McCormick, Stephen; Schultz, Eric T.

    2015-01-01

    Adaptation to freshwater may be expected to reduce performance in seawater because these environments represent opposing selective regimes. We tested for such a trade-off in populations of the Alewife (Alosa pseudoharengus). Alewives are ancestrally anadromous, and multiple populations have been independently restricted to freshwater (landlocked). We conducted salinity challenge experiments, whereby juvenile Alewives from one anadromous and multiple landlocked populations were exposed to freshwater and seawater on acute and acclimation timescales. In response to acute salinity challenge trials, independently derived landlocked populations varied in the degree to which seawater tolerance has been lost. In laboratory-acclimation experiments, landlocked Alewives exhibited improved freshwater tolerance, which was correlated with reductions in seawater tolerance and hypo-osmotic balance, suggesting that trade-offs in osmoregulation may be associated with local adaptation to freshwater. We detected differentiation between life-history forms in the expression of an ion-uptake gene (NHE3), and in gill Na+/K+-ATPase activity. Trade-offs in osmoregulation, therefore, may be mediated by differentiation in ion-uptake and salt-secreting pathways.

  14. 3D Hybrid Simulations of Interactions of High-Velocity Plasmoids with Obstacles

    NASA Astrophysics Data System (ADS)

    Omelchenko, Y. A.; Weber, T. E.; Smith, R. J.

    2015-11-01

    Interactions of fast plasma streams and objects with magnetic obstacles (dipoles, mirrors, etc) lie at the core of many space and laboratory plasma phenomena ranging from magnetoshells and solar wind interactions with planetary magnetospheres to compact fusion plasmas (spheromaks and FRCs) to astrophysics-in-lab experiments. Properly modeling ion kinetic, finite-Larmor radius and Hall effects is essential for describing large-scale plasma dynamics, turbulence and heating in complex magnetic field geometries. Using an asynchronous parallel hybrid code, HYPERS, we conduct 3D hybrid (particle-in-cell ion, fluid electron) simulations of such interactions under realistic conditions that include magnetic flux coils, ion-ion collisions and the Chodura resistivity. HYPERS does not step simulation variables synchronously in time but instead performs time integration by executing asynchronous discrete events: updates of particles and fields carried out as frequently as dictated by local physical time scales. Simulations are compared with data from the MSX experiment which studies the physics of magnetized collisionless shocks through the acceleration and subsequent stagnation of FRC plasmoids against a strong magnetic mirror and flux-conserving boundary.

  15. Characterization and Simulation of a New Design Parallel-Plate Ionization Chamber for CT Dosimetry at Calibration Laboratories

    NASA Astrophysics Data System (ADS)

    Perini, Ana P.; Neves, Lucio P.; Maia, Ana F.; Caldas, Linda V. E.

    2013-12-01

    In this work, a new extended-length parallel-plate ionization chamber was tested in the standard radiation qualities for computed tomography established according to the half-value layers defined at the IEC 61267 standard, at the Calibration Laboratory of the Instituto de Pesquisas Energéticas e Nucleares (IPEN). The experimental characterization was made following the IEC 61674 standard recommendations. The experimental results obtained with the ionization chamber studied in this work were compared to those obtained with a commercial pencil ionization chamber, showing a good agreement. With the use of the PENELOPE Monte Carlo code, simulations were undertaken to evaluate the influence of the cables, insulator, PMMA body, collecting electrode, guard ring, screws, as well as different materials and geometrical arrangements, on the energy deposited on the ionization chamber sensitive volume. The maximum influence observed was 13.3% for the collecting electrode, and regarding the use of different materials and design, the substitutions showed that the original project presented the most suitable configuration. The experimental and simulated results obtained in this work show that this ionization chamber has appropriate characteristics to be used at calibration laboratories, for dosimetry in standard computed tomography and diagnostic radiology quality beams.

  16. Cyclic deformations in the Opalinus clay: a laboratory experiment

    NASA Astrophysics Data System (ADS)

    Huber, Emanuel; Huggenberger, Peter; Möri, Andreas; Meier, Edi

    2015-04-01

    The influence of tunnel climate on deformation cycles of joint openings and closings is often observed immediately after excavation. At the EZ-B niche in the Mt. Terri rock laboratory (Switzerland), a cyclic deformation of the shaly Opalinus clay has been monitored for several years. The deformation cycles of the joints parallel to the clay bedding planes correlate with seasonal variations in relative humidity of the air in the niche. In winter, when the relative humidity is the lowest (down to 65%), the joints open as the clay volume decreases, whereas they tend to close in the summer when the relative humidity reaches up to 100%. Furthermore, in situ measurements have shown the trend of an increasingly smaller aperture of joints with time. A laboratory experiment was carried out to reproduce the observed cyclic deformation in a climate chamber using a core sample of Opalinus clay. The main goal of the experiment was to investigate the influence of the relative humidity on the deformation of the Opalinus clay while excluding the in situ effects (e.g. confining stress). The core sample of Opalinus clay was put into a closed ended PVC tube and the space between the sample and the tube was filled with resin. Then, the sample (size: 28 cm × 14 cm × 6.5 cm) was cut in half lengthways and the open end was cut, so that the half-core sample could move in one direction. The mounted sample was exposed to wetting and drying cycles in a climate chamber. Air temperature, air humidity and sample weight were continuously recorded. Photographs taken at regular time intervals by a webcam allowed the formation/deformation of cracks on the surface of the sample to be monitored. A crackmeter consisting of a double-plate capacitor attached to the core sample was developed to measure the dynamics of the crack opening and closing. Preliminary results show that: - Deformation movements during different climate cycles can be visualized with the webcam - The crackmeter signal gives a relatively precise response for relative humidity below 80% - The sample weight variations are clearly related to the climatic conditions (temperature and relative humidity) and associated with deformation of the sample (widening and narrowing of the cracks) - The control of the relative humidity in the climate chamber turned out to be difficult in a laboratory without climate conditioning, especially during summer time

  17. MPI_XSTAR: MPI-based Parallelization of the XSTAR Photoionization Program

    NASA Astrophysics Data System (ADS)

    Danehkar, Ashkbiz; Nowak, Michael A.; Lee, Julia C.; Smith, Randall K.

    2018-02-01

    We describe a program for the parallel implementation of multiple runs of XSTAR, a photoionization code that is used to predict the physical properties of an ionized gas from its emission and/or absorption lines. The parallelization program, called MPI_XSTAR, has been developed and implemented in the C++ language by using the Message Passing Interface (MPI) protocol, a conventional standard of parallel computing. We have benchmarked parallel multiprocessing executions of XSTAR, using MPI_XSTAR, against a serial execution of XSTAR, in terms of the parallelization speedup and the computing resource efficiency. Our experience indicates that the parallel execution runs significantly faster than the serial execution, however, the efficiency in terms of the computing resource usage decreases with increasing the number of processors used in the parallel computing.

  18. Parallel Electrochemical Treatment System and Application for Identifying Acid-Stable Oxygen Evolution Electrocatalysts

    DOE PAGES

    Jones, Ryan J. R.; Shinde, Aniketa; Guevarra, Dan; ...

    2015-01-05

    There are many energy technologies require electrochemical stability or preactivation of functional materials. Due to the long experiment duration required for either electrochemical preactivation or evaluation of operational stability, parallel screening is required to enable high throughput experimentation. We found that imposing operational electrochemical conditions to a library of materials in parallel creates several opportunities for experimental artifacts. We discuss the electrochemical engineering principles and operational parameters that mitigate artifacts int he parallel electrochemical treatment system. We also demonstrate the effects of resistive losses within the planar working electrode through a combination of finite element modeling and illustrative experiments. Operationmore » of the parallel-plate, membrane-separated electrochemical treatment system is demonstrated by exposing a composition library of mixed metal oxides to oxygen evolution conditions in 1M sulfuric acid for 2h. This application is particularly important because the electrolysis and photoelectrolysis of water are promising future energy technologies inhibited by the lack of highly active, acid-stable catalysts containing only earth abundant elements.« less

  19. Concurrent extensions to the FORTRAN language for parallel programming of computational fluid dynamics algorithms

    NASA Technical Reports Server (NTRS)

    Weeks, Cindy Lou

    1986-01-01

    Experiments were conducted at NASA Ames Research Center to define multi-tasking software requirements for multiple-instruction, multiple-data stream (MIMD) computer architectures. The focus was on specifying solutions for algorithms in the field of computational fluid dynamics (CFD). The program objectives were to allow researchers to produce usable parallel application software as soon as possible after acquiring MIMD computer equipment, to provide researchers with an easy-to-learn and easy-to-use parallel software language which could be implemented on several different MIMD machines, and to enable researchers to list preferred design specifications for future MIMD computer architectures. Analysis of CFD algorithms indicated that extensions of an existing programming language, adaptable to new computer architectures, provided the best solution to meeting program objectives. The CoFORTRAN Language was written in response to these objectives and to provide researchers a means to experiment with parallel software solutions to CFD algorithms on machines with parallel architectures.

  20. A Comparison of Parallelism in Interface Designs for Computer-Based Learning Environments

    ERIC Educational Resources Information Center

    Min, Rik; Yu, Tao; Spenkelink, Gerd; Vos, Hans

    2004-01-01

    In this paper we discuss an experiment that was carried out with a prototype, designed in conformity with the concept of parallelism and the Parallel Instruction theory (the PI theory). We designed this prototype with five different interfaces, and ran an empirical study in which 18 participants completed an abstract task. The five basic designs…

  1. Parallel-Processing Test Bed For Simulation Software

    NASA Technical Reports Server (NTRS)

    Blech, Richard; Cole, Gary; Townsend, Scott

    1996-01-01

    Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).

  2. PRATHAM: Parallel Thermal Hydraulics Simulations using Advanced Mesoscopic Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Abhijit S; Jain, Prashant K; Mudrich, Jaime A

    2012-01-01

    At the Oak Ridge National Laboratory, efforts are under way to develop a 3D, parallel LBM code called PRATHAM (PaRAllel Thermal Hydraulic simulations using Advanced Mesoscopic Methods) to demonstrate the accuracy and scalability of LBM for turbulent flow simulations in nuclear applications. The code has been developed using FORTRAN-90, and parallelized using the message passing interface MPI library. Silo library is used to compact and write the data files, and VisIt visualization software is used to post-process the simulation data in parallel. Both the single relaxation time (SRT) and multi relaxation time (MRT) LBM schemes have been implemented in PRATHAM.more » To capture turbulence without prohibitively increasing the grid resolution requirements, an LES approach [5] is adopted allowing large scale eddies to be numerically resolved while modeling the smaller (subgrid) eddies. In this work, a Smagorinsky model has been used, which modifies the fluid viscosity by an additional eddy viscosity depending on the magnitude of the rate-of-strain tensor. In LBM, this is achieved by locally varying the relaxation time of the fluid.« less

  3. Diagnostic approach to hemoglobins with high oxygen affinity: experience from France and Belgium and review of the literature.

    PubMed

    Orvain, Corentin; Joly, Philippe; Pissard, Serge; Badiou, Stéphanie; Badens, Catherine; Bonello-Palot, Nathalie; Couque, Nathalie; Gulbis, Béatrice; Aguilar-Martinez, Patricia

    2017-02-01

    Congenital causes of erythrocytosis are now more easily identified due to the improvement of the molecular characterization of many of them. Among these causes, hemoglobins with high oxygen affinity take a large place. The aim of this work was to reevaluate the diagnostic approach of these disorders. To assess the current practices, we sent a questionnaire to the expert laboratories in the diagnosis of hemoglobinopathies in France and Belgium. In parallel, we gathered the methods used for the diagnosis of the hemoglobins with high oxygen affinity indexed in the international database HbVar. Even though they remain a rare cause of erythrocytosis (1 to 5 positive diagnosis every year in each of the questioned specialized laboratories), hemoglobins with high oxygen affinity are increasingly suspected by clinicians. Phenotypic assessment by laboratory techniques remains a main step in their diagnosis as it enables the finding of 93% of them in the questioned laboratories (28 of the 30 variants diagnosed during the last 5 years). Among the 96 hemoglobin variants with high oxygen affinity indexed in the international database, 87% could be diagnosed with phenotypic techniques. A direct measure of the p50 with the Hemox-Analyzer is included in the diagnostic approach of half of the laboratories only, because of the poor availability of this apparatus. Comparatively, the estimation of p50 by blood gas analyzers on venous blood is a much more convenient and attractive method but due to the lack of proof as to its effectiveness in the diagnosis of hemoglobins with high oxygen affinity, it requires further investigations. Beta- and alphaglobin genes analysis by molecular biology techniques is essential as it either allows a quick and definite identification of the variant or definitely excludes the diagnosis. It is thus systematically performed as a first or second step method, according to the laboratory practice.

  4. The optimization of total laboratory automation by simulation of a pull-strategy.

    PubMed

    Yang, Taho; Wang, Teng-Kuan; Li, Vincent C; Su, Chia-Lo

    2015-01-01

    Laboratory results are essential for physicians to diagnose medical conditions. Because of the critical role of medical laboratories, an increasing number of hospitals use total laboratory automation (TLA) to improve laboratory performance. Although the benefits of TLA are well documented, systems occasionally become congested, particularly when hospitals face peak demand. This study optimizes TLA operations. Firstly, value stream mapping (VSM) is used to identify the non-value-added time. Subsequently, batch processing control and parallel scheduling rules are devised and a pull mechanism that comprises a constant work-in-process (CONWIP) is proposed. Simulation optimization is then used to optimize the design parameters and to ensure a small inventory and a shorter average cycle time (CT). For empirical illustration, this approach is applied to a real case. The proposed methodology significantly improves the efficiency of laboratory work and leads to a reduction in patient waiting times and increased service level.

  5. An Environment for Incremental Development of Distributed Extensible Asynchronous Real-time Systems

    NASA Technical Reports Server (NTRS)

    Ames, Charles K.; Burleigh, Scott; Briggs, Hugh C.; Auernheimer, Brent

    1996-01-01

    Incremental parallel development of distributed real-time systems is difficult. Architectural techniques and software tools developed at the Jet Propulsion Laboratory's (JPL's) Flight System Testbed make feasible the integration of complex systems in various stages of development.

  6. Branched Polymers for Enhancing Polymer Gel Strength and Toughness

    DTIC Science & Technology

    2013-02-01

    Molecular Massively Parallel Simulator ( LAMMPS ) program and the stress-strain relations were calculated with varying strain-rates (figure 6). A...Acronyms ARL U.S. Army Research Laboratory D3 hexamethylcyclotrisiloxane FTIR Fourier transform infrared GPC gel permeation chromatography LAMMPS

  7. Integration experiences and performance studies of A COTS parallel archive systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Bary

    2010-01-01

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less

  8. Integration experiments and performance studies of a COTS parallel archive system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Gary

    2010-06-16

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less

  9. Using the Computer as a Laboratory Instrument.

    ERIC Educational Resources Information Center

    Collings, Peter J.; Greenslade, Thomas B., Jr.

    1989-01-01

    Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)

  10. On the costs of parallel processing in dual-task performance: The case of lexical processing in word production.

    PubMed

    Paucke, Madlen; Oppermann, Frank; Koch, Iring; Jescheniak, Jörg D

    2015-12-01

    Previous dual-task picture-naming studies suggest that lexical processes require capacity-limited processes and prevent other tasks to be carried out in parallel. However, studies involving the processing of multiple pictures suggest that parallel lexical processing is possible. The present study investigated the specific costs that may arise when such parallel processing occurs. We used a novel dual-task paradigm by presenting 2 visual objects associated with different tasks and manipulating between-task similarity. With high similarity, a picture-naming task (T1) was combined with a phoneme-decision task (T2), so that lexical processes were shared across tasks. With low similarity, picture-naming was combined with a size-decision T2 (nonshared lexical processes). In Experiment 1, we found that a manipulation of lexical processes (lexical frequency of T1 object name) showed an additive propagation with low between-task similarity and an overadditive propagation with high between-task similarity. Experiment 2 replicated this differential forward propagation of the lexical effect and showed that it disappeared with longer stimulus onset asynchronies. Moreover, both experiments showed backward crosstalk, indexed as worse T1 performance with high between-task similarity compared with low similarity. Together, these findings suggest that conditions of high between-task similarity can lead to parallel lexical processing in both tasks, which, however, does not result in benefits but rather in extra performance costs. These costs can be attributed to crosstalk based on the dual-task binding problem arising from parallel processing. Hence, the present study reveals that capacity-limited lexical processing can run in parallel across dual tasks but only at the expense of extraordinary high costs. (c) 2015 APA, all rights reserved).

  11. The effects of cyproterone acetate on sleeping and waking penile erections in pedophiles: possible implications for treatment.

    PubMed

    Cooper, A J; Cernovovsky, Z

    1992-02-01

    This study reports the short term effects in five pedophiles of the antiandrogenic drug cyproterone acetate (CPA) on nocturnal penile tumescence (NPT); penile responses to erotic stimuli in the laboratory; and sex hormones (testosterone, LH, FSH and prolactin). During the administration of CPA, NPT, laboratory arousal and hormone measures (except prolactin) all decreased. Waking laboratory measures were influenced less and were more variable (one subject showed greater arousal) than NPT measures. The changes in NPT closely paralleled the reduction in testosterone. The results are discussed with reference to the known psycho-neuroendocrinology of sleeping and waking erections.

  12. Exploratory study of the acceptance of two individual practical classes with remote labs

    NASA Astrophysics Data System (ADS)

    Tirado-Morueta, Ramón; Sánchez-Herrera, Reyes; Márquez-Sánchez, Marco A.; Mejías-Borrero, Andrés; Andujar-Márquez, José Manuel

    2018-03-01

    Remote lab experiences are proliferating in higher education, although there are still few studies that manage to build a theoretical framework for educational assessment and design of this technology. In order to explore to what extent the use of facilitators of proximity to the laboratory and the autonomy of the experiment makes remote laboratories a technology accepted by students, two remote labs different yet similar educational conditions in laboratories are used. A sample of 98 undergraduate students from a degree course in Energy Engineering was used for this study; 57 of these students ran experiments in a laboratory of electrical machines and 41 in a photovoltaic systems laboratory. The data suggest using conditions that facilitate the proximity of the laboratory and the autonomy in the realisation of the experiment; in both laboratories the experience was positively valued by the students. Also, data suggest that the types of laboratory and experiment have influences on usability - autonomy and lab proximity - perceived by students.

  13. Parallel Unsteady Overset Mesh Methodology for a Multi-Solver Paradigm with Adaptive Cartesian Grids

    DTIC Science & Technology

    2008-08-21

    Engineer, U.S. Army Research Laboratory ., Matthew.W.Floros@nasa.gov, AIAA Member ‡Senior Research Scientist, Scaled Numerical Physics LLC., awissink...IV.E and IV.D). Good linear scalability was observed for all three cases up to 12 processors. Beyond that the scalability drops off depending on grid...Research Laboratory for the usage of SUGGAR module and Yikloon Lee at NAVAIR for the usage of the NAVAIR-IHC code. 13 of 22 American Institute of

  14. Parallel Study of HEND, RAD, and DAN Instrument Response to Martian Radiation and Surface Conditions

    NASA Technical Reports Server (NTRS)

    Martiniez Sierra, Luz Maria; Jun, Insoo; Litvak, Maxim; Sanin, Anton; Mitrofanov, Igor; Zeitlin, Cary

    2015-01-01

    Nuclear detection methods are being used to understand the radiation environment at Mars. JPL (Jet Propulsion Laboratory) assets on Mars include: Orbiter -2001 Mars Odyssey [High Energy Neutron Detector (HEND)]; Mars Science Laboratory Rover -Curiosity [(Radiation Assessment Detector (RAD); Dynamic Albedo Neutron (DAN))]. Spacecraft have instruments able to detect ionizing and non-ionizing radiation. Instrument response on orbit and on the surface of Mars to space weather and local conditions [is discussed] - Data available at NASA-PDS (Planetary Data System).

  15. Parallel approach in RDF query processing

    NASA Astrophysics Data System (ADS)

    Vajgl, Marek; Parenica, Jan

    2017-07-01

    Parallel approach is nowadays a very cheap solution to increase computational power due to possibility of usage of multithreaded computational units. This hardware became typical part of nowadays personal computers or notebooks and is widely spread. This contribution deals with experiments how evaluation of computational complex algorithm of the inference over RDF data can be parallelized over graphical cards to decrease computational time.

  16. Optimizing transformations of stencil operations for parallel cache-based architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bassetti, F.; Davis, K.

    This paper describes a new technique for optimizing serial and parallel stencil- and stencil-like operations for cache-based architectures. This technique takes advantage of the semantic knowledge implicity in stencil-like computations. The technique is implemented as a source-to-source program transformation; because of its specificity it could not be expected of a conventional compiler. Empirical results demonstrate a uniform factor of two speedup. The experiments clearly show the benefits of this technique to be a consequence, as intended, of the reduction in cache misses. The test codes are based on a 5-point stencil obtained by the discretization of the Poisson equation andmore » applied to a two-dimensional uniform grid using the Jacobi method as an iterative solver. Results are presented for a 1-D tiling for a single processor, and in parallel using 1-D data partition. For the parallel case both blocking and non-blocking communication are tested. The same scheme of experiments has bee n performed for the 2-D tiling case. However, for the parallel case the 2-D partitioning is not discussed here, so the parallel case handled for 2-D is 2-D tiling with 1-D data partitioning.« less

  17. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    DTIC Science & Technology

    2004-10-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory

  18. Zero-gravity cloud physics laboratory: Experiment program definition and preliminary laboratory concept studies

    NASA Technical Reports Server (NTRS)

    Eaton, L. R.; Greco, E. V.

    1973-01-01

    The experiment program definition and preliminary laboratory concept studies on the zero G cloud physics laboratory are reported. This program involves the definition and development of an atmospheric cloud physics laboratory and the selection and delineations of a set of candidate experiments that must utilize the unique environment of zero gravity or near zero gravity.

  19. Sequence Memory Constraints Give Rise to Language-Like Structure through Iterated Learning

    PubMed Central

    Cornish, Hannah; Dale, Rick; Kirby, Simon; Christiansen, Morten H.

    2017-01-01

    Human language is composed of sequences of reusable elements. The origins of the sequential structure of language is a hotly debated topic in evolutionary linguistics. In this paper, we show that sets of sequences with language-like statistical properties can emerge from a process of cultural evolution under pressure from chunk-based memory constraints. We employ a novel experimental task that is non-linguistic and non-communicative in nature, in which participants are trained on and later asked to recall a set of sequences one-by-one. Recalled sequences from one participant become training data for the next participant. In this way, we simulate cultural evolution in the laboratory. Our results show a cumulative increase in structure, and by comparing this structure to data from existing linguistic corpora, we demonstrate a close parallel between the sets of sequences that emerge in our experiment and those seen in natural language. PMID:28118370

  20. A study of Channeling, Volume Reflection and Volume Capture of 3.35 - 14.0 GeV Electrons in a bent Silicon Crystal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wistisen, T. N.; Uggerhoj, U. I.; Wienands, U.

    2015-12-03

    We present the experimental data and analysis of experiments conducted at SLAC National Accelerator Laboratory investigating the processes of channeling, volume-reflection and volume-capture along the (111) plane in a strongly bent quasi-mosaic silicon crystal. Additionally, these phenomena were investigated at 5 energies: 3.35, 4.2, 6.3, 10.5 and 14.0 GeV with a crystal with bending radius of 0.15m, corresponding to curvatures of 0.070, 0.088, 0.13, 0.22 and 0.29 times the critical curvature respectively. We have extracted important parameters describing the channeling process such as the dechanneling length, the angle of volume reflection, the surface transmission and the widths of the distributionmore » of channeled particles parallel and orthogonal to the plane.« less

  1. Issues in human/computer control of dexterous remote hands

    NASA Technical Reports Server (NTRS)

    Salisbury, K.

    1987-01-01

    Much research on dexterous robot hands has been aimed at the design and control problems associated with their autonomous operation, while relatively little research has addressed the problem of direct human control. It is likely that these two modes can be combined in a complementary manner yielding more capability than either alone could provide. While many of the issues in mixed computer/human control of dexterous hands parallel those found in supervisory control of traditional remote manipulators, the unique geometry and capabilities of dexterous hands pose many new problems. Among these are the control of redundant degrees of freedom, grasp stabilization and specification of non-anthropomorphic behavior. An overview is given of progress made at the MIT AI Laboratory in control of the Salisbury 3 finger hand, including experiments in grasp planning and manipulation via controlled slip. It is also suggested how we might introduce human control into the process at a variety of functional levels.

  2. Formation of collisionless shocks in magnetized plasma interaction with kinetic-scale obstacles

    DOE PAGES

    Cruz, F.; Alves, E. P.; Bamford, R. A.; ...

    2017-02-06

    We investigate the formation of collisionless magnetized shocks triggered by the interaction between magnetized plasma flows and miniature-sized (order of plasma kinetic-scales) magnetic obstacles resorting to massively parallel, full particle-in-cell simulations, including the electron kinetics. The critical obstacle size to generate a compressed plasma region ahead of these objects is determined by independently varying the magnitude of the dipolar magnetic moment and the plasma magnetization. Here we find that the effective size of the obstacle depends on the relative orientation between the dipolar and plasma internal magnetic fields, and we show that this may be critical to form a shockmore » in small-scale structures. We also study the microphysics of the magnetopause in different magnetic field configurations in 2D and compare the results with full 3D simulations. Finally, we evaluate the parameter range where such miniature magnetized shocks can be explored in laboratory experiments.« less

  3. A study of selected radiation and propagation problems related to antennas and probes in magneto-ionic media

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Research consisted of computations toward the solution of the problem of the current distribution on a cylindrical antenna in a magnetoplasma. The case of an antenna parallel to the applied magnetic field was investigated. A systematic method of asymptotic expansion was found which simplifies the solution in the general case by giving the field of a dipole even at relatively short range. Some useful properties of the dispersion surfaces in a lossy medium have also been found. A laboratory experiment was directed toward evaluating nonlinear effects, such as those due to power level, bias voltage and electron heating. The problem of reflection and transmission of waves in an electron heated plasma was treated theoretically. The profile inversion problem has been pursued. Some results are very encouraging, however, the general question of stability of the solution remains unsolved.

  4. Behavioural social choice: a status report.

    PubMed

    Regenwetter, Michel; Grofman, Bernard; Popova, Anna; Messner, William; Davis-Stober, Clintin P; Cavagnaro, Daniel R

    2009-03-27

    Behavioural social choice has been proposed as a social choice parallel to seminal developments in other decision sciences, such as behavioural decision theory, behavioural economics, behavioural finance and behavioural game theory. Behavioural paradigms compare how rational actors should make certain types of decisions with how real decision makers behave empirically. We highlight that important theoretical predictions in social choice theory change dramatically under even minute violations of standard assumptions. Empirical data violate those critical assumptions. We argue that the nature of preference distributions in electorates is ultimately an empirical question, which social choice theory has often neglected. We also emphasize important insights for research on decision making by individuals. When researchers aggregate individual choice behaviour in laboratory experiments to report summary statistics, they are implicitly applying social choice rules. Thus, they should be aware of the potential for aggregation paradoxes. We hypothesize that such problems may substantially mar the conclusions of a number of (sometimes seminal) papers in behavioural decision research.

  5. Comparison of electric dipole and magnetic loop antennas for exciting whistler modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stenzel, R. L.; Urrutia, J. M.

    2016-08-15

    The excitation of low frequency whistler modes from different antennas has been investigated experimentally in a large laboratory plasma. One antenna consists of a linear electric dipole oriented across the uniform ambient magnetic field B{sub 0}. The other antenna is an elongated loop with dipole moment parallel to B{sub 0}. Both antennas are driven by the same rf generator which produces a rf burst well below the electron cyclotron frequency. The antenna currents as well as the wave magnetic fields from each antenna are measured. Both the antenna currents and the wave fields of the loop antenna exceed that ofmore » the electric dipole by two orders of magnitude. The conclusion is that loop antennas are far superior to dipole antennas for exciting large amplitude whistler modes, a result important for active wave experiments in space plasmas.« less

  6. Sequence Memory Constraints Give Rise to Language-Like Structure through Iterated Learning.

    PubMed

    Cornish, Hannah; Dale, Rick; Kirby, Simon; Christiansen, Morten H

    2017-01-01

    Human language is composed of sequences of reusable elements. The origins of the sequential structure of language is a hotly debated topic in evolutionary linguistics. In this paper, we show that sets of sequences with language-like statistical properties can emerge from a process of cultural evolution under pressure from chunk-based memory constraints. We employ a novel experimental task that is non-linguistic and non-communicative in nature, in which participants are trained on and later asked to recall a set of sequences one-by-one. Recalled sequences from one participant become training data for the next participant. In this way, we simulate cultural evolution in the laboratory. Our results show a cumulative increase in structure, and by comparing this structure to data from existing linguistic corpora, we demonstrate a close parallel between the sets of sequences that emerge in our experiment and those seen in natural language.

  7. Xyce™ Parallel Electronic Simulator Users' Guide, Version 6.5.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik V.; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The information herein is subject to change without notice. Copyright © 2002-2016 Sandia Corporation. All rights reserved.« less

  8. MPI implementation of PHOENICS: A general purpose computational fluid dynamics code

    NASA Astrophysics Data System (ADS)

    Simunovic, S.; Zacharia, T.; Baltas, N.; Spalding, D. B.

    1995-03-01

    PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. The Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.

  9. MPI implementation of PHOENICS: A general purpose computational fluid dynamics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, S.; Zacharia, T.; Baltas, N.

    1995-04-01

    PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. Themore » Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.« less

  10. Fast experiments for structure elucidation of small molecules: Hadamard NMR with multiple receivers.

    PubMed

    Gierth, Peter; Codina, Anna; Schumann, Frank; Kovacs, Helena; Kupče, Ēriks

    2015-11-01

    We propose several significant improvements to the PANSY (Parallel NMR SpectroscopY) experiments-PANSY COSY and PANSY-TOCSY. The improved versions of these experiments provide sufficient spectral information for structure elucidation of small organic molecules from just two 2D experiments. The PANSY-TOCSY-Q experiment has been modified to allow for simultaneous acquisition of three different types of NMR spectra-1D C-13 of non-protonated carbon sites, 2D TOCSY and multiplicity edited 2D HETCOR. In addition the J-filtered 2D PANSY-gCOSY experiment records a 2D HH gCOSY spectrum in parallel with a (1) J-filtered HC long-range HETCOR spectrum as well as offers a simplified data processing. In addition to parallel acquisition, further time savings are feasible because of significantly smaller F1 spectral windows as compared to the indirect detection experiments. Use of cryoprobes and multiple receivers can significantly alleviate the sensitivity issues that are usually associated with the so called direct detection experiments. In cases where experiments are sampling limited rather than sensitivity limited further reduction of experiment time is achieved by using Hadamard encoding. In favorable cases the total recording time for the two PANSY experiments can be reduced to just 40 s. The proposed PANSY experiments provide sufficient information to allow the CMCse software package (Bruker) to solve structures of small organic molecules. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Self-sustained radial oscillating flows between parallel disks

    NASA Astrophysics Data System (ADS)

    Mochizuki, S.; Yang, W.-J.

    1985-05-01

    It is pointed out that radial flow between parallel circular disks is of interest in a number of physical systems such as hydrostatic air bearings, radial diffusers, and VTOL aircraft with centrally located downward-positioned jets. The present investigation is concerned with the problem of instability in radial flow between parallel disks. A time-dependent numerical study and experiments are conducted. Both approaches reveal the nucleation, growth, migration, and decay of annular separation bubbles (i.e. vortex or recirculation zones) in the laminar-flow region. A finite-difference technique is utilized to solve the full unsteady vorticity transport equation in the theoretical procedure, while the flow patterns in the experiments are visualized with the aid of dye-injection, hydrogen-bubble, and paraffin-mist methods. It is found that the separation and reattachment of shear layers in the radial flow through parallel disks are unsteady phenomena. The sequence of nucleation, growth, migration, and decay of the vortices is self-sustained.

  12. Laboratory Mid-frequency (Kilohertz) Range Seismic Property Measurements and X-ray CT Imaging of Fractured Sandstone Cores During Supercritical CO2 Injection

    NASA Astrophysics Data System (ADS)

    Nakagawa, S.; Kneafsey, T. J.; Chang, C.; Harper, E.

    2014-12-01

    During geological sequestration of CO2, fractures are expected to play a critical role in controlling the migration of the injected fluid in reservoir rock. To detect the invasion of supercritical (sc-) CO2 and to determine its saturation, velocity and attenuation of seismic waves can be monitored. When both fractures and matrix porosity connected to the fractures are present, wave-induced dynamic poroelastic interactions between these two different types of rock porosity—high-permeability, high-compliance fractures and low-permeability, low-compliance matrix porosity—result in complex velocity and attenuation changes of compressional waves as scCO2 invades the rock. We conducted core-scale laboratory scCO2 injection experiments on small (diameter 1.5 inches, length 3.5-4 inches), medium-porosity/permeability (porosity 15%, matrix permeability 35 md) sandstone cores. During the injection, the compressional and shear (torsion) wave velocities and attenuations of the entire core were determined using our Split Hopkinson Resonant Bar (short-core resonant bar) technique in the frequency range of 1-2 kHz, and the distribution and saturation of the scCO2 determined via X-ray CT imaging using a medical CT scanner. A series of tests were conducted on (1) intact rock cores, (2) a core containing a mated, core-parallel fracture, (3) a core containing a sheared core-parallel fracture, and (4) a core containing a sheared, core-normal fracture. For intact cores and a core containing a mated sheared fracture, injections of scCO2 into an initially water-saturated sample resulted in large and continuous decreases in the compressional velocity as well as temporary increases in the attenuation. For a sheared core-parallel fracture, large attenuation was also observed, but almost no changes in the velocity occurred. In contrast, a sample containing a core-normal fracture exhibited complex behavior of compressional wave attenuation: the attenuation peaked as the leading edge of the scCO2 approached the fracture; followed by an immediate drop as scCO2 invaded the fracture; and by another, gradual increase as the scCO2 infiltrated into the other side of the fracture. The compressional wave velocity declined monotonically, but the rate of velocity decrease changed with the changes in attenuation.

  13. Bubble-detector measurements of neutron radiation in the international space station: ISS-34 to ISS-37.

    PubMed

    Smith, M B; Khulapko, S; Andrews, H R; Arkhangelsky, V; Ing, H; Koslowksy, M R; Lewis, B J; Machrafi, R; Nikolaev, I; Shurshakov, V

    2016-02-01

    Bubble detectors have been used to characterise the neutron dose and energy spectrum in several modules of the International Space Station (ISS) as part of an ongoing radiation survey. A series of experiments was performed during the ISS-34, ISS-35, ISS-36 and ISS-37 missions between December 2012 and October 2013. The Radi-N2 experiment, a repeat of the 2009 Radi-N investigation, included measurements in four modules of the US orbital segment: Columbus, the Japanese experiment module, the US laboratory and Node 2. The Radi-N2 dose and spectral measurements are not significantly different from the Radi-N results collected in the same ISS locations, despite the large difference in solar activity between 2009 and 2013. Parallel experiments using a second set of detectors in the Russian segment of the ISS included the first characterisation of the neutron spectrum inside the tissue-equivalent Matroshka-R phantom. These data suggest that the dose inside the phantom is ∼70% of the dose at its surface, while the spectrum inside the phantom contains a larger fraction of high-energy neutrons than the spectrum outside the phantom. The phantom results are supported by Monte Carlo simulations that provide good agreement with the empirical data. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Translation Ambiguity in and out of Context

    ERIC Educational Resources Information Center

    Prior, Anat; Wintner, Shuly; MacWhinney, Brian; Lavie, Alon

    2011-01-01

    We compare translations of single words, made by bilingual speakers in a laboratory setting, with contextualized translation choices of the same items, made by professional translators and extracted from parallel language corpora. The translation choices in both cases show moderate convergence, demonstrating that decontextualized translation…

  15. Spherical harmonic results for the 3D Kobayashi Benchmark suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, P N; Chang, B; Hanebutte, U R

    1999-03-02

    Spherical harmonic solutions are presented for the Kobayashi benchmark suite. The results were obtained with Ardra, a scalable, parallel neutron transport code developed at Lawrence Livermore National Laboratory (LLNL). The calculations were performed on the IBM ASCI Blue-Pacific computer at LLNL.

  16. Parallels in the Arts

    ERIC Educational Resources Information Center

    Laffey, Grace

    1972-01-01

    A mini-course of nine weeks was organized as a laboratory course to survey relationships in literature, music, and art. Three periods in the arts (Romanticism, Impressionism, and Contemporary) were matched with three major activities; the basic areas of study and activity were poetry, short story, and novel. (Author)

  17. Analysis of Mancos shale failure in light of localization theory for transversely isotropic materials.

    NASA Astrophysics Data System (ADS)

    Ingraham, M. D.; Dewers, T. A.; Heath, J. E.

    2016-12-01

    Utilizing the localization conditions laid out in Rudnicki 2002, the failure of a series of tests performed on Mancos shale has been analyzed. Shale specimens were tested under constant mean stress conditions in an axisymmetric stress state, with specimens cored both parallel and perpendicular to bedding. Failure data indicates that for the range of pressures tested the failure surface is well represented by a Mohr- Coulomb failure surface with a friction angle of 34.4 for specimens cored parallel to bedding, and 26.5 for specimens cored perpendicular to bedding. There is no evidence of a yield cap up to 200 MPa mean stress. Comparison with the theory shows that the best agreement in terms of band angles comes from assuming normality of the plastic strain increment. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Experimental Fluidic Investigation of Degradation of Pico-liter Oil Droplets by Physical and Biological Processes

    NASA Astrophysics Data System (ADS)

    Jalali, Maryam; Sheng, Jian

    2016-11-01

    This study used laboratory experiments to assess degradation of crude oil by physical and biological processes including dissolution and consumption. To perform this study, we have developed a bioassay that consists of a flow chamber with a bottom glass substrate printed with an array of pico-liter oil droplets using micro-Transfer Printing. The technique allows the printing of highly homogeneous pico-liter droplet array with different dimensions and shapes that can be maintained for weeks. Since the droplets are pinned and stationary on the bottom substrate, the key processes can be evaluated by measuring the change of shape and volume using Atomic Force Microscopy. Parallel microfluidic bioassays are established at the beginning, exposed to abiotic/biotic solutions, and scarified for characterization at given time intervals for each experiment. Two processes, dissolution and consumption, are investigated. In addition, the effects of dispersant on these processes are also studied. The results show that the amount of oil degraded by bacteria accounts for almost 50% of the total volume in comparison to 25% via dissolution. Although dispersant has a subtle effect on dissolution, the effect on rates of consumption and its asymptotic behavior are substantial. Experiments involving different bacterial strains, dispersant concentration, and flow shear rate are on-going.

  19. Extreme Performance Scalable Operating Systems Final Progress Report (July 1, 2008 - October 31, 2011)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D; Shende, Sameer

    This is the final progress report for the FastOS (Phase 2) (FastOS-2) project with Argonne National Laboratory and the University of Oregon (UO). The project started at UO on July 1, 2008 and ran until April 30, 2010, at which time a six-month no-cost extension began. The FastOS-2 work at UO delivered excellent results in all research work areas: * scalable parallel monitoring * kernel-level performance measurement * parallel I/0 system measurement * large-scale and hybrid application performance measurement * onlne scalable performance data reduction and analysis * binary instrumentation

  20. Flexible All-Digital Receiver for Bandwidth Efficient Modulations

    NASA Technical Reports Server (NTRS)

    Gray, Andrew; Srinivasan, Meera; Simon, Marvin; Yan, Tsun-Yee

    2000-01-01

    An all-digital high data rate parallel receiver architecture developed jointly by Goddard Space Flight Center and the Jet Propulsion Laboratory is presented. This receiver utilizes only a small number of high speed components along with a majority of lower speed components operating in a parallel frequency domain structure implementable in CMOS, and can currently process up to 600 Mbps with standard QPSK modulation. Performance results for this receiver for bandwidth efficient QPSK modulation schemes such as square-root raised cosine pulse shaped QPSK and Feher's patented QPSK are presented, demonstrating the flexibility of the receiver architecture.

  1. How to say no: single- and dual-process theories of short-term recognition tested on negative probes.

    PubMed

    Oberauer, Klaus

    2008-05-01

    Three experiments with short-term recognition tasks are reported. In Experiments 1 and 2, participants decided whether a probe matched a list item specified by its spatial location. Items presented at study in a different location (intrusion probes) had to be rejected. Serial position curves of positive, new, and intrusion probes over the probed location's position were mostly parallel. Serial position curves of intrusion probes over their position of origin were again parallel to those of positive probes. Experiment 3 showed largely parallel serial position effects for positive probes and for intrusion probes plotted over positions in a relevant and an irrelevant list, respectively. The results support a dual-process theory in which recognition is based on familiarity and recollection, and recollection uses 2 retrieval routes, from context to item and from item to context.

  2. A Novel Design of 4-Class BCI Using Two Binary Classifiers and Parallel Mental Tasks

    PubMed Central

    Geng, Tao; Gan, John Q.; Dyson, Matthew; Tsui, Chun SL; Sepulveda, Francisco

    2008-01-01

    A novel 4-class single-trial brain computer interface (BCI) based on two (rather than four or more) binary linear discriminant analysis (LDA) classifiers is proposed, which is called a “parallel BCI.” Unlike other BCIs where mental tasks are executed and classified in a serial way one after another, the parallel BCI uses properly designed parallel mental tasks that are executed on both sides of the subject body simultaneously, which is the main novelty of the BCI paradigm used in our experiments. Each of the two binary classifiers only classifies the mental tasks executed on one side of the subject body, and the results of the two binary classifiers are combined to give the result of the 4-class BCI. Data was recorded in experiments with both real movement and motor imagery in 3 able-bodied subjects. Artifacts were not detected or removed. Offline analysis has shown that, in some subjects, the parallel BCI can generate a higher accuracy than a conventional 4-class BCI, although both of them have used the same feature selection and classification algorithms. PMID:18584040

  3. Architectures for reasoning in parallel

    NASA Technical Reports Server (NTRS)

    Hall, Lawrence O.

    1989-01-01

    The research conducted has dealt with rule-based expert systems. The algorithms that may lead to effective parallelization of them were investigated. Both the forward and backward chained control paradigms were investigated in the course of this work. The best computer architecture for the developed and investigated algorithms has been researched. Two experimental vehicles were developed to facilitate this research. They are Backpac, a parallel backward chained rule-based reasoning system and Datapac, a parallel forward chained rule-based reasoning system. Both systems have been written in Multilisp, a version of Lisp which contains the parallel construct, future. Applying the future function to a function causes the function to become a task parallel to the spawning task. Additionally, Backpac and Datapac have been run on several disparate parallel processors. The machines are an Encore Multimax with 10 processors, the Concert Multiprocessor with 64 processors, and a 32 processor BBN GP1000. Both the Concert and the GP1000 are switch-based machines. The Multimax has all its processors hung off a common bus. All are shared memory machines, but have different schemes for sharing the memory and different locales for the shared memory. The main results of the investigations come from experiments on the 10 processor Encore and the Concert with partitions of 32 or less processors. Additionally, experiments have been run with a stripped down version of EMYCIN.

  4. Research on Parallel Three Phase PWM Converters base on RTDS

    NASA Astrophysics Data System (ADS)

    Xia, Yan; Zou, Jianxiao; Li, Kai; Liu, Jingbo; Tian, Jun

    2018-01-01

    Converters parallel operation can increase capacity of the system, but it may lead to potential zero-sequence circulating current, so the control of circulating current was an important goal in the design of parallel inverters. In this paper, the Real Time Digital Simulator (RTDS) is used to model the converters parallel system in real time and study the circulating current restraining. The equivalent model of two parallel converters and zero-sequence circulating current(ZSCC) were established and analyzed, then a strategy using variable zero vector control was proposed to suppress the circulating current. For two parallel modular converters, hardware-in-the-loop(HIL) study based on RTDS and practical experiment were implemented, results prove that the proposed control strategy is feasible and effective.

  5. Integration of next-generation sequencing in clinical diagnostic molecular pathology laboratories for analysis of solid tumours; an expert opinion on behalf of IQN Path ASBL.

    PubMed

    Deans, Zandra C; Costa, Jose Luis; Cree, Ian; Dequeker, Els; Edsjö, Anders; Henderson, Shirley; Hummel, Michael; Ligtenberg, Marjolijn Jl; Loddo, Marco; Machado, Jose Carlos; Marchetti, Antonio; Marquis, Katherine; Mason, Joanne; Normanno, Nicola; Rouleau, Etienne; Schuuring, Ed; Snelson, Keeda-Marie; Thunnissen, Erik; Tops, Bastiaan; Williams, Gareth; van Krieken, Han; Hall, Jacqueline A

    2017-01-01

    The clinical demand for mutation detection within multiple genes from a single tumour sample requires molecular diagnostic laboratories to develop rapid, high-throughput, highly sensitive, accurate and parallel testing within tight budget constraints. To meet this demand, many laboratories employ next-generation sequencing (NGS) based on small amplicons. Building on existing publications and general guidance for the clinical use of NGS and learnings from germline testing, the following guidelines establish consensus standards for somatic diagnostic testing, specifically for identifying and reporting mutations in solid tumours. These guidelines cover the testing strategy, implementation of testing within clinical service, sample requirements, data analysis and reporting of results. In conjunction with appropriate staff training and international standards for laboratory testing, these consensus standards for the use of NGS in molecular pathology of solid tumours will assist laboratories in implementing NGS in clinical services.

  6. The chemistry teaching laboratory: The student perspective

    NASA Astrophysics Data System (ADS)

    Polles, John Steven

    In this study, I investigated the Student/learner's experiences in the chemistry teaching laboratory and the meaning that she or he derived from these experiences. This study sought to answer these questions: (1) What was the students experience in the teaching laboratory?, (2) What aspects of the laboratory experience did the student value?, and (3) What beliefs did the student hold concerning the role of the laboratory experience in developing her or his understanding of chemistry? Students involved in an introductory chemistry course at Purdue University were asked to complete a two-part questionnaire consisting of 16 scaled response and 5 free response items, and 685 did so. Fourteen students also participated in a semi-structured individual interview. The questionnaire and interview were designed to probe the students' perceived experience and answer the above questions. I found that students possess strong conceptions of the laboratory experience: a pre-conception that colors their experience from the outset, and a post-conception that is a mix of positive and negative reflections. I also found that the learner deeply holds an implicit value in the laboratory experience. The other major finding was that the students' lived experience is dramatically shaped or influenced by external agencies, primarily the faculty (and by extension the teaching assistants). There is much debate in the extant literature over the learning value of the science teaching laboratory, but it is all from the perspective of faculty, curriculum designers, and administrators. This study adds the students' voice to the argument.

  7. Partitioning evapotranspiration fluxes with water stable isotopic measurements: from the lab to the field

    NASA Astrophysics Data System (ADS)

    Quade, M. E.; Brueggemann, N.; Graf, A.; Rothfuss, Y.

    2017-12-01

    Water stable isotopes are powerful tools for partitioning net into raw water fluxes such as evapotranspiration (ET) into soil evaporation (E) and plant transpiration (T). The isotopic methodology for ET partitioning is based on the fact that E and T have distinct water stable isotopic compositions, which in turn relies on the fact that each flux is differently affected by isotopic kinetic effects. An important work to be performed in parallel to field measurements is to better characterize these kinetic effects in the laboratory under controlled conditions. A soil evaporation laboratory experiment was conducted to retrieve characteristic values of the kinetic fractionation factor (αK) under varying soil and atmospheric water conditions. For this we used a combined soil and atmosphere column to monitor the soil and atmospheric water isotopic composition profiles at a high temporal and vertical resolution in a nondestructive manner by combining micro-porous membranes and laser spectroscopy. αK was calculated by using a well-known isotopic evaporation model in an inverse mode with the isotopic composition of E as one input variable, which was determined using a micro-Keeling regression plot. Knowledge on αK was further used in the field (Selhausen, North Rhine-Westphalia, Germany) to partition ET of catch crops and sugar beet (Beta vulgaris) during one growing season. Soil and atmospheric water isotopic profiles were measured automatically across depths and heights following a similar modus operandi as in the laboratory experiment. Additionally, a newly developed continuously moving elevator was used to obtain water vapor isotopic composition profiles with a high vertical resolution between soil surface, plant canopy and atmosphere. Finally, soil and plant samples were collected destructively to provide a comparison with the traditional isotopic methods. Our results illustrate the changing proportions of T and E along the growing season and demonstrate the applicability of our new non-destructive approach to field conditions.

  8. Stress-dependent elastic properties of shales—laboratory experiments at seismic and ultrasonic frequencies

    NASA Astrophysics Data System (ADS)

    Szewczyk, Dawid; Bauer, Andreas; Holt, Rune M.

    2018-01-01

    Knowledge about the stress sensitivity of elastic properties and velocities of shales is important for the interpretation of seismic time-lapse data taken as part of reservoir and caprock surveillance of both unconventional and conventional oil and gas fields (e.g. during 4-D monitoring of CO2 storage). Rock physics models are often developed based on laboratory measurements at ultrasonic frequencies. However, as shown previously, shales exhibit large seismic dispersion, and it is possible that stress sensitivities of velocities are also frequency dependent. In this work, we report on a series of seismic and ultrasonic laboratory tests in which the stress sensitivity of elastic properties of Mancos shale and Pierre shale I were investigated. The shales were tested at different water saturations. Dynamic rock engineering parameters and elastic wave velocities were examined on core plugs exposed to isotropic loading. Experiments were carried out in an apparatus allowing for static-compaction and dynamic measurements at seismic and ultrasonic frequencies within single test. For both shale types, we present and discuss experimental results that demonstrate dispersion and stress sensitivity of the rock stiffness, as well as P- and S-wave velocities, and stiffness anisotropy. Our experimental results show that the stress-sensitivity of shales is different at seismic and ultrasonic frequencies, which can be linked with simultaneously occurring changes in the dispersion with applied stress. Measured stress sensitivity of elastic properties for relatively dry samples was higher at seismic frequencies however, the increasing saturation of shales decreases the difference between seismic and ultrasonic stress-sensitivities, and for moist samples stress-sensitivity is higher at ultrasonic frequencies. Simultaneously, the increased saturation highly increases the dispersion in shales. We have also found that the stress-sensitivity is highly anisotropic in both shales and that in some of the cases higher stress-sensitivity of elastic properties can be seen in the direction parallel to the bedding plane.

  9. Combining Experiments and Simulations of Extraction Kinetics and Thermodynamics in Advanced Separation Processes for Used Nuclear Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mikael

    This 3-year project was a collaboration between University of California Irvine (UC Irvine), Pacific Northwest National Laboratory (PNNL), Idaho National Laboratory (INL), Argonne National Laboratory (ANL) and with an international collaborator at ForschungZentrum Jülich (FZJ). The project was led from UC Irvine under the direction of Profs. Mikael Nilsson and Hung Nguyen. The leads at PNNL, INL, ANL and FZJ were Dr. Liem Dang, Dr. Peter Zalupski, Dr. Nathaniel Hoyt and Dr. Giuseppe Modolo, respectively. Involved in this project at UC Irvine were three full time PhD graduate students, Tro Babikian, Ted Yoo, and Quynh Vo, and one MS student,more » Alba Font Bosch. The overall objective of this project was to study how the kinetics and thermodynamics of metal ion extraction can be described by molecular dynamic (MD) simulations and how the simulations can be validated by experimental data. Furthermore, the project includes the applied separation by testing the extraction systems in a single stage annular centrifugal contactor and coupling the experimental data with computational fluid dynamic (CFD) simulations. Specific objectives of the proposed research were: Study and establish a rigorous connection between MD simulations based on polarizable force fields and extraction thermodynamic and kinetic data. Compare and validate CFD simulations of extraction processes for An/Ln separation using different sizes (and types) of annular centrifugal contactors. Provide a theoretical/simulation and experimental base for scale-up of batch-wise extraction to continuous contactors. We approached objective 1 and 2 in parallel. For objective 1 we started by studying a well established extraction system with a relatively simple extraction mechanism, namely tributyl phosphate. What we found was that well optimized simulations can inform experiments and new information on TBP behavior was presented in this project, as well be discussed below. The second objective proved a larger challenge and most of the efforts were devoted to experimental studies.« less

  10. Channel flow and trichloroethylene treatment in a partly iron-filled fracture: experimental and model results.

    PubMed

    Cai, Zuansi; Merly, Corrine; Thomson, Neil R; Wilson, Ryan D; Lerner, David N

    2007-08-15

    Technical developments have now made it possible to emplace granular zero-valent iron (Fe(0)) in fractured media to create a Fe(0) fracture reactive barrier (Fe(0) FRB) for the treatment of contaminated groundwater. To evaluate this concept, we conducted a laboratory experiment in which trichloroethylene (TCE) contaminated water was flushed through a single uniform fracture created between two sandstone blocks. This fracture was partly filled with what was intended to be a uniform thickness of iron. Partial treatment of TCE by iron demonstrated that the concept of a Fe(0) FRB is practical, but was less than anticipated for an iron layer of uniform thickness. When the experiment was disassembled, evidence of discrete channelised flow was noted and attributed to imperfect placement of the iron. To evaluate the effect of the channel flow, an explicit Channel Model was developed that simplifies this complex flow regime into a conceptualised set of uniform and parallel channels. The mathematical representation of this conceptualisation directly accounts for (i) flow channels and immobile fluid arising from the non-uniform iron placement, (ii) mass transfer from the open fracture to iron and immobile fluid regions, and (iii) degradation in the iron regions. A favourable comparison between laboratory data and the results from the developed mathematical model suggests that the model is capable of representing TCE degradation in fractures with non-uniform iron placement. In order to apply this Channel Model concept to a Fe(0) FRB system, a simplified, or implicit, Lumped Channel Model was developed where the physical and chemical processes in the iron layer and immobile fluid regions are captured by a first-order lumped rate parameter. The performance of this Lumped Channel Model was compared to laboratory data, and benchmarked against the Channel Model. The advantages of the Lumped Channel Model are that the degradation of TCE in the system is represented by a first-order parameter that can be used directly in readily available numerical simulators.

  11. Channel flow and trichloroethylene treatment in a partly iron-filled fracture: Experimental and model results

    NASA Astrophysics Data System (ADS)

    Cai, Zuansi; Merly, Corrine; Thomson, Neil R.; Wilson, Ryan D.; Lerner, David N.

    2007-08-01

    Technical developments have now made it possible to emplace granular zero-valent iron (Fe 0) in fractured media to create a Fe 0 fracture reactive barrier (Fe 0 FRB) for the treatment of contaminated groundwater. To evaluate this concept, we conducted a laboratory experiment in which trichloroethylene (TCE) contaminated water was flushed through a single uniform fracture created between two sandstone blocks. This fracture was partly filled with what was intended to be a uniform thickness of iron. Partial treatment of TCE by iron demonstrated that the concept of a Fe 0 FRB is practical, but was less than anticipated for an iron layer of uniform thickness. When the experiment was disassembled, evidence of discrete channelised flow was noted and attributed to imperfect placement of the iron. To evaluate the effect of the channel flow, an explicit Channel Model was developed that simplifies this complex flow regime into a conceptualised set of uniform and parallel channels. The mathematical representation of this conceptualisation directly accounts for (i) flow channels and immobile fluid arising from the non-uniform iron placement, (ii) mass transfer from the open fracture to iron and immobile fluid regions, and (iii) degradation in the iron regions. A favourable comparison between laboratory data and the results from the developed mathematical model suggests that the model is capable of representing TCE degradation in fractures with non-uniform iron placement. In order to apply this Channel Model concept to a Fe 0 FRB system, a simplified, or implicit, Lumped Channel Model was developed where the physical and chemical processes in the iron layer and immobile fluid regions are captured by a first-order lumped rate parameter. The performance of this Lumped Channel Model was compared to laboratory data, and benchmarked against the Channel Model. The advantages of the Lumped Channel Model are that the degradation of TCE in the system is represented by a first-order parameter that can be used directly in readily available numerical simulators.

  12. Simulating the VUV photochemistry of the upper atmosphere of Titan

    NASA Astrophysics Data System (ADS)

    Tigrine, Sarah; Carrasco, Nathalie; Vettier, Ludovic; Chitarra, Olivia; Cernogora, Guy

    2016-10-01

    The Cassini mission around Titan revealed that the interaction between the N2 and CH4 molecules and the solar VUV radiation leads to a complex chemistry above an altitude of 800km with the detection of heavy organic molecules like benzene (C6H6). This is consistent with an initiation of the aerosols in Titan's upper atmosphere. The presence of those molecules makes Titan a natural laboratory to witness and understand prebiotic-like chemistry but despite all the data collected, all the possible photochemical processes in such a hydrocarbon-nitrogen-rich environment are not precisely understood.This is why Titan's atmospheric chemistry experiments are of high interest, especially those focusing on the photochemistry as most of the Titan-like experiments are based on N2-CH4 plasma techniques. In order to reproduce this VUV photochemistry of N2 and CH4, we designed a photochemical reactor named APSIS which is to be coupled window-less with a VUV photon source as N2 needs wavelengths shorter than 100 nm in order to be dissociated. Those wavelengths are available at synchrotron beamlines but are challenging to obtain with common laboratory discharge lamps. At LATMOS, we developed a table-top VUV window-less source using noble gases for the micro-wave discharge. We started with Neon, as it has two resonance lines at 73.6 and 74.3 nm which allow us to dissociate and/or ionize both CH4 and N2.We will present here our first experimental results obtained with APSIS coupled with this VUV source. A range of different pressures below 1 mbar is tested, in parallel to different methane ratio. Moreover, other wavelengths are injected by adding some other noble gases in the MO discharge (He, Kr, Xe, Ar). We will review the mass spectra obtained in those different conditions and then discuss them regarding the Cassini data and other previous laboratory photochemical studies.

  13. Parallel Event Analysis Under Unix

    NASA Astrophysics Data System (ADS)

    Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.

    The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.

  14. Code Optimization and Parallelization on the Origins: Looking from Users' Perspective

    NASA Technical Reports Server (NTRS)

    Chang, Yan-Tyng Sherry; Thigpen, William W. (Technical Monitor)

    2002-01-01

    Parallel machines are becoming the main compute engines for high performance computing. Despite their increasing popularity, it is still a challenge for most users to learn the basic techniques to optimize/parallelize their codes on such platforms. In this paper, we present some experiences on learning these techniques for the Origin systems at the NASA Advanced Supercomputing Division. Emphasis of this paper will be on a few essential issues (with examples) that general users should master when they work with the Origins as well as other parallel systems.

  15. How the position of mussels at the intertidal lagoon affects their infection with the larvae of parasitic flatworms (Trematoda: Digenea): A combined laboratory and field experimental study

    NASA Astrophysics Data System (ADS)

    Nikolaev, Kirill E.; Prokofiev, Vladimir V.; Levakin, Ivan A.; Galaktionov, Kirill V.

    2017-10-01

    In the complex trematode life cycle, cercariae transmit infection from the first to the second intermediate host. These short-lived lecithotrophic larvae possess a complex of behavioural responses for finding and infecting the host. We studied strategies used by cercariae of Himasthla elongata and Cercaria parvicaudata (Renicola sp.) infecting mussels Mytilus edulis at the White Sea intertidal. Laboratory and field experiments were conducted in parallel. Geotactic response of cercariae was tested in an experimental chamber. Their distribution in nature was studied by counting larvae infecting mussels in cages installed in pairs (a ground and a suspended cage) in an intertidal lagoon. In the chamber H. elongata cercariae concentrated at the bottom, C. parvicaudata cercariae aged 1 h mostly concentrated near the surface and those aged 6 h sank to the bottom. A few larvae of both species ("evaders") showed behavioural patterns antithetic to the prevalent ones. Infection was the highest in mussels in ground cages. In suspended cages mussel infection with H. elongata cercariae was much lower than with C. parvicaudata cercariae. Our study confirmed that results of experiments on cercarial behaviour could be extrapolated to natural conditions. Cercariae of two species using the same intermediate hosts and co-occurring in a biotope implemented dramatically different strategies. This might be associated with differences in cercarial output by parthenitae groups. The presence of "evaders" might be useful for successful transmission. Our results indicate that mussels cultivated in suspended cultures are at the least risk of infection with trematode larvae.

  16. How do we make models that are useful in understanding partial epilepsies?

    PubMed

    Prince, David A

    2014-01-01

    The goals of constructing epilepsy models are (1) to develop approaches to prophylaxis of epileptogenesis following cortical injury; (2) to devise selective treatments for established epilepsies based on underlying pathophysiological mechanisms; and (3) use of a disease (epilepsy) model to explore brain molecular, cellular and circuit properties. Modeling a particular epilepsy syndrome requires detailed knowledge of key clinical phenomenology and results of human experiments that can be addressed in critically designed laboratory protocols. Contributions to understanding mechanisms and treatment of neurological disorders has often come from research not focused on a specific disease-relevant issue. Much of the foundation for current research in epilepsy falls into this category. Too strict a definition of the relevance of an experimental model to progress in preventing or curing epilepsy may, in the long run, slow progress. Inadequate exploration of the experimental target and basic laboratory results in a given model can lead to a failed effort and false negative or positive results. Models should be chosen based on the specific issues to be addressed rather than on convenience of use. Multiple variables including maturational age, species and strain, lesion type, severity and location, latency from injury to experiment and genetic background will affect results. A number of key issues in clinical and basic research in partial epilepsies remain to be addressed including the mechanisms active during the latent period following injury, susceptibility factors that predispose to epileptogenesis, injury - induced adaptive versus maladaptive changes, mechanisms of pharmaco-resistance and strategies to deal with multiple pathophysiological processes occurring in parallel.

  17. Chemistry Graduate Teaching Assistants' Experiences in Academic Laboratories and Development of a Teaching Self-image

    NASA Astrophysics Data System (ADS)

    Gatlin, Todd Adam

    Graduate teaching assistants (GTAs) play a prominent role in chemistry laboratory instruction at research based universities. They teach almost all undergraduate chemistry laboratory courses. However, their role in laboratory instruction has often been overlooked in educational research. Interest in chemistry GTAs has been placed on training and their perceived expectations, but less attention has been paid to their experiences or their potential benefits from teaching. This work was designed to investigate GTAs' experiences in and benefits from laboratory instructional environments. This dissertation includes three related studies on GTAs' experiences teaching in general chemistry laboratories. Qualitative methods were used for each study. First, phenomenological analysis was used to explore GTAs' experiences in an expository laboratory program. Post-teaching interviews were the primary data source. GTAs experiences were described in three dimensions: doing, knowing, and transferring. Gains available to GTAs revolved around general teaching skills. However, no gains specifically related to scientific development were found in this laboratory format. Case-study methods were used to explore and illustrate ways GTAs develop a GTA self-image---the way they see themselves as instructors. Two general chemistry laboratory programs that represent two very different instructional frameworks were chosen for the context of this study. The first program used a cooperative project-based approach. The second program used weekly, verification-type activities. End of the semester interviews were collected and served as the primary data source. A follow-up case study of a new cohort of GTAs in the cooperative problem-based laboratory was undertaken to investigate changes in GTAs' self-images over the course of one semester. Pre-semester and post-semester interviews served as the primary data source. Findings suggest that GTAs' construction of their self-image is shaped through the interaction of 1) prior experiences, 2) training, 3) beliefs about the nature of knowledge, 4) beliefs about the nature of laboratory work, and 5) involvement in the laboratory setting. Further GTAs' self-images are malleable and susceptible to change through their laboratory teaching experiences. Overall, this dissertation contributes to chemistry education by providing a model useful for exploring GTAs' development of a self-image in laboratory teaching. This work may assist laboratory instructors and coordinators in reconsidering, when applicable, GTA training and support. This work also holds considerable implications for how teaching experiences are conceptualized as part of the chemistry graduate education experience. Findings suggest that appropriate teaching experiences may contribute towards better preparing graduate students for their journey in becoming scientists.

  18. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  19. Economic Analysis of Alternative Strategies for Detection of ALK Rearrangements in Non Small Cell Lung Cancer.

    PubMed

    Doshi, Shivang; Ray, David; Stein, Karen; Zhang, Jie; Koduru, Prasad; Fogt, Franz; Wellman, Axel; Wat, Ricky; Mathews, Charles

    2016-01-06

    Identification of alterations in ALK gene and development of ALK-directed therapies have increased the need for accurate and efficient detection methodologies. To date, research has focused on the concordance between the two most commonly used technologies, fluorescent in situ hybridization (FISH) and immunohistochemistry (IHC). However, inter-test concordance reflects only one, albeit important, aspect of the diagnostic process; laboratories, hospitals, and payors must understand the cost and workflow of ALK rearrangement detection strategies. Through literature review combined with interviews of pathologists and laboratory directors in the U.S. and Europe, a cost-impact model was developed that compared four alternative testing strategies-IHC only, FISH only, IHC pre-screen followed by FISH confirmation, and parallel testing by both IHC and FISH. Interviews were focused on costs of reagents, consumables, equipment, and personnel. The resulting model showed that testing by IHC alone cost less ($90.07 in the U.S., $68.69 in Europe) than either independent or parallel testing by both FISH and IHC ($441.85 in the U.S. and $279.46 in Europe). The strategies differed in cost of execution, turnaround time, reimbursement, and number of positive results detected, suggesting that laboratories must weigh the costs and the clinical benefit of available ALK testing strategies.

  20. Economic Analysis of Alternative Strategies for Detection of ALK Rearrangements in Non Small Cell Lung Cancer

    PubMed Central

    Doshi, Shivang; Ray, David; Stein, Karen; Zhang, Jie; Koduru, Prasad; Fogt, Franz; Wellman, Axel; Wat, Ricky; Mathews, Charles

    2016-01-01

    Identification of alterations in ALK gene and development of ALK-directed therapies have increased the need for accurate and efficient detection methodologies. To date, research has focused on the concordance between the two most commonly used technologies, fluorescent in situ hybridization (FISH) and immunohistochemistry (IHC). However, inter-test concordance reflects only one, albeit important, aspect of the diagnostic process; laboratories, hospitals, and payors must understand the cost and workflow of ALK rearrangement detection strategies. Through literature review combined with interviews of pathologists and laboratory directors in the U.S. and Europe, a cost-impact model was developed that compared four alternative testing strategies—IHC only, FISH only, IHC pre-screen followed by FISH confirmation, and parallel testing by both IHC and FISH. Interviews were focused on costs of reagents, consumables, equipment, and personnel. The resulting model showed that testing by IHC alone cost less ($90.07 in the U.S., $68.69 in Europe) than either independent or parallel testing by both FISH and IHC ($441.85 in the U.S. and $279.46 in Europe). The strategies differed in cost of execution, turnaround time, reimbursement, and number of positive results detected, suggesting that laboratories must weigh the costs and the clinical benefit of available ALK testing strategies. PMID:26838801

  1. Winter photosynthesis in red spruce (Picea rubens Sarg.): limitations, potential benefits, and risks

    Treesearch

    P.G. Schaberg

    2000-01-01

    Numerous cold-induced changes in physiology limit the capacity of northern conifers to photosynthesize during winter. Studies of red spruce (Picea rubens Sarg.) have shown that rates of field photosynthesis (Pfield) and laboratory measurements of photosynthetic capacity (Pmax) generally parallel seasonal...

  2. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  3. Redefining Authentic Research Experiences in Introductory Biology Laboratories and Barriers to Their Implementation

    PubMed Central

    Spell, Rachelle M.; Guinan, Judith A.; Miller, Kristen R.; Beck, Christopher W.

    2014-01-01

    Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are poorly defined. To guide future reform efforts in this area, we conducted a national survey of biology faculty members to determine 1) their definitions of authentic research experiences in laboratory classes, 2) the extent of authentic research experiences currently experienced in their laboratory classes, and 3) the barriers that prevent incorporation of authentic research experiences into these classes. Strikingly, the definitions of authentic research experiences differ among faculty members and tend to emphasize either the scientific process or the discovery of previously unknown data. The low level of authentic research experiences in introductory biology labs suggests that more development and support is needed to increase undergraduate exposure to research experiences. Faculty members did not cite several barriers commonly assumed to impair pedagogical reform; however, their responses suggest that expanded support for development of research experiences in laboratory classes could address the most common barrier. PMID:24591509

  4. Redefining authentic research experiences in introductory biology laboratories and barriers to their implementation.

    PubMed

    Spell, Rachelle M; Guinan, Judith A; Miller, Kristen R; Beck, Christopher W

    2014-01-01

    Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are poorly defined. To guide future reform efforts in this area, we conducted a national survey of biology faculty members to determine 1) their definitions of authentic research experiences in laboratory classes, 2) the extent of authentic research experiences currently experienced in their laboratory classes, and 3) the barriers that prevent incorporation of authentic research experiences into these classes. Strikingly, the definitions of authentic research experiences differ among faculty members and tend to emphasize either the scientific process or the discovery of previously unknown data. The low level of authentic research experiences in introductory biology labs suggests that more development and support is needed to increase undergraduate exposure to research experiences. Faculty members did not cite several barriers commonly assumed to impair pedagogical reform; however, their responses suggest that expanded support for development of research experiences in laboratory classes could address the most common barrier.

  5. Developing an online chemistry laboratory for non-chemistry majors

    NASA Astrophysics Data System (ADS)

    Poole, Jacqueline H.

    Distance education, also known as online learning, is student-centered/self-directed educational opportunities. This style of learning is expanding in scope and is increasingly being accepted throughout the academic curriculum as a result of its flexibility for the student as well as the cost-effectiveness for the institution. Nevertheless, the introduction of online science courses including chemistry and physics have lagged behind due to the challenge of re-creation of the hands-on laboratory learning experience. This dissertation looks at the effectiveness of the design of a series of chemistry laboratory experiments for possible online delivery that provide students with simulated hands-on experiences. One class of college Chemistry 101 students conducted chemistry experiments inside and outside of the physical laboratory using instructions on Blackboard and Late Nite Labs(TM). Learning outcomes measured by (a) pretests, (b) written laboratory reports, (c) posttest assessments, (d) student reactions as determined by a questionnaire, and (e) a focus group interview were utilized to compare both types of laboratory experiences. The research findings indicated learning outcomes achieved by students outside of the traditional physical laboratory were statistically greater than the equivalent face-to-face instruction in the traditional laboratory. Evidence from student reactions comparing both types of laboratory formats (online and traditional face-to-face) indicated student preference for the online laboratory format. The results are an initial contribution to the design of a complete sequence of experiments that can be performed independently by online students outside of the traditional face-to-face laboratory that will satisfy the laboratory requirement for the two-semester college Chemistry 101 laboratory course.

  6. Acoustic simulation in architecture with parallel algorithm

    NASA Astrophysics Data System (ADS)

    Li, Xiaohong; Zhang, Xinrong; Li, Dan

    2004-03-01

    In allusion to complexity of architecture environment and Real-time simulation of architecture acoustics, a parallel radiosity algorithm was developed. The distribution of sound energy in scene is solved with this method. And then the impulse response between sources and receivers at frequency segment, which are calculated with multi-process, are combined into whole frequency response. The numerical experiment shows that parallel arithmetic can improve the acoustic simulating efficiency of complex scene.

  7. The Master level optics laboratory at the Institute of Optics

    NASA Astrophysics Data System (ADS)

    Adamson, Per

    2017-08-01

    The master level optics laboratory is a biannual, intensive laboratory course in the fields of geometrical, physical and modern optics. This course is intended for the master level student though Ph.D. advisors which often recommend it to their advisees. The students are required to complete five standard laboratory experiments and an independent project during a semester. The goals of the laboratory experiments are for the students to get hands-on experience setting up optical laboratory equipment, collecting and analyzing data, as well as to communicate key results. The experimental methods, analysis, and results of the standard experiments are submitted in a journal style report, while an oral presentation is given for the independent project.

  8. Experiences using OpenMP based on Computer Directed Software DSM on a PC Cluster

    NASA Technical Reports Server (NTRS)

    Hess, Matthias; Jost, Gabriele; Mueller, Matthias; Ruehle, Roland

    2003-01-01

    In this work we report on our experiences running OpenMP programs on a commodity cluster of PCs running a software distributed shared memory (DSM) system. We describe our test environment and report on the performance of a subset of the NAS Parallel Benchmarks that have been automaticaly parallelized for OpenMP. We compare the performance of the OpenMP implementations with that of their message passing counterparts and discuss performance differences.

  9. Development of Accessible Laboratory Experiments for Students with Visual Impairments

    ERIC Educational Resources Information Center

    Kroes, KC; Lefler, Daniel; Schmitt, Aaron; Supalo, Cary A.

    2016-01-01

    The hands-on laboratory experiments are frequently what spark students' interest in science. Students who are blind or have low vision (BLV) typically do not get the same experience while participating in hands-on activities due to accessibility. Over the course of approximately nine months, common chemistry laboratory experiments were adapted and…

  10. Do-It-Yourself Experiments for the Instructional Laboratory

    ERIC Educational Resources Information Center

    Craig, Norman C.; Hill, Cortland S.

    2012-01-01

    A new design for experiments in the general chemistry laboratory incorporates a "do-it-yourself" component for students. In this design, students perform proven experiments to gain experience with techniques for about two-thirds of a laboratory session and then spend the last part in the do-it-yourself component, applying the techniques to an…

  11. A teaching intervention for reading laboratory experiments in college-level introductory chemistry

    NASA Astrophysics Data System (ADS)

    Kirk, Maria Kristine

    The purpose of this study was to determine the effects that a pre-laboratory guide, conceptualized as a "scientific story grammar," has on college chemistry students' learning when they read an introductory chemistry laboratory manual and perform the experiments in the chemistry laboratory. The participants (N = 56) were students enrolled in four existing general chemistry laboratory sections taught by two instructors at a women's liberal arts college. The pre-laboratory guide consisted of eight questions about the experiment, including the purpose, chemical species, variables, chemical method, procedure, and hypothesis. The effects of the intervention were compared with those of the traditional pre-laboratory assignment for the eight chemistry experiments. Measures included quizzes, tests, chemistry achievement test, science process skills test, laboratory reports, laboratory average, and semester grade. The covariates were mathematical aptitude and prior knowledge of chemistry and science processes, on which the groups differed significantly. The study captured students' perceptions of their experience in general chemistry through a survey and interviews with eight students. The only significant differences in the treatment group's performance were in some subscores on lecture items and laboratory items on the quizzes. An apparent induction period was noted, in that significant measures occurred in mid-semester. Voluntary study with the pre-laboratory guide by control students precluded significant differences on measures given later in the semester. The groups' responses to the survey were similar. Significant instructor effects on three survey items were corroborated by the interviews. The researcher's students were more positive about their pre-laboratory tasks, enjoyed the laboratory sessions more, and were more confident about doing chemistry experiments than the laboratory instructor's groups due to differences in scaffolding by the instructors.

  12. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    PubMed

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  13. Fragmentation studies of relativistic iron ions using plastic nuclear track detectors.

    PubMed

    Scampoli, P; Durante, M; Grossi, G; Manti, L; Pugliese, M; Gialanella, G

    2005-01-01

    We measured fluence and fragmentation of high-energy (1 or 5 A GeV) 56Fe ions accelerated at the Alternating Gradient Synchrotron or at the NASA Space Radiation Laboratory (Brookhaven National Laboratory, NY, USA) using solid-state CR-39 nuclear track detectors. Different targets (polyethylene, PMMA, C, Al, Pb) were used to produce a large spectrum of charged fragments. CR-39 plastics were exposed both in front and behind the shielding block (thickness ranging from 5 to 30 g/cm2) at a normal incidence and low fluence. The radiation dose deposited by surviving Fe ions and charged fragments was measured behind the shield using an ionization chamber. The distribution of the measured track size was exploited to distinguish the primary 56Fe ions tracks from the lighter fragments. Measurements of projectile's fluence in front of the shield were used to determine the dose per incident particle behind the block. Simultaneous measurements of primary 56Fe ion tracks in front and behind the shield were used to evaluate the fraction of surviving iron projectiles and the total charge-changing fragmentation cross-section. These physical measurements will be used to characterize the beam used in parallel biological experiments. c2005 COSPAR. Published by Elsevier Ltd. All rights reserved.

  14. LIGHT WATER REACTOR ACCIDENT TOLERANT FUELS IRRADIATION TESTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmack, William Jonathan; Barrett, Kristine Eloise; Chichester, Heather Jean MacLean

    2015-09-01

    The purpose of Accident Tolerant Fuels (ATF) experiments is to test novel fuel and cladding concepts designed to replace the current zirconium alloy uranium dioxide (UO2) fuel system. The objective of this Research and Development (R&D) is to develop novel ATF concepts that will be able to withstand loss of active cooling in the reactor core for a considerably longer time period than the current fuel system while maintaining or improving the fuel performance during normal operations, operational transients, design basis, and beyond design basis events. It was necessary to design, analyze, and fabricate drop-in capsules to meet the requirementsmore » for testing under prototypic LWR temperatures in Idaho National Laboratory's Advanced Test Reactor (ATR). Three industry led teams and one DOE team from Oak Ridge National Laboratory provided fuel rodlet samples for their new concepts for ATR insertion in 2015. As-built projected temperature calculations were performed on the ATF capsules using the BISON fuel performance code. BISON is an application of INL’s Multi-physics Object Oriented Simulation Environment (MOOSE), which is a massively parallel finite element based framework used to solve systems of fully coupled nonlinear partial differential equations. Both 2D and 3D models were set up to examine cladding and fuel performance.« less

  15. Measuring meaningful learning in the undergraduate chemistry laboratory

    NASA Astrophysics Data System (ADS)

    Galloway, Kelli R.

    The undergraduate chemistry laboratory has been an essential component in chemistry education for over a century. The literature includes reports on investigations of singular aspects laboratory learning and attempts to measure the efficacy of reformed laboratory curriculum as well as faculty goals for laboratory learning which found common goals among instructors for students to learn laboratory skills, techniques, experimental design, and to develop critical thinking skills. These findings are important for improving teaching and learning in the undergraduate chemistry laboratory, but research is needed to connect the faculty goals to student perceptions. This study was designed to explore students' ideas about learning in the undergraduate chemistry laboratory. Novak's Theory of Meaningful Learning was used as a guide for the data collection and analysis choices for this research. Novak's theory states that in order for meaningful learning to occur the cognitive, affective, and psychomotor domains must be integrated. The psychomotor domain is inherent in the chemistry laboratory, but the extent to which the cognitive and affective domains are integrated is unknown. For meaningful learning to occur in the laboratory, students must actively integrate both the cognitive domain and the affective domains into the "doing" of their laboratory work. The Meaningful Learning in the Laboratory Instrument (MLLI) was designed to measure students' cognitive and affective expectations and experiences within the context of conducting experiments in the undergraduate chemistry laboratory. Evidence for the validity and reliability of the data generated by the MLLI were collected from multiple quantitative studies: a one semester study at one university, a one semester study at 15 colleges and universities across the United States, and a longitudinal study where the MLLI was administered 6 times during two years of general and organic chemistry laboratory courses. Results from these studies revealed students' narrow cognitive expectations for learning that go largely unmet by their experiences and diverse affective expectations and experiences. Concurrently, a qualitative study was carried out to describe and characterize students' cognitive and affective experiences in the undergraduate chemistry laboratory. Students were video recorded while performing one of their regular laboratory experiments and then interviewed about their experiences. The students' descriptions of their learning experiences were characterized by their overreliance on following the experimental procedure correctly rather than developing process-oriented problem solving skills. Future research could use the MLLI to intentionally compare different types of laboratory curricula or environments.

  16. The Effect of Guided-Inquiry Laboratory Experiments on Science Education Students' Chemistry Laboratory Attitudes, Anxiety and Achievement

    ERIC Educational Resources Information Center

    Ural, Evrim

    2016-01-01

    The study aims to search the effect of guided inquiry laboratory experiments on students' attitudes towards chemistry laboratory, chemistry laboratory anxiety and their academic achievement in the laboratory. The study has been carried out with 37 third-year, undergraduate science education students, as a part of their Science Education Laboratory…

  17. Evaluation of an Infiltration Model with Microchannels

    NASA Astrophysics Data System (ADS)

    Garcia-Serrana, M.; Gulliver, J. S.; Nieber, J. L.

    2015-12-01

    This research goal is to develop and demonstrate the means by which roadside drainage ditches and filter strips can be assigned the appropriate volume reduction credits by infiltration. These vegetated surfaces convey stormwater, infiltrate runoff, and filter and/or settle solids, and are often placed along roads and other impermeable surfaces. Infiltration rates are typically calculated by assuming that water flows as sheet flow over the slope. However, for most intensities water flow occurs in narrow and shallow micro-channels and concentrates in depressions. This channelization reduces the fraction of the soil surface covered with the water coming from the road. The non-uniform distribution of water along a hillslope directly affects infiltration. First, laboratory and field experiments have been conducted to characterize the spatial pattern of flow for stormwater runoff entering onto the surface of a sloped surface in a drainage ditch. In the laboratory experiments different micro-topographies were tested over bare sandy loam soil: a smooth surface, and three and five parallel rills. All the surfaces experienced erosion; the initially smooth surface developed a system of channels over time that increased runoff generation. On average, the initially smooth surfaces infiltrated 10% more volume than the initially rilled surfaces. The field experiments were performed in the side slope of established roadside drainage ditches. Three rates of runoff from a road surface into the swale slope were tested, representing runoff from 1, 2, and 10-year storm events. The average percentage of input runoff water infiltrated in the 32 experiments was 67%, with a 21% standard deviation. Multiple measurements of saturated hydraulic conductivity were conducted to account for its spatial variability. Second, a rate-based coupled infiltration and overland model has been designed that calculates stormwater infiltration efficiency of swales. The Green-Ampt-Mein-Larson assumptions were implemented to calculate infiltration along with a kinematic wave model for overland flow that accounts for short-circuiting of flow. Additionally, a sensitivity analysis on the parameters implemented in the model has been performed. Finally, the field experiments results have been used to quantify the validity of the coupled model.

  18. A Transition from a Traditional to a Project-Like Physical Chemistry Laboratory via a Heterogeneous Catalysis Study.

    ERIC Educational Resources Information Center

    Goldwasser, M. R.; Leal, O.

    1979-01-01

    Outlines an approach for instruction in a physical chemistry laboratory which combines traditional and project-like experiments. An outline of laboratory experiments and examples of project-like experiments are included. (BT)

  19. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  20. Xyce Parallel Electronic Simulator - Users' Guide Version 2.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, Scott A; Hoekstra, Robert J.; Russo, Thomas V.

    This manual describes the use of theXyceParallel Electronic Simulator.Xycehasbeen designed as a SPICE-compatible, high-performance analog circuit simulator, andhas been written to support the simulation needs of the Sandia National Laboratorieselectrical designers. This development has focused on improving capability over thecurrent state-of-the-art in the following areas:%04Capability to solve extremely large circuit problems by supporting large-scale par-allel computing platforms (up to thousands of processors). Note that this includessupport for most popular parallel and serial computers.%04Improved performance for all numerical kernels (e.g., time integrator, nonlinearand linear solvers) through state-of-the-art algorithms and novel techniques.%04Device models which are specifically tailored to meet Sandia's needs, includingmanymore » radiation-aware devices.3 XyceTMUsers' Guide%04Object-oriented code design and implementation using modern coding practicesthat ensure that theXyceParallel Electronic Simulator will be maintainable andextensible far into the future.Xyceis a parallel code in the most general sense of the phrase - a message passingparallel implementation - which allows it to run efficiently on the widest possible numberof computing platforms. These include serial, shared-memory and distributed-memoryparallel as well as heterogeneous platforms. Careful attention has been paid to thespecific nature of circuit-simulation problems to ensure that optimal parallel efficiencyis achieved as the number of processors grows.The development ofXyceprovides a platform for computational research and de-velopment aimed specifically at the needs of the Laboratory. WithXyce, Sandia hasan %22in-house%22 capability with which both new electrical (e.g., device model develop-ment) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms)research and development can be performed. As a result,Xyceis a unique electricalsimulation capability, designed to meet the unique needs of the laboratory.4 XyceTMUsers' GuideAcknowledgementsThe authors would like to acknowledge the entire Sandia National Laboratories HPEMS(High Performance Electrical Modeling and Simulation) team, including Steve Wix, CarolynBogdan, Regina Schells, Ken Marx, Steve Brandon and Bill Ballard, for their support onthis project. We also appreciate very much the work of Jim Emery, Becky Arnold and MikeWilliamson for the help in reviewing this document.Lastly, a very special thanks to Hue Lai for typesetting this document with LATEX.TrademarksThe information herein is subject to change without notice.Copyrightc 2002-2003 Sandia Corporation. All rights reserved.XyceTMElectronic Simulator andXyceTMtrademarks of Sandia Corporation.Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence DesignSystems, Inc.Silicon Graphics, the Silicon Graphics logo and IRIX are registered trademarks of SiliconGraphics, Inc.Microsoft, Windows and Windows 2000 are registered trademark of Microsoft Corporation.Solaris and UltraSPARC are registered trademarks of Sun Microsystems Corporation.Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation.HP and Alpha are registered trademarks of Hewlett-Packard company.Amtec and TecPlot are trademarks of Amtec Engineering, Inc.Xyce's expression library is based on that inside Spice 3F5 developed by the EECS De-partment at the University of California.All other trademarks are property of their respective owners.ContactsBug Reportshttp://tvrusso.sandia.gov/bugzillaEmailxyce-support%40sandia.govWorld Wide Webhttp://www.cs.sandia.gov/xyce5 XyceTMUsers' GuideThis page is left intentionally blank6« less

  1. Perturbation Experiments: Approaches for Metabolic Pathway Analysis in Bioreactors.

    PubMed

    Weiner, Michael; Tröndle, Julia; Albermann, Christoph; Sprenger, Georg A; Weuster-Botz, Dirk

    2016-01-01

    In the last decades, targeted metabolic engineering of microbial cells has become one of the major tools in bioprocess design and optimization. For successful application, a detailed knowledge is necessary about the relevant metabolic pathways and their regulation inside the cells. Since in vitro experiments cannot display process conditions and behavior properly, process data about the cells' metabolic state have to be collected in vivo. For this purpose, special techniques and methods are necessary. Therefore, most techniques enabling in vivo characterization of metabolic pathways rely on perturbation experiments, which can be divided into dynamic and steady-state approaches. To avoid any process disturbance, approaches which enable perturbation of cell metabolism in parallel to the continuing production process are reasonable. Furthermore, the fast dynamics of microbial production processes amplifies the need of parallelized data generation. These points motivate the development of a parallelized approach for multiple metabolic perturbation experiments outside the operating production reactor. An appropriate approach for in vivo characterization of metabolic pathways is presented and applied exemplarily to a microbial L-phenylalanine production process on a 15 L-scale.

  2. Ghost writer | ASCR Discovery

    Science.gov Websites

    the one illustrated here, the outer membrane protein OprF of Pseudomonas aeruginosa in its -1990s, NWChem was designed to run on networked processors, as in an HPC system, using one-sided communication, says Jeff Hammond of Intel Corp.'s Parallel Computing Laboratory. In one-sided communication, a

  3. Practical Application of Fundamental Concepts in Exercise Physiology

    ERIC Educational Resources Information Center

    Ramsbottom R.; Kinch, R. F. T.; Morris, M. G.; Dennis, A. M.

    2007-01-01

    The collection of primary data in laboratory classes enhances undergraduate practical and critical thinking skills. The present article describes the use of a lecture program, running in parallel with a series of linked practical classes, that emphasizes classical or standard concepts in exercise physiology. The academic and practical program ran…

  4. Effects of Early Seizures on Later Behavior and Epileptogenicity

    ERIC Educational Resources Information Center

    Holmes, Gregory L.

    2004-01-01

    Both clinical and laboratory studies demonstrate that seizures early in life can result in permanent behavioral abnormalities and enhance epileptogenicity. Understanding the critical periods of vulnerability of the developing nervous system to seizure-induced changes may provide insights into parallel or divergent processes in the development of…

  5. 75 FR 15675 - Professional Research Experience Program in Chemical Science and Technology Laboratory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-30

    ... in physics, chemistry, mathematics, computer science, or engineering. Institutions should have a 4..., mathematics, computer science, or engineering with work experiences in laboratories or other settings...-0141-01] Professional Research Experience Program in Chemical Science and Technology Laboratory...

  6. Bioinformatics algorithm based on a parallel implementation of a machine learning approach using transducers

    NASA Astrophysics Data System (ADS)

    Roche-Lima, Abiel; Thulasiram, Ruppa K.

    2012-02-01

    Finite automata, in which each transition is augmented with an output label in addition to the familiar input label, are considered finite-state transducers. Transducers have been used to analyze some fundamental issues in bioinformatics. Weighted finite-state transducers have been proposed to pairwise alignments of DNA and protein sequences; as well as to develop kernels for computational biology. Machine learning algorithms for conditional transducers have been implemented and used for DNA sequence analysis. Transducer learning algorithms are based on conditional probability computation. It is calculated by using techniques, such as pair-database creation, normalization (with Maximum-Likelihood normalization) and parameters optimization (with Expectation-Maximization - EM). These techniques are intrinsically costly for computation, even worse when are applied to bioinformatics, because the databases sizes are large. In this work, we describe a parallel implementation of an algorithm to learn conditional transducers using these techniques. The algorithm is oriented to bioinformatics applications, such as alignments, phylogenetic trees, and other genome evolution studies. Indeed, several experiences were developed using the parallel and sequential algorithm on Westgrid (specifically, on the Breeze cluster). As results, we obtain that our parallel algorithm is scalable, because execution times are reduced considerably when the data size parameter is increased. Another experience is developed by changing precision parameter. In this case, we obtain smaller execution times using the parallel algorithm. Finally, number of threads used to execute the parallel algorithm on the Breezy cluster is changed. In this last experience, we obtain as result that speedup is considerably increased when more threads are used; however there is a convergence for number of threads equal to or greater than 16.

  7. Pretreatment Engineering Platform Phase 1 Final Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurath, Dean E.; Hanson, Brady D.; Minette, Michael J.

    2009-12-23

    Pacific Northwest National Laboratory (PNNL) was tasked by Bechtel National Inc. (BNI) on the River Protection Project, Hanford Tank Waste Treatment and Immobilization Plant (RPP-WTP) project to conduct testing to demonstrate the performance of the WTP Pretreatment Facility (PTF) leaching and ultrafiltration processes at an engineering-scale. In addition to the demonstration, the testing was to address specific technical issues identified in Issue Response Plan for Implementation of External Flowsheet Review Team (EFRT) Recommendations - M12, Undemonstrated Leaching Processes.( ) Testing was conducted in a 1/4.5-scale mock-up of the PTF ultrafiltration system, the Pretreatment Engineering Platform (PEP). Parallel laboratory testing wasmore » conducted in various PNNL laboratories to allow direct comparison of process performance at an engineering-scale and a laboratory-scale. This report presents and discusses the results of those tests.« less

  8. Multidimensional Screening as a Pharmacology Laboratory Experience.

    ERIC Educational Resources Information Center

    Malone, Marvin H.; And Others

    1979-01-01

    A multidimensional pharmacodynamic screening experiment that addresses drug interaction is included in the pharmacology-toxicology laboratory experience of pharmacy students at the University of the Pacific. The student handout with directions for the procedure is reproduced, drug compounds tested are listed, and laboratory evaluation results are…

  9. Experiences Using OpenMP Based on Compiler Directed Software DSM on a PC Cluster

    NASA Technical Reports Server (NTRS)

    Hess, Matthias; Jost, Gabriele; Mueller, Matthias; Ruehle, Roland; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In this work we report on our experiences running OpenMP (message passing) programs on a commodity cluster of PCs (personal computers) running a software distributed shared memory (DSM) system. We describe our test environment and report on the performance of a subset of the NAS (NASA Advanced Supercomputing) Parallel Benchmarks that have been automatically parallelized for OpenMP. We compare the performance of the OpenMP implementations with that of their message passing counterparts and discuss performance differences.

  10. Subthreshold parallel pumping experiments on the quasi one-dimensional S = {1}/{2} ferromagnets [C 6H 11NH 3]CuBr 3 and [C 6H 11NH 3]CuCl 3

    NASA Astrophysics Data System (ADS)

    Hoogerbeets, R.; Wiegers, S. A. J.; Van Duyneveldt, A. J.

    1985-04-01

    Subthreshold parallel pumping experiments on [C 6H 11NH 3]CuBr 3 (abbreviated as CHAB) and [C 6H 11NH 3]CuCl 3 (CHAC) at 9.6 and 18.3 GHz are reported. It is shown that the experimental results can be explained using the values of the parameters as have been obtained from previously reported FMR measurements.

  11. Sensing underground coal gasification by ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Kotyrba, Andrzej; Stańczyk, Krzysztof

    2017-12-01

    The paper describes the results of research on the applicability of the ground penetrating radar (GPR) method for remote sensing and monitoring of the underground coal gasification (UCG) processes. The gasification of coal in a bed entails various technological problems and poses risks to the environment. Therefore, in parallel with research on coal gasification technologies, it is necessary to develop techniques for remote sensing of the process environment. One such technique may be the radar method, which allows imaging of regions of mass loss (voids, fissures) in coal during and after carrying out a gasification process in the bed. The paper describes two research experiments. The first one was carried out on a large-scale model constructed on the surface. It simulated a coal seam in natural geological conditions. A second experiment was performed in a shallow coal deposit maintained in a disused mine and kept accessible for research purposes. Tests performed in the laboratory and in situ conditions showed that the method provides valuable data for assessing and monitoring gasification surfaces in the UCG processes. The advantage of the GPR method is its high resolution and the possibility of determining the spatial shape of various zones and forms created in the coal by the gasification process.

  12. Nonlinear Electrostatic Properties of Lunar Dust

    NASA Technical Reports Server (NTRS)

    Irwin, Stacy A.

    2012-01-01

    A laboratory experiment was designed to study the induction charging and charge decay characteristics of small dielectric particles, or glass beads. Initially, the goal of the experiment was further understanding of induction charging of lunar dust particles. However, the mechanism of charging became a point of greater interest as the project continued. Within an environmentally-controlled acrylic glove box was placed a large parallel plate capacitor at high-voltage (HV) power supply with reversible polarity. Spherical 1-mm and 0.5-mm glass beads, singly, were placed between the plates, and their behaviors recorded on video and quantified. Nearly a hundred trials at various humidities were performed. The analysis of the results indicated a non-linear relationship between humidity and particle charge exchange time (CET), for both sizes of beads. Further, a difference in CET for top-resting beads and bottom-resting beads hinted at a different charging mechanism than that of simple induction. Results from the I-mm bead trials were presented at several space science and physics conferences in 2008 and 2009, and were published as a Master's thesis in August 2009. Tangential work stemming from this project resulted in presentations at other international conferences in 2010, and selection to attend workshop on granular matter flow 2011.

  13. Development of the Science Data System for the International Space Station Cold Atom Lab

    NASA Technical Reports Server (NTRS)

    van Harmelen, Chris; Soriano, Melissa A.

    2015-01-01

    Cold Atom Laboratory (CAL) is a facility that will enable scientists to study ultra-cold quantum gases in a microgravity environment on the International Space Station (ISS) beginning in 2016. The primary science data for each experiment consists of two images taken in quick succession. The first image is of the trapped cold atoms and the second image is of the background. The two images are subtracted to obtain optical density. These raw Level 0 atom and background images are processed into the Level 1 optical density data product, and then into the Level 2 data products: atom number, Magneto-Optical Trap (MOT) lifetime, magnetic chip-trap atom lifetime, and condensate fraction. These products can also be used as diagnostics of the instrument health. With experiments being conducted for 8 hours every day, the amount of data being generated poses many technical challenges, such as downlinking and managing the required data volume. A parallel processing design is described, implemented, and benchmarked. In addition to optimizing the data pipeline, accuracy and speed in producing the Level 1 and 2 data products is key. Algorithms for feature recognition are explored, facilitating image cropping and accurate atom number calculations.

  14. Fluid Line Evacuation and Freezing Experiments for Digital Radiator Concept

    NASA Technical Reports Server (NTRS)

    Berisford, Daniel F.; Birur, Gajanana C.; Miller, Jennifer R.; Sunada, Eric T.; Ganapathi, Gani B.; Stephan, Ryan; Johnson, Mark

    2011-01-01

    The digital radiator technology is one of three variable heat rejection technologies being investigated for future human-rated NASA missions. The digital radiator concept is based on a mechanically pumped fluid loop with parallel tubes carrying coolant to reject heat from the radiator surface. A series of valves actuate to start and stop fluid flow to di erent combinations of tubes, in order to vary the heat rejection capability of the radiator by a factor of 10 or more. When the flow in a particular leg is stopped, the fluid temperature drops and the fluid can freeze, causing damage or preventing flow from restarting. For this reason, the liquid in a stopped leg must be partially or fully evacuated upon shutdown. One of the challenges facing fluid evacuation from closed tubes arises from the vapor generated during pumping to low pressure, which can cause pump cavitation and incomplete evacuation. Here we present a series of laboratory experiments demonstrating fluid evacuation techniques to overcome these challenges by applying heat and pumping to partial vacuum. Also presented are results from qualitative testing of the freezing characteristics of several different candidate fluids, which demonstrate significant di erences in freezing properties, and give insight to the evacuation process.

  15. Performance of the Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  16. Bilingual parallel programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; Overbeek, R.

    1990-01-01

    Numerous experiments have demonstrated that computationally intensive algorithms support adequate parallelism to exploit the potential of large parallel machines. Yet successful parallel implementations of serious applications are rare. The limiting factor is clearly programming technology. None of the approaches to parallel programming that have been proposed to date -- whether parallelizing compilers, language extensions, or new concurrent languages -- seem to adequately address the central problems of portability, expressiveness, efficiency, and compatibility with existing software. In this paper, we advocate an alternative approach to parallel programming based on what we call bilingual programming. We present evidence that this approach providesmore » and effective solution to parallel programming problems. The key idea in bilingual programming is to construct the upper levels of applications in a high-level language while coding selected low-level components in low-level languages. This approach permits the advantages of a high-level notation (expressiveness, elegance, conciseness) to be obtained without the cost in performance normally associated with high-level approaches. In addition, it provides a natural framework for reusing existing code.« less

  17. Implementation of DFT application on ternary optical computer

    NASA Astrophysics Data System (ADS)

    Junjie, Peng; Youyi, Fu; Xiaofeng, Zhang; Shuai, Kong; Xinyu, Wei

    2018-03-01

    As its characteristics of huge number of data bits and low energy consumption, optical computing may be used in the applications such as DFT etc. which needs a lot of computation and can be implemented in parallel. According to this, DFT implementation methods in full parallel as well as in partial parallel are presented. Based on resources ternary optical computer (TOC), extensive experiments were carried out. Experimental results show that the proposed schemes are correct and feasible. They provide a foundation for further exploration of the applications on TOC that needs a large amount calculation and can be processed in parallel.

  18. Rethinking key–value store for parallel I/O optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kougkas, Anthony; Eslami, Hassan; Sun, Xian-He

    2015-01-26

    Key-value stores are being widely used as the storage system for large-scale internet services and cloud storage systems. However, they are rarely used in HPC systems, where parallel file systems are the dominant storage solution. In this study, we examine the architecture differences and performance characteristics of parallel file systems and key-value stores. We propose using key-value stores to optimize overall Input/Output (I/O) performance, especially for workloads that parallel file systems cannot handle well, such as the cases with intense data synchronization or heavy metadata operations. We conducted experiments with several synthetic benchmarks, an I/O benchmark, and a real application.more » We modeled the performance of these two systems using collected data from our experiments, and we provide a predictive method to identify which system offers better I/O performance given a specific workload. The results show that we can optimize the I/O performance in HPC systems by utilizing key-value stores.« less

  19. Mining algorithm for association rules in big data based on Hadoop

    NASA Astrophysics Data System (ADS)

    Fu, Chunhua; Wang, Xiaojing; Zhang, Lijun; Qiao, Liying

    2018-04-01

    In order to solve the problem that the traditional association rules mining algorithm has been unable to meet the mining needs of large amount of data in the aspect of efficiency and scalability, take FP-Growth as an example, the algorithm is realized in the parallelization based on Hadoop framework and Map Reduce model. On the basis, it is improved using the transaction reduce method for further enhancement of the algorithm's mining efficiency. The experiment, which consists of verification of parallel mining results, comparison on efficiency between serials and parallel, variable relationship between mining time and node number and between mining time and data amount, is carried out in the mining results and efficiency by Hadoop clustering. Experiments show that the paralleled FP-Growth algorithm implemented is able to accurately mine frequent item sets, with a better performance and scalability. It can be better to meet the requirements of big data mining and efficiently mine frequent item sets and association rules from large dataset.

  20. Bringing Undergraduates and Geoscientists Together for Field-Based Geophysical Education and Research at an On-Campus Well Field

    NASA Astrophysics Data System (ADS)

    Day-Lewis, F. D.; Gray, M. B.

    2004-12-01

    Development of our Hydrogeophysics Well Field has enabled new opportunities for field-based undergraduate research and active-learning at Bucknell University. Installed in 2001-2002, the on-campus well field has become a cornerstone of field labs for hydrogeology and applied geophysics courses, and for introductory labs in engineering and environmental geology. In addition to enabling new field experiences, the well field serves as a meeting place for students and practicing geoscientists. In the last three years, we have hosted field demonstrations by alumni working in the environmental, geophysical, and water-well drilling industries; researchers from government agencies; graduate students from other universities; and geophysical equipment vendors seeking to test and demonstrate new instruments. Coordinating undergraduate research and practical course labs with field experiments led by alumni and practicing geoscientists provides students hands-on experience with new technology while educating them about career and graduate-school opportunities. In addition to being effective pedagogical strategy, these experiences are well received by students -- enrollment in our geophysics course has tripled from three years ago. The Bucknell Hydrogeophysics Well Field consists of five bedrock wells, installed in a fractured-rock aquifer in the Wills Creek Shale. The wells are open in the bedrock, facilitating geophysical and hydraulic measurements. To date, student have helped acquire from one or more wells: (1) open-hole slug- and aquifer-test data; (2) packer test data from isolated borehole intervals; (3) flow-meter logs; (4) acoustic and optical televiewer logs; (5) standard borehole logs including single-point resistance, caliper, and natural-gamma; (6) borehole video camera; (7) electrical resistivity tomograms; (8) water levels while drilling; and (9) water chemistry and temperature logs. Preliminary student-led data analysis indicates that sparse discrete fractures dominate the response of water levels to pumping. The three sets of fractures observed in the wells are consistent with those observed in outcrops around Bucknell: (1) bedding sub-parallel fractures; (2) joints; and (3) fractures parallel to rock cleavage. Efforts are ongoing to develop a CD-ROM of field data, photographs and video footage documenting the site and experiments; the CD is intended for publication as a "Virtual Field Laboratory" teaching tool for undergraduate hydrogeology and applied geophysics. We have seen the benefits of merging theory and practice in our undergraduate curriculum, and we seek to make these benefits available to other schools.

  1. 180 MW/180 KW pulse modulator for S-band klystron of LUE-200 linac of IREN installation of JINR

    NASA Astrophysics Data System (ADS)

    Su, Kim Dong; Sumbaev, A. P.; Shvetsov, V. N.

    2014-09-01

    The offer on working out of the pulse modulator with 180 MW pulse power and 180 kW average power for pulse S-band klystrons of LUE-200 linac of IREN installation at the Laboratory of neutron physics (FLNP) at JINR is formulated. Main requirements, key parameters and element base of the modulator are presented. The variant of the basic scheme on the basis of 14 (or 11) stage 2 parallel PFN with the thyratron switchboard (TGI2-10K/50) and six parallel high voltage power supplies (CCPS Power Supply) is considered.

  2. Clock Agreement Among Parallel Supercomputer Nodes

    DOE Data Explorer

    Jones, Terry R.; Koenig, Gregory A.

    2014-04-30

    This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.

  3. Lower Limb Rehabilitation Using Patient Data

    PubMed Central

    Saadat, Mozafar

    2016-01-01

    The aim of this study is to investigate the performance of a 6-DoF parallel robot in tracking the movement of the foot trajectory of a paretic leg during a single stride. The foot trajectories of nine patients with a paretic leg including both males and females have been measured and analysed by a Vicon system in a gait laboratory. Based on kinematic and dynamic analysis of a 6-DoF UPS parallel robot, an algorithm was developed in MATLAB to calculate the length of the actuators and their required forces during all trajectories. The workspace and singularity points of the robot were then investigated in nine different cases. A 6-DoF UPS parallel robot prototype with high repeatability was designed and built in order to simulate a single stride. Results showed that the robot was capable of tracking all of the trajectories with the maximum position error of 1.2 mm. PMID:27721648

  4. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    DOE PAGES

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results.« less

  5. Fluid driven fracture mechanics in highly anisotropic shale: a laboratory study with application to hydraulic fracturing

    NASA Astrophysics Data System (ADS)

    Gehne, Stephan; Benson, Philip; Koor, Nick; Enfield, Mark

    2017-04-01

    The finding of considerable volumes of hydrocarbon resources within tight sedimentary rock formations in the UK led to focused attention on the fundamental fracture properties of low permeability rock types and hydraulic fracturing. Despite much research in these fields, there remains a scarcity of available experimental data concerning the fracture mechanics of fluid driven fracturing and the fracture properties of anisotropic, low permeability rock types. In this study, hydraulic fracturing is simulated in a controlled laboratory environment to track fracture nucleation (location) and propagation (velocity) in space and time and assess how environmental factors and rock properties influence the fracture process and the developing fracture network. Here we report data on employing fluid overpressure to generate a permeable network of micro tensile fractures in a highly anisotropic shale ( 50% P-wave velocity anisotropy). Experiments are carried out in a triaxial deformation apparatus using cylindrical samples. The bedding planes are orientated either parallel or normal to the major principal stress direction (σ1). A newly developed technique, using a steel guide arrangement to direct pressurised fluid into a sealed section of an axially drilled conduit, allows the pore fluid to contact the rock directly and to initiate tensile fractures from the pre-defined zone inside the sample. Acoustic Emission location is used to record and map the nucleation and development of the micro-fracture network. Indirect tensile strength measurements at atmospheric pressure show a high tensile strength anisotropy ( 60%) of the shale. Depending on the relative bedding orientation within the stress field, we find that fluid induced fractures in the sample propagate in two of the three principal fracture orientations: Divider and Short-Transverse. The fracture progresses parallel to the bedding plane (Short-Transverse orientation) if the bedding plane is aligned (parallel) with the direction of σ1. Conversely, the crack plane develops perpendicular to the bedding plane, if the bedding plane is orientated normal to σ1. Fracture initiation pressures are higher in the Divider orientation ( 24MPa) than in the Short-Transverse orientation ( 14MPa) showing a tensile strength anisotropy ( 42%) comparable to ambient tensile strength results. We then use X-Ray Computed Tomography (CT) 3D-images to evaluate the evolved fracture network in terms of fracture pattern, aperture and post-test water permeability. For both fracture orientations, very fine, axial fractures evolve over the entire length of the sample. For the fracturing in the Divider orientation, it has been observed, that in some cases, secondary fractures are branching of the main fracture. Test data from fluid driven fracturing experiments suggest that fracture pattern, fracture propagation trajectories and fracturing fluid pressure (initiation and propagation pressure) are predominantly controlled by the interaction between the anisotropic mechanical properties of the shale and the anisotropic stress environment. The orientation of inherent rock anisotropy relative to the principal stress directions seems to be the main control on fracture orientation and required fracturing pressure.

  6. A Novel Implementation of Massively Parallel Three Dimensional Monte Carlo Radiation Transport

    NASA Astrophysics Data System (ADS)

    Robinson, P. B.; Peterson, J. D. L.

    2005-12-01

    The goal of our summer project was to implement the difference formulation for radiation transport into Cosmos++, a multidimensional, massively parallel, magneto hydrodynamics code for astrophysical applications (Peter Anninos - AX). The difference formulation is a new method for Symbolic Implicit Monte Carlo thermal transport (Brooks and Szöke - PAT). Formerly, simultaneous implementation of fully implicit Monte Carlo radiation transport in multiple dimensions on multiple processors had not been convincingly demonstrated. We found that a combination of the difference formulation and the inherent structure of Cosmos++ makes such an implementation both accurate and straightforward. We developed a "nearly nearest neighbor physics" technique to allow each processor to work independently, even with a fully implicit code. This technique coupled with the increased accuracy of an implicit Monte Carlo solution and the efficiency of parallel computing systems allows us to demonstrate the possibility of massively parallel thermal transport. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48

  7. Beyond 2D: Parallel Electric Fields and Dissipation in Guide Field Reconnectio

    NASA Astrophysics Data System (ADS)

    Wilder, F. D.; Ergun, R.; Ahmadi, N.; Goodrich, K.; Eriksson, S.; Shimoda, E.; Burch, J. L.; Phan, T.; Torbert, R. B.; Strangeway, R. J.; Giles, B. L.; Lindqvist, P. A.; Khotyaintsev, Y. V.

    2017-12-01

    In 2015, NASA launched the Magnetospheric Multiscale (MMS) mission to study phenomenon of magnetic reconnection down to the electron scale. Advantages of MMS include a 20s spin period and long axial booms, which together allow for measurement of 3-D electric fields with accuracy down to 1 mV/m. During the two dayside phases of the prime mission, MMS has observed multiple electron and ion diffusion region events at the Earth's subsolar and flank magnetopause, as well as in the magnetosheath, providing an option to study both symmetric and asymmetric reconnection at a variety of guide field strengths. We present a review of parallel electric fields observed by MMS during diffusion region events, and discuss their implications for simulations and laboratory observations of reconnection. We find that as the guide field increases, the dissipation in the diffusion region transitions from being due to currents and fields perpendicular to the background magnetic field, to being associated with parallel electric fields and currents. Additionally, the observed parallel electric fields are significantly larger than those predicted by simulations of reconnection under strong guide field conditions.

  8. Reference earth orbital research and applications investigations (blue book). Volume 3: Physics

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The definition of physics experiments to be conducted aboard the space station is presented. The four functional program elements are: (1) space physics research laboratory, (2) plasma physics and environmental perturbation laboratory, (3) cosmic ray physics laboratory, and (4) physics and chemistry laboratory. The experiments to be conducted by each facility are defined and the crew member requirements to accomplish the experiments are presented.

  9. AC losses in horizontally parallel HTS tapes for possible wireless power transfer applications

    NASA Astrophysics Data System (ADS)

    Shen, Boyang; Geng, Jianzhao; Zhang, Xiuchang; Fu, Lin; Li, Chao; Zhang, Heng; Dong, Qihuan; Ma, Jun; Gawith, James; Coombs, T. A.

    2017-12-01

    This paper presents the concept of using horizontally parallel HTS tapes with AC loss study, and the investigation on possible wireless power transfer (WPT) applications. An example of three parallel HTS tapes was proposed, whose AC loss study was carried out both from experiment using electrical method; and simulation using 2D H-formulation on the FEM platform of COMSOL Multiphysics. The electromagnetic induction around the three parallel tapes was monitored using COMSOL simulation. The electromagnetic induction and AC losses generated by a conventional three turn coil was simulated as well, and then compared to the case of three parallel tapes with the same AC transport current. The analysis demonstrates that HTS parallel tapes could be potentially used into wireless power transfer systems, which could have lower total AC losses than conventional HTS coils.

  10. A Two-Week Guided Inquiry Protein Separation and Detection Experiment for Undergraduate Biochemistry

    ERIC Educational Resources Information Center

    Carolan, James P.; Nolta, Kathleen V.

    2016-01-01

    A laboratory experiment for teaching protein separation and detection in an undergraduate biochemistry laboratory course is described. This experiment, performed in two, 4 h laboratory periods, incorporates guided inquiry principles to introduce students to the concepts behind and difficulties of protein purification. After using size-exclusion…

  11. Simulated and Virtual Science Laboratory Experiments: Improving Critical Thinking and Higher-Order Learning Skills

    ERIC Educational Resources Information Center

    Simon, Nicole A.

    2013-01-01

    Virtual laboratory experiments using interactive computer simulations are not being employed as viable alternatives to laboratory science curriculum at extensive enough rates within higher education. Rote traditional lab experiments are currently the norm and are not addressing inquiry, Critical Thinking, and cognition throughout the laboratory…

  12. Wiki Laboratory Notebooks: Supporting Student Learning in Collaborative Inquiry-Based Laboratory Experiments

    ERIC Educational Resources Information Center

    Lawrie, Gwendolyn Angela; Grøndahl, Lisbeth; Boman, Simon; Andrews, Trish

    2016-01-01

    Recent examples of high-impact teaching practices in the undergraduate chemistry laboratory that include course-based undergraduate research experiences and inquiry-based experiments require new approaches to assessing individual student learning outcomes. Instructors require tools and strategies that can provide them with insight into individual…

  13. Green Fluorescent Protein-Focused Bioinformatics Laboratory Experiment Suitable for Undergraduates in Biochemistry Courses

    ERIC Educational Resources Information Center

    Rowe, Laura

    2017-01-01

    An introductory bioinformatics laboratory experiment focused on protein analysis has been developed that is suitable for undergraduate students in introductory biochemistry courses. The laboratory experiment is designed to be potentially used as a "stand-alone" activity in which students are introduced to basic bioinformatics tools and…

  14. An Example of a Laboratory Teaching Experience in a Professional Year (Plan B) Program

    ERIC Educational Resources Information Center

    Miller, P. J.; And Others

    1978-01-01

    A laboratory teaching experience (L.T.E.) was designed to focus on three teaching behaviors. It was recognized that a behavioral approach to teaching simplified its complexity by isolating specific teaching behaviors. Discusses the development and evaluation of the laboratory teaching experience. (Author/RK)

  15. Capillary Electrophoresis Analysis of Cations in Water Samples: An Experiment for the Introductory Laboratory

    ERIC Educational Resources Information Center

    Pursell, Christopher J.; Chandler, Bert; Bushey, Michelle M.

    2004-01-01

    Capillary electrophoresis is gradually working its way into the undergraduate laboratory curriculum. Typically, experiments utilizing this newer technology have been introduced into analytical or instrumental courses. The authors of this article have introduced an experiment into the introductory laboratory that utilizes capillary electrophoresis…

  16. An Undergraduate Laboratory Experiment in Bioinorganic Chemistry: Ligation States of Myoglobin

    ERIC Educational Resources Information Center

    Bailey, James A.

    2011-01-01

    Although there are numerous inorganic model systems that are readily presented as undergraduate laboratory experiments in bioinorganic chemistry, there are few examples that explore the inorganic chemistry of actual biological molecules. We present a laboratory experiment using the oxygen-binding protein myoglobin that can be easily incorporated…

  17. Consumer-Oriented Laboratory Activities: A Manual for Secondary Science Students.

    ERIC Educational Resources Information Center

    Anderson, Jacqueline; McDuffie, Thomas E., Jr.

    This document provides a laboratory manual for use by secondary level students in performing consumer-oriented laboratory experiments. Each experiment includes an introductory question outlining the purpose of the investigation, a detailed discussion, detailed procedures, questions to be answered upon completing the experiment, and information for…

  18. Circular dichroism spectroscopy: Enhancing a traditional undergraduate biochemistry laboratory experience.

    PubMed

    Lewis, Russell L; Seal, Erin L; Lorts, Aimee R; Stewart, Amanda L

    2017-11-01

    The undergraduate biochemistry laboratory curriculum is designed to provide students with experience in protein isolation and purification protocols as well as various data analysis techniques, which enhance the biochemistry lecture course and give students a broad range of tools upon which to build in graduate level laboratories or once they begin their careers. One of the most common biochemistry protein purification experiments is the isolation and characterization of cytochrome c. Students across the country purify cytochrome c, lysozyme, or some other well-known protein to learn these common purification techniques. What this series of experiments lacks is the use of sophisticated instrumentation that is rarely available to undergraduate students. To give students a broader background in biochemical spectroscopy techniques, a new circular dichroism (CD) laboratory experiment was introduced into the biochemistry laboratory curriculum. This CD experiment provides students with a means of conceptualizing the secondary structure of their purified protein, and assessments indicate that students' understanding of the technique increased significantly. Students conducted this experiment with ease and in a short time frame, so this laboratory is conducive to merging with other data analysis techniques within a single laboratory period. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(6):515-520, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  19. Integrated parallel reception, excitation, and shimming (iPRES).

    PubMed

    Han, Hui; Song, Allen W; Truong, Trong-Kha

    2013-07-01

    To develop a new concept for a hardware platform that enables integrated parallel reception, excitation, and shimming. This concept uses a single coil array rather than separate arrays for parallel excitation/reception and B0 shimming. It relies on a novel design that allows a radiofrequency current (for excitation/reception) and a direct current (for B0 shimming) to coexist independently in the same coil. Proof-of-concept B0 shimming experiments were performed with a two-coil array in a phantom, whereas B0 shimming simulations were performed with a 48-coil array in the human brain. Our experiments show that individually optimized direct currents applied in each coil can reduce the B0 root-mean-square error by 62-81% and minimize distortions in echo-planar images. The simulations show that dynamic shimming with the 48-coil integrated parallel reception, excitation, and shimming array can reduce the B0 root-mean-square error in the prefrontal and temporal regions by 66-79% as compared with static second-order spherical harmonic shimming and by 12-23% as compared with dynamic shimming with a 48-coil conventional shim array. Our results demonstrate the feasibility of the integrated parallel reception, excitation, and shimming concept to perform parallel excitation/reception and B0 shimming with a unified coil system as well as its promise for in vivo applications. Copyright © 2013 Wiley Periodicals, Inc.

  20. A multi-satellite orbit determination problem in a parallel processing environment

    NASA Technical Reports Server (NTRS)

    Deakyne, M. S.; Anderle, R. J.

    1988-01-01

    The Engineering Orbit Analysis Unit at GE Valley Forge used an Intel Hypercube Parallel Processor to investigate the performance and gain experience of parallel processors with a multi-satellite orbit determination problem. A general study was selected in which major blocks of computation for the multi-satellite orbit computations were used as units to be assigned to the various processors on the Hypercube. Problems encountered or successes achieved in addressing the orbit determination problem would be more likely to be transferable to other parallel processors. The prime objective was to study the algorithm to allow processing of observations later in time than those employed in the state update. Expertise in ephemeris determination was exploited in addressing these problems and the facility used to bring a realism to the study which would highlight the problems which may not otherwise be anticipated. Secondary objectives were to gain experience of a non-trivial problem in a parallel processor environment, to explore the necessary interplay of serial and parallel sections of the algorithm in terms of timing studies, to explore the granularity (coarse vs. fine grain) to discover the granularity limit above which there would be a risk of starvation where the majority of nodes would be idle or under the limit where the overhead associated with splitting the problem may require more work and communication time than is useful.

  1. Interactive virtual optical laboratories

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Yang, Yi

    2017-08-01

    Laboratory experiences are essential for optics education. However, college students have limited access to advanced optical equipment that is generally expensive and complicated. Hence there is a need for innovative solutions to expose students to advanced optics laboratories. Here we describe a novel approach, interactive virtual optical laboratory (IVOL) that allows unlimited number of students to participate the lab session remotely through internet, to improve laboratory education in photonics. Although students are not physically conducting the experiment, IVOL is designed to engage students, by actively involving students in the decision making process throughout the experiment.

  2. Fast Face-Recognition Optical Parallel Correlator Using High Accuracy Correlation Filter

    NASA Astrophysics Data System (ADS)

    Watanabe, Eriko; Kodate, Kashiko

    2005-11-01

    We designed and fabricated a fully automatic fast face recognition optical parallel correlator [E. Watanabe and K. Kodate: Appl. Opt. 44 (2005) 5666] based on the VanderLugt principle. The implementation of an as-yet unattained ultra high-speed system was aided by reconfiguring the system to make it suitable for easier parallel processing, as well as by composing a higher accuracy correlation filter and high-speed ferroelectric liquid crystal-spatial light modulator (FLC-SLM). In running trial experiments using this system (dubbed FARCO), we succeeded in acquiring remarkably low error rates of 1.3% for false match rate (FMR) and 2.6% for false non-match rate (FNMR). Given the results of our experiments, the aim of this paper is to examine methods of designing correlation filters and arranging database image arrays for even faster parallel correlation, underlining the issues of calculation technique, quantization bit rate, pixel size and shift from optical axis. The correlation filter has proved its excellent performance and higher precision than classical correlation and joint transform correlator (JTC). Moreover, arrangement of multi-object reference images leads to 10-channel correlation signals, as sharply marked as those of a single channel. This experiment result demonstrates great potential for achieving the process speed of 10000 face/s.

  3. The Timing of an Experiment in the Laboratory Program Is Crucial for the Student Laboratory Experience: Acylation of Ferrocene as a Case Study

    ERIC Educational Resources Information Center

    Southam, Daniel C.; Shand, Bradley; Buntine, Mark A.; Kable, Scott H.; Read, Justin R.; Morris, Jonathan C.

    2013-01-01

    An assessment of the acylation of ferrocene laboratory exercise across three successive years resulted in a significant fluctuation in student perception of the experiment. This perception was measured by collecting student responses to an instrument immediately after the experiment, which includes Likert and open-ended responses from the student.…

  4. A comparison of traditional physical laboratory and computer-simulated laboratory experiences in relation to engineering undergraduate students' conceptual understandings of a communication systems topic

    NASA Astrophysics Data System (ADS)

    Javidi, Giti

    2005-07-01

    This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical group's (n = 40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation group's (n = 40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups' attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the qualitative research has uncovered several issues not explored by the quantitative research. It was concluded that incorporating the recommendations acquired from the qualitative research, especially elements of incorporating hardware experience to avoid lack of hands-on skills, into the laboratory pedagogy should help improve students' experience regardless of the environment in which the laboratory is conducted.

  5. Thin-Film Nanocapacitor and Its Characterization

    ERIC Educational Resources Information Center

    Hunter, David N.; Pickering, Shawn L.; Jia, Dongdong

    2007-01-01

    An undergraduate thin-film nanotechnology laboratory was designed. Nanocapacitors were fabricated on silicon substrates by sputter deposition. A mask was designed to form the shape of the capacitor and its electrodes. Thin metal layers of Au with a 80 nm thickness were deposited and used as two infinitely large parallel plates for a capacitor.…

  6. Combinatorial Partial Hydrogenation Reactions of 4-Nitroacetophenone: An Undergraduate Organic Laboratory

    ERIC Educational Resources Information Center

    Kittredge, Kevin W.; Marine, Susan S.; Taylor, Richard T.

    2004-01-01

    A molecule possessing other functional groups that could be hydrogenerated is examined, where a variety of metal catalysts are evaluated under similar reaction conditions. Optimizing organic reactions is both time and labor intensive, and the use of a combinatorial parallel synthesis reactor was great time saving device, as per summary.

  7. A Convenient Storage Rack for Graduated Cylinders

    ERIC Educational Resources Information Center

    Love, Brian

    2004-01-01

    An attempt is made to find a solution to the occasional problem of a need for storing large numbers of graduated cylinders in many teaching and research laboratories. A design, which involves the creation of a series of parallel channels that are used to suspend inverted graduated cylinders by their bases, is proposed.

  8. Temperament and Attention Deficit Hyperactivity Disorder: The Development of a Multiple Pathway Model

    ERIC Educational Resources Information Center

    Nigg, Joel T.; Goldsmith, H. Hill; Sachek, Jennifer

    2004-01-01

    This article outlines the parallels between major theories of attention deficit hyperactivity disorder (ADHD) and relevant temperament domains, summarizing recent research from our laboratories on (a) child temperament and (b) adult personality traits related to ADHD symptoms. These data are convergent in suggesting a role of effortful control and…

  9. FPGA-Based Laboratory Assignments for NoC-Based Manycore Systems

    ERIC Educational Resources Information Center

    Ttofis, C.; Theocharides, T.; Michael, M. K.

    2012-01-01

    Manycore systems have emerged as being one of the dominant architectural trends in next-generation computer systems. These highly parallel systems are expected to be interconnected via packet-based networks-on-chip (NoC). The complexity of such systems poses novel and exciting challenges in academia, as teaching their design requires the students…

  10. Investigating Student Perceptions of the Chemistry Laboratory and Their Approaches to Learning in the Laboratory

    NASA Astrophysics Data System (ADS)

    Berger, Spencer Granett

    This dissertation explores student perceptions of the instructional chemistry laboratory and the approaches students take when learning in the laboratory environment. To measure student perceptions of the chemistry laboratory, a survey instrument was developed. 413 students responded to the survey during the Fall 2011 semester. Students' perception of the usefulness of the laboratory in helping them learn chemistry in high school was related to several factors regarding their experiences in high school chemistry. Students' perception of the usefulness of the laboratory in helping them learn chemistry in college was also measured. Reasons students provided for the usefulness of the laboratory were categorized. To characterize approaches to learning in the laboratory, students were interviewed midway through semester (N=18). The interviews were used to create a framework describing learning approaches that students use in the laboratory environment. Students were categorized into three levels: students who view the laboratory as a requirement, students who believe that the laboratory augments their understanding, and students who view the laboratory as an important part of science. These categories describe the types of strategies students used when conducting experiments. To further explore the relationship between students' perception of the laboratory and their approaches to learning, two case studies are described. These case studies involve interviews in the beginning and end of the semester. In the interviews, students reflect on what they have learned in the laboratory and describe their perceptions of the laboratory environment. In order to encourage students to adopt higher-level approaches to learning in the laboratory, a metacognitive intervention was created. The intervention involved supplementary questions that students would answer while completing laboratory experiments. The questions were designed to encourage students to think critically about the laboratory procedures. In order to test the effects of the intervention, an experimental group (N=87) completed these supplementary questions during two laboratory experiments while a control group (N=84) performed the same experiments without these additional questions. The effects of the intervention on laboratory exam performance were measured. Students in the experimental group had a higher average on the laboratory exam than students in the control group.

  11. Epigenetic Mechanisms in Learned Fear: Implications for PTSD

    PubMed Central

    Zovkic, Iva B; Sweatt, J David

    2013-01-01

    One of the most exciting discoveries in the learning and memory field in the past two decades is the observation that active regulation of gene expression is necessary for experience to trigger lasting functional and behavioral change, in a wide variety of species, including humans. Thus, as opposed to the traditional view of ‘nature' (genes) being separate from ‘nurture' (environment and experience), it is now clear that experience actively drives alterations in central nervous system (CNS) gene expression in an ongoing fashion, and that the resulting transcriptional changes are necessary for experience to trigger altered long-term behavior. In parallel over the past decade, epigenetic mechanisms, including regulation of chromatin structure and DNA methylation, have been shown to be potent regulators of gene transcription in the CNS. In this review, we describe data supporting the hypothesis that epigenetic molecular mechanisms, especially DNA methylation and demethylation, drive long-term behavioral change through active regulation of gene transcription in the CNS. Specifically, we propose that epigenetic molecular mechanisms underlie the formation and stabilization of context- and cue-triggered fear conditioning based in the hippocampus and amygdala, a conclusion reached in a wide variety of studies using laboratory animals. Given the relevance of cued and contextual fear conditioning to post-traumatic stress, by extension we propose that these mechanisms may contribute to post-traumatic stress disorder (PTSD) in humans. Moreover, we speculate that epigenetically based pharmacotherapy may provide a new avenue of drug treatment for PTSD-related cognitive and behavioral function. PMID:22692566

  12. Step-response of a torsional device with multiple discontinuous non-linearities: Formulation of a vibratory experiment

    NASA Astrophysics Data System (ADS)

    Krak, Michael D.; Dreyer, Jason T.; Singh, Rajendra

    2016-03-01

    A vehicle clutch damper is intentionally designed to contain multiple discontinuous non-linearities, such as multi-staged springs, clearances, pre-loads, and multi-staged friction elements. The main purpose of this practical torsional device is to transmit a wide range of torque while isolating torsional vibration between an engine and transmission. Improved understanding of the dynamic behavior of the device could be facilitated by laboratory measurement, and thus a refined vibratory experiment is proposed. The experiment is conceptually described as a single degree of freedom non-linear torsional system that is excited by an external step torque. The single torsional inertia (consisting of a shaft and torsion arm) is coupled to ground through parallel production clutch dampers, which are characterized by quasi-static measurements provided by the manufacturer. Other experimental objectives address physical dimensions, system actuation, flexural modes, instrumentation, and signal processing issues. Typical measurements show that the step response of the device is characterized by three distinct non-linear regimes (double-sided impact, single-sided impact, and no-impact). Each regime is directly related to the non-linear features of the device and can be described by peak angular acceleration values. Predictions of a simplified single degree of freedom non-linear model verify that the experiment performs well and as designed. Accordingly, the benchmark measurements could be utilized to validate non-linear models and simulation codes, as well as characterize dynamic parameters of the device including its dissipative properties.

  13. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    PubMed

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  14. Functional Genomics Using the Saccharomyces cerevisiae Yeast Deletion Collections.

    PubMed

    Nislow, Corey; Wong, Lai Hong; Lee, Amy Huei-Yi; Giaever, Guri

    2016-09-01

    Constructed by a consortium of 16 laboratories, the Saccharomyces genome-wide deletion collections have, for the past decade, provided a powerful, rapid, and inexpensive approach for functional profiling of the yeast genome. Loss-of-function deletion mutants were systematically created using a polymerase chain reaction (PCR)-based gene deletion strategy to generate a start-to-stop codon replacement of each open reading frame by homologous recombination. Each strain carries two molecular barcodes that serve as unique strain identifiers, enabling their growth to be analyzed in parallel and the fitness contribution of each gene to be quantitatively assessed by hybridization to high-density oligonucleotide arrays or through the use of next-generation sequencing technologies. Functional profiling of the deletion collections, using either strain-by-strain or parallel assays, provides an unbiased approach to systematically survey the yeast genome. The Saccharomyces yeast deletion collections have proved immensely powerful in contributing to the understanding of gene function, including functional relationships between genes and genetic pathways in response to diverse genetic and environmental perturbations. © 2016 Cold Spring Harbor Laboratory Press.

  15. Tennis Rackets and the Parallel Axis Theorem

    ERIC Educational Resources Information Center

    Christie, Derek

    2014-01-01

    This simple experiment uses an unusual graph straightening exercise to confirm the parallel axis theorem for an irregular object. Along the way, it estimates experimental values for g and the moment of inertia of a tennis racket. We use Excel to find a 95% confidence interval for the true values.

  16. Using Motivational Interviewing Techniques to Address Parallel Process in Supervision

    ERIC Educational Resources Information Center

    Giordano, Amanda; Clarke, Philip; Borders, L. DiAnne

    2013-01-01

    Supervision offers a distinct opportunity to experience the interconnection of counselor-client and counselor-supervisor interactions. One product of this network of interactions is parallel process, a phenomenon by which counselors unconsciously identify with their clients and subsequently present to their supervisors in a similar fashion…

  17. Final Technical Report: Application of in situ Neutron Diffraction to Understand the Mechanism of Phase Transitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chandran, Ravi

    In this research, phase transitions in the bulk electrodes for Li-ion batteries were investigated using neutron diffraction (ND) as well as neutron imaging techniques. The objectives of this research is to design of a novel in situ electrochemical cell to obtain Rietveld refinable neutron diffraction experiments using small volume electrodes of various laboratory/research-scale electrodes intended for Li-ion batteries. This cell is also to be used to investigate the complexity of phase transitions in Li(Mg) alloy electrodes, either by diffraction or by neutron imaging, which occur under electrochemical lithiation and delithiation, and to determine aspects of phase transition that enable/limit energymore » storage capacity. Additional objective is to investigate the phase transitions in electrodes made of etched micro-columns of silicon and investigate the effect of particle/column size on phase transitions and nonequilibrium structures. An in situ electrochemical cell was designed successfully and was used to study the phase transitions under in-situ neutron diffraction in both the electrodes (anode/cathode) simultaneously in graphite/LiCoO 2 and in graphite/LiMn 2O 4 cells each with two cells. The diffraction patterns fully validated the working of the in situ cell. Additional experimental were performed using the Si micro-columnar electrodes. The results revealed new lithiation phenomena, as evidenced by mosaicity formation in silicon electrode. These experiments were performed in Vulcan diffractometer at SNS, Oak Ridge National Laboratory. In parallel, the spatial distribution of Li during lithiation and delithiation processes in Li-battery electrodes were investigated. For this purpose, neutron tomographic imaging technique has been used for 3D mapping of Li distribution in bulk Li(Mg) alloy electrodes. It was possible to observe the phase boundary of Li(Mg) alloy indicating phase transition from Li-rich BCC β-phase to Li-lean α-phase. These experiments have been performed at CG-1D Neutron Imaging Prototype Station at SNS.« less

  18. Spies and Bloggers: New Synthetic Biology Tools to Understand Microbial Processes in Soils and Sediments

    NASA Astrophysics Data System (ADS)

    Masiello, C. A.; Silberg, J. J.; Cheng, H. Y.; Del Valle, I.; Fulk, E. M.; Gao, X.; Bennett, G. N.

    2017-12-01

    Microbes can be programmed through synthetic biology to report on their behavior, informing researchers when their environment has triggered changes in their gene expression (e.g. in response to shifts in O2 or H2O), or when they have participated in a specific step of an elemental cycle (e.g. denitrification). This use of synthetic biology has the potential to significantly improve our understanding of microbes' roles in elemental and water cycling, because it allows reporting on the environment from the perspective of a microbe, matching the measurement scale exactly to the scale that a microbe experiences. However, synthetic microbes have not yet seen wide use in soil and sediment laboratory experiments because synthetic organisms typically report by fluorescing, making their signals difficult to detect outside the petri dish. We are developing a new suite of microbial programs that report instead by releasing easily-detected gases, allowing the real-time, noninvasive monitoring of behaviors in sediments and soils. Microbial biosensors can, in theory, be programmed to detect dynamic processes that contribute to a wide range of geobiological processes, including C cycling (biofilm production, methanogenesis, and synthesis of extracellular enzymes that degrade organic matter), N cycling (expression of enzymes that underlie different steps of the N cycle) and potentially S cycling. We will provide an overview of the potential uses of gas-reporting biosensors in soil and sediment lab experiments, and will report the development of the systematics of these sensors. Successful development of gas biosensors for laboratory use will require addressing issues including: engineering the intensity and selectivity of microbial gas production to maximize the signal to noise ratio; normalizing the gas reporter signal to cell population size, managing gas diffusion effects on signal shape; and developing multiple gases that can be used in parallel.

  19. Laboratory Evidence of Strength Recovery of Healed Faults

    NASA Astrophysics Data System (ADS)

    Masuda, K.

    2015-12-01

    Fault zones consist of a fault core and a surrounding damage zone. Fault zones are typically characterized by the presence of many healed surfaces, the strength of which is unknown. If a healed fault recovers its strength such that its cohesion is equal to or greater than that of the host rock, repeated cycles of fracture and healing may be one mechanism producing wide fault zones. I present laboratory evidence supporting the strength recovery of healed fault surface, obtained by AE monitoring, strain measurements and X-ray CT techniques. The loading experiment was performed with a specimen collected from an exhumed fault zone. Healed surfaces of the rock sample were interpreted to be parallel to slip surfaces. The specimen was a cylinder with 50 mm diameter and 100 mm long. The long axis of the specimen was inclined with respect to the orientation of the healed surfaces. The compression test used a constant loading rate under 50 MPa of confining pressure. Macroscopic failure occurred when the applied differential stress reached 439 MPa. The macro-fracture surface created during the experiment was very close to the preexisting plane. The AE hypocenters closely match the locations of the preexisting healed surface and the new fault plane. The experiment also revealed details of the initial stage of fault development. The new fault zone developed near, but not precisely on the preexisting healed fault plane. An area of heterogeneous structure where stress appears to have concentrated, was where the AEs began, and it was also where the fracture started. This means that the healed surface was not a weak surface and that healing strengthened the fault such that its cohesion was equal to or greater than that of the intact host rock. These results suggest that repeated cycles of fracture and healing may be the main mechanism creating wide fault zones with multiple fault cores and damage zones.

  20. Asking the Next Generation: The Implementation of Pre-University Students' Ideas about Physics Laboratory Preparation Exercises

    ERIC Educational Resources Information Center

    Dunnett, K.; Bartlett, P. A.

    2018-01-01

    It was planned to introduce online pre-laboratory session activities to a first-year undergraduate physics laboratory course to encourage a minimum level of student preparation for experiments outside the laboratory environment. A group of 16 and 17 year old laboratory work-experience students were tasked to define and design a pre-laboratory…

  1. Chemical Remediation of Nickel(II) Waste: A Laboratory Experiment for General Chemistry Students

    ERIC Educational Resources Information Center

    Corcoran, K. Blake; Rood, Brian E.; Trogden, Bridget G.

    2011-01-01

    This project involved developing a method to remediate large quantities of aqueous waste from a general chemistry laboratory experiment. Aqueous Ni(II) waste from a general chemistry laboratory experiment was converted into solid nickel hydroxide hydrate with a substantial decrease in waste volume. The remediation method was developed for a…

  2. A Laboratory Experiment on the Statistical Theory of Nuclear Reactions

    ERIC Educational Resources Information Center

    Loveland, Walter

    1971-01-01

    Describes an undergraduate laboratory experiment on the statistical theory of nuclear reactions. The experiment involves measuring the relative cross sections for formation of a nucleus in its meta stable excited state and its ground state by applying gamma-ray spectroscopy to an irradiated sample. Involves 3-4 hours of laboratory time plus…

  3. Redefining Authentic Research Experiences in Introductory Biology Laboratories and Barriers to Their Implementation

    ERIC Educational Resources Information Center

    Spell, Rachelle M.; Guinan, Judith A.; Miller, Kristen R.; Beck, Christopher W.

    2014-01-01

    Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are…

  4. Synthesis and Biological Testing of Penicillins: An Investigative Approach to the Undergraduate Teaching Laboratory

    ERIC Educational Resources Information Center

    Whitaker, Ragnhild D.; Truhlar, Laura M.; Yksel, Deniz; Walt, David R.; Williams, Mark D.

    2010-01-01

    The development and implementation of a research-based organic chemistry laboratory experiment is presented. The experiment was designed to simulate a scientific research environment, involve students in critical thinking, and develop the student's ability to analyze and present research-based data. In this experiment, a laboratory class…

  5. A Laboratory Experiment for Rapid Determination of the Stability of Vitamin C

    ERIC Educational Resources Information Center

    Adem, Seid M.; Lueng, Sam H.; Elles, Lisa M. Sharpe; Shaver, Lee Alan

    2016-01-01

    Experiments in laboratory manuals intended for general, organic, and biological (GOB) chemistry laboratories include few opportunities for students to engage in instrumental methods of analysis. Many of these students seek careers in modern health-related fields where experience in spectroscopic techniques would be beneficial. A simple, rapid,…

  6. The parallel-antiparallel signal difference in double-wave-vector diffusion-weighted MR at short mixing times: A phase evolution perspective

    NASA Astrophysics Data System (ADS)

    Finsterbusch, Jürgen

    2011-01-01

    Experiments with two diffusion weightings applied in direct succession in a single acquisition, so-called double- or two-wave-vector diffusion-weighting (DWV) experiments at short mixing times, have been shown to be a promising tool to estimate cell or compartment sizes, e.g. in living tissue. The basic theory for such experiments predicts that the signal decays for parallel and antiparallel wave vector orientations differ by a factor of three for small wave vectors. This seems to be surprising because in standard, single-wave-vector experiments the polarity of the diffusion weighting has no influence on the signal attenuation. Thus, the question how this difference can be understood more pictorially is often raised. In this rather educational manuscript, the phase evolution during a DWV experiment for simple geometries, e.g. diffusion between parallel, impermeable planes oriented perpendicular to the wave vectors, is considered step-by-step and demonstrates how the signal difference develops. Considering the populations of the phase distributions obtained, the factor of three between the signal decays which is predicted by the theory can be reproduced. Furthermore, the intermediate signal decay for orthogonal wave vector orientations can be derived when investigating diffusion in a box. Thus, the presented “phase gymnastics” approach may help to understand the signal modulation observed in DWV experiments at short mixing times.

  7. Development, Evaluation and Use of a Student Experience Survey in Undergraduate Science Laboratories: The Advancing Science by Enhancing Learning in the Laboratory Student Laboratory Learning Experience Survey

    NASA Astrophysics Data System (ADS)

    Barrie, Simon C.; Bucat, Robert B.; Buntine, Mark A.; Burke da Silva, Karen; Crisp, Geoffrey T.; George, Adrian V.; Jamie, Ian M.; Kable, Scott H.; Lim, Kieran F.; Pyke, Simon M.; Read, Justin R.; Sharma, Manjula D.; Yeung, Alexandra

    2015-07-01

    Student experience surveys have become increasingly popular to probe various aspects of processes and outcomes in higher education, such as measuring student perceptions of the learning environment and identifying aspects that could be improved. This paper reports on a particular survey for evaluating individual experiments that has been developed over some 15 years as part of a large national Australian study pertaining to the area of undergraduate laboratories-Advancing Science by Enhancing Learning in the Laboratory. This paper reports on the development of the survey instrument and the evaluation of the survey using student responses to experiments from different institutions in Australia, New Zealand and the USA. A total of 3153 student responses have been analysed using factor analysis. Three factors, motivation, assessment and resources, have been identified as contributing to improved student attitudes to laboratory activities. A central focus of the survey is to provide feedback to practitioners to iteratively improve experiments. Implications for practitioners and researchers are also discussed.

  8. Clarity: An Open Source Manager for Laboratory Automation

    PubMed Central

    Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.

    2013-01-01

    Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169

  9. Design and Implementation of Instructional Videos for Upper-Division Undergraduate Laboratory Courses

    ERIC Educational Resources Information Center

    Schmidt-McCormack, Jennifer A.; Muniz, Marc N.; Keuter, Ellie C.; Shaw, Scott K.; Cole, Renée S.

    2017-01-01

    Well-designed laboratories can help students master content and science practices by successfully completing the laboratory experiments. Upper-division chemistry laboratory courses often present special challenges for instruction due to the instrument intensive nature of the experiments. To address these challenges, particularly those associated…

  10. A 13-Week Research-Based Biochemistry Laboratory Curriculum

    ERIC Educational Resources Information Center

    Lefurgy, Scott T.; Mundorff, Emily C.

    2017-01-01

    Here, we present a 13-week research-based biochemistry laboratory curriculum designed to provide the students with the experience of engaging in original research while introducing foundational biochemistry laboratory techniques. The laboratory experience has been developed around the directed evolution of an enzyme chosen by the instructor, with…

  11. Inducing Mutations in "Paramecium": An Inquiry-Based Approach

    ERIC Educational Resources Information Center

    Elwess, Nancy L.; Latourelle, Sandra L.

    2004-01-01

    A major challenge in teaching any college level general genetics course including a laboratory component is having the students actively understand the research part of an experiment as well as develop the necessary laboratory skills. This laboratory experience furthers the students' knowledge of genetics while improving their laboratory skills.…

  12. Experiences with hypercube operating system instrumentation

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Rudolph, David C.

    1989-01-01

    The difficulties in conceptualizing the interactions among a large number of processors make it difficult both to identify the sources of inefficiencies and to determine how a parallel program could be made more efficient. This paper describes an instrumentation system that can trace the execution of distributed memory parallel programs by recording the occurrence of parallel program events. The resulting event traces can be used to compile summary statistics that provide a global view of program performance. In addition, visualization tools permit the graphic display of event traces. Visual presentation of performance data is particularly useful, indeed, necessary for large-scale parallel computers; the enormous volume of performance data mandates visual display.

  13. Communications oriented programming of parallel iterative solutions of sparse linear systems

    NASA Technical Reports Server (NTRS)

    Patrick, M. L.; Pratt, T. W.

    1986-01-01

    Parallel algorithms are developed for a class of scientific computational problems by partitioning the problems into smaller problems which may be solved concurrently. The effectiveness of the resulting parallel solutions is determined by the amount and frequency of communication and synchronization and the extent to which communication can be overlapped with computation. Three different parallel algorithms for solving the same class of problems are presented, and their effectiveness is analyzed from this point of view. The algorithms are programmed using a new programming environment. Run-time statistics and experience obtained from the execution of these programs assist in measuring the effectiveness of these algorithms.

  14. Development, Evaluation and Use of a Student Experience Survey in Undergraduate Science Laboratories: The Advancing Science by Enhancing Learning in the Laboratory Student Laboratory Learning Experience Survey

    ERIC Educational Resources Information Center

    Barrie, Simon C.; Bucat, Robert B.; Buntine, Mark A.; Burke da Silva, Karen; Crisp, Geoffrey T.; George, Adrian V.; Jamie, Ian M.; Kable, Scott H.; Lim, Kieran F.; Pyke, Simon M.; Read, Justin R.; Sharma, Manjula D.; Yeung, Alexandra

    2015-01-01

    Student experience surveys have become increasingly popular to probe various aspects of processes and outcomes in higher education, such as measuring student perceptions of the learning environment and identifying aspects that could be improved. This paper reports on a particular survey for evaluating individual experiments that has been developed…

  15. Professional behaviors, sense of belonging, and professional socialization of early career clinical laboratory scientists

    NASA Astrophysics Data System (ADS)

    Schill, Janna Marie

    Professional socialization is a process that individuals experience as members of a profession and consists of the knowledge, attitudes, and experiences that influence and shape their professional identity. The process of professional socialization has not been studied in the clinical laboratory science profession. Clinical laboratory science is an allied health profession that is faced by a workforce shortage that has been caused by a decrease in new graduates, decreased retention of qualified professionals, and increased retirements. Other allied health professions such as nursing, athletic training, and pharmacy have studied professional socialization as a way to identify factors that may influence the retention of early career professionals. This mixed method study, which quantitatively used Hall's Professionalism Scale (1968) in addition to qualitative focus group interviews, sought to identify the professional attitudes and behaviors, sense of belonging, and professional socialization of early career clinical laboratory scientists. Early career clinical laboratory scientists were divided into two groups based upon the amount of work experience they had; new clinical laboratory science graduates have had less than one year of work experience and novice clinical laboratory scientists had between one and three years of work experience. This study found that early career clinical laboratory scientists have established professional identities and view themselves as members of the clinical laboratory science field within four proposed stages of professional socialization consisting of pre-arrival, encounter, adaptation, and commitment. New CLS graduates and novice clinical laboratory scientists were found to be at different stages of the professional stage process. New CLS graduates, who had less than one year of work experience, were found to be in the encounter stage. Novice clinical laboratory scientists, with one to three years of work experience, were found to be in the adaptation stage. In order for early career clinical laboratory scientists to successfully transition from student to committed professional, increased support from more experienced colleagues needs to be provided for this group of laboratory professionals. This study provided an initial examination of the professional socialization process in the CLS profession and adds to existing professional socialization studies in allied health.

  16. Inter-laboratory evaluation of the EUROFORGEN Global ancestry-informative SNP panel by massively parallel sequencing using the Ion PGM™.

    PubMed

    Eduardoff, M; Gross, T E; Santos, C; de la Puente, M; Ballard, D; Strobl, C; Børsting, C; Morling, N; Fusco, L; Hussing, C; Egyed, B; Souto, L; Uacyisrael, J; Syndercombe Court, D; Carracedo, Á; Lareu, M V; Schneider, P M; Parson, W; Phillips, C; Parson, W; Phillips, C

    2016-07-01

    The EUROFORGEN Global ancestry-informative SNP (AIM-SNPs) panel is a forensic multiplex of 128 markers designed to differentiate an individual's ancestry from amongst the five continental population groups of Africa, Europe, East Asia, Native America, and Oceania. A custom multiplex of AmpliSeq™ PCR primers was designed for the Global AIM-SNPs to perform massively parallel sequencing using the Ion PGM™ system. This study assessed individual SNP genotyping precision using the Ion PGM™, the forensic sensitivity of the multiplex using dilution series, degraded DNA plus simple mixtures, and the ancestry differentiation power of the final panel design, which required substitution of three original ancestry-informative SNPs with alternatives. Fourteen populations that had not been previously analyzed were genotyped using the custom multiplex and these studies allowed assessment of genotyping performance by comparison of data across five laboratories. Results indicate a low level of genotyping error can still occur from sequence misalignment caused by homopolymeric tracts close to the target SNP, despite careful scrutiny of candidate SNPs at the design stage. Such sequence misalignment required the exclusion of component SNP rs2080161 from the Global AIM-SNPs panel. However, the overall genotyping precision and sensitivity of this custom multiplex indicates the Ion PGM™ assay for the Global AIM-SNPs is highly suitable for forensic ancestry analysis with massively parallel sequencing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Star of Condor - A strontium critical velocity experiment, Peru, 1983

    NASA Technical Reports Server (NTRS)

    Wescott, E. M.; Stenbaek-Nielsen, H. C.; Hallinan, T.; Foeppl, H.; Valenzuela, A.

    1986-01-01

    'Star of Condor' was a critical velocity experiment using Sr vapor produced in a radial shaped charge, which was carried to 571.11 km altitude on a Taurus-Tomahawk rocket launched from Punto Lobos, Peru, and detonated in the plane of the magnetic field lines so that all ranges of pitch angles from parallel to B to perpendicular to B were covered. Sr has a critical velocity of 3.3 km/s, and from observation, 42.5 percent of the neutral Sr gas had a velocity component perpendicular to B exceeding that value. No Sr ion emissions were detected shortly after the burst with usual TV integration times. However, about 10 min after the detonation a faint field-aligned streak was discovered with long TV integration times. The brightness is estimated as 5 R, which, combined with the streak geometry, implies an ion production of 2.4 x 10 to the 19th ions. This is only 0.0036 percent ionization of the Sr vapor. All the ions could easily have been produced by thermal ionization from the original detonation thermal distribution. The breakup of the Sr gas into small bloblike structures may have allowed the high-energy electrons to escape before an ionization cascade could be produced. For whatever reason, the Alfven mechanism proposed for space plasmas in the absence of laboratory walls did not produce an ionization cascade in the experiment.

  18. Observation of single-mode, Kelvin-Helmholtz instability in a supersonic flow

    DOE PAGES

    Wan, W. C.; Malamud, Guy; Shimony, A.; ...

    2015-10-01

    This manuscript reports the first observations of the Kelvin-Helmholtz instability evolving from well-characterized seed perturbations in a steady, supersonic flow. The Kelvin-Helmholtz instability occurs when two fluids move parallel to one another at different velocities, and contributes to an intermixing of fluids and transition to turbulence. It is ubiquitous in nature and engineering, including terrestrial systems such as cloud formations, astrophysical systems such as supernovae, and laboratory systems such as fusion experiments. In a supersonic flow, the growth rate of the instability is inhibited due to effects of compressibility. These effects are still not fully understood, and hold the motivationmore » for the current work. The data presented here were obtained by developing a novel experimental platform capable of sustaining a steady shockwave over a precision-machined interface for unprecedented durations. The chosen interface was a well-characterized, single-mode sine wave, allowing us to document the evolution of individual vortices at high resolution. Understanding the behavior of individual vortices is the first of two fundamental steps towards developing a comprehensive model for the Kelvin-Helmholtz instability in a compressible flow. The results of this experiment were well reproduced with 2D hydrodynamic simulations. The platform has been extended to additional experiments, which study the evolution of different hydrodynamic instabilities in steady, supersonic flows.« less

  19. Observation of single-mode, Kelvin-Helmholtz instability in a supersonic flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, W. C.; Malamud, Guy; Shimony, A.

    This manuscript reports the first observations of the Kelvin-Helmholtz instability evolving from well-characterized seed perturbations in a steady, supersonic flow. The Kelvin-Helmholtz instability occurs when two fluids move parallel to one another at different velocities, and contributes to an intermixing of fluids and transition to turbulence. It is ubiquitous in nature and engineering, including terrestrial systems such as cloud formations, astrophysical systems such as supernovae, and laboratory systems such as fusion experiments. In a supersonic flow, the growth rate of the instability is inhibited due to effects of compressibility. These effects are still not fully understood, and hold the motivationmore » for the current work. The data presented here were obtained by developing a novel experimental platform capable of sustaining a steady shockwave over a precision-machined interface for unprecedented durations. The chosen interface was a well-characterized, single-mode sine wave, allowing us to document the evolution of individual vortices at high resolution. Understanding the behavior of individual vortices is the first of two fundamental steps towards developing a comprehensive model for the Kelvin-Helmholtz instability in a compressible flow. The results of this experiment were well reproduced with 2D hydrodynamic simulations. The platform has been extended to additional experiments, which study the evolution of different hydrodynamic instabilities in steady, supersonic flows.« less

  20. BMDO materials testing in the EOIM-3 experiment

    NASA Technical Reports Server (NTRS)

    Chung, Shirley Y.; Brinza, David E.; Minton, Timothy K.; Liang, Ranty H.

    1995-01-01

    The NASA Evaluation of Oxygen Interactions with Materials-3 (EOIM-3) experiment served as a testbed for a variety of materials that are candidates for Ballistic Missile Defense Organization (BMDO) space assets. The materials evaluated on this flight experiment were provided by BMDO contractors and technology laboratories. A parallel ground-based exposure evaluation was conducted using the Fast Atom Sample Tester (FAST) atomic-oxygen simulation facility at Physical Sciences, Inc. The EOIM-3 flight materials were exposed to an atomic oxygen fluence of approximately 2.3 x 10(exp 20) atoms/sq cm. The ground-based exposure fluence of 2.0 - 2.5 x 10(exp 20) atoms/sq cm permits direct comparison with that of the flight-exposed specimens. The results from the flight test conducted aboard STS-46 and the correlative ground-based exposure are summarized here. A more detailed correlation study is presented in the JPL Publication 93-31 entitled 'Flight-and Ground-Test Correlation Study of BMDO SDS Materials: Phase 1 Report'. In general, the majority of the materials survived the AO environment with their performance tolerances maintained for the duration of the exposure. Optical materials, baffles, and coatings performed extremely well as did most of the thermal coatings and tribological materials. A few of the candidate radiator, threat shielding, and structural materials showed significant degradation. Many of the coatings designed to protect against AO erosion of sensitive materials performed this function well.

  1. Control on frontal thrust progression by the mechanically weak Gondwana horizon in the Darjeeling-Sikkim Himalaya

    NASA Astrophysics Data System (ADS)

    Ghosh, Subhajit; Bose, Santanu; Mandal, Nibir; Das, Animesh

    2018-03-01

    This study integrates field evidence with laboratory experiments to show the mechanical effects of a lithologically contrasting stratigraphic sequence on the development of frontal thrusts: Main Boundary Thrust (MBT) and Daling Thrust (DT) in the Darjeeling-Sikkim Himalaya (DSH). We carried out field investigations mainly along two river sections in the DSH: Tista-Kalijhora and Mahanadi, covering an orogen-parallel stretch of 20 km. Our field observations suggest that the coal-shale dominated Gondwana sequence (sandwiched between the Daling Group in the north and Siwaliks in the south) has acted as a mechanically weak horizon to localize the MBT and DT. We simulated a similar mechanical setting in scaled model experiments to validate our field interpretation. In experiments, such a weak horizon at a shallow depth perturbs the sequential thrust progression, and causes a thrust to localize in the vicinity of the weak zone, splaying from the basal detachment. We correlate this weak-zone-controlled thrust with the DT, which accommodates a large shortening prior to activation of the weak zone as a new detachment with ongoing horizontal shortening. The entire shortening in the model is then transferred to this shallow detachment to produce a new sequence of thrust splays. Extrapolating this model result to the natural prototype, we show that the mechanically weak Gondwana Sequence has caused localization of the DT and MBT in the mountain front of DSH.

  2. A privacy-preserving parallel and homomorphic encryption scheme

    NASA Astrophysics Data System (ADS)

    Min, Zhaoe; Yang, Geng; Shi, Jingqi

    2017-04-01

    In order to protect data privacy whilst allowing efficient access to data in multi-nodes cloud environments, a parallel homomorphic encryption (PHE) scheme is proposed based on the additive homomorphism of the Paillier encryption algorithm. In this paper we propose a PHE algorithm, in which plaintext is divided into several blocks and blocks are encrypted with a parallel mode. Experiment results demonstrate that the encryption algorithm can reach a speed-up ratio at about 7.1 in the MapReduce environment with 16 cores and 4 nodes.

  3. Parallel Computing:. Some Activities in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Willers, Ian

    This paper examines some activities in High Energy Physics that utilise parallel computing. The topic includes all computing from the proposed SIMD front end detectors, the farming applications, high-powered RISC processors and the large machines in the computer centers. We start by looking at the motivation behind using parallelism for general purpose computing. The developments around farming are then described from its simplest form to the more complex system in Fermilab. Finally, there is a list of some developments that are happening close to the experiments.

  4. A Parallel Trade Study Architecture for Design Optimization of Complex Systems

    NASA Technical Reports Server (NTRS)

    Kim, Hongman; Mullins, James; Ragon, Scott; Soremekun, Grant; Sobieszczanski-Sobieski, Jaroslaw

    2005-01-01

    Design of a successful product requires evaluating many design alternatives in a limited design cycle time. This can be achieved through leveraging design space exploration tools and available computing resources on the network. This paper presents a parallel trade study architecture to integrate trade study clients and computing resources on a network using Web services. The parallel trade study solution is demonstrated to accelerate design of experiments, genetic algorithm optimization, and a cost as an independent variable (CAIV) study for a space system application.

  5. Data Acquisition with GPUs: The DAQ for the Muon $g$-$2$ Experiment at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gohn, W.

    Graphical Processing Units (GPUs) have recently become a valuable computing tool for the acquisition of data at high rates and for a relatively low cost. The devices work by parallelizing the code into thousands of threads, each executing a simple process, such as identifying pulses from a waveform digitizer. The CUDA programming library can be used to effectively write code to parallelize such tasks on Nvidia GPUs, providing a significant upgrade in performance over CPU based acquisition systems. The muonmore » $g$-$2$ experiment at Fermilab is heavily relying on GPUs to process its data. The data acquisition system for this experiment must have the ability to create deadtime-free records from 700 $$\\mu$$s muon spills at a raw data rate 18 GB per second. Data will be collected using 1296 channels of $$\\mu$$TCA-based 800 MSPS, 12 bit waveform digitizers and processed in a layered array of networked commodity processors with 24 GPUs working in parallel to perform a fast recording of the muon decays during the spill. The described data acquisition system is currently being constructed, and will be fully operational before the start of the experiment in 2017.« less

  6. Flow Structure and Channel Morphology at a Confluent-Meander Bend

    NASA Astrophysics Data System (ADS)

    Riley, J. D.; Rhoads, B. L.

    2009-12-01

    Flow structure and channel morphology in meander bends have been well documented. Channel curvature subjects flow through a bend to centrifugal acceleration, inducing a counterbalancing pressure-gradient force that initiates secondary circulation. Transverse variations in boundary shear stress and bedload transport parallel cross-stream movement of high velocity flow and determine spatial patterns of erosion along the outer bank and deposition along the inner bank. Laboratory experiments and numerical modeling of confluent-meander bends, a junction planform that develops when a tributary joins a meandering river along the outer bank of a bend, suggest that flow and channel morphology in such bends deviate from typical patterns. The purpose of this study is to examine three-dimensional (3-D) flow structure and channel morphology at a natural confluent-meander bend. Field data were collected in southeastern Illinois where Big Muddy Creek joins the Little Wabash River near a local maximum of curvature along an elongated meander loop. Measurements of 3-D velocity components were obtained with an acoustic Doppler current profiler (ADCP) for two flow events with differing momentum ratios. Channel bathymetry was also resolved from the four-beam depths of the ADCP. Analysis of velocity data reveals a distinct shear layer flanked by dual helical cells within the bend immediately downstream of the confluence. Flow from the tributary confines flow from the main channel along the inner part of the channel cross section, displacing the thalweg inward, limiting the downstream extent of the point bar, protecting the outer bank from erosion and enabling bar-building along this bank. Overall, this pattern of flow and channel morphology is quite different from typical patterns in meander bends, but is consistent with a conceptual model derived from laboratory experiments and numerical modeling.

  7. A parallel variable metric optimization algorithm

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.

    1973-01-01

    An algorithm, designed to exploit the parallel computing or vector streaming (pipeline) capabilities of computers is presented. When p is the degree of parallelism, then one cycle of the parallel variable metric algorithm is defined as follows: first, the function and its gradient are computed in parallel at p different values of the independent variable; then the metric is modified by p rank-one corrections; and finally, a single univariant minimization is carried out in the Newton-like direction. Several properties of this algorithm are established. The convergence of the iterates to the solution is proved for a quadratic functional on a real separable Hilbert space. For a finite-dimensional space the convergence is in one cycle when p equals the dimension of the space. Results of numerical experiments indicate that the new algorithm will exploit parallel or pipeline computing capabilities to effect faster convergence than serial techniques.

  8. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented

  9. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.

  10. Plasma physics and environmental perturbation laboratory. [magnetospheric experiments from space shuttle

    NASA Technical Reports Server (NTRS)

    Vogl, J. L.

    1973-01-01

    Current work aimed at identifying the active magnetospheric experiments that can be performed from the Space Shuttle, and designing a laboratory to carry out these experiments is described. The laboratory, known as the PPEPL (Plasma Physics and Environmental Perturbation Laboratory) consists of 35-ft pallet of instruments connected to a 25-ft pressurized control module. The systems deployed from the pallet are two 50-m booms, two subsatellites, a high-power transmitter, a multipurpose accelerator, a set of deployable canisters, and a gimbaled instrument platform. Missions are planned to last seven days, during which two scientists will carry out experiments from within the pressurized module. The type of experiments to be performed are outlined.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, Garth

    Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability. The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools. The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz. Because the Institute focusesmore » on low level files systems and storage systems, its role in improving SciDAC systems was one of supporting application middleware such as data management and system-level performance tuning. In retrospect, the Petascale Data Storage Institute’s most innovative and impactful contribution is the Parallel Log-structured File System (PLFS). Published in SC09, PLFS is middleware that operates in MPI-IO or embedded in FUSE for non-MPI applications. Its function is to decouple concurrently written files into a per-process log file, whose impact (the contents of the single file that the parallel application was concurrently writing) is determined on later reading, rather than during its writing. PLFS is transparent to the parallel application, offering a POSIX or MPI-IO interface, and it shows an order of magnitude speedup to the Chombo benchmark and two orders of magnitude to the FLASH benchmark. Moreover, LANL production applications see speedups of 5X to 28X, so PLFS has been put into production at LANL. Originally conceived and prototyped in a PDSI collaboration between LANL and CMU, it has grown to engage many other PDSI institutes, international partners like AWE, and has a large team at EMC supporting and enhancing it. PLFS is open sourced with a BSD license on sourceforge. Post PDSI funding comes from NNSA and industry sources. Moreover, PLFS has spin out half a dozen or more papers, partnered on research with multiple schools and vendors, and has projects to transparently 1) dis- tribute metadata over independent metadata servers, 2) exploit drastically non-POSIX Hadoop storage for HPC POSIX applications, 3) compress checkpoints on the fly, 4) batch delayed writes for write speed, 5) compress read-back indexes and parallelize their redistribution, 6) double-buffer writes in NAND Flash storage to decouple host blocking during checkpoint from disk write time in the storage system, 7) pack small files into a smaller number of bigger containers. There are two large scale open source Linux software projects that PDSI significantly incubated, though neither were initated in PDSI. These are 1) Ceph, a UCSC parallel object storage research project that has continued to be a vehicle for research, and has become a released part of Linux, and 2) Parallel NFS (pNFS) a portion of the IETF’s NFSv4.1 that brings the core data parallelism found in Lustre, PanFS, PVFS, and Ceph to the industry standard NFS, with released code in Linux 3.0, and its vendor offerings, with products from NetApp, EMC, BlueArc and RedHat. Both are fundamentally supported and advanced by vendor companies now, but were critcally transferred from research demonstration to viable product with funding from PDSI, in part. At this point Lustre remains the primary path to scalable IO in Exascale systems, but both Ceph and pNFS are viable alternatives with different fundamental advantages. Finally, research community building was a big success for PDSI. Through the HECFSIO workshops and HECURA project with NSF PDSI stimulated and helped to steer leveraged funding of over $25M. Through the Petascale (now Parallel) Data Storage Workshop series, www.pdsw.org, colocated with SCxy each year, PDSI created and incubated five offerings of this high-attendance workshop. The workshop has gone on without PDSI support with two more highly successfully workshops, rewriting its organizational structure to be community managed. More than 70 peer reviewed papers have been presented at PDSW workshops.« less

  12. Tested Studies for Laboratory Teaching. Proceedings of the Workshop/Conference of the Association for Biology Laboratory Education (ABLE) (12th, Springfield, Missouri, June 4-8, 1990). Volume 12.

    ERIC Educational Resources Information Center

    Goldman, Corey A., Ed.

    The focus of the Association for Biology Laboratory Education (ABLE) is to improve the undergraduate biology laboratory experience by promoting the development and dissemination of interesting, innovative, and reliable laboratory exercises. This proceedings volume includes 13 papers: "Non-Radioactive DNA Hybridization Experiments for the…

  13. Six Years of Parallel Computing at NAS (1987 - 1993): What Have we Learned?

    NASA Technical Reports Server (NTRS)

    Simon, Horst D.; Cooper, D. M. (Technical Monitor)

    1994-01-01

    In the fall of 1987 the age of parallelism at NAS began with the installation of a 32K processor CM-2 from Thinking Machines. In 1987 this was described as an "experiment" in parallel processing. In the six years since, NAS acquired a series of parallel machines, and conducted an active research and development effort focused on the use of highly parallel machines for applications in the computational aerosciences. In this time period parallel processing for scientific applications evolved from a fringe research topic into the one of main activities at NAS. In this presentation I will review the history of parallel computing at NAS in the context of the major progress, which has been made in the field in general. I will attempt to summarize the lessons we have learned so far, and the contributions NAS has made to the state of the art. Based on these insights I will comment on the current state of parallel computing (including the HPCC effort) and try to predict some trends for the next six years.

  14. Does the Lack of Hands-On Experience in a Remotely Delivered Laboratory Course Affect Student Learning?

    ERIC Educational Resources Information Center

    Abdel-Salam, Tarek; Kauffman, Paul J.; Crossman, Gary

    2006-01-01

    Educators question whether performing a laboratory experiment as an observer (non-hands-on), such as conducted in a distance education context, can be as effective a learning tool as personally performing the experiment in a laboratory environment. The present paper investigates this issue by comparing the performance of distance education…

  15. Expression, Purification, and Characterization of a Carbohydrate-Active Enzyme: A Research-Inspired Methods Optimization Experiment for the Biochemistry Laboratory

    ERIC Educational Resources Information Center

    Willbur, Jaime F.; Vail, Justin D.; Mitchell, Lindsey N.; Jakeman, David L.; Timmons, Shannon C.

    2016-01-01

    The development and implementation of research-inspired, discovery-based experiences into science laboratory curricula is a proven strategy for increasing student engagement and ownership of experiments. In the novel laboratory module described herein, students learn to express, purify, and characterize a carbohydrate-active enzyme using modern…

  16. Incrementally Approaching an Inquiry Lab Curriculum: Can Changing a Single Laboratory Experiment Improve Student Performance in General Chemistry?

    ERIC Educational Resources Information Center

    Cacciatore, Kristen L.; Sevian, Hannah

    2009-01-01

    Many institutions are responding to current research about how students learn science by transforming their general chemistry laboratory curricula to be inquiry-oriented. We present a comparison study of student performance after completing either a traditional or an inquiry stoichiometry experiment. This single laboratory experience was the only…

  17. Quality Assurance Program for Molecular Medicine Laboratories

    PubMed Central

    Hajia, M; Safadel, N; Samiee, S Mirab; Dahim, P; Anjarani, S; Nafisi, N; Sohrabi, A; Rafiee, M; Sabzavi, F; Entekhabi, B

    2013-01-01

    Background: Molecular diagnostic methods have played and continuing to have a critical role in clinical laboratories in recent years. Therefore, standardization is an evolutionary process that needs to be upgrade with increasing scientific knowledge, improvement of the instruments and techniques. The aim of this study was to design a quality assurance program in order to have similar conditions for all medical laboratories engaging with molecular tests. Methods: We had to design a plan for all four elements; required space conditions, equipments, training, and basic guidelines. Necessary guidelines was prepared and confirmed by the launched specific committee at the Health Reference Laboratory. Results: Several workshops were also held for medical laboratories directors and staffs, quality control manager of molecular companies, directors and nominees from universities. Accreditation of equipments and molecular material was followed parallel with rest of program. Now we are going to accredit medical laboratories and to evaluate the success of the program. Conclusion: Accreditation of medical laboratory will be succeeding if its basic elements are provided in advance. Professional practice guidelines, holding training and performing accreditation the molecular materials and equipments ensured us that laboratories are aware of best practices, proper interpretation, limitations of techniques, and technical issues. Now, active external auditing can improve the applied laboratory conditions toward the defined standard level. PMID:23865028

  18. Quality assurance program for molecular medicine laboratories.

    PubMed

    Hajia, M; Safadel, N; Samiee, S Mirab; Dahim, P; Anjarani, S; Nafisi, N; Sohrabi, A; Rafiee, M; Sabzavi, F; Entekhabi, B

    2013-01-01

    Molecular diagnostic methods have played and continuing to have a critical role in clinical laboratories in recent years. Therefore, standardization is an evolutionary process that needs to be upgrade with increasing scientific knowledge, improvement of the instruments and techniques. The aim of this study was to design a quality assurance program in order to have similar conditions for all medical laboratories engaging with molecular tests. We had to design a plan for all four elements; required space conditions, equipments, training, and basic guidelines. Necessary guidelines was prepared and confirmed by the launched specific committee at the Health Reference Laboratory. Several workshops were also held for medical laboratories directors and staffs, quality control manager of molecular companies, directors and nominees from universities. Accreditation of equipments and molecular material was followed parallel with rest of program. Now we are going to accredit medical laboratories and to evaluate the success of the program. Accreditation of medical laboratory will be succeeding if its basic elements are provided in advance. Professional practice guidelines, holding training and performing accreditation the molecular materials and equipments ensured us that laboratories are aware of best practices, proper interpretation, limitations of techniques, and technical issues. Now, active external auditing can improve the applied laboratory conditions toward the defined standard level.

  19. Parallel image-acquisition in continuous-wave electron paramagnetic resonance imaging with a surface coil array: Proof-of-concept experiments

    NASA Astrophysics Data System (ADS)

    Enomoto, Ayano; Hirata, Hiroshi

    2014-02-01

    This article describes a feasibility study of parallel image-acquisition using a two-channel surface coil array in continuous-wave electron paramagnetic resonance (CW-EPR) imaging. Parallel EPR imaging was performed by multiplexing of EPR detection in the frequency domain. The parallel acquisition system consists of two surface coil resonators and radiofrequency (RF) bridges for EPR detection. To demonstrate the feasibility of this method of parallel image-acquisition with a surface coil array, three-dimensional EPR imaging was carried out using a tube phantom. Technical issues in the multiplexing method of EPR detection were also clarified. We found that degradation in the signal-to-noise ratio due to the interference of RF carriers is a key problem to be solved.

  20. Asking the next generation: the implementation of pre-university students’ ideas about physics laboratory preparation exercises

    NASA Astrophysics Data System (ADS)

    Dunnett, K.; Bartlett, P. A.

    2018-01-01

    It was planned to introduce online pre-laboratory session activities to a first-year undergraduate physics laboratory course to encourage a minimum level of student preparation for experiments outside the laboratory environment. A group of 16 and 17 year old laboratory work-experience students were tasked to define and design a pre-laboratory activity based on experiments that they had been undertaking. This informed the structure, content and aims of the activities introduced to a first year physics undergraduate laboratory course, with the particular focus on practising the data handling. An implementation study showed how students could try to optimise high grades, rather than gain efficiency-enhancing experience if careful controls were not put in place by assessors. However, the work demonstrated that pre-university and first-year physics students can take an active role in developing scaffolding activities that can help to improve the performance of those that follow their footsteps.

  1. The Effect of Chemistry Laboratory Activities on Students' Chemistry Perception and Laboratory Anxiety Levels

    ERIC Educational Resources Information Center

    Aydogdu, Cemil

    2017-01-01

    Chemistry lesson should be supported with experiments to understand the lecture effectively. For safety laboratory environment and to prevent laboratory accidents; chemical substances' properties, working principles for chemical substances' usage should be learnt. Aim of the present study was to analyze the effect of experiments which depend on…

  2. Lab experiments are a major source of knowledge in the social sciences.

    PubMed

    Falk, Armin; Heckman, James J

    2009-10-23

    Laboratory experiments are a widely used methodology for advancing causal knowledge in the physical and life sciences. With the exception of psychology, the adoption of laboratory experiments has been much slower in the social sciences, although during the past two decades the use of lab experiments has accelerated. Nonetheless, there remains considerable resistance among social scientists who argue that lab experiments lack "realism" and generalizability. In this article, we discuss the advantages and limitations of laboratory social science experiments by comparing them to research based on nonexperimental data and to field experiments. We argue that many recent objections against lab experiments are misguided and that even more lab experiments should be conducted.

  3. A highly efficient multi-core algorithm for clustering extremely large datasets

    PubMed Central

    2010-01-01

    Background In recent years, the demand for computational power in computational biology has increased due to rapidly growing data sets from microarray and other high-throughput technologies. This demand is likely to increase. Standard algorithms for analyzing data, such as cluster algorithms, need to be parallelized for fast processing. Unfortunately, most approaches for parallelizing algorithms largely rely on network communication protocols connecting and requiring multiple computers. One answer to this problem is to utilize the intrinsic capabilities in current multi-core hardware to distribute the tasks among the different cores of one computer. Results We introduce a multi-core parallelization of the k-means and k-modes cluster algorithms based on the design principles of transactional memory for clustering gene expression microarray type data and categorial SNP data. Our new shared memory parallel algorithms show to be highly efficient. We demonstrate their computational power and show their utility in cluster stability and sensitivity analysis employing repeated runs with slightly changed parameters. Computation speed of our Java based algorithm was increased by a factor of 10 for large data sets while preserving computational accuracy compared to single-core implementations and a recently published network based parallelization. Conclusions Most desktop computers and even notebooks provide at least dual-core processors. Our multi-core algorithms show that using modern algorithmic concepts, parallelization makes it possible to perform even such laborious tasks as cluster sensitivity and cluster number estimation on the laboratory computer. PMID:20370922

  4. Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility

    NASA Technical Reports Server (NTRS)

    Williams, Jeffrey P.; Rallo, Rosemary A.

    1987-01-01

    A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for a laboratory experiment, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.

  5. Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility

    NASA Technical Reports Server (NTRS)

    Williams, Jeffrey P.; Rallo, Rosemary A.

    1987-01-01

    A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for laboratory experiments, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.

  6. Direct numerical simulation of steady state, three dimensional, laminar flow around a wall mounted cube

    NASA Astrophysics Data System (ADS)

    Liakos, Anastasios; Malamataris, Nikolaos A.

    2014-05-01

    The topology and evolution of flow around a surface mounted cubical object in three dimensional channel flow is examined for low to moderate Reynolds numbers. Direct numerical simulations were performed via a home made parallel finite element code. The computational domain has been designed according to actual laboratory experiment conditions. Analysis of the results is performed using the three dimensional theory of separation. Our findings indicate that a tornado-like vortex by the side of the cube is present for all Reynolds numbers for which flow was simulated. A horseshoe vortex upstream from the cube was formed at Reynolds number approximately 1266. Pressure distributions are shown along with three dimensional images of the tornado-like vortex and the horseshoe vortex at selected Reynolds numbers. Finally, and in accordance to previous work, our results indicate that the upper limit for the Reynolds number for which steady state results are physically realizable is roughly 2000.

  7. Progress in FMIT test assembly development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opperman, E.K.; Vogel, M.A.; Shen, E.J.

    Research and development supporting the completed design of the Fusion Materials Irradiation Test (FMIT) Facility is continuing at the Hanford Engineering Development Laboratory (HEDL) in Richland, Washington. The FMIT, a deuteron accelerator based (d + Li) neutron source, will produce an intense flux of high energy neutrons for use in radiation damage studies of fusion reactor materials. The most intense flux magnitude of greater than 10/sup 15/ n/cm/sup 2/-s is located close to the neutron producing lithium target and is distributed within a volume about the size of an American football. The conceptual design and development of FMIT experiments calledmore » Test Assemblies has progressed over the past five years in parallel with the design of the FMIT. The paper will describe the recent accomplishments made in developing test assemblies appropriate for use in the limited volume close to the FMIT target where high neutron flux and heating rates and the associated spacial gradients significantly impact design considerations.« less

  8. Three-dimensional MHD (magnetohydrodynamic) flows in rectangular ducts of liquid-metal-cooled blankets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hua, T.Q.; Walker, J.S.; Picologlou, B.F.

    1988-07-01

    Magnetohydrodynamic flows of liquid metals in rectangular ducts with thin conducting walls in the presence of strong nonuniform transverse magnetic fields are examined. The interaction parameter and Hartmann number are assumed to be large, whereas the magnetic Reynolds number is assumed to be small. Under these assumptions, viscous and inertial effects are confined in very thin boundary layers adjacent to the walls. A significant fraction of the fluid flow is concentrated in the boundary layers adjacent to the side walls which are parallel to the magnetic field. This paper describes the analysis and numerical methods for obtaining 3-D solutions formore » flow parameters outside these layers, without solving explicitly for the layers themselves. Numerical solutions are presented for cases which are relevant to the flows of liquid metals in fusion reactor blankets. Experimental results obtained from the ALEX experiments at Argonne National Laboratory are used to validate the numerical code. In general, the agreement is excellent. 5 refs., 14 figs.« less

  9. Conservative and reactive solute transport in constructed wetlands

    USGS Publications Warehouse

    Keefe, Steffanie H.; Barber, Larry B.; Runkel, Robert L.; Ryan, Joseph N.; McKnight, Diane M.; Wass, Roland D.

    2004-01-01

    The transport of bromide, a conservative tracer, and rhodamine WT (RWT), a photodegrading tracer, was evaluated in three wastewater‐dependent wetlands near Phoenix, Arizona, using a solute transport model with transient storage. Coupled sodium bromide and RWT tracer tests were performed to establish conservative transport and reactive parameters in constructed wetlands with water losses ranging from (1) relatively impermeable (15%), (2) moderately leaky (45%), and (3) significantly leaky (76%). RWT first‐order photolysis rates and sorption coefficients were determined from independent field and laboratory experiments. Individual wetland hydraulic profiles influenced the extent of transient storage interaction in stagnant water areas and consequently RWT removal. Solute mixing and transient storage interaction occurred in the impermeable wetland, resulting in 21% RWT mass loss from main channel and storage zone photolysis (10%) and sorption (11%) reactions. Advection and dispersion governed solute transport in the leaky wetland, limiting RWT photolysis removal (1.2%) and favoring main channel sorption (3.6%). The moderately leaky wetland contained islands parallel to flow, producing channel flow and minimizing RWT losses (1.6%).

  10. Simple method for generating adjustable trains of picosecond electron bunches

    NASA Astrophysics Data System (ADS)

    Muggli, P.; Allen, B.; Yakimenko, V. E.; Park, J.; Babzien, M.; Kusche, K. P.; Kimura, W. D.

    2010-05-01

    A simple, passive method for producing an adjustable train of picosecond electron bunches is demonstrated. The key component of this method is an electron beam mask consisting of an array of parallel wires that selectively spoils the beam emittance. This mask is positioned in a high magnetic dispersion, low beta-function region of the beam line. The incoming electron beam striking the mask has a time/energy correlation that corresponds to a time/position correlation at the mask location. The mask pattern is transformed into a time pattern or train of bunches when the dispersion is brought back to zero downstream of the mask. Results are presented of a proof-of-principle experiment demonstrating this novel technique that was performed at the Brookhaven National Laboratory Accelerator Test Facility. This technique allows for easy tailoring of the bunch train for a particular application, including varying the bunch width and spacing, and enabling the generation of a trailing witness bunch.

  11. Experimental design, operation, and results of a 4 kW high temperature steam electrolysis experiment

    DOE PAGES

    Zhang, Xiaoyu; O'Brien, James E.; Tao, Greg; ...

    2015-08-06

    High temperature steam electrolysis (HTSE) is a promising technology for large-scale hydrogen production. However, research on HTSE performance above the kW level is limited. This paper presents the results of 4 kW HTSE long-term test completed in a multi-kW test facility recently developed at the Idaho National Laboratory (INL). The 4 kW HTSE unit included two solid oxide electrolysis stacks operating in parallel, each of which included 40 electrode-supported planar cells. A current density of 0.41 A/cm2 was used for the long-term operation, resulting in a hydrogen production rate about 25 slpm. A demonstration of 920 hours stable operation wasmore » achieved. The paper also includes detailed descriptions of the piping layout, steam generation and delivery system, test fixture, heat recuperation system, hot zone, instrumentation, and operating conditions. As a result, this successful demonstration of multi-kW scale HTSE unit will help to advance the technology toward near-term commercialization.« less

  12. Dynamics and control of flexible spacecraft during and after slewing maneuvers

    NASA Technical Reports Server (NTRS)

    Kakad, Yogendra P.

    1989-01-01

    The dynamics and control of slewing maneuvers of NASA Spacecraft COntrol Laboratory Experiment (SCOLE) are analyzed. The control problem of slewing maneuvers of SCOLE is formulated in terms of an arbitrary maneuver about any given axis. The control system is developed for the combined problem of rigid-body slew maneuver and vibration suppression of the flexible appendage. The control problem formulation incorporates the nonlinear dynamical equations derived previously, and is expressed in terms of a two-point boundary value problem utilizing a quadratic type of performance index. The two-point boundary value problem is solved as a hierarchical control problem with the overall system being split in terms of two subsystems, namely the slewing of the entire assembly and the vibration suppression of the flexible antenna. The coupling variables between the two dynamical subsystems are identified and these two subsystems for control purposes are treated independently in parallel at the first level. Then the state-space trajectory of the combined problem is optimized at the second level.

  13. NEET-AMM Final Technical Report on Laser Direct Manufacturing (LDM) for Nuclear Power Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Scott; Baca, Georgina; O'Connor, Michael

    2015-12-31

    Final technical report summarizes the program progress and technical accomplishments of the Laser Direct Manufacturing (LDM) for Nuclear Power Components project. A series of experiments varying build process parameters (scan speed and laser power) were conducted at the outset to establish the optimal build conditions for each of the alloys. Fabrication was completed in collaboration with Quad City Manufacturing Laboratory (QCML). The density of all sample specimens was measured and compared to literature values. Optimal build process conditions giving fabricated part densities close to literature values were chosen for making mechanical test coupons. Test coupons whose principal axis is onmore » the x-y plane (perpendicular to build direction) and on the z plane (parallel to build direction) were built and tested as part of the experimental build matrix to understand the impact of the anisotropic nature of the process.. Investigations are described 316L SS, Inconel 600, 718 and 800 and oxide dispersion strengthed 316L SS (Yttria) alloys.« less

  14. Experimental studies of interactions between Alfv'en waves and striated density depletions in the LAPD

    NASA Astrophysics Data System (ADS)

    Auerbach, D. W.; Carter, T. A.; Vincena, S.

    2008-11-01

    Satellite measurements in the earth's magnetosphere have associated Alfv'en frequency fluctuations with density depletions striated along the geomagnetic field. This poster presents laboratory studies in the LADP experiment at UCLA modeling this phenomena. Density depletions are pre-formed in the plasma column by selectively blocking a portion of the drive beam, and Alfv'en waves are driven in the cavity by means of an inserted antenna. Relevant experimental parameters include an ion cyclotron radius around a mm, alfven parallel wavelength several meters, electron inertial length around 6 mm, and electron thermal speeds about a third of the alfv'en speed. We report here on modifications to the wave propagation due to the density depletion. We also report on the details of the interactions between the driven wave and the secondary drift-alfv'en wave instabilities that arise on the density boundary, including wave-wave interactions and possible turbulent broadening effects on the main wave.

  15. City transport of the future - the high speed pedestrian conveyor. Part 1: ergonomic considerations of accelerators, decelerators and transfer sections.

    PubMed

    Browning, A C

    1974-12-01

    In this article, an uncommon form of passenger transport is considered, the moving pavement or pedestrian conveyor running at speeds of up to 16 km/h. There are very little relevant ergonomic data for such devices and some specific laboratory experiments have been carried out using 1000 subjects to represent the general public. It is concluded that whilst high speed pedestrian conveyors are quite feasible, stations along them are likely to be large. The most attractive type is a set of parallel surfaces moving at different speeds and with handholds provided in the form of poles. This type could be extremely convenient for certain locations but will probably have to be restricted in its use to fairly fit adults carrying little luggage, and would find applications in situations where a large number of people need to travel in the same direction. Part 2, Ergonomic considerations of complete conveyor systems, will follow.

  16. Helium release during shale deformation: Experimental validation

    DOE PAGES

    Bauer, Stephen J.; Gardner, W. Payton; Heath, Jason E.

    2016-07-01

    This paper describes initial experimental results of helium tracer release monitoring during deformation of shale. Naturally occurring radiogenic 4He is present in high concentration in most shales. During rock deformation, accumulated helium could be released as fractures are created and new transport pathways are created. We present the results of an experimental study in which confined reservoir shale samples, cored parallel and perpendicular to bedding, which were initially saturated with helium to simulate reservoir conditions, are subjected to triaxial compressive deformation. During the deformation experiment, differential stress, axial, and radial strains are systematically tracked. Release of helium is dynamically measuredmore » using a helium mass spectrometer leak detector. Helium released during deformation is observable at the laboratory scale and the release is tightly coupled to the shale deformation. These first measurements of dynamic helium release from rocks undergoing deformation show that helium provides information on the evolution of microstructure as a function of changes in stress and strain.« less

  17. Assessment of bacterial growth and total organic carbon removal on granular activated carbon contactors.

    PubMed

    Bancroft, K; Maloney, S W; McElhaney, J; Suffet, I H; Pipes, W O

    1983-09-01

    The overall growth rate of bacteria on granular activated carbon (GAC) contactors at the Philadelphia Torresdale Water Treatment Pilot Plant facility was found to decrease until steady state was reached. The growth rate was found to fluctuate between 6.94 X 10(-3) and 8.68 X 10(-4) doublings per h. The microbiological removal of total organic carbon (TOC) was calculated by considering the GAC contactors as semiclosed continuous culture systems and using growth yield factors determined in laboratory experiments. After ozonation, the average TOC entering the contactors was 1,488 micrograms/liter, and the average effluent TOC was 497 micrograms/liter. Microbiological TOC removal was found to average 240 micrograms/liter on GAC contactors, which was not significantly different from microbiological TOC (220 micrograms/liter) removal across a parallel sand contactor where no adsorption took place. Thus, GAC did not appear to enhance biological TOC removal. Bacterial growth and maintenance was responsible for approximately 24% of the TOC removal on GAC under the conditions of this study.

  18. Fluctuation dynamo and turbulent induction at small Prandtl number.

    PubMed

    Eyink, Gregory L

    2010-10-01

    We study the Lagrangian mechanism of the fluctuation dynamo at zero Prandtl number and infinite magnetic Reynolds number, in the Kazantsev-Kraichnan model of white-noise advection. With a rough velocity field corresponding to a turbulent inertial range, flux freezing holds only in a stochastic sense. We show that field lines arriving to the same point which were initially separated by many resistive lengths are important to the dynamo. Magnetic vectors of the seed field that point parallel to the initial separation vector arrive anticorrelated and produce an "antidynamo" effect. We also study the problem of "magnetic induction" of a spatially uniform seed field. We find no essential distinction between this process and fluctuation dynamo, both producing the same growth rates and small-scale magnetic correlations. In the regime of very rough velocity fields where fluctuation dynamo fails, we obtain the induced magnetic energy spectra. We use these results to evaluate theories proposed for magnetic spectra in laboratory experiments of turbulent induction.

  19. Biotechnology Laboratory Methods.

    ERIC Educational Resources Information Center

    Davis, Robert H.; Kompala, Dhinakar S.

    1989-01-01

    Describes a course entitled "Biotechnology Laboratory" which introduces a variety of laboratory methods associated with biotechnology. Describes the history, content, and seven experiments of the course. The seven experiments are selected from microbiology and molecular biology, kinetics and fermentation, and downstream…

  20. Scalable load balancing for massively parallel distributed Monte Carlo particle transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, M. J.; Brantley, P. S.; Joy, K. I.

    2013-07-01

    In order to run computer simulations efficiently on massively parallel computers with hundreds of thousands or millions of processors, care must be taken that the calculation is load balanced across the processors. Examining the workload of every processor leads to an unscalable algorithm, with run time at least as large as O(N), where N is the number of processors. We present a scalable load balancing algorithm, with run time 0(log(N)), that involves iterated processor-pair-wise balancing steps, ultimately leading to a globally balanced workload. We demonstrate scalability of the algorithm up to 2 million processors on the Sequoia supercomputer at Lawrencemore » Livermore National Laboratory. (authors)« less

Top