Sample records for particle control program

  1. PROGRAMMABLE TURBIDISTAT FOR SUSPENDED PARTICLES IN LABORATORY AQUARIA

    EPA Science Inventory

    A system for precise control of suspended particle concentrations in laboratory aquaria is described. It comprises an air-lift dosing system, a transmissometer to measure particle concentration, and a microcomputer which calculates the dose required to achieve a programmed turbid...

  2. Acoustic Detection Of Loose Particles In Pressure Sensors

    NASA Technical Reports Server (NTRS)

    Kwok, Lloyd C.

    1995-01-01

    Particle-impact-noise-detector (PIND) apparatus used in conjunction with computer program analyzing output of apparatus to detect extraneous particles trapped in pressure sensors. PIND tester essentially shaker equipped with microphone measuring noise in pressure sensor or other object being shaken. Shaker applies controlled vibration. Output of microphone recorded and expressed in terms of voltage, yielding history of noise subsequently processed by computer program. Data taken at sampling rate sufficiently high to enable identification of all impacts of particles on sensor diaphragm and on inner surfaces of sensor cavities.

  3. QUALITY CONTROL OF SEMI-CONTINUOUS MOBILITY SIZE-FRACTIONATED PARTICLE NUMBER CONCENTRATION DATA. (R827352)

    EPA Science Inventory

    Fine and ultrafine particles have been postulated to play an important role in the association between ambient particulate matters and adverse health effects. As part of the EPA Supersite Program, the Southern California Particle Center & Supersite has conducted a series o...

  4. IMPLICIT DUAL CONTROL BASED ON PARTICLE FILTERING AND FORWARD DYNAMIC PROGRAMMING.

    PubMed

    Bayard, David S; Schumitzky, Alan

    2010-03-01

    This paper develops a sampling-based approach to implicit dual control. Implicit dual control methods synthesize stochastic control policies by systematically approximating the stochastic dynamic programming equations of Bellman, in contrast to explicit dual control methods that artificially induce probing into the control law by modifying the cost function to include a term that rewards learning. The proposed implicit dual control approach is novel in that it combines a particle filter with a policy-iteration method for forward dynamic programming. The integration of the two methods provides a complete sampling-based approach to the problem. Implementation of the approach is simplified by making use of a specific architecture denoted as an H-block. Practical suggestions are given for reducing computational loads within the H-block for real-time applications. As an example, the method is applied to the control of a stochastic pendulum model having unknown mass, length, initial position and velocity, and unknown sign of its dc gain. Simulation results indicate that active controllers based on the described method can systematically improve closed-loop performance with respect to other more common stochastic control approaches.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: PAINT OVERSPRAY ARRESTOR, FARR COMPANY RIGA-FLO 200

    EPA Science Inventory

    Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: PAINT OVERSPRAY ARRESTOR, ATI OSM 200 SYSTEM

    EPA Science Inventory

    Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: PAINT OVERSPRAY ARRESTOR, COLUMBUS INDUSTRIES SL-46B

    EPA Science Inventory

    Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...

  8. UCLA Tokamak Program Close Out Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Robert John

    2014-02-04

    The results of UCLA experimental fusion program are summarized. Starting with smaller devices like Microtor, Macrotor, CCT and ending the research on the large (5 m) Electric Tokamak. CCT was the most diagnosed device for H-mode like physics and the effects of rotation induced radial fields. ICRF heating was also studied but plasma heating of University Type Tokamaks did not produce useful results due to plasma edge disturbances of the antennae. The Electric Tokamak produced better confinement in the seconds range. However, it presented very good particle confinement due to an "electric particle pinch". This effect prevented us from reachingmore » a quasi steady state. This particle accumulation effect was numerically explained by Shaing's enhanced neoclassical theory. The PI believes that ITER will have a good energy confinement time but deleteriously large particle confinement time and it will disrupt on particle pinching at nominal average densities. The US fusion research program did not study particle transport effects due to its undue focus on the physics of energy confinement time. Energy confinement time is not an issue for energy producing tokamaks. Controlling the ash flow will be very expensive.« less

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PAINT OVERSPRAY ARRESTOR, KOCH FILTER CORPORATION, DUO-PAK 650

    EPA Science Inventory

    Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: PAINT OVERSPRAY ARRESTOR, AAF INTERNATIONAL DRI-PAK 40-45%

    EPA Science Inventory

    Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...

  11. Evaluation of cloud detection instruments and performance of laminar-flow leading-edge test articles during NASA Leading-Edge Flight-Test Program

    NASA Technical Reports Server (NTRS)

    Davis, Richard E.; Maddalon, Dal V.; Wagner, Richard D.; Fisher, David F.; Young, Ronald

    1989-01-01

    Summary evaluations of the performance of laminar-flow control (LFC) leading edge test articles on a NASA JetStar aircraft are presented. Statistics, presented for the test articles' performance in haze and cloud situations, as well as in clear air, show a significant effect of cloud particle concentrations on the extent of laminar flow. The cloud particle environment was monitored by two instruments, a cloud particle spectrometer (Knollenberg probe) and a charging patch. Both instruments are evaluated as diagnostic aids for avoiding laminar-flow detrimental particle concentrations in future LFC aircraft operations. The data base covers 19 flights in the simulated airline service phase of the NASA Leading-Edge Flight-Test (LEFT) Program.

  12. RIP-REMOTE INTERACTIVE PARTICLE-TRACER

    NASA Technical Reports Server (NTRS)

    Rogers, S. E.

    1994-01-01

    Remote Interactive Particle-tracing (RIP) is a distributed-graphics program which computes particle traces for computational fluid dynamics (CFD) solution data sets. A particle trace is a line which shows the path a massless particle in a fluid will take; it is a visual image of where the fluid is going. The program is able to compute and display particle traces at a speed of about one trace per second because it runs on two machines concurrently. The data used by the program is contained in two files. The solution file contains data on density, momentum and energy quantities of a flow field at discrete points in three-dimensional space, while the grid file contains the physical coordinates of each of the discrete points. RIP requires two computers. A local graphics workstation interfaces with the user for program control and graphics manipulation, and a remote machine interfaces with the solution data set and performs time-intensive computations. The program utilizes two machines in a distributed mode for two reasons. First, the data to be used by the program is usually generated on the supercomputer. RIP avoids having to convert and transfer the data, eliminating any memory limitations of the local machine. Second, as computing the particle traces can be computationally expensive, RIP utilizes the power of the supercomputer for this task. Although the remote site code was developed on a CRAY, it is possible to port this to any supercomputer class machine with a UNIX-like operating system. Integration of a velocity field from a starting physical location produces the particle trace. The remote machine computes the particle traces using the particle-tracing subroutines from PLOT3D/AMES, a CFD post-processing graphics program available from COSMIC (ARC-12779). These routines use a second-order predictor-corrector method to integrate the velocity field. Then the remote program sends graphics tokens to the local machine via a remote-graphics library. The local machine interprets the graphics tokens and draws the particle traces. The program is menu driven. RIP is implemented on the silicon graphics IRIS 3000 (local workstation) with an IRIX operating system and on the CRAY2 (remote station) with a UNICOS 1.0 or 2.0 operating system. The IRIS 4D can be used in place of the IRIS 3000. The program is written in C (67%) and FORTRAN 77 (43%) and has an IRIS memory requirement of 4 MB. The remote and local stations must use the same user ID. PLOT3D/AMES unformatted data sets are required for the remote machine. The program was developed in 1988.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PAINT OVERSPRAY ARRESTOR, PUROLATOR PRODUCTS AIR FILTRATION COMPANY, DMK804404 AND PB2424

    EPA Science Inventory

    Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...

  14. New numerical methods for open-loop and feedback solutions to dynamic optimization problems

    NASA Astrophysics Data System (ADS)

    Ghosh, Pradipto

    The topic of the first part of this research is trajectory optimization of dynamical systems via computational swarm intelligence. Particle swarm optimization is a nature-inspired heuristic search method that relies on a group of potential solutions to explore the fitness landscape. Conceptually, each particle in the swarm uses its own memory as well as the knowledge accumulated by the entire swarm to iteratively converge on an optimal or near-optimal solution. It is relatively straightforward to implement and unlike gradient-based solvers, does not require an initial guess or continuity in the problem definition. Although particle swarm optimization has been successfully employed in solving static optimization problems, its application in dynamic optimization, as posed in optimal control theory, is still relatively new. In the first half of this thesis particle swarm optimization is used to generate near-optimal solutions to several nontrivial trajectory optimization problems including thrust programming for minimum fuel, multi-burn spacecraft orbit transfer, and computing minimum-time rest-to-rest trajectories for a robotic manipulator. A distinct feature of the particle swarm optimization implementation in this work is the runtime selection of the optimal solution structure. Optimal trajectories are generated by solving instances of constrained nonlinear mixed-integer programming problems with the swarming technique. For each solved optimal programming problem, the particle swarm optimization result is compared with a nearly exact solution found via a direct method using nonlinear programming. Numerical experiments indicate that swarm search can locate solutions to very great accuracy. The second half of this research develops a new extremal-field approach for synthesizing nearly optimal feedback controllers for optimal control and two-player pursuit-evasion games described by general nonlinear differential equations. A notable revelation from this development is that the resulting control law has an algebraic closed-form structure. The proposed method uses an optimal spatial statistical predictor called universal kriging to construct the surrogate model of a feedback controller, which is capable of quickly predicting an optimal control estimate based on current state (and time) information. With universal kriging, an approximation to the optimal feedback map is computed by conceptualizing a set of state-control samples from pre-computed extremals to be a particular realization of a jointly Gaussian spatial process. Feedback policies are computed for a variety of example dynamic optimization problems in order to evaluate the effectiveness of this methodology. This feedback synthesis approach is found to combine good numerical accuracy with low computational overhead, making it a suitable candidate for real-time applications. Particle swarm and universal kriging are combined for a capstone example, a near optimal, near-admissible, full-state feedback control law is computed and tested for the heat-load-limited atmospheric-turn guidance of an aeroassisted transfer vehicle. The performance of this explicit guidance scheme is found to be very promising; initial errors in atmospheric entry due to simulated thruster misfirings are found to be accurately corrected while closely respecting the algebraic state-inequality constraint.

  15. Predictive Capability of the Compressible MRG Equation for an Explosively Driven Particle with Validation

    NASA Astrophysics Data System (ADS)

    Garno, Joshua; Ouellet, Frederick; Koneru, Rahul; Balachandar, Sivaramakrishnan; Rollin, Bertrand

    2017-11-01

    An analytic model to describe the hydrodynamic forces on an explosively driven particle is not currently available. The Maxey-Riley-Gatignol (MRG) particle force equation generalized for compressible flows is well-studied in shock-tube applications, and captures the evolution of particle force extracted from controlled shock-tube experiments. In these experiments only the shock-particle interaction was examined, and the effects of the contact line were not investigated. In the present work, the predictive capability of this model is considered for the case where a particle is explosively ejected from a rigid barrel into ambient air. Particle trajectory information extracted from simulations is compared with experimental data. This configuration ensures that both the shock and contact produced by the detonation will influence the motion of the particle. The simulations are carried out using a finite volume, Euler-Lagrange code using the JWL equation of state to handle the explosive products. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program,under Contract No. DE-NA0002378.

  16. Manipulation of Micro Scale Particles in Optical Traps Using Programmable Spatial Light Modulation

    NASA Technical Reports Server (NTRS)

    Seibel, Robin E.; Decker, Arthur J. (Technical Monitor)

    2003-01-01

    1064 nm light, from an Nd:YAG laser, was polarized and incident upon a programmable parallel aligned liquid crystal spatial light modulator (PAL-SLM), where it was phase modulated according to the program controlling the PAL-SLM. Light reflected from the PAL-SLM was injected into a microscope and focused. At the focus, multiple optical traps were formed in which 9.975 m spheres were captured. The traps and the spheres were moved by changing the program of the PAL-SLM. The motion of ordered groups of micro particles was clearly demonstrated.

  17. An overview of the cosmic dust analogue material production in reduced gravity: the STARDUST experience

    NASA Technical Reports Server (NTRS)

    Ferguson, F.; Lilleleht, L. U.; Nuth, J.; Stephens, J. R.; Bussoletti, E.; Colangeli, L.; Mennella, V.; Dell'Aversana, P.; Mirra, C.

    1993-01-01

    The formation, properties and chemical dynamics of microparticles are important in a wide variety of technical and scientific fields including synthesis of semiconductor crystals from the vapour, heterogeneous chemistry in the stratosphere and the formation of cosmic dust surrounding the stars. Gravitational effects on particle formation from vapors include gas convection and buoyancy and particle sedimentation. These processes can be significantly reduced by studying condensation and agglomeration of particles in microgravity. In addition, to accurately simulate particle formation near stars, which takes place under low gravity conditions, studies in microgravity are desired. We report here the STARDUST experience, a recent collaborative effort that brings together a successful American program of microgravity experiments on particle formation aboard NASA KC-135 Reduced Gravity Research Aircraft and several Italian research groups with expertise in microgravity research and astrophysical dust formation. The program goal is to study the formation and properties of high temperature particles and gases that are of interest in astrophysics and planetary science. To do so we are developing techniques that are generally applicable to study particle formation and properties, taking advantage of the microgravity environment to allow accurate control of system parameters.

  18. An overview of the cosmic dust analogue material production in reduced gravity: the STARDUST experience.

    PubMed

    Ferguson, F; Lilleleht, L U; Nuth, J; Stephens, J R; Bussoletti, E; Colangeli, L; Mennella, V; Dell'Aversana, P; Mirra, C

    1993-01-01

    The formation, properties and chemical dynamics of microparticles are important in a wide variety of technical and scientific fields including synthesis of semiconductor crystals from the vapour, heterogeneous chemistry in the stratosphere and the formation of cosmic dust surrounding the stars. Gravitational effects on particle formation from vapors include gas convection and buoyancy and particle sedimentation. These processes can be significantly reduced by studying condensation and agglomeration of particles in microgravity. In addition, to accurately simulate particle formation near stars, which takes place under low gravity conditions, studies in microgravity are desired. We report here the STARDUST experience, a recent collaborative effort that brings together a successful American program of microgravity experiments on particle formation aboard NASA KC-135 Reduced Gravity Research Aircraft and several Italian research groups with expertise in microgravity research and astrophysical dust formation. The program goal is to study the formation and properties of high temperature particles and gases that are of interest in astrophysics and planetary science. To do so we are developing techniques that are generally applicable to study particle formation and properties, taking advantage of the microgravity environment to allow accurate control of system parameters.

  19. Contamination control program for the Extreme Ultraviolet Explorer instruments

    NASA Technical Reports Server (NTRS)

    Ray, David C.; Malina, Roger F.; Welsh, Barry Y.; Austin, James D.; Teti, Bonnie Gray

    1989-01-01

    A contamination-control program has been instituted for the optical components of the EUV Explorer satellite, whose 80-900 A range performance is easily degraded by particulate and molecular contamination. Cleanliness requirements have been formulated for the design, fabrication, and test phases of these instruments; in addition, contamination-control steps have been taken which prominently include the isolation of sensitive components in a sealed optics cavity. Prelaunch monitoring systems encompass the use of quartz crystal microbalances, particle witness plates, direct flight hardware sampling, and optical witness sampling of EUV scattering and reflectivity.

  20. Visualization of fluid dynamics at NASA Ames

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1989-01-01

    The hardware and software currently used for visualization of fluid dynamics at NASA Ames is described. The software includes programs to create scenes (for example particle traces representing the flow over an aircraft), programs to interactively view the scenes, and programs to control the creation of video tapes and 16mm movies. The hardware includes high performance graphics workstations, a high speed network, digital video equipment, and film recorders.

  1. Dynamic Modeling of Yield and Particle Size Distribution in Continuous Bayer Precipitation

    NASA Astrophysics Data System (ADS)

    Stephenson, Jerry L.; Kapraun, Chris

    Process engineers at Alcoa's Point Comfort refinery are using a dynamic model of the Bayer precipitation area to evaluate options in operating strategies. The dynamic model, a joint development effort between Point Comfort and the Alcoa Technical Center, predicts process yields, particle size distributions and occluded soda levels for various flowsheet configurations of the precipitation and classification circuit. In addition to rigorous heat, material and particle population balances, the model includes mechanistic kinetic expressions for particle growth and agglomeration and semi-empirical kinetics for nucleation and attrition. The kinetic parameters have been tuned to Point Comfort's operating data, with excellent matches between the model results and plant data. The model is written for the ACSL dynamic simulation program with specifically developed input/output graphical user interfaces to provide a user-friendly tool. Features such as a seed charge controller enhance the model's usefulness for evaluating operating conditions and process control approaches.

  2. Fusion plasma theory project summaries

    NASA Astrophysics Data System (ADS)

    1993-10-01

    This Project Summary book is a published compilation consisting of short descriptions of each project supported by the Fusion Plasma Theory and Computing Group of the Advanced Physics and Technology Division of the Department of Energy, Office of Fusion Energy. The summaries contained in this volume were written by the individual contractors with minimal editing by the Office of Fusion Energy. Previous summaries were published in February of 1982 and December of 1987. The Plasma Theory program is responsible for the development of concepts and models that describe and predict the behavior of a magnetically confined plasma. Emphasis is given to the modelling and understanding of the processes controlling transport of energy and particles in a toroidal plasma and supporting the design of the International Thermonuclear Experimental Reactor (ITER). A tokamak transport initiative was begun in 1989 to improve understanding of how energy and particles are lost from the plasma by mechanisms that transport them across field lines. The Plasma Theory program has actively participated in this initiative. Recently, increased attention has been given to issues of importance to the proposed Tokamak Physics Experiment (TPX). Particular attention has been paid to containment and thermalization of fast alpha particles produced in a burning fusion plasma as well as control of sawteeth, current drive, impurity control, and design of improved auxiliary heating. In addition, general models of plasma behavior are developed from physics features common to different confinement geometries. This work uses both analytical and numerical techniques. The Fusion Theory program supports research projects at U.S. government laboratories, universities and industrial contractors. Its support of theoretical work at universities contributes to the office of Fusion Energy mission of training scientific manpower for the U.S. Fusion Energy Program.

  3. Apparatus, Method and Program Storage Device for Determining High-Energy Neutron/Ion Transport to a Target of Interest

    NASA Technical Reports Server (NTRS)

    Wilson, John W. (Inventor); Tripathi, Ram K. (Inventor); Cucinotta, Francis A. (Inventor); Badavi, Francis F. (Inventor)

    2012-01-01

    An apparatus, method and program storage device for determining high-energy neutron/ion transport to a target of interest. Boundaries are defined for calculation of a high-energy neutron/ion transport to a target of interest; the high-energy neutron/ion transport to the target of interest is calculated using numerical procedures selected to reduce local truncation error by including higher order terms and to allow absolute control of propagated error by ensuring truncation error is third order in step size, and using scaling procedures for flux coupling terms modified to improve computed results by adding a scaling factor to terms describing production of j-particles from collisions of k-particles; and the calculated high-energy neutron/ion transport is provided to modeling modules to control an effective radiation dose at the target of interest.

  4. Fine urban and precursor emissions control for diesel urban transit buses.

    PubMed

    Lanni, Thomas

    2003-01-01

    Particulate emission from diesel engines is one of the most important pollutants in urban areas. As a result, particulate emission control from urban bus diesel engines using particle filter technology is being evaluated at several locations in the US. A project entitled "Clean Diesel Air Quality Demonstration Program" has been initiated by the New York City Metropolitan Transit Authority (MTA) under the supervision of New York State Department of Environmental Conservation and with active participation from Johnson Matthey, Corning, Equilon, Environment Canada and RAD Energy. Under this program, several MTA transit buses with DDC Series 50 engines were equipped with Continuously Regenerating Technology (CRTTM) particulate filter systems and have been operated with ultra low sulfur diesel (<30 ppm S) in transit service in Manhattan since February 2000. These buses were evaluated over a 9-month period for durability and maintainability of the particulate filter. In addition, an extensive emissions testing program was carried out using transient cycles on a chassis dynamometer to evaluate the emissions reductions obtained with the particle filter. In this paper, the emissions testing data from the Clean Diesel Air Quality Demonstration Program are discussed in detail.

  5. Determination of Electron Optical Properties for Aperture Zoom Lenses Using an Artificial Neural Network Method.

    PubMed

    Isik, Nimet

    2016-04-01

    Multi-element electrostatic aperture lens systems are widely used to control electron or charged particle beams in many scientific instruments. By means of applied voltages, these lens systems can be operated for different purposes. In this context, numerous methods have been performed to calculate focal properties of these lenses. In this study, an artificial neural network (ANN) classification method is utilized to determine the focused/unfocused charged particle beam in the image point as a function of lens voltages for multi-element electrostatic aperture lenses. A data set for training and testing of ANN is taken from the SIMION 8.1 simulation program, which is a well known and proven accuracy program in charged particle optics. Mean squared error results of this study indicate that the ANN classification method provides notable performance characteristics for electrostatic aperture zoom lenses.

  6. Particle trajectory computer program for icing analysis of axisymmetric bodies

    NASA Technical Reports Server (NTRS)

    Frost, Walter; Chang, Ho-Pen; Kimble, Kenneth R.

    1982-01-01

    General aviation aircraft and helicopters exposed to an icing environment can accumulate ice resulting in a sharp increase in drag and reduction of maximum lift causing hazardous flight conditions. NASA Lewis Research Center (LeRC) is conducting a program to examine, with the aid of high-speed computer facilities, how the trajectories of particles contribute to the ice accumulation on airfoils and engine inlets. This study, as part of the NASA/LeRC research program, develops a computer program for the calculation of icing particle trajectories and impingement limits relative to axisymmetric bodies in the leeward-windward symmetry plane. The methodology employed in the current particle trajectory calculation is to integrate the governing equations of particle motion in a flow field computed by the Douglas axisymmetric potential flow program. The three-degrees-of-freedom (horizontal, vertical, and pitch) motion of the particle is considered. The particle is assumed to be acted upon by aerodynamic lift and drag forces, gravitational forces, and for nonspherical particles, aerodynamic moments. The particle momentum equation is integrated to determine the particle trajectory. Derivation of the governing equations and the method of their solution are described in Section 2.0. General features, as well as input/output instructions for the particle trajectory computer program, are described in Section 3.0. The details of the computer program are described in Section 4.0. Examples of the calculation of particle trajectories demonstrating application of the trajectory program to given axisymmetric inlet test cases are presented in Section 5.0. For the examples presented, the particles are treated as spherical water droplets. In Section 6.0, limitations of the program relative to excessive computer time and recommendations in this regard are discussed.

  7. Programming the composition of polymer blend particles for controlled immunity towards individual protein antigens.

    PubMed

    Zhan, Xi; Shen, Hong

    2015-05-28

    In order for a more precise control over the quality and quantity of immune responses stimulated by synthetic particle-based vaccines, it is critical to control the colloidal stability of particles and the release of protein antigens in both extracellular space and intracellular compartments. Different proteins exhibit different sizes, charges and solubilities. This study focused on modulating the release and colloidal stability of proteins with varied isoelectric points. A polymer particle delivery platform made from the blend of three polymers, poly(lactic-co-glycolic acid) (PLGA) and two random pH-sensitive copolymers, were developed. Our study demonstrated its programmability with respective to individual proteins. We showed the colloidal stability of particles at neutral environment and the release of each individual protein at different pH environments were dependent on the ratio of two charge polymers. Subsequently, two antigenic proteins, ovalbumin (OVA) and Type 2 Herpes Simplex Virus (HSV-2) glycoprotein D (gD) protein, were incorporated into particles with systematically varied compositions. We demonstrated that the level of in vitro CD8(+) T cell and in vivo immune responses were dependent on the ratio of two charged polymers, which correlated well with the release of proteins. This study provided a promising design framework of pH-responsive synthetic vaccines for protein antigens of interest. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Investigation of the Influence of Microgravity on Transport Mechanism in a Virtual Spaceflight Chamber: A Flight Definition Program

    NASA Technical Reports Server (NTRS)

    Trolinger, James D.; Rangel, Roger; Witherow, William; Rogers, Jan; Lal, Ravindra B.

    1999-01-01

    A need exists for understanding precisely how particles move and interact in a fluid in the absence of gravity. Such understanding is required, for example, for modeling and predicting crystal growth in space where crystals grow from solution around nucleation sites as well as for any study of particles or bubbles in liquids or in experiments where particles are used as tracers for mapping microconvection. We have produced an exact solution to the general equation of motion of particles at extremely low Reynolds number in microgravity that covers a wide range of interesting conditions. We have also developed diagnostic tools and experimental techniques to test the validity of the general equation . This program, which started in May, 1998, will produce the flight definition for an experiment in a microgravity environment of space to validate the theoretical model. We will design an experiment with the help of the theoretical model that is optimized for testing the model, measuring g, g-jitter, and other microgravity phenomena. This paper describes the goals, rational, and approach for the flight definition program. The first objective of this research is to understand the physics of particle interactions with fluids and other particles in low Reynolds number flows in microgravity. Secondary objectives are to (1) observe and quantify g-jitter effects and microconvection on particles in fluids, (2) validate an exact solution to the general equation of motion of a particle in a fluid, and (3) to characterize the ability of isolation tables to isolate experiments containing particle in liquids. The objectives will be achieved by recording a large number of holograms of particle fields in microgravity under controlled conditions, extracting the precise three-dimensional position of all of the particles as a function of time and examining the effects of all parameters on the motion of the particles. The feasibility for achieving these results has already been established in the ongoing ground-based NRA, which led to the "virtual spaceflight chamber" concept.

  9. Strong Shock Propagating Over A Random Bed of Spherical Particles

    NASA Astrophysics Data System (ADS)

    Mehta, Yash; Salari, Kambiz; Jackson, Thomas L.; Balachandar, S.; Thakur, Siddharth

    2017-11-01

    The study of shock interaction with particles has been largely motivated because of its wide-ranging applications. The complex interaction between the compressible flow features, such as shock wave and expansion fan, and the dispersed phase makes this multi-phase flow very difficult to predict and control. In this talk we will be presenting results on fully resolved inviscid simulations of shock interaction with random bed of particles. One of the fascinating observations from these simulations are the flow field fluctuations due to the presence of randomly distributed particles. Rigorous averaging (Favre averaging) of the governing equations results in Reynolds stress like term, which can be classified as pseudo turbulence in this case. We have computed this ``Reynolds stress'' term along with individual fluctuations and the turbulent kinetic energy. Average pressure was also computed to characterize the strength of the transmitted and the reflected waves. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program.

  10. Longitudinal bunch monitoring at the Fermilab Tevatron and Main Injector synchrotrons

    DOE PAGES

    Thurman-Keup, R.; Bhat, C.; Blokland, W.; ...

    2011-10-17

    The measurement of the longitudinal behavior of the accelerated particle beams at Fermilab is crucial to the optimization and control of the beam and the maximizing of the integrated luminosity for the particle physics experiments. Longitudinal measurements in the Tevatron and Main Injector synchrotrons are based on the analysis of signals from resistive wall current monitors. This study describes the signal processing performed by a 2 GHz-bandwidth oscilloscope together with a computer running a LabVIEW program which calculates the longitudinal beam parameters.

  11. Randomized Trial to Reduce Air Particle Levels in Homes of Smokers and Children.

    PubMed

    Hughes, Suzanne C; Bellettiere, John; Nguyen, Benjamin; Liles, Sandy; Klepeis, Neil E; Quintana, Penelope J E; Berardi, Vincent; Obayashi, Saori; Bradley, Savannah; Hofstetter, C Richard; Hovell, Melbourne F

    2018-03-01

    Exposure to fine particulate matter in the home from sources such as smoking, cooking, and cleaning may put residents, especially children, at risk for detrimental health effects. A randomized clinical trial was conducted from 2011 to 2016 to determine whether real-time feedback in the home plus brief coaching of parents or guardians could reduce fine particle levels in homes with smokers and children. A randomized trial with two groups-intervention and control. A total of 298 participants from predominantly low-income households with an adult smoker and a child aged <14 years. Participants were recruited during 2012-2015 from multiple sources in San Diego, mainly Women, Infants and Children Program sites. The multicomponent intervention consisted of continuous lights and brief sound alerts based on fine particle levels in real time and four brief coaching sessions using particle level graphs and motivational interviewing techniques. Motivational interviewing coaching focused on particle reduction to protect children and other occupants from elevated particle levels, especially from tobacco-related sources. In-home air particle levels were measured by laser particle counters continuously in both study groups. The two outcomes were daily mean particle counts and percentage time with high particle concentrations (>15,000 particles/0.01 ft 3 ). Linear mixed models were used to analyze the differential change in the outcomes over time by group, during 2016-2017. Intervention homes had significantly larger reductions than controls in daily geometric mean particle concentrations (18.8% reduction vs 6.5% reduction, p<0.001). Intervention homes' average percentage time with high particle concentrations decreased 45.1% compared with a 4.2% increase among controls (difference between groups p<0.001). Real-time feedback for air particle levels and brief coaching can reduce fine particle levels in homes with smokers and young children. Results set the stage for refining feedback and possible reinforcing consequences for not generating smoke-related particles. This study is registered at www.clinicaltrials.gov NCT01634334. Copyright © 2017 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  12. Properties of meso-Erythritol; phase state, accommodation coefficient and saturation vapour pressure

    NASA Astrophysics Data System (ADS)

    Emanuelsson, Eva; Tschiskale, Morten; Bilde, Merete

    2016-04-01

    Introduction Saturation vapour pressure and the associated temperature dependence (enthalpy ΔH), are key parameters for improving predictive atmospheric models. Generally, the atmospheric aerosol community lack experimentally determined values of these properties for relevant organic aerosol compounds (Bilde et al., 2015). In this work we have studied the organic aerosol component meso-Erythritol. Methods Sub-micron airborne particles of meso-Erythritol were generated by nebulization from aqueous solution, dried, and a mono disperse fraction of the aerosol was selected using a differential mobility analyser. The particles were then allowed to evaporate in the ARAGORN (AaRhus Atmospheric Gas phase OR Nano particle) flow tube. It is a temperature controlled 3.5 m long stainless steel tube with an internal diameter of 0.026 m (Bilde et al., 2003, Zardini et al., 2010). Changes in particle size as function of evaporation time were determined using a scanning mobility particle sizer system. Physical properties like air flow, temperature, humidity and pressure were controlled and monitored on several places in the setup. The saturation vapour pressures were then inferred from the experimental results in the MATLAB® program AU_VaPCaP (Aarhus University_Vapour Pressure Calculation Program). Results Following evaporation, meso-Erythriol under some conditions showed a bimodal particle size distribution indicating the formation of particles of two different phase states. The issue of physical phase state, along with critical assumptions e.g. the accommodation coefficient in the calculations of saturation vapour pressures of atmospheric relevant compounds, will be discussed. Saturation vapour pressures from the organic compound meso-Erythritol will be presented at temperatures between 278 and 308 K, and results will be discussed in the context of atmospheric chemistry. References Bilde, M. et al., (2015), Chemical Reviews, 115 (10), 4115-4156. Bilde, M. et. al., (2003), Environmental Science and Technology 37(7), 1371-1378. Zardini, A. A. et al., (2010), Journal of Aerosol Science, 41, 760-770.

  13. Harnessing what lies within: Programming immunity with biocompatible devices to treat human disease

    NASA Astrophysics Data System (ADS)

    Roberts, Reid Austin

    Advances in our mechanistic insight of cellular function and how this relates to host physiology have revealed a world which is intimately connected at the macro and micro level. Our increasing understanding of biology exemplifies this, where cells respond to environmental cues through interconnected networks of proteins which function as receptors and adaptors to elicit gene expression changes that drive appropriate cellular programs for a given stimulus. Consequently, our deeper molecular appreciation of host homeostasis implicates aberrations of these pathways in nearly all major human disease categories, including those of infectious, metabolic, neurologic, oncogenic, and autoimmune etiology. We have come to recognize the mammalian immune system as a common network hub among all these varied pathologies. As such, the major goal of this dissertation is to identify a platform to program immune responses in mammals so that we may enhance our ability to treat disease and improve health in the 21st century. Using advances in materials science, in particular a recently developed particle fabrication technology termed Particle Replication in Non-wetting Templates (PRINT), our studies systematically assess the murine and human immune response to precisely fabricated nano- and microscale particles composed of biodegradable and biocompatible materials. We then build on these findings and present particle design parameters to program a number of clinically attractive immune responses by targeting endogenous cellular signaling pathways. These include control of particle uptake through surface modification, design parameters that modulate the magnitude and kinetics of biological signaling dynamics that can be used to exacerbate or dampen inflammatory responses, as well as particle designs which may be of use in treating allergies and autoimmune disorders. In total, this dissertation provides evidence that rational design of biocompatible nano- and microparticles is a viable means to instruct therapeutic immune responses that may fundamentally improve how we treat human disease.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - TETRATEC PTFE TECHNOLOGIES TETRATEX 8005

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - INSPEC FIBRES 5512BRF FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - MENARDI-CRISWELL 50-504 FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  17. Image Analysis Program for Measuring Particles with the Zeiss CSM 950 Scanning Electron Microscope (SEM)

    DTIC Science & Technology

    1990-01-01

    7 𔄁 . ,: 1& *U _’ ś TECHNICAL REPORT AD NATICK/TR-90/014 (V) N* IMAGE ANALYSIS PROGRAM FOR MEASURING PARTICLES < WITH THE ZEISS CSM 950 SCANNING... image analysis program for measuring particles using the Zeiss CSM 950/Kontron system is as follows: A>CSM calls the image analysis program. Press D to...27 vili LIST OF TABLES TABLE PAGE 1. Image Analysis Program for Measuring 29 Spherical Particles 14 2. Printout of Statistical Data Frcm Table 1 16 3

  18. The Relationship between Self-Assembly and Conformal Mappings

    NASA Astrophysics Data System (ADS)

    Duque, Carlos; Santangelo, Christian

    The isotropic growth of a thin sheet has been used as a way to generate programmed shapes through controlled buckling. We discuss how conformal mappings, which are transformations that locally preserve angles, provide a way to quantify the area growth needed to produce a particular shape. A discrete version of the conformal map can be constructed from circle packings, which are maps between packings of circles whose contact network is preserved. This provides a link to the self-assembly of particles on curved surfaces. We performed simulations of attractive particles on a curved surface using molecular dynamics. The resulting particle configurations were used to generate the corresponding discrete conformal map, allowing us to quantify the degree of area distortion required to produce a particular shape by finding particle configurations that minimize the area distortion.

  19. Rotating magnetic field induced oscillation of magnetic particles for in vivo mechanical destruction of malignant glioma.

    PubMed

    Cheng, Yu; Muroski, Megan E; Petit, Dorothée C M C; Mansell, Rhodri; Vemulkar, Tarun; Morshed, Ramin A; Han, Yu; Balyasnikova, Irina V; Horbinski, Craig M; Huang, Xinlei; Zhang, Lingjiao; Cowburn, Russell P; Lesniak, Maciej S

    2016-02-10

    Magnetic particles that can be precisely controlled under a magnetic field and transduce energy from the applied field open the way for innovative cancer treatment. Although these particles represent an area of active development for drug delivery and magnetic hyperthermia, the in vivo anti-tumor effect under a low-frequency magnetic field using magnetic particles has not yet been demonstrated. To-date, induced cancer cell death via the oscillation of nanoparticles under a low-frequency magnetic field has only been observed in vitro. In this report, we demonstrate the successful use of spin-vortex, disk-shaped permalloy magnetic particles in a low-frequency, rotating magnetic field for the in vitro and in vivo destruction of glioma cells. The internalized nanomagnets align themselves to the plane of the rotating magnetic field, creating a strong mechanical force which damages the cancer cell structure inducing programmed cell death. In vivo, the magnetic field treatment successfully reduces brain tumor size and increases the survival rate of mice bearing intracranial glioma xenografts, without adverse side effects. This study demonstrates a novel approach of controlling magnetic particles for treating malignant glioma that should be applicable to treat a wide range of cancers. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Aerial applications dispersal systems control requirements study. [agriculture

    NASA Technical Reports Server (NTRS)

    Bauchspies, J. S.; Cleary, W. L.; Rogers, W. F.; Simpson, W.; Sanders, G. S.

    1980-01-01

    Performance deficiencies in aerial liquid and dry dispersal systems are identified. Five control system concepts are explored: (1) end of field on/off control; (2) manual control of particle size and application rate from the aircraft; (3) manual control of deposit rate on the field; (4) automatic alarm and shut-off control; and (5) fully automatic control. Operational aspects of the concepts and specifications for improved control configurations are discussed in detail. A research plan to provide the technology needed to develop the proposed improvements is presented along with a flight program to verify the benefits achieved.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - AIR PURATOR CORPORATION HUYGLAS 1405M FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - BHA GROUP, INC. QG061 FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - STANDARD FILTER CORPORATION PE16ZU FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  4. USPAS | U.S. Particle Accelerator School

    Science.gov Websites

    U.S. Particle Accelerator School U.S. Particle Accelerator School U.S. Particle Accelerator School U.S. Particle Accelerator School Education in Beam Physics and Accelerator Technology Home About About University Credits Joint International Accelerator School University-Style Programs Symposium-Style Programs

  5. Calculation of four-particle harmonic-oscillator transformation brackets

    NASA Astrophysics Data System (ADS)

    Germanas, D.; Kalinauskas, R. K.; Mickevičius, S.

    2010-02-01

    A procedure for precise calculation of the three- and four-particle harmonic-oscillator (HO) transformation brackets is presented. The analytical expressions of the four-particle HO transformation brackets are given. The computer code for the calculations of HO transformation brackets proves to be quick, efficient and produces results with small numerical uncertainties. Program summaryProgram title: HOTB Catalogue identifier: AEFQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1247 No. of bytes in distributed program, including test data, etc.: 6659 Distribution format: tar.gz Programming language: FORTRAN 90 Computer: Any computer with FORTRAN 90 compiler Operating system: Windows, Linux, FreeBSD, True64 Unix RAM: 8 MB Classification: 17.17 Nature of problem: Calculation of the three-particle and four-particle harmonic-oscillator transformation brackets. Solution method: The method is based on compact expressions of the three-particle harmonics oscillator brackets, presented in [1] and expressions of the four-particle harmonics oscillator brackets, presented in this paper. Restrictions: The three- and four-particle harmonic-oscillator transformation brackets up to the e=28. Unusual features: Possibility of calculating the four-particle harmonic-oscillator transformation brackets. Running time: Less than one second for the single harmonic-oscillator transformation bracket. References:G.P. Kamuntavičius, R.K. Kalinauskas, B.R. Barret, S. Mickevičius, D. Germanas, Nuclear Physics A 695 (2001) 191.

  6. Jdpd: an open java simulation kernel for molecular fragment dissipative particle dynamics.

    PubMed

    van den Broek, Karina; Kuhn, Hubert; Zielesny, Achim

    2018-05-21

    Jdpd is an open Java simulation kernel for Molecular Fragment Dissipative Particle Dynamics with parallelizable force calculation, efficient caching options and fast property calculations. It is characterized by an interface and factory-pattern driven design for simple code changes and may help to avoid problems of polyglot programming. Detailed input/output communication, parallelization and process control as well as internal logging capabilities for debugging purposes are supported. The new kernel may be utilized in different simulation environments ranging from flexible scripting solutions up to fully integrated "all-in-one" simulation systems.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - W.L. GORE & ASSOCIATES, INC. L4347 FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - ALBANY INTERNATIONAL CORP. INDUSTRIAL PROCESS TECHNOLOGIES PRIMATEX PLUS I FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - BASF CORPORATION AX/BA-14/9-SAXP FILTER SAMPLE

    EPA Science Inventory

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  10. The Hubble Space Telescope Servicing Mission 3A Contamination Control Program

    NASA Technical Reports Server (NTRS)

    Hansen, Patricia A.

    2000-01-01

    After nearly 10 years on-orbit, the Hubble Space Telescope (HST) external thermal control materials and paint have degraded due to exposure to the low Earth orbit environment. This presented a potentially large on-orbit contamination source (particles and/or debris). Contamination mitigation techniques were developed to augment existing on-orbit servicing contamination controls. They encompassed mission management, crew training, and crew aids and tools. These techniques were successfully employed during the HST Servicing Mission 3A, December 1999.

  11. Anlysis capabilities for plutonium-238 programs

    NASA Astrophysics Data System (ADS)

    Wong, A. S.; Rinehart, G. H.; Reimus, M. H.; Pansoy-Hjelvik, M. E.; Moniz, P. F.; Brock, J. C.; Ferrara, S. E.; Ramsey, S. S.

    2000-07-01

    In this presentation, an overview of analysis capabilities that support 238Pu programs will be discussed. These capabilities include neutron emission rate and calorimetric measurements, metallography/ceramography, ultrasonic examination, particle size determination, and chemical analyses. The data obtained from these measurements provide baseline parameters for fuel clad impact testing, fuel processing, product certifications, and waste disposal. Also several in-line analyses capabilities will be utilized for process control in the full-scale 238Pu Aqueous Scrap Recovery line in FY01.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    A test program to collect and analyze size-fractionated stack gas particulate samples for selected inorganic hazardous air pollutants (HAPs) was conducted . Specific goals of the program are (1) the collection of one-gram quantities of size-fractionated stack gas particulate matter for bulk (total) and surface chemical characterization, and (2) the determination of the relationship between particle size, bulk and surface (leachable) composition, and unit load. The information obtained from this program identifies the effects of unit load, particle size, and wet FGD system operation on the relative toxicological effects of exposure to particulate emissions. Field testing was conducted in twomore » phases. The Phase I field program was performed over the period of August 24 through September 20, 1992, at the Tennessee Valley Authority Widows Creek Unit 8 Power Station, located near Stevenson (Jackson County), Alabama, on the Tennessee River. Sampling activities for Phase II were conducted from September 11 through October 14, 1993. Widows Creek Unit 8 is a 575-megawatt plant that uses bituminous coal averaging 3.7% sulfur and 13% ash. Downstream of the boiler, a venture wet scrubbing system is used for control of both sulfur dioxide and particulate emissions. There is no electrostatic precipitator (ESP) in this system. This system is atypical and represents only about 5% of the US utility industry. However, this site was chosen for this study because of the lack of information available for this particulate emission control system.« less

  13. Colloquium: Toward living matter with colloidal particles

    NASA Astrophysics Data System (ADS)

    Zeravcic, Zorana; Manoharan, Vinothan N.; Brenner, Michael P.

    2017-07-01

    A fundamental unsolved problem is to understand the differences between inanimate matter and living matter. Although this question might be framed as philosophical, there are many fundamental and practical reasons to pursue the development of synthetic materials with the properties of living ones. There are three fundamental properties of living materials that we seek to reproduce: The ability to spontaneously assemble complex structures, the ability to self-replicate, and the ability to perform complex and coordinated reactions that enable transformations impossible to realize if a single structure acted alone. The conditions that are required for a synthetic material to have these properties are currently unknown. This Colloquium examines whether these phenomena could emerge by programming interactions between colloidal particles, an approach that bootstraps off of recent advances in DNA nanotechnology and in the mathematics of sphere packings. The argument is made that the essential properties of living matter could emerge from colloidal interactions that are specific—so that each particle can be programmed to bind or not bind to any other particle—and also time dependent—so that the binding strength between two particles could increase or decrease in time at a controlled rate. There is a small regime of interaction parameters that gives rise to colloidal particles with lifelike properties, including self-assembly, self-replication, and metabolism. The parameter range for these phenomena can be identified using a combinatorial search over the set of known sphere packings.

  14. UmUTracker: A versatile MATLAB program for automated particle tracking of 2D light microscopy or 3D digital holography data

    NASA Astrophysics Data System (ADS)

    Zhang, Hanqing; Stangner, Tim; Wiklund, Krister; Rodriguez, Alvaro; Andersson, Magnus

    2017-10-01

    We present a versatile and fast MATLAB program (UmUTracker) that automatically detects and tracks particles by analyzing video sequences acquired by either light microscopy or digital in-line holographic microscopy. Our program detects the 2D lateral positions of particles with an algorithm based on the isosceles triangle transform, and reconstructs their 3D axial positions by a fast implementation of the Rayleigh-Sommerfeld model using a radial intensity profile. To validate the accuracy and performance of our program, we first track the 2D position of polystyrene particles using bright field and digital holographic microscopy. Second, we determine the 3D particle position by analyzing synthetic and experimentally acquired holograms. Finally, to highlight the full program features, we profile the microfluidic flow in a 100 μm high flow chamber. This result agrees with computational fluid dynamic simulations. On a regular desktop computer UmUTracker can detect, analyze, and track multiple particles at 5 frames per second for a template size of 201 ×201 in a 1024 × 1024 image. To enhance usability and to make it easy to implement new functions we used object-oriented programming. UmUTracker is suitable for studies related to: particle dynamics, cell localization, colloids and microfluidic flow measurement. Program Files doi : http://dx.doi.org/10.17632/fkprs4s6xp.1 Licensing provisions : Creative Commons by 4.0 (CC by 4.0) Programming language : MATLAB Nature of problem: 3D multi-particle tracking is a common technique in physics, chemistry and biology. However, in terms of accuracy, reliable particle tracking is a challenging task since results depend on sample illumination, particle overlap, motion blur and noise from recording sensors. Additionally, the computational performance is also an issue if, for example, a computationally expensive process is executed, such as axial particle position reconstruction from digital holographic microscopy data. Versatile robust tracking programs handling these concerns and providing a powerful post-processing option are significantly limited. Solution method: UmUTracker is a multi-functional tool to extract particle positions from long video sequences acquired with either light microscopy or digital holographic microscopy. The program provides an easy-to-use graphical user interface (GUI) for both tracking and post-processing that does not require any programming skills to analyze data from particle tracking experiments. UmUTracker first conduct automatic 2D particle detection even under noisy conditions using a novel circle detector based on the isosceles triangle sampling technique with a multi-scale strategy. To reduce the computational load for 3D tracking, it uses an efficient implementation of the Rayleigh-Sommerfeld light propagation model. To analyze and visualize the data, an efficient data analysis step, which can for example show 4D flow visualization using 3D trajectories, is included. Additionally, UmUTracker is easy to modify with user-customized modules due to the object-oriented programming style Additional comments: Program obtainable from https://sourceforge.net/projects/umutracker/

  15. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.

  16. KENNEDY SPACE CENTER, FLA. - A KSC employee uses a clean-air shower before entering a clean room. Streams of pressurized air directed at the occupant from nozzles in the chamber's ceiling and walls are designed to dislodge particulate matter from hair, clothing and shoes. The adhesive mat on the floor captures soil from shoe soles, as well as particles that fall on its surface. Particulate matter has the potential to contaminate the space flight hardware being stored or processed in the clean room. The shower is part of KSC's Foreign Object Debris (FOD) control program, an important safety initiative.

    NASA Image and Video Library

    2003-08-29

    KENNEDY SPACE CENTER, FLA. - A KSC employee uses a clean-air shower before entering a clean room. Streams of pressurized air directed at the occupant from nozzles in the chamber's ceiling and walls are designed to dislodge particulate matter from hair, clothing and shoes. The adhesive mat on the floor captures soil from shoe soles, as well as particles that fall on its surface. Particulate matter has the potential to contaminate the space flight hardware being stored or processed in the clean room. The shower is part of KSC's Foreign Object Debris (FOD) control program, an important safety initiative.

  17. Layer-by-layer charging in non-volatile memory devices using embedded sub-2 nm platinum nanoparticles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramalingam, Balavinayagam; Zheng, Haisheng; Gangopadhyay, Shubhra, E-mail: gangopadhyays@missouri.edu

    In this work, we demonstrate multi-level operation of a non-volatile memory metal oxide semiconductor capacitor by controlled layer-by-layer charging of platinum nanoparticle (PtNP) floating gate devices with defined gate voltage bias ranges. The device consists of two layers of ultra-fine, sub-2 nm PtNPs integrated between Al{sub 2}O{sub 3} tunneling and separation layers. PtNP size and interparticle distance were varied to control the particle self-capacitance and associated Coulomb charging energy. Likewise, the tunneling layer thicknesses were also varied to control electron tunneling to the first and second PtNP layers. The final device configuration with optimal charging behavior and multi-level programming was attainedmore » with a 3 nm Al{sub 2}O{sub 3} initial tunneling layer, initial PtNP layer with particle size 0.54 ± 0.12 nm and interparticle distance 4.65 ± 2.09 nm, 3 nm Al{sub 2}O{sub 3} layer to separate the PtNP layers, and second particle layer with 1.11 ± 0.28 nm PtNP size and interparticle distance 2.75 ± 1.05 nm. In this device, the memory window of the first PtNP layer saturated over a programming bias range of 7 V to 14 V, after which the second PtNP layer starts charging, exhibiting a multi-step memory window with layer-by-layer charging.« less

  18. DNA Origami Patterned Colloids for Programmed Design and Chirality

    NASA Astrophysics Data System (ADS)

    Ben Zion, Matan Yah; He, Xiaojin; Maass, Corinna; Sha, Ruojie; Seeman, Ned; Chaikin, Paul

    Micron size colloidal particles are scientifically important as model systems for equilibrium and active systems in physics, chemistry and biology and for technologies ranging from catalysis to photonics. The past decade has seen development of new particles with directional patches, lock and key reactions and specific recognition that guide assembly of structures such as complex crystalline arrays. What remains lacking is the ability to self-assemble structures of arbitrary shape with specific chirality, placement and orientation of neighbors. Here we demonstrate the adaptation of DNA origami nanotechnology to the micron colloidal scale with designed control of neighbor type, placement and dihedral angle. We use DNA origami belts with programmed flexibility, and functionality to pattern colloidal surfaces and bind particles to specific sites at specific angles and make uniquely right handed or left handed structures. The hybrid DNA origami colloid technology should allow the synthesis of designed functional structural and active materials. This work was supported as part of the Center for Bio-Inspired Energy Science, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences under Award # DE-SC0000989.

  19. Burn Control in Fusion Reactors via Isotopic Fuel Tailoring

    NASA Astrophysics Data System (ADS)

    Boyer, Mark D.; Schuster, Eugenio

    2011-10-01

    The control of plasma density and temperature are among the most fundamental problems in fusion reactors and will be critical to the success of burning plasma experiments like ITER. Economic and technological constraints may require future commercial reactors to operate with low temperature, high-density plasma, for which the burn condition may be unstable. An active control system will be essential for stabilizing such operating points. In this work, a volume-averaged transport model for the energy and the densities of deuterium and tritium fuel ions, as well as the alpha particles, is used to synthesize a nonlinear feedback controller for stabilizing the burn condition. The controller makes use of ITER's planned isotopic fueling capability and controls the densities of these ions separately. The ability to modulate the DT fuel mix is exploited in order to reduce the fusion power during thermal excursions without the need for impurity injection. By moving the isotopic mix in the plasma away from the optimal 50:50 mix, the reaction rate is slowed and the alpha-particle heating is reduced to desired levels. Supported by the NSF CAREER award program (ECCS-0645086).

  20. High-temperature LDV seed particle development

    NASA Technical Reports Server (NTRS)

    Frish, Michael B.; Pierce, Vicky G.

    1989-01-01

    The feasibility of developing a method for making monodisperse, unagglomerated spherical particles greater than 50 nm in diameter was demonstrated. Carbonaceous particles were made by pyrolyzing ethylene with a pulsed CO2 laser, thereby creating a non-equilibrium mixture of carbon, hydrogen, hydrocarbon vapors, and unpyrolyzed ethylene. Via a complex series of reactions, the carbon and hydrocarbon vapors quickly condensed into the spherical particles. By cooling and dispersing them in a supersonic expansion immediately after their creation, the hot newly-formed spheres were prevented from colliding and coalescing, thus preventing the problem of agglomeration which as plagued other investigators studying laser-simulated particle formation. The cold particles could be left suspended in the residual gases indefinitely without agglomerating. Their uniform sizes and unagglomerated nature were visualized by collecting the particles on filters that were subsequently examined using electron microscopy. It was found the mean particle size can be coarsely controlled by varying the initial ethylene pressure, and can be finely controlled by varying the fluence (energy/unit area) with which the laser irradiates the gas. The motivating application for this research was to manufacture particles that could be used as laser Doppler velocimetry (LDV) seeds in high-temperature high-speed flows. Though the particles made in this program will not evaporate until heated to about 3000 K, and thus could serve as LDV seeds in some applications, they are not ideal when the hot atmosphere is also oxidizing. In that situation, ceramic materials would be preferable. Research performed elsewhere has demonstrated that selected ceramic materials can be manufactured by laser pyrolysis of appropriate supply gases. It is anticipated that, when the same gases are used in conjunction with the rapid cooling technique, unagglomerated spherical ceramic particles can be made with little difficulty. Such particles would also be valuable to manufacturers of ceramic or abrasive products, and this technique may find its greatest commercial potential in those areas.

  1. High-temperature LDV seed particle development

    NASA Astrophysics Data System (ADS)

    Frish, Michael B.; Pierce, Vicky G.

    1989-05-01

    The feasibility of developing a method for making monodisperse, unagglomerated spherical particles greater than 50 nm in diameter was demonstrated. Carbonaceous particles were made by pyrolyzing ethylene with a pulsed CO2 laser, thereby creating a non-equilibrium mixture of carbon, hydrogen, hydrocarbon vapors, and unpyrolyzed ethylene. Via a complex series of reactions, the carbon and hydrocarbon vapors quickly condensed into the spherical particles. By cooling and dispersing them in a supersonic expansion immediately after their creation, the hot newly-formed spheres were prevented from colliding and coalescing, thus preventing the problem of agglomeration which as plagued other investigators studying laser-simulated particle formation. The cold particles could be left suspended in the residual gases indefinitely without agglomerating. Their uniform sizes and unagglomerated nature were visualized by collecting the particles on filters that were subsequently examined using electron microscopy. It was found the mean particle size can be coarsely controlled by varying the initial ethylene pressure, and can be finely controlled by varying the fluence (energy/unit area) with which the laser irradiates the gas. The motivating application for this research was to manufacture particles that could be used as laser Doppler velocimetry (LDV) seeds in high-temperature high-speed flows. Though the particles made in this program will not evaporate until heated to about 3000 K, and thus could serve as LDV seeds in some applications, they are not ideal when the hot atmosphere is also oxidizing. In that situation, ceramic materials would be preferable. Research performed elsewhere has demonstrated that selected ceramic materials can be manufactured by laser pyrolysis of appropriate supply gases. It is anticipated that, when the same gases are used in conjunction with the rapid cooling technique, unagglomerated spherical ceramic particles can be made with little difficulty. Such particles would also be valuable to manufacturers of ceramic or abrasive products, and this technique may find its greatest commercial potential in those areas.

  2. 77 FR 3681 - Approval and Promulgation of Air Quality Implementation Plans; Minnesota; Regional Haze

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... approach in the BART Guidelines in making a BART determination for a fossil fuel-fired EGU with total... ammonia (NH 3 ), and volatile organic compound (VOCs). Fine particle precursors react in the atmosphere to... standards, low sulfur fuel, and non-road mobile source control programs. The fourth step is to determine...

  3. Effects of Retrofitting Emission Control Systems on all In-Use Heavy Diesel Trucks

    NASA Astrophysics Data System (ADS)

    Millstein, D.; Harley, R. A.

    2009-12-01

    Diesel exhaust is now the largest source of nitrogen oxide (NOx) emissions nationally in the US, and contributes significantly to emissions of fine particulate black carbon (soot) as well. New national standards call for dramatically lower emissions of exhaust particulate matter (PM) and NOx from new diesel engines starting in 2007 and 2010, respectively. Unfortunately it will take decades for the cleaner new engines to replace those currently in service on existing heavy-duty trucks. The state of California recently adopted a rule to accelerate fleet turnover in the heavy-duty truck sector, requiring that all in-use trucks meet the new exhaust PM standards by 2014. This will entail retrofit of diesel particle filters or replacement for over a million existing diesel engines. Diesel particle filters can replace the muffler on existing trucks, and there is extensive experience with retrofit of this control equipment on public sector fleets such as diesel-powered transit buses. Nitrogen dioxide (NO2) is used as an oxidizing agent to remove carbon particles from the particle filter, to prevent it from becoming plugged. To create the needed NO2, NOx already present in engine exhaust as nitric oxide (NO) is deliberately oxidized to NO2 upstream of the particle filter using a platinum catalyst. The NO2/NOx ratio in exhaust emissions therefore increases to ~35% in comparison to much lower values (~5%) typical of older engines without particle filters. We evaluate the effects on air quality of increased use of diesel particle traps and NOx controls in southern California using the Community Multiscale Air Quality (CMAQ) model. Compared to a reference scenario without the retrofit program, we found black carbon concentrations decreased by ~20%, with small increases (4%) in ambient ozone concentrations. During summer, average NO2 concentrations decrease despite the increase in primary NO2 emissions - because total NOx emissions are reduced as part of a parallel but more gradual program to retrofit NOx control systems on in-use engines. During winter, NO2 concentrations increase by 1-2% at locations with high diesel truck traffic, and larger increases may occur if diesel trucks outfitted with particle traps do not meet the in-use NOx emission reduction requirements. Small changes to fine particulate nitrate are seen as well with increases over the Los Angeles area of 3 and 6% during the summer and fall, respectively. During the summer, but not the fall, downwind nitrate decreased by 2% east of Los Angeles near Riverside. Emissions reductions due to fleet turnover in the reference scenario (without retrofit) may be optimistic, and the air quality benefits of retrofits could therefore be understated, due to slow sales of new engines in recent years. In any case, significant changes in diesel engine emissions of NOx and PM are expected to occur over the next 5 years in California.

  4. Comprehensive Analysis of Established Dyslipidemia-Associated Loci in the Diabetes Prevention Program

    PubMed Central

    Varga, Tibor V.; Winters, Alexandra H.; Jablonski, Kathleen A.; Horton, Edward S.; Khare-Ranade, Prajakta; Knowler, William C.; Marcovina, Santica M.; Renström, Frida; Watson, Karol E.; Goldberg, Ronald; Florez, José C.

    2016-01-01

    Background We assessed whether 234 established dyslipidemia-associated loci modify the effects of metformin treatment and lifestyle intervention (vs. placebo control) on lipid and lipid sub-fraction levels in the Diabetes Prevention Program (DPP) randomized controlled trial. Methods and Results We tested gene-treatment interactions in relation to baseline adjusted follow-up blood lipid concentrations (high and low density lipoprotein cholesterol [HDL-C, LDL-C], total cholesterol, triglycerides) and lipoprotein sub-fraction particle concentrations and size in 2,993 participants with pre-diabetes. Of the previously reported SNP associations, 32.5% replicated at P<0.05 with baseline lipid traits. Trait-specific genetic risk scores (GRS) were robustly associated (3×10−4>P>1.1×10−16) with their respective baseline traits for all but two traits. Lifestyle modified the effect of the GRS for large HDL particle numbers, such that each risk allele of the GRSHDL-large was associated with lower concentrations of large HDL particles at follow-up in the lifestyle arm (β=−0.11 μmol/l per GRS risk allele; 95%CI −0.188, −0.033; P=5×10−3; Pinteraction=1×10−3 for lifestyle vs. placebo), but not in the metformin or placebo arms (P>0.05). In the lifestyle arm, participants with high genetic risk had more favorable or similar trait levels at 1-yr compared to participants at lower genetic risk at baseline for 17 of the 20 traits. Conclusions Improvements in large HDL particle concentrations conferred by lifestyle may be diminished by genetic factors. Lifestyle intervention, however, was successful in offsetting unfavorable genetic loading for most lipid traits. PMID:27784733

  5. Dynamic cross correlation studies of wave particle interactions in ULF phenomena

    NASA Technical Reports Server (NTRS)

    Mcpherron, R. L.

    1979-01-01

    Magnetic field observations made by satellites in the earth's magnetic field reveal a wide variety of ULF waves. These waves interact with the ambient particle populations in complex ways, causing modulation of the observed particle fluxes. This modulation is found to be a function of species, pitch angle, energy and time. The characteristics of this modulation provide information concerning the wave mode and interaction process. One important characteristic of wave-particle interactions is the phase of the particle flux modulation relative to the magnetic field variations. To display this phase as a function of time a dynamic cross spectrum program has been developed. The program produces contour maps in the frequency time plane of the cross correlation coefficient between any particle flux time series and the magnetic field vector. This program has been utilized in several studies of ULF wave-particle interactions at synchronous orbit.

  6. Proceedings of the International Congress (12th), Corrosion Control for Low-Cost Reliability, Held in Houston, Texas on September 19 -24, 1993. Volume 1. Coatings

    DTIC Science & Technology

    1993-09-24

    3]) Gas-cooled reactors were first developed in Europe and have been built since 1956. HTGR , equipped with the core of ceramic coated particle fuels ...demands must also be covered by nuclear energy in not so long future. Programs on developing the process heating HTGR have been promoted mainly in Germany...Material programs for HTGR have been promoted in several countries since late 1960’s which include the tasks of developing and qualifying materials, eg

  7. KSC-2009-1505

    NASA Image and Video Library

    2009-02-03

    CAPE CANAVERAL, Fla. – Mike Curie (left), with NASA Public Affairs, introduces NASA managers following their day-long Flight Readiness Review of space shuttle Discovery for the STS-119 mission. Next to Curie are (from left) William H. Gerstenmaier, associate administrator for Space Operations, John Shannon, Shuttle Program manager, Mike Suffredini, program manager for the International Space Station, and Mike Leinbach, shuttle launch director. NASA managers decided to plan a launch no earlier than Feb. 19, pending additional analysis and particle impact testing associated with a flow control valve in the shuttle's main engine system. Photo credit: NASA/Cory Huston

  8. Magnetic agglomeration method for size control in the synthesis of magnetic nanoparticles

    DOEpatents

    Huber, Dale L [Albuquerque, NM

    2011-07-05

    A method for controlling the size of chemically synthesized magnetic nanoparticles that employs magnetic interaction between particles to control particle size and does not rely on conventional kinetic control of the reaction to control particle size. The particles are caused to reversibly agglomerate and precipitate from solution; the size at which this occurs can be well controlled to provide a very narrow particle size distribution. The size of particles is controllable by the size of the surfactant employed in the process; controlling the size of the surfactant allows magnetic control of the agglomeration and precipitation processes. Agglomeration is used to effectively stop particle growth to provide a very narrow range of particle sizes.

  9. Twisting Neutron Waves

    NASA Astrophysics Data System (ADS)

    Pushin, Dmitry

    Most waves encountered in nature can be given a ``twist'', so that their phase winds around an axis parallel to the direction of wave propagation. Such waves are said to possess orbital angular momentum (OAM). For quantum particles such as photons, atoms, and electrons, this corresponds to the particle wavefunction having angular momentum of Lℏ along its propagation axis. Controlled generation and detection of OAM states of photons began in the 1990s, sparking considerable interest in applications of OAM in light and matter waves. OAM states of photons have found diverse applications such as broadband data multiplexing, massive quantum entanglement, optical trapping, microscopy, quantum state determination and teleportation, and interferometry. OAM states of electron beams have been used to rotate nanoparticles, determine the chirality of crystals and for magnetic microscopy. Here I discuss the first demonstration of OAM control of neutrons. Using neutron interferometry with a spatially incoherent input beam, we show the addition and conservation of quantum angular momenta, entanglement between quantum path and OAM degrees of freedom. Neutron-based quantum information science heretofore limited to spin, path, and energy degrees of freedom, now has access to another quantized variable, and OAM modalities of light, x-ray, and electron beams are extended to a massive, penetrating neutral particle. The methods of neutron phase imprinting demonstrated here expand the toolbox available for development of phase-sensitive techniques of neutron imaging. Financial support provided by the NSERC Create and Discovery programs, CERC and the NIST Quantum Information Program is acknowledged.

  10. Effects of diesel particle filter retrofits and accelerated fleet turnover on drayage truck emissions at the Port of Oakland.

    PubMed

    Dallmann, Timothy R; Harley, Robert A; Kirchstetter, Thomas W

    2011-12-15

    Heavy-duty diesel drayage trucks have a disproportionate impact on the air quality of communities surrounding major freight-handling facilities. In an attempt to mitigate this impact, the state of California has mandated new emission control requirements for drayage trucks accessing ports and rail yards in the state beginning in 2010. This control rule prompted an accelerated diesel particle filter (DPF) retrofit and truck replacement program at the Port of Oakland. The impact of this program was evaluated by measuring emission factor distributions for diesel trucks operating at the Port of Oakland prior to and following the implementation of the emission control rule. Emission factors for black carbon (BC) and oxides of nitrogen (NO(x)) were quantified in terms of grams of pollutant emitted per kilogram of fuel burned using a carbon balance method. Concentrations of these species along with carbon dioxide were measured in the exhaust plumes of individual diesel trucks as they drove by en route to the Port. A comparison of emissions measured before and after the implementation of the truck retrofit/replacement rule shows a 54 ± 11% reduction in the fleet-average BC emission factor, accompanied by a shift to a more highly skewed emission factor distribution. Although only particulate matter mass reductions were required in the first year of the program, a significant reduction in the fleet-average NO(x) emission factor (41 ± 5%) was observed, most likely due to the replacement of older trucks with new ones.

  11. CRRES combined radiation and release effects satellite program

    NASA Technical Reports Server (NTRS)

    Giles, B. L. (Compiler); Mccook, M. A. (Compiler); Mccook, M. W. (Compiler); Miller, G. P. (Compiler)

    1995-01-01

    The various regions of the magnetosphere-ionosphere system are coupled by flows of charged particle beams and electromagnetic waves. This coupling gives rise to processes that affect both technical and non-technical aspects of life on Earth. The CRRES Program sponsored experiments which were designed to produce controlled and known input to the space environment and the effects were measured with arrays of diagnostic instruments. Large amounts of material were used to modify and perturb the environment in a controlled manner, and response to this was studied. The CRRES and PEGSAT satellites were dual-mission spacecraft with a NASA mission to perform active chemical-release experiments, grouped into categories of tracer, modification, and simulation experiments. Two sounding rocket chemical release campaigns completed the study.

  12. Erosion in radial inflow turbines. Volume 5: Computer programs for tracing particle trajectories

    NASA Technical Reports Server (NTRS)

    Clevenger, W. B., Jr.; Tabakoff, W.

    1975-01-01

    Computer programs used to study the trajectories of particles in the radial inflow turbines are presented. The general technique of each program is described. A set of subroutines developed during the study are described. Descriptions, listings, and typical examples of each of the main programs are included.

  13. Tuning selectivity in catalysis by controlling particle shape

    NASA Astrophysics Data System (ADS)

    Lee, Ilkeun; Delbecq, Françoise; Morales, Ricardo; Albiter, Manuel A.; Zaera, Francisco

    2009-02-01

    A catalytic process for the selective formation of cis olefins would help minimize the production of unhealthy trans fats during the partial hydrogenation of edible oils. Here we report on the design of such a process on the basis of studies with model systems. Temperature programmed desorption data on single crystals showed that the isomerization of trans olefins to their cis counterparts is promoted by (111) facets of platinum, and that such selectivity is reversed on more open surfaces. Quantum mechanics calculations suggested that the extra stability of cis olefins seen on hydrogen-saturated Pt(111) surfaces may be due to a lesser degree of surface reconstruction, a factor found to be significant in the adsorption on close-packed platinum surfaces. Kinetic data using catalysts made out of dispersed tetrahedral Pt nanoparticles corroborated the selective promotion of the trans-to-cis isomerization on the (111) facets of the metal. Our work provides an example for how catalytic selectivity may be controlled by controlling the shape of the catalytic particles.

  14. Bidirectional quantum teleportation of unknown photons using path-polarization intra-particle hybrid entanglement and controlled-unitary gates via cross-Kerr nonlinearity

    NASA Astrophysics Data System (ADS)

    Heo, Jino; Hong, Chang-Ho; Lim, Jong-In; Yang, Hyung-Jin

    2015-05-01

    We propose an arbitrary controlled-unitary (CU) gate and a bidirectional quantum teleportation (BQTP) scheme. The proposed CU gate utilizes photonic qubits (photons) with cross-Kerr nonlinearities (XKNLs), X-homodyne detectors, and linear optical elements, and consists of the consecutive operation of a controlled-path (C-path) gate and a gathering-path (G-path) gate. It is almost deterministic and feasible with current technology when a strong coherent state and weak XKNLs are employed. Based on the CU gate, we present a BQTP scheme that simultaneously teleports two unknown photons between distant users by transmitting only one photon in a path-polarization intra-particle hybrid entangled state. Consequently, it is possible to experimentally implement BQTP with a certain success probability using the proposed CU gate. Project supported by the Ministry of Science, ICT&Future Planning, Korea, under the C-ITRC (Convergence Information Technology Research Center) Support program (NIPA-2013-H0301-13-3007) supervised by the National IT Industry Promotion Agency.

  15. Predictable Particle Engineering: Programming the Energy Level, Carrier Generation, and Conductivity of Core-Shell Particles.

    PubMed

    Yuan, Conghui; Wu, Tong; Mao, Jie; Chen, Ting; Li, Yuntong; Li, Min; Xu, Yiting; Zeng, Birong; Luo, Weiang; Yu, Lingke; Zheng, Gaofeng; Dai, Lizong

    2018-06-20

    Core-shell structures are of particular interest in the development of advanced composite materials as they can efficiently bring different components together at nanoscale. The advantage of this structure greatly relies on the crucial design of both core and shell, thus achieving an intercomponent synergistic effect. In this report, we show that decorating semiconductor nanocrystals with a boronate polymer shell can easily achieve programmable core-shell interactions. Taking ZnO and anatase TiO 2 nanocrystals as inner core examples, the effective core-shell interactions can narrow the band gap of semiconductor nanocrystals, change the HOMO and LUMO levels of boronate polymer shell, and significantly improve the carrier density of core-shell particles. The hole mobility of core-shell particles can be improved by almost 9 orders of magnitude in comparison with net boronate polymer, while the conductivity of core-shell particles is at most 30-fold of nanocrystals. The particle engineering strategy is based on two driving forces: catechol-surface binding and B-N dative bonding and having a high ability to control and predict the shell thickness. Also, this approach is applicable to various inorganic nanoparticles with different components, sizes, and shapes.

  16. Simulation of mixture microstructures via particle packing models and their direct comparison with real mixtures

    NASA Astrophysics Data System (ADS)

    Gulliver, Eric A.

    The objective of this thesis to identify and develop techniques providing direct comparison between simulated and real packed particle mixture microstructures containing submicron-sized particles. This entailed devising techniques for simulating powder mixtures, producing real mixtures with known powder characteristics, sectioning real mixtures, interrogating mixture cross-sections, evaluating and quantifying the mixture interrogation process and for comparing interrogation results between mixtures. A drop and roll-type particle-packing model was used to generate simulations of random mixtures. The simulated mixtures were then evaluated to establish that they were not segregated and free from gross defects. A powder processing protocol was established to provide real mixtures for direct comparison and for use in evaluating the simulation. The powder processing protocol was designed to minimize differences between measured particle size distributions and the particle size distributions in the mixture. A sectioning technique was developed that was capable of producing distortion free cross-sections of fine scale particulate mixtures. Tessellation analysis was used to interrogate mixture cross sections and statistical quality control charts were used to evaluate different types of tessellation analysis and to establish the importance of differences between simulated and real mixtures. The particle-packing program generated crescent shaped pores below large particles but realistic looking mixture microstructures otherwise. Focused ion beam milling was the only technique capable of sectioning particle compacts in a manner suitable for stereological analysis. Johnson-Mehl and Voronoi tessellation of the same cross-sections produced tessellation tiles with different the-area populations. Control charts analysis showed Johnson-Mehl tessellation measurements are superior to Voronoi tessellation measurements for detecting variations in mixture microstructure, such as altered particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.

  17. Wave Propagation in 2-D Granular Matrix and Dust Mitigation of Fabrics for Space Exploration Mission

    NASA Technical Reports Server (NTRS)

    Thanh, Phi Hung X.

    2004-01-01

    Wave Propagation study is essential to exploring the soil on Mars or Moon and Dust Mitigation is a necessity in terms of crew's health in exploration missions. The study of Dust Mitigation has a significant impact on the crew s health when astronauts track dust back into their living space after exploration trips. We are trying to use piezoelectric fiber to create waves and vibrations at certain critical frequencies and amplitudes so that we can shake the particles off from the astronaut s fabrics. By shaking off the dust and removing it, the astronauts no longer have to worry about breathing in small and possibly hazardous materials, when they are back in their living quarters. The Wave Propagation in 2-D Granular Matrix studies how the individual particles interact with each other when a pressure wave travels through the matrix. This experiment allows us to understand how wave propagates through soils and other materials. By knowing the details about the interactions of particles when they act as a medium for waves, we can better understand how wave propagates through soils and other materials. With this experiment, we can study how less gravity effects the wave propagation and hence device a way to study soils in space and on Moon or Mars. Some scientists treat the medium that waves travel through as a "black box", they did not pay much attention to how individual particles act as wave travels through them. With this data, I believe that we can use it to model ways to measure the properties of different materials such as density and composition. In order to study how the particles interact with each other, I have continued Juan Agui's experiment of the effects of impacts on a 2-D matrix. By controlling the inputs and measuring the outputs of the system, I will be able to study now the particles in that system interact with each other. I will also try to model this with the software called PFC2D in order to obtain theoretical data to compare with the experiment. PFC2D is a program that allows the user to control the number of particle's characteristic, and the environment of the particle. With this I can run simulations that mimic the impulse test. This software uses a language called FISH, probably created by the creator of the software. This means that in order to model anything, one must use the command terminal instead of GUI's. I will also use this program to simulate the Moon/Mars simulate adhering to the fabric for the Dust Mitigation project. My goals for this summer are just to complete preliminary studies of the feasibility of the Shaking Fabric, learn the PFC-2D program, and to complete building and testing the wave propagation experiment.

  18. Lepton identification at particle flow oriented detector for the future e+e- Higgs factories

    NASA Astrophysics Data System (ADS)

    Yu, Dan; Ruan, Manqi; Boudry, Vincent; Videau, Henri

    2017-09-01

    The lepton identification is essential for the physics programs at high-energy frontier, especially for the precise measurement of the Higgs boson. For this purpose, a toolkit for multivariate data analysis (TMVA) based lepton identification (LICH) has been developed for detectors using high granularity calorimeters. Using the conceptual detector geometry for the Circular Electron-Positron Collider (CEPC) and single charged particle samples with energy larger than 2 GeV, LICH identifies electrons/muons with efficiencies higher than 99.5% and controls the mis-identification rate of hadron to muons/electrons to better than 1/0.5%. Reducing the calorimeter granularity by 1-2 orders of magnitude, the lepton identification performance is stable for particles with E > 2 GeV. Applied to fully simulated eeH/μ μ H events, the lepton identification performance is consistent with the single particle case: the efficiency of identifying all the high energy leptons in an event, is 95.5-98.5%.

  19. Particle mobility size spectrometers: harmonization of technical standards and data structure to facilitate high quality long-term observations of atmospheric particle number size distributions

    NASA Astrophysics Data System (ADS)

    Wiedensohler, A.; Birmili, W.; Nowak, A.; Sonntag, A.; Weinhold, K.; Merkel, M.; Wehner, B.; Tuch, T.; Pfeifer, S.; Fiebig, M.; Fjäraa, A. M.; Asmi, E.; Sellegri, K.; Depuy, R.; Venzac, H.; Villani, P.; Laj, P.; Aalto, P.; Ogren, J. A.; Swietlicki, E.; Roldin, P.; Williams, P.; Quincey, P.; Hüglin, C.; Fierz-Schmidhauser, R.; Gysel, M.; Weingartner, E.; Riccobono, F.; Santos, S.; Grüning, C.; Faloon, K.; Beddows, D.; Harrison, R. M.; Monahan, C.; Jennings, S. G.; O'Dowd, C. D.; Marinoni, A.; Horn, H.-G.; Keck, L.; Jiang, J.; Scheckman, J.; McMurry, P. H.; Deng, Z.; Zhao, C. S.; Moerman, M.; Henzing, B.; de Leeuw, G.

    2010-12-01

    Particle mobility size spectrometers often referred to as DMPS (Differential Mobility Particle Sizers) or SMPS (Scanning Mobility Particle Sizers) have found a wide application in atmospheric aerosol research. However, comparability of measurements conducted world-wide is hampered by lack of generally accepted technical standards with respect to the instrumental set-up, measurement mode, data evaluation as well as quality control. This article results from several instrument intercomparison workshops conducted within the European infrastructure project EUSAAR (European Supersites for Atmospheric Aerosol Research). Under controlled laboratory conditions, the number size distribution from 20 to 200 nm determined by mobility size spectrometers of different design are within an uncertainty range of ±10% after correcting internal particle losses, while below and above this size range the discrepancies increased. Instruments with identical design agreed within ±3% in the peak number concentration when all settings were done carefully. Technical standards were developed for a minimum requirement of mobility size spectrometry for atmospheric aerosol measurements. Technical recommendations are given for atmospheric measurements including continuous monitoring of flow rates, temperature, pressure, and relative humidity for the sheath and sample air in the differential mobility analyser. In cooperation with EMEP (European Monitoring and Evaluation Program), a new uniform data structure was introduced for saving and disseminating the data within EMEP. This structure contains three levels: raw data, processed data, and final particle size distributions. Importantly, we recommend reporting raw measurements including all relevant instrument parameters as well as a complete documentation on all data transformation and correction steps. These technical and data structure standards aim to enhance the quality of long-term size distribution measurements, their comparability between different networks and sites, and their transparency and traceability back to raw data.

  20. Inertial microfluidic physics.

    PubMed

    Amini, Hamed; Lee, Wonhee; Di Carlo, Dino

    2014-08-07

    Microfluidics has experienced massive growth in the past two decades, and especially with advances in rapid prototyping researchers have explored a multitude of channel structures, fluid and particle mixtures, and integration with electrical and optical systems towards solving problems in healthcare, biological and chemical analysis, materials synthesis, and other emerging areas that can benefit from the scale, automation, or the unique physics of these systems. Inertial microfluidics, which relies on the unconventional use of fluid inertia in microfluidic platforms, is one of the emerging fields that make use of unique physical phenomena that are accessible in microscale patterned channels. Channel shapes that focus, concentrate, order, separate, transfer, and mix particles and fluids have been demonstrated, however physical underpinnings guiding these channel designs have been limited and much of the development has been based on experimentally-derived intuition. Here we aim to provide a deeper understanding of mechanisms and underlying physics in these systems which can lead to more effective and reliable designs with less iteration. To place the inertial effects into context we also discuss related fluid-induced forces present in particulate flows including forces due to non-Newtonian fluids, particle asymmetry, and particle deformability. We then highlight the inverse situation and describe the effect of the suspended particles acting on the fluid in a channel flow. Finally, we discuss the importance of structured channels, i.e. channels with boundary conditions that vary in the streamwise direction, and their potential as a means to achieve unprecedented three-dimensional control over fluid and particles in microchannels. Ultimately, we hope that an improved fundamental and quantitative understanding of inertial fluid dynamic effects can lead to unprecedented capabilities to program fluid and particle flow towards automation of biomedicine, materials synthesis, and chemical process control.

  1. Ultrasonic control of ceramic membrane fouling: Effect of particle characteristics.

    PubMed

    Chen, Dong; Weavers, Linda K; Walker, Harold W

    2006-02-01

    In this study, the effect of particle characteristics on the ultrasonic control of membrane fouling was investigated. Ultrasound at 20 kHz was applied to a cross-flow filtration system with gamma-alumina membranes in the presence of colloidal silica particles. Experimental results indicated that particle concentration affected the ability of ultrasound to control membrane fouling, with less effective control of fouling at higher particle concentrations. Measurements of sound wave intensity and images of the cavitation region indicated that particles induced additional cavitation bubbles near the ultrasonic source, which resulted in less turbulence reaching the membrane surface and subsequently less effective control of fouling. When silica particles were modified to be hydrophobic, greater inducement of cavitation bubbles near the ultrasonic source occurred for a fixed concentration, also resulting in less effective control of fouling. Particle size influenced the cleaning ability of ultrasound, with better permeate recovery observed with larger particles. Particle size did not affect sound wave intensity, suggesting that the more effective control of fouling by large particles was due to greater lift and cross-flow drag forces on larger particles compared to smaller particles.

  2. Scientific program and abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerich, C.

    1983-01-01

    The Fifth International Conference on High-Power Particle Beams is organized jointly by the Lawrence Livermore National Laboratory and Physics International Company. As in the previous conferences in this series, the program includes the following topics: high-power, electron- and ion-beam acceleration and transport; diode physics; high-power particle beam interaction with plasmas and dense targets; particle beam fusion (inertial confinement); collective ion acceleration; particle beam heating of magnetically confined plasmas; and generation of microwave/free-electron lasers.

  3. Development of a novel drug release system, time-controlled explosion system (TES). I. Concept and design.

    PubMed

    Ueda, S; Hata, T; Asakura, S; Yamaguchi, H; Kotani, M; Ueda, Y

    1994-01-01

    A novel controlled drug release system. Time-Controlled Explosion System (TES) has been developed. TES has a four-layered spherical structure, which consists of core, drug, swelling agent and water insoluble polymer membrane. TES is characterized by a rapid drug release with a precisely programmed lag time; i.e. expansion of the swelling agent by water penetrating through the outer membrane, destruction of the membrane by stress due to swelling force and subsequent rapid drug release. For establishing the concept and development strategy, TES was designed using metoprolol and polystyrene balls (size: 3.2 mm in diameter) as a model drug and core particles. Among the polymers screened, low-substituted hydroxypropylcellulose (L-HPC) and ethylcellulose (EC) were selected for a swelling agent and an outer water insoluble membrane, respectively. The release profiles of metoprolol from the system were not affected by the pH of the dissolution media. Lag time was controlled by the thickness of the outer EC membrane; thus, a combination of TES particles possessing different lag times could offer any desired release profile of the model compound, metoprolol.

  4. Particle astrophysics

    NASA Technical Reports Server (NTRS)

    Sadoulet, Bernard; Cronin, James; Aprile, Elena; Barish, Barry C.; Beier, Eugene W.; Brandenberger, Robert; Cabrera, Blas; Caldwell, David; Cassiday, George; Cline, David B.

    1991-01-01

    The following scientific areas are reviewed: (1) cosmology and particle physics (particle physics and the early universe, dark matter, and other relics); (2) stellar physics and particles (solar neutrinos, supernovae, and unconventional particle physics); (3) high energy gamma ray and neutrino astronomy; (4) cosmic rays (space and ground observations). Highest scientific priorities for the next decade include implementation of the current program, new initiatives, and longer-term programs. Essential technological developments, such as cryogenic detectors of particles, new solar neutrino techniques, and new extensive air shower detectors, are discussed. Also a certain number of institutional issues (the funding of particle astrophysics, recommended funding mechanisms, recommended facilities, international collaborations, and education and technology) which will become critical in the coming decade are presented.

  5. Multivariable optimization of liquid rocket engines using particle swarm algorithms

    NASA Astrophysics Data System (ADS)

    Jones, Daniel Ray

    Liquid rocket engines are highly reliable, controllable, and efficient compared to other conventional forms of rocket propulsion. As such, they have seen wide use in the space industry and have become the standard propulsion system for launch vehicles, orbit insertion, and orbital maneuvering. Though these systems are well understood, historical optimization techniques are often inadequate due to the highly non-linear nature of the engine performance problem. In this thesis, a Particle Swarm Optimization (PSO) variant was applied to maximize the specific impulse of a finite-area combustion chamber (FAC) equilibrium flow rocket performance model by controlling the engine's oxidizer-to-fuel ratio and de Laval nozzle expansion and contraction ratios. In addition to the PSO-controlled parameters, engine performance was calculated based on propellant chemistry, combustion chamber pressure, and ambient pressure, which are provided as inputs to the program. The performance code was validated by comparison with NASA's Chemical Equilibrium with Applications (CEA) and the commercially available Rocket Propulsion Analysis (RPA) tool. Similarly, the PSO algorithm was validated by comparison with brute-force optimization, which calculates all possible solutions and subsequently determines which is the optimum. Particle Swarm Optimization was shown to be an effective optimizer capable of quick and reliable convergence for complex functions of multiple non-linear variables.

  6. Decay of super-heavy particles: user guide of the SHdecay program

    NASA Astrophysics Data System (ADS)

    Barbot, C.

    2004-02-01

    I give here a detailed user guide for the C++ program SHdecay, which has been developed for computing the final spectra of stable particles (protons, photons, LSPs, electrons, neutrinos of the three species and their antiparticles) arising from the decay of a super-heavy X particle. It allows to compute in great detail the complete decay cascade for any given decay mode into particles of the Minimal Supersymmetric Standard Model (MSSM). In particular, it takes into account all interactions of the MSSM during the perturbative cascade (including not only QCD, but also the electroweak and 3rd generation Yukawa interactions), and includes a detailed treatment of the SUSY decay cascade (for a given set of parameters) and of the non-perturbative hadronization process. All these features allow us to ensure energy conservation over the whole cascade up to a numerical accuracy of a few per mille. Yet, this program also allows to restrict the computation to QCD or SUSY-QCD frameworks. I detail the input and output files, describe the role of each part of the program, and include some advice for using it best. Program summaryTitle of program: SHdecay Catalogue identifier:ADSL Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSL Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer and operating system: Program tested on PC running Linux KDE and Suse 8.1 Programming language used: C with STL C++ library and using the standard gnu g++ compiler No. lines in distributed program: 14 955 No. of bytes in distributed program, including test data, etc.: 624 487 Distribution format: tar gzip file Keywords: Super-heavy particles, fragmentation functions, DGLAP equations, supersymmetry, MSSM, UHECR Nature of physical problem: Obtaining the energy spectra of the final stable decay products (protons, photons, electrons, the three species of neutrinos and the LSPs) of a decaying super-heavy X particle, within the framework of the Minimal Supersymmetric Standard Model (MSSM). It can be done numerically by solving the full set of DGLAP equations in the MSSM for the perturbative evolution of the fragmentation functions Dp2p1( x, Q) of any particle p1 into any other p2 ( x is the energy fraction carried by the particle p2 and Q its virtuality), and by treating properly the different decay cascades of all unstable particles and the final hadronization of quarks and gluons. In order to obtain proper results at very low values of x (up to x˜10 -13), NLO color coherence effects have been included by using the Modified Leading Log Approximation (MLLA). Method of solution: the DGLAP equations are solved by a four order Runge-Kutta method with a fixed step. Typical running time: Around 35 hours for the first run, but the most time consuming sub-programs can be run only once for most applications.

  7. Nozzles for Focusing Aerosol Particles

    DTIC Science & Technology

    2009-10-01

    Fabrication of the nozzle with the desired shape was accomplished using EDM technology. First, a copper tungsten electrode was turned on a CNC lathe . The...small (0.9-mm diameter). The external portions of the nozzles were machined in a more conventional manner using computer numerical control ( CNC ... lathes and milling machines running programs written by computer aided machining (CAM) software. The close tolerance of concentricity of the two

  8. Sensitive Detection Using Microfluidics and Nonlinear Amplification

    DTIC Science & Technology

    2011-07-22

    Quantification of Nucleic Acids via Simultaneous Chemical Initiation of Recombinase Polymerase Amplification Reactions on SlipChip" 2011, 83, 3533... Amplification 5a. CONTRACT NUMBER 5b. GRANT NUMBER N00014-08-1-0936 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Rustem F. Ismagilov 5d. PROJECT NUMBER 5e...concentrations by combining controlled chemical autocatalytic amplification and stochastic confinement of small particles with the microfluidic

  9. Different elution modes and field programming in gravitational field-flow fractionation. III. Field programming by flow-rate gradient generated by a programmable pump.

    PubMed

    Plocková, J; Chmelík, J

    2001-05-25

    Gravitational field-flow fractionation (GFFF) utilizes the Earth's gravitational field as an external force that causes the settlement of particles towards the channel accumulation wall. Hydrodynamic lift forces oppose this action by elevating particles away from the channel accumulation wall. These two counteracting forces enable modulation of the resulting force field acting on particles in GFFF. In this work, force-field programming based on modulating the magnitude of hydrodynamic lift forces was implemented via changes of flow-rate, which was accomplished by a programmable pump. Several flow-rate gradients (step gradients, linear gradients, parabolic, and combined gradients) were tested and evaluated as tools for optimization of the separation of a silica gel particle mixture. The influence of increasing amount of sample injected on the peak resolution under flow-rate gradient conditions was also investigated. This is the first time that flow-rate gradients have been implemented for programming of the resulting force field acting on particles in GFFF.

  10. PROGRAM TO DEVELOP ENGINEERING DATA FOR FABRIC FILTRATION WITH INTEGRAL PARTICLE CHARGING AND COLLECTION IN A COMBINED ELECTRIC AND FLOW FIELD

    EPA Science Inventory

    The paper discusses an EPA program to develop engineering data for the application of electrostatics to fabric filtration in the form of integral particle charging and collection in a combined electric and flow field, which causes particle deposition to be dominated by electrosta...

  11. Preparation of composite materials in space. Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    Steurer, W. H.; Kaye, S.

    1973-01-01

    A study to define promising materials, significant processing criteria, and the related processing techniques and apparatus for the preparation of composite materials in space was conducted. The study also established a program for zero gravity experiments and the required developmental efforts. The following composite types were considered: (1) metal-base fiber and particle composites, including cemented compacts, (2) controlled density metals, comprising plain and reinforced metal foams, and (3) unidirectionally solidified eutectic alloys. A program of suborbital and orbital experiments for the 1972 to 1978 time period was established to identify materials, processes, and required experiment equipment.

  12. Nonlinear Burn Control and Operating Point Optimization in ITER

    NASA Astrophysics Data System (ADS)

    Boyer, Mark; Schuster, Eugenio

    2013-10-01

    Control of the fusion power through regulation of the plasma density and temperature will be essential for achieving and maintaining desired operating points in fusion reactors and burning plasma experiments like ITER. In this work, a volume averaged model for the evolution of the density of energy, deuterium and tritium fuel ions, alpha-particles, and impurity ions is used to synthesize a multi-input multi-output nonlinear feedback controller for stabilizing and modulating the burn condition. Adaptive control techniques are used to account for uncertainty in model parameters, including particle confinement times and recycling rates. The control approach makes use of the different possible methods for altering the fusion power, including adjusting the temperature through auxiliary heating, modulating the density and isotopic mix through fueling, and altering the impurity density through impurity injection. Furthermore, a model-based optimization scheme is proposed to drive the system as close as possible to desired fusion power and temperature references. Constraints are considered in the optimization scheme to ensure that, for example, density and beta limits are avoided, and that optimal operation is achieved even when actuators reach saturation. Supported by the NSF CAREER award program (ECCS-0645086).

  13. Generating heavy particles with energy and momentum conservation

    NASA Astrophysics Data System (ADS)

    Mereš, Michal; Melo, Ivan; Tomášik, Boris; Balek, Vladimír; Černý, Vladimír

    2011-12-01

    We propose a novel algorithm, called REGGAE, for the generation of momenta of a given sample of particle masses, evenly distributed in Lorentz-invariant phase space and obeying energy and momentum conservation. In comparison to other existing algorithms, REGGAE is designed for the use in multiparticle production in hadronic and nuclear collisions where many hadrons are produced and a large part of the available energy is stored in the form of their masses. The algorithm uses a loop simulating multiple collisions which lead to production of configurations with reasonably large weights. Program summaryProgram title: REGGAE (REscattering-after-Genbod GenerAtor of Events) Catalogue identifier: AEJR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1523 No. of bytes in distributed program, including test data, etc.: 9608 Distribution format: tar.gz Programming language: C++ Computer: PC Pentium 4, though no particular tuning for this machine was performed. Operating system: Originally designed on Linux PC with g++, but it has been compiled and ran successfully on OS X with g++ and MS Windows with Microsoft Visual C++ 2008 Express Edition, as well. RAM: This depends on the number of particles which are generated. For 10 particles like in the attached example it requires about 120 kB. Classification: 11.2 Nature of problem: The task is to generate momenta of a sample of particles with given masses which obey energy and momentum conservation. Generated samples should be evenly distributed in the available Lorentz-invariant phase space. Solution method: In general, the algorithm works in two steps. First, all momenta are generated with the GENBOD algorithm. There, particle production is modeled as a sequence of two-body decays of heavy resonances. After all momenta are generated this way, they are reshuffled. Each particle undergoes a collision with some other partner such that in the pair center of mass system the new directions of momenta are distributed isotropically. After each particle collides only a few times, the momenta are distributed evenly across the whole available phase space. Starting with GENBOD is not essential for the procedure but it improves the performance. Running time: This depends on the number of particles and number of events one wants to generate. On a LINUX PC with 2 GHz processor, generation of 1000 events with 10 particles each takes about 3 s.

  14. A three-dimensional spacecraft-charging computer code

    NASA Technical Reports Server (NTRS)

    Rubin, A. G.; Katz, I.; Mandell, M.; Schnuelle, G.; Steen, P.; Parks, D.; Cassidy, J.; Roche, J.

    1980-01-01

    A computer code is described which simulates the interaction of the space environment with a satellite at geosynchronous altitude. Employing finite elements, a three-dimensional satellite model has been constructed with more than 1000 surface cells and 15 different surface materials. Free space around the satellite is modeled by nesting grids within grids. Applications of this NASA Spacecraft Charging Analyzer Program (NASCAP) code to the study of a satellite photosheath and the differential charging of the SCATHA (satellite charging at high altitudes) satellite in eclipse and in sunlight are discussed. In order to understand detector response when the satellite is charged, the code is used to trace the trajectories of particles reaching the SCATHA detectors. Particle trajectories from positive and negative emitters on SCATHA also are traced to determine the location of returning particles, to estimate the escaping flux, and to simulate active control of satellite potentials.

  15. ZENO: N-body and SPH Simulation Codes

    NASA Astrophysics Data System (ADS)

    Barnes, Joshua E.

    2011-02-01

    The ZENO software package integrates N-body and SPH simulation codes with a large array of programs to generate initial conditions and analyze numerical simulations. Written in C, the ZENO system is portable between Mac, Linux, and Unix platforms. It is in active use at the Institute for Astronomy (IfA), at NRAO, and possibly elsewhere. Zeno programs can perform a wide range of simulation and analysis tasks. While many of these programs were first created for specific projects, they embody algorithms of general applicability and embrace a modular design strategy, so existing code is easily applied to new tasks. Major elements of the system include: Structured data file utilities facilitate basic operations on binary data, including import/export of ZENO data to other systems.Snapshot generation routines create particle distributions with various properties. Systems with user-specified density profiles can be realized in collisionless or gaseous form; multiple spherical and disk components may be set up in mutual equilibrium.Snapshot manipulation routines permit the user to sift, sort, and combine particle arrays, translate and rotate particle configurations, and assign new values to data fields associated with each particle.Simulation codes include both pure N-body and combined N-body/SPH programs: Pure N-body codes are available in both uniprocessor and parallel versions.SPH codes offer a wide range of options for gas physics, including isothermal, adiabatic, and radiating models. Snapshot analysis programs calculate temporal averages, evaluate particle statistics, measure shapes and density profiles, compute kinematic properties, and identify and track objects in particle distributions.Visualization programs generate interactive displays and produce still images and videos of particle distributions; the user may specify arbitrary color schemes and viewing transformations.

  16. THERMUS—A thermal model package for ROOT

    NASA Astrophysics Data System (ADS)

    Wheaton, S.; Cleymans, J.; Hauer, M.

    2009-01-01

    THERMUS is a package of C++ classes and functions allowing statistical-thermal model analyses of particle production in relativistic heavy-ion collisions to be performed within the ROOT framework of analysis. Calculations are possible within three statistical ensembles; a grand-canonical treatment of the conserved charges B, S and Q, a fully canonical treatment of the conserved charges, and a mixed-canonical ensemble combining a canonical treatment of strangeness with a grand-canonical treatment of baryon number and electric charge. THERMUS allows for the assignment of decay chains and detector efficiencies specific to each particle yield, which enables sensible fitting of model parameters to experimental data. Program summaryProgram title: THERMUS, version 2.1 Catalogue identifier: AEBW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 152 No. of bytes in distributed program, including test data, etc.: 93 581 Distribution format: tar.gz Programming language: C++ Computer: PC, Pentium 4, 1 GB RAM (not hardware dependent) Operating system: Linux: FEDORA, RedHat, etc. Classification: 17.7 External routines: Numerical Recipes in C [1], ROOT [2] Nature of problem: Statistical-thermal model analyses of heavy-ion collision data require the calculation of both primordial particle densities and contributions from resonance decay. A set of thermal parameters (the number depending on the particular model imposed) and a set of thermalized particles, with their decays specified, is required as input to these models. The output is then a complete set of primordial thermal quantities for each particle, together with the contributions to the final particle yields from resonance decay. In many applications of statistical-thermal models it is required to fit experimental particle multiplicities or particle ratios. In such analyses, the input is a set of experimental yields and ratios, a set of particles comprising the assumed hadron resonance gas formed in the collision and the constraints to be placed on the system. The thermal model parameters consistent with the specified constraints leading to the best-fit to the experimental data are then output. Solution method: THERMUS is a package designed for incorporation into the ROOT [2] framework, used extensively by the heavy-ion community. As such, it utilizes a great deal of ROOT's functionality in its operation. ROOT features used in THERMUS include its containers, the wrapper TMinuit implementing the MINUIT fitting package, and the TMath class of mathematical functions and routines. Arguably the most useful feature is the utilization of CINT as the control language, which allows interactive access to the THERMUS objects. Three distinct statistical ensembles are included in THERMUS, while additional options to include quantum statistics, resonance width and excluded volume corrections are also available. THERMUS provides a default particle list including all mesons (up to the K4∗ (2045)) and baryons (up to the Ω) listed in the July 2002 Particle Physics Booklet [3]. For each typically unstable particle in this list, THERMUS includes a text-file listing its decays. With thermal parameters specified, THERMUS calculates primordial thermal densities either by performing numerical integrations or else, in the case of the Boltzmann approximation without resonance width in the grand-canonical ensemble, by evaluating Bessel functions. Particle decay chains are then used to evaluate experimental observables (i.e. particle yields following resonance decay). Additional detector efficiency factors allow fine-tuning of the model predictions to a specific detector arrangement. When parameters are required to be constrained, use is made of the 'Numerical Recipes in C' [1] function which applies the Broyden globally convergent secant method of solving nonlinear systems of equations. Since the NRC software is not freely-available, it has to be purchased by the user. THERMUS provides the means of imposing a large number of constraints on the chosen model (amongst others, THERMUS can fix the baryon-to-charge ratio of the system, the strangeness density of the system and the primordial energy per hadron). Fits to experimental data are accomplished in THERMUS by using the ROOT TMinuit class. In its default operation, the standard χ function is minimized, yielding the set of best-fit thermal parameters. THERMUS allows the assignment of separate decay chains to each experimental input. In this way, the model is able to match the specific feed-down corrections of a particular data set. Running time: Depending on the analysis required, run-times vary from seconds (for the evaluation of particle multiplicities given a set of parameters) to several minutes (for fits to experimental data subject to constraints). References:W.H. Press, S.A. Teukolsky, W.T. Vetterling, B.P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, Cambridge University Press, Cambridge, 2002. R. Brun, F. Rademakers, Nucl. Inst. Meth. Phys. Res. A 389 (1997) 81. See also http://root.cern.ch/. K. Hagiwara et al., Phys. Rev. D 66 (2002) 010001.

  17. Strategies for Controlled Placement of Nanoscale Building Blocks

    PubMed Central

    2007-01-01

    The capability of placing individual nanoscale building blocks on exact substrate locations in a controlled manner is one of the key requirements to realize future electronic, optical, and magnetic devices and sensors that are composed of such blocks. This article reviews some important advances in the strategies for controlled placement of nanoscale building blocks. In particular, we will overview template assisted placement that utilizes physical, molecular, or electrostatic templates, DNA-programmed assembly, placement using dielectrophoresis, approaches for non-close-packed assembly of spherical particles, and recent development of focused placement schemes including electrostatic funneling, focused placement via molecular gradient patterns, electrodynamic focusing of charged aerosols, and others. PMID:21794185

  18. Installation Restoration Program. Preliminary Assessment: Connecticut Air National Guard, 103rd Tactical Fighter Group (TFG), Bradley International Airport, Windsor Locks, Connecticut and 103rd Tactical Control Squadron (TCS), Orange/West Haven, Connectiut

    DTIC Science & Technology

    1988-11-01

    poorly sorted, not I E compacted, very plastic . Contains siliceous N diatoms and spores. Organic content high (17.2 T percent of sample lost during...physical character of a rock (e.g., particle size, color, mineral content, primary strutures, thickness, weathering caracteristics , and other physical

  19. EuCARD 2010: European coordination of accelerator research and development

    NASA Astrophysics Data System (ADS)

    Romaniuk, Ryszard S.

    2010-09-01

    Accelerators are basic tools of the experimental physics of elementary particles, nuclear physics, light sources of the fourth generation. They are also used in myriad other applications in research, industry and medicine. For example, there are intensely developed transmutation techniques for nuclear waste from nuclear power and atomic industries. The European Union invests in the development of accelerator infrastructures inside the framework programs to build the European Research Area. The aim is to build new accelerator research infrastructures, develop the existing ones, and generally make the infrastructures more available to competent users. The paper summarizes the first year of activities of the EU FP7 Project Capacities EuCARD -European Coordination of Accelerator R&D. EuCARD is a common venture of 37 European Accelerator Laboratories, Institutes, Universities and Industrial Partners involved in accelerator sciences and technologies. The project, initiated by ESGARD, is an Integrating Activity co-funded by the European Commission under Framework Program 7 - Capacities for a duration of four years, starting April 1st, 2009. Several teams from this country participate actively in this project. The contribution from Polish research teams concerns: photonic and electronic measurement - control systems, RF-gun co-design, thin-film superconducting technology, superconducting transport infrastructures, photon and particle beam measurements and control.

  20. Nanoparticle Superlattice Engineering with DNA

    NASA Astrophysics Data System (ADS)

    Macfarlane, Robert John

    In this thesis, we describe a set of design rules for using programmable oligonucleotide interactions, elements of both thermodynamic and kinetic control, and an understanding of the dominant forces that are responsible for particle assembly to design and deliberately make a wide variety of nanoparticle-based superlattices. Like the rules for ionic solids developed by Linus Pauling, these rules are guidelines for determining relative nanoparticle superlattice stability, rather than rigorous mathematical descriptions. However, unlike Pauling's rules, the set of rules developed herein allow one to not just predict crystal stability, but also to deliberately and independently control the nanoparticle sizes, interparticle spacings, and crystallographic symmetries of a superlattice. In the first chapter of this thesis, a general background is given for using DNA as a tool in programmable materials synthesis. Chapter 2 demonstrates how altering oligonucleotide length and nanoparticle size can be used to control nanoparticle superlattice lattice parameters with nanometer-scale precision. In the third chapter, the kinetics of crystallization are examined, and a method to selectively stabilize kinetic products is presented. The data in chapter 4 prove that it is the overall hydrodynamic radius of a DNA-functionalized particle, rather than the sizes of the inorganic nanoparticles being assembled, that dictates particle packing behavior. Chapter 5 demonstrates how particles that exhibit non-equivalent packing behavior can be used to control superlattice symmetry, and chapter 6 utilizes these data to develop a phase diagram that predicts lattice stability a priori to synthesis. In chapter 7, the ability to functionalize a particle with multiple types of oligonucleotides is used to synthesize complex lattices, including ternary superlattices that are capable of dynamic symmetry conversion between a binary and a ternary state. The final chapter provides an outlook on other developments in DNA-programmed nanoparticle assembly not covered in this thesis, as well as future challenges for this field. Supplementary information to support the conclusions of the thesis, as well as provide technical details on how these materials are synthesized, are provided in appendices at the end of the thesis. As a whole, this methodology presents a major advance towards nanoparticle superlattice engineering, as it effectively separates the identity of a particle core (and thereby its physical properties) from the variables that control its assembly, enabling the synthesis of designer nanoparticle-based materials.

  1. An experimental and theoretical investigation of deposition patterns from an agricultural airplane

    NASA Technical Reports Server (NTRS)

    Morris, D. J.; Croom, C. C.; Vandam, C. P.; Holmes, B. J.

    1984-01-01

    A flight test program has been conducted with a representative agricultural airplane to provide data for validating a computer program model which predicts aerially applied particle deposition. Test procedures and the data from this test are presented and discussed. The computer program features are summarized, and comparisons of predicted and measured particle deposition are presented. Applications of the computer program for spray pattern improvement are illustrated.

  2. Integral Engine Inlet Particle Separator. Volume 1. Technology Program

    DTIC Science & Technology

    1975-07-01

    inlet particle separators for future Army aircraft gas turbine engines . Appropriate technical personnel of this Directorate have reviewed this report...USAAMRDL-TR-75-31A I - / INTEGRAL ENGINE INLET PARTICLE SEPARATOR Volume I-- Technology Program General Electric Company Aircraft Engine Group...N1 i 9ap mm tm~qu INTRODUCTION The adverse environments in which Army equipment operates impose severe )enalties upon gas turbine engine performance

  3. Particle Tracking Model (PTM) with Coastal Modeling System (CMS)

    DTIC Science & Technology

    2015-11-04

    Coastal Inlets Research Program Particle Tracking Model (PTM) with Coastal Modeling System ( CMS ) The Particle Tracking Model (PTM) is a Lagrangian...currents and waves. The Coastal Inlets Research Program (CIRP) supports the PTM with the Coastal Modeling System ( CMS ), which provides coupled wave...and current forcing for PTM simulations. CMS -PTM is implemented in the Surface-water Modeling System, a GUI environment for input development

  4. Software for Acquiring Image Data for PIV

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Cheung, H. M.; Kressler, Brian

    2003-01-01

    PIV Acquisition (PIVACQ) is a computer program for acquisition of data for particle-image velocimetry (PIV). In the PIV system for which PIVACQ was developed, small particles entrained in a flow are illuminated with a sheet of light from a pulsed laser. The illuminated region is monitored by a charge-coupled-device camera that operates in conjunction with a data-acquisition system that includes a frame grabber and a counter-timer board, both installed in a single computer. The camera operates in "frame-straddle" mode where a pair of images can be obtained closely spaced in time (on the order of microseconds). The frame grabber acquires image data from the camera and stores the data in the computer memory. The counter/timer board triggers the camera and synchronizes the pulsing of the laser with acquisition of data from the camera. PIVPROC coordinates all of these functions and provides a graphical user interface, through which the user can control the PIV data-acquisition system. PIVACQ enables the user to acquire a sequence of single-exposure images, display the images, process the images, and then save the images to the computer hard drive. PIVACQ works in conjunction with the PIVPROC program which processes the images of particles into the velocity field in the illuminated plane.

  5. Press Room

    Science.gov Websites

    Science at Fermilab Fermilab and the Higgs Boson Frontiers of Particle Physics Experiments & Projects Frontiers of Particle Physics Benefits to Society Contacting Fermilab General Contact Information Email -12 Programs Lederman Science Center Saturday Morning Physics Cooperative Education Program

  6. A computer program for two-particle intrinsic coefficients of fractional parentage

    NASA Astrophysics Data System (ADS)

    Deveikis, A.

    2012-06-01

    A Fortran 90 program CESOS for the calculation of the two-particle intrinsic coefficients of fractional parentage for several j-shells with isospin and an arbitrary number of oscillator quanta (CESOs) is presented. The implemented procedure for CESOs calculation consistently follows the principles of antisymmetry and translational invariance. The approach is based on a simple enumeration scheme for antisymmetric many-particle states, efficient algorithms for calculation of the coefficients of fractional parentage for j-shells with isospin, and construction of the subspace of the center-of-mass Hamiltonian eigenvectors corresponding to the minimal eigenvalue equal to 3/2 (in ℏω). The program provides fast calculation of CESOs for a given particle number and produces results possessing small numerical uncertainties. The introduced CESOs may be used for calculation of expectation values of two-particle nuclear shell-model operators within the isospin formalism. Program summaryProgram title: CESOS Catalogue identifier: AELT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 10 932 No. of bytes in distributed program, including test data, etc.: 61 023 Distribution format: tar.gz Programming language: Fortran 90 Computer: Any computer with a Fortran 90 compiler Operating system: Windows XP, Linux RAM: The memory demand depends on the number of particles A and the excitation energy of the system E. Computation of the A=6 particle system with the total angular momentum J=0 and the total isospin T=1 requires around 4 kB of RAM at E=0,˜3 MB at E=3, and ˜172 MB at E=5. Classification: 17.18 Nature of problem: The code CESOS generates a list of two-particle intrinsic coefficients of fractional parentage for several j-shells with isospin. Solution method: The method is based on the observation that CESOs may be obtained by diagonalizing the center-of-mass Hamiltonian in the basis set of antisymmetric A-particle oscillator functions with singled out dependence on Jacobi coordinates of two last particles and choosing the subspace of its eigenvectors corresponding to the minimal eigenvalue equal to 3/2. Restrictions: One run of the code CESOS generates CESOs for one specified set of (A,E,J,T) values only. The restrictions on the (A,E,J,T) values are completely determined by the restrictions on the computation of the single-shell CFPs and two-particle multishell CFPs (GCFPs) [1]. The full sets of single-shell CFPs may be calculated up to the j=9/2 shell (for any particular shell of the configuration); the shell with j⩾11/2 cannot get full (it is the implementation constraint). The calculation of GCFPs is limited by A<86 when E=0 (due to the memory constraints); small numbers of particles allow significantly higher excitations. Any allowed values of J and T may be chosen for the specified values of A and E. The complete list of allowed values of J and T for the chosen values of A and E may be generated by the GCFP program - CPC Program Library, Catalogue Id. AEBI_v1_0. The actual scale of the CESOs computation problem depends strongly on the magnitude of the A and E values. Though there are no limitations on A and E values (within the limits of single-shell CFPs and multishell CFPs calculation), however the generation of corresponding list of CESOs is the subject of available computing resources. For example, the computing time of CESOs for A=6, JT=10 at E=5 took around 14 hours. The system with A=11, JT=1/23/2 at E=2 requires around 15 hours. These computations were performed on Pentium 3 GHz PC with 1 GB RAM [2]. Unusual features: It is possible to test the computed CESOs without saving them to a file. This allows the user to learn their number and approximate computation time and to evaluate the accuracy of calculations. Additional comments: The program CESOS uses the code from GCFP program for calculation of the two-particle multishell coefficients of fractional parentage. Running time: It depends on the size of the problem. The A=6 particle system with the JT=01 took around 31 seconds on Pentium 3 GHz PC with 1 GB RAM at E=3 and about 2.6 hours at E=5.

  7. Exploring the Early Structure of a Rapidly Decompressed Particle Bed

    NASA Astrophysics Data System (ADS)

    Zunino, Heather; Adrian, R. J.; Clarke, Amanda; Johnson, Blair; Arizona State University Collaboration

    2017-11-01

    Rapid expansion of dense, pressurized beds of fine particles subjected to rapid reduction of the external pressure is studied in a vertical shock tube. A near-sonic expansion wave impinges on the particle bed-gas interface and rapidly unloads the particle bed. A high-speed video camera captures events occurring during bed expansion. The particle bed does not expand homogeneously, but breaks down into horizontal slabs and then transforms into a cellular-type structure. There are several key parameters that affect the particle bed evolution, including particle size and initial bed height. Analyses of this bed structure evolution from experiments with varying particle sizes and initial bed heights is presented. This work is supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science and Academic Alliance Program, under Contract No. DE-NA0002378.

  8. Effects of retrofitting emission control systems on in-use heavy diesel vehicles.

    PubMed

    Millstein, Dev E; Harley, Robert A

    2010-07-01

    Diesel engines are now the largest source of nitrogen oxides (NO(x)) and fine particulate black carbon (soot) emissions in California. The California Air Resources Board recently adopted a rule requiring that by 2014 all in-use heavy trucks and buses meet current (2007) exhaust particulate matter (PM) emission standards. Also by 2023 all in-use heavy-duty vehicles will have to meet current NO(x) emission standards, with significant progress in achieving the requirements for NO(x) control expected by 2014. This will require retrofit or replacement of older in-use engines. Diesel particle filters (DPF) reduce PM emissions but may increase the NO(2)/NO(x) emission ratio to approximately 35%, compared to approximately 5% typical of diesel engines without particle filters. Additionally, DPF with high oxidative capacity reduce CO and hydrocarbon emissions. We evaluate the effects of retrofitting trucks with DPF on air quality in southern California, using an Eulerian photochemical air quality model. Compared to a 2014 reference scenario without the retrofit program, black carbon concentrations decreased by 12 +/- 2% and 14 +/- 2% during summer and fall, respectively, with corresponding increases in ambient ozone concentrations of 3 +/- 2% and 7 +/- 3%. NO(2) concentrations decreased by 2-4% overall despite the increase in primary NO(2) emissions because total NO(x) emissions were reduced as part of the program to retrofit NO(x) control systems on in-use engines. However, in some cases NO(2) concentrations may increase at locations with high diesel truck traffic.

  9. Development of Austenitic ODS Strengthened Alloys for Very High Temperature Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stubbins, James; Heuser, Brent; Robertson, Ian

    2015-04-22

    This “Blue Sky” project was directed at exploring the opportunities that would be gained by developing Oxide Dispersion Strengthened (ODS) alloys based on the Fe-Cr-Ni austenitic alloy system. A great deal of research effort has been directed toward ferritic and ferritic/martensitic ODS alloys which has resulted in reasonable advances in alloy properties. Similar gains should be possible with austenitic alloy which would also take advantage of other superior properties of that alloy system. The research effort was aimed at the developing an in-depth understanding of the microstructural-level strengthening effects of ODS particles in austentic alloys. This was accomplished on amore » variety of alloy compositions with the main focus on 304SS and 316SS compositions. A further goal was to develop an understanding other the role of ODS particles on crack propagation and creep performance. Since these later two properties require bulk alloy material which was not available, this work was carried out on promising austentic alloy systems which could later be enhanced with ODS strengthening. The research relied on a large variety of micro-analytical techniques, many of which were available through various scientific user facilities. Access to these facilities throughout the course of this work was instrumental in gathering complimentary data from various analysis techniques to form a well-rounded picture of the processes which control austenitic ODS alloy performance. Micromechanical testing of the austenitic ODS alloys confirmed their highly superior mechanical properties at elevated temperature from the enhanced strengthening effects. The study analyzed the microstructural mechanisms that provide this enhanced high temperature performance. The findings confirm that the smallest size ODS particles provide the most potent strengthening component. Larger particles and other thermally- driven precipitate structures were less effective contributors and, in some cases, limited overall properties. With this understanding, the major materials development challenge is to provide a high uniformly distributed population of very fine ODS particles to be able to realize the full promise of dispersion strengthening. This should be a major goal of future work. This program had the further goal to develop graduate student researcher with the experience and capabilities to move this field forward. The support in this program was used for graduate student support and for research expenses; none of the program funds directly supported the faculty in the program. In this sense, the program was successful in supporting several very promising graduate researchers. Four of the graduate students supported here will complete their PhDs in 2015.« less

  10. GASP cloud- and particle-encounter statistics and their application to LPC aircraft studies. Volume 1: Analysis and conclusions

    NASA Technical Reports Server (NTRS)

    Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D.

    1984-01-01

    Summary studies are presented for the entire cloud observation archieve from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long range airline routes, and to assess the probability and extent of laminar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical.

  11. Simulation of Radioactive Corrosion Product in Primary Cooling System of Japanese Sodium-Cooled Fast Breeder Reactor

    NASA Astrophysics Data System (ADS)

    Matuo, Youichirou; Miyahara, Shinya; Izumi, Yoshinobu

    Radioactive Corrosion Product (CP) is a main cause of personal radiation exposure during maintenance with no breached fuel in fast breeder reactor (FBR) plants. The most important CP is 54Mn and 60Co. In order to establish techniques of radiation dose estimation for radiation workers in radiation-controlled areas of the FBR, the PSYCHE (Program SYstem for Corrosion Hazard Evaluation) code was developed. We add the Particle Model to the conventional PSYCHE analytical model. In this paper, we performed calculation of CP transfer in JOYO using an improved calculation code in which the Particle Model was added to the PSYCHE. The C/E (calculated / experimentally observed) value for CP deposition was improved through use of this improved PSYCHE incorporating the Particle Model. Moreover, among the percentage of total radioactive deposition accounted for by CP in particle form, 54Mn was estimated to constitute approximately 20 % and 60Co approximately 40 % in the cold-leg region. These calculation results are consistent with the measured results for the actual cold-leg piping in the JOYO.

  12. Biological and physical controls on the flux and characteristics of sinking particles on the Northwest Atlantic margin

    NASA Astrophysics Data System (ADS)

    Hwang, Jeomshik; Manganini, Steven J.; Park, JongJin; Montluçon, Daniel B.; Toole, John M.; Eglinton, Timothy I.

    2017-06-01

    matter characteristics and radiocarbon contents of organic carbon (OC) were examined on sinking particle samples intercepted at three nominal depths of 1000 m, 2000 m, and 3000 m (˜50 m above the seafloor) during a 3 year sediment trap program on the New England slope in the Northwest Atlantic. We have sought to characterize the sources of sinking particles in the context of vertical export of biogenic particles from the overlying water column and lateral supply of resuspended sediment particles from adjacent margin sediments. High aluminum (Al) abundances and low OC radiocarbon contents indicated contributions from resuspended sediment which was greatest at 3000 m but also significant at shallower depths. The benthic source (i.e., laterally supplied resuspended sediment) of opal appears negligible based on the absence of a correlation with Al fluxes. In comparison, CaCO3 fluxes at 3000 m showed a positive correlation with Al fluxes. Benthic sources accounted for 42 ˜ 63% of the sinking particle flux based on radiocarbon mass balance and the relationship between Al flux and CaCO3 flux. Episodic pulses of Al at 3000 m were significantly correlated with the near-bottom current at a nearby hydrographic mooring site, implying the importance of current variability in lateral particle transport. However, Al fluxes at 1000 m and 2000 m were coherent but differed from those at 3000 m, implying more than one mode of lateral supply of particles in the water column.

  13. Generalized Faxén's theorem: Evaluating first-order (hydrodynamic drag) and second-order (acoustic radiation) forces on finite-sized rigid particles, bubbles and droplets in arbitrary complex flows

    NASA Astrophysics Data System (ADS)

    Annamalai, Subramanian; Balachandar, S.

    2016-11-01

    In recent times, study of complex disperse multiphase problems involving several million particles (e.g. volcanic eruptions, spray control etc.) is garnering momentum. The objective of this work is to present an accurate model (termed generalized Faxén's theorem) to predict the hydrodynamic forces on such inclusions (particles/bubbles/droplets) without having to solve for the details of flow around them. The model is developed using acoustic theory and the force obtained as a summation of infinite series (monopole, dipole and higher sources). The first-order force is the time-dependent hydrodynamic drag force arising from the dipole component due to interaction between the gas and the inclusion at the microscale level. The second-order force however is a time-averaged differential force (contributions arise both from monopole and dipole), also known as the acoustic radiation force primarily used to levitate particles. In this work, the monopole and dipole strengths are represented in terms of particle surface and volume averages of the incoming flow properties and therefore applicable to particle sizes of the order of fluid length scale and subjected to any arbitrary flow. Moreover, this model can also be used to account for inter-particle coupling due to neighboring particles. U.S. DoE, NNSA, Advanced Simulation and Computing Program, Cooperative Agreement under PSAAP-II, Contract No. DE-NA0002378.

  14. Biomimetic Antigenic Nanoparticles Elicit Controlled Protective Immune Response to Influenza

    PubMed Central

    Patterson, Dustin P.; Rynda-Apple, Agnieszka; Harmsen, Ann L.; Harmsen, Allen G.; Douglas, Trevor

    2013-01-01

    Here we present a biomimetic strategy towards nanoparticle design for controlled immune response through encapsulation of conserved internal influenza proteins on the interior of virus like particles (VLPs) to direct CD8+ cytotoxic T cell protection. Programmed encapsulation and sequestration of the conserved nucleoprotein (NP) from influenza on the interior of a VLP, derived from the bacteriophage P22, results in a vaccine that provides multi-strain protection against 100 times lethal doses of influenza in an NP specific CD8+ T cell-dependent manner. VLP assembly and encapsulation of the immunogenic NP cargo protein is the result of a genetically programmed self-assembly making this strategy amendable to the quick production of vaccines to rapidly emerging pathogens. Addition of adjuvants or targeting molecules were not required for eliciting the protective response. PMID:23540530

  15. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  16. Monodisperse Block Copolymer Particles with Controllable Size, Shape, and Nanostructure

    NASA Astrophysics Data System (ADS)

    Shin, Jae Man; Kim, Yongjoo; Kim, Bumjoon; PNEL Team

    Shape-anisotropic particles are important class of novel colloidal building block for their functionality is more strongly governed by their shape, size and nanostructure compared to conventional spherical particles. Recently, facile strategy for producing non-spherical polymeric particles by interfacial engineering received significant attention. However, achieving uniform size distribution of particles together with controlled shape and nanostructure has not been achieved. Here, we introduce versatile system for producing monodisperse BCP particles with controlled size, shape and morphology. Polystyrene-b-polybutadiene (PS-b-PB) self-assembled to either onion-like or striped ellipsoid particle, where final structure is governed by amount of adsorbed sodium dodecyl sulfate (SDS) surfactant at the particle/surrounding interface. Further control of molecular weight and particle size enabled fine-tuning of aspect ratio of ellipsoid particle. Underlying physics of free energy for morphology formation and entropic penalty associated with bending BCP chains strongly affects particle structure and specification.

  17. A computer program for two-particle generalized coefficients of fractional parentage

    NASA Astrophysics Data System (ADS)

    Deveikis, A.; Juodagalvis, A.

    2008-10-01

    We present a FORTRAN90 program GCFP for the calculation of the generalized coefficients of fractional parentage (generalized CFPs or GCFP). The approach is based on the observation that the multi-shell CFPs can be expressed in terms of single-shell CFPs, while the latter can be readily calculated employing a simple enumeration scheme of antisymmetric A-particle states and an efficient method of construction of the idempotent matrix eigenvectors. The program provides fast calculation of GCFPs for a given particle number and produces results possessing numerical uncertainties below the desired tolerance. A single j-shell is defined by four quantum numbers, (e,l,j,t). A supplemental C++ program parGCFP allows calculation to be done in batches and/or in parallel. Program summaryProgram title:GCFP, parGCFP Catalogue identifier: AEBI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 199 No. of bytes in distributed program, including test data, etc.: 88 658 Distribution format: tar.gz Programming language: FORTRAN 77/90 ( GCFP), C++ ( parGCFP) Computer: Any computer with suitable compilers. The program GCFP requires a FORTRAN 77/90 compiler. The auxiliary program parGCFP requires GNU-C++ compatible compiler, while its parallel version additionally requires MPI-1 standard libraries Operating system: Linux (Ubuntu, Scientific) (all programs), also checked on Windows XP ( GCFP, serial version of parGCFP) RAM: The memory demand depends on the computation and output mode. If this mode is not 4, the program GCFP demands the following amounts of memory on a computer with Linux operating system. It requires around 2 MB of RAM for the A=12 system at E⩽2. Computation of the A=50 particle system requires around 60 MB of RAM at E=0 and ˜70 MB at E=2 (note, however, that the calculation of this system will take a very long time). If the computation and output mode is set to 4, the memory demands by GCFP are significantly larger. Calculation of GCFPs of A=12 system at E=1 requires 145 MB. The program parGCFP requires additional 2.5 and 4.5 MB of memory for the serial and parallel version, respectively. Classification: 17.18 Nature of problem: The program GCFP generates a list of two-particle coefficients of fractional parentage for several j-shells with isospin. Solution method: The method is based on the observation that multishell coefficients of fractional parentage can be expressed in terms of single-shell CFPs [1]. The latter are calculated using the algorithm [2,3] for a spectral decomposition of an antisymmetrization operator matrix Y. The coefficients of fractional parentage are those eigenvectors of the antisymmetrization operator matrix Y that correspond to unit eigenvalues. A computer code for these coefficients is available [4]. The program GCFP offers computation of two-particle multishell coefficients of fractional parentage. The program parGCFP allows a batch calculation using one input file. Sets of GCFPs are independent and can be calculated in parallel. Restrictions:A<86 when E=0 (due to the memory constraints); small numbers of particles allow significantly higher excitations, though the shell with j⩾11/2 cannot get full (it is the implementation constraint). Unusual features: Using the program GCFP it is possible to determine allowed particle configurations without the GCFP computation. The GCFPs can be calculated either for all particle configurations at once or for a specified particle configuration. The values of GCFPs can be printed out with a complete specification in either one file or with the parent and daughter configurations printed in separate files. The latter output mode requires additional time and RAM memory. It is possible to restrict the ( J,T) values of the considered particle configurations. (Here J is the total angular momentum and T is the total isospin of the system.) The program parGCFP produces several result files the number of which equals to the number of particle configurations. To work correctly, the program GCFP needs to be compiled to read parameters from the standard input (the default setting). Running time: It depends on the size of the problem. The minimum time is required, if the computation and output mode ( CompMode) is not 4, but the resulting file is larger. A system with A=12 particles at E=0 (all 9411 GCFPs) took around 1 sec on a Pentium4 2.8 GHz processor with 1 MB L2 cache. The program required about 14 min to calculate all 1.3×10 GCFPs of E=1. The time for all 5.5×10 GCFPs of E=2 was about 53 hours. For this number of particles, the calculation time of both E=0 and E=1 with CompMode = 1 and 4 is nearly the same, when no other processes are running. The case of E=2 could not be calculated with CompMode = 4, because the RAM memory was insufficient. In general, the latter CompMode requires a longer computation time, although the resulting files are smaller in size. The program parGCFP puts virtually no time overhead. Its parallel version speeds-up the calculation. However, the results need to be collected from several files created for each configuration. References: [1] J. Levinsonas, Works of Lithuanian SSR Academy of Sciences 4 (1957) 17. [2] A. Deveikis, A. Bončkus, R. Kalinauskas, Lithuanian Phys. J. 41 (2001) 3. [3] A. Deveikis, R.K. Kalinauskas, B.R. Barrett, Ann. Phys. 296 (2002) 287. [4] A. Deveikis, Comput. Phys. Comm. 173 (2005) 186. (CPC Catalogue ID. ADWI_v1_0)

  18. Mobility particle size spectrometers: harmonization of technical standards and data structure to facilitate high quality long-term observations of atmospheric particle number size distributions

    NASA Astrophysics Data System (ADS)

    Wiedensohler, A.; Birmili, W.; Nowak, A.; Sonntag, A.; Weinhold, K.; Merkel, M.; Wehner, B.; Tuch, T.; Pfeifer, S.; Fiebig, M.; Fjäraa, A. M.; Asmi, E.; Sellegri, K.; Depuy, R.; Venzac, H.; Villani, P.; Laj, P.; Aalto, P.; Ogren, J. A.; Swietlicki, E.; Williams, P.; Roldin, P.; Quincey, P.; Hüglin, C.; Fierz-Schmidhauser, R.; Gysel, M.; Weingartner, E.; Riccobono, F.; Santos, S.; Grüning, C.; Faloon, K.; Beddows, D.; Harrison, R.; Monahan, C.; Jennings, S. G.; O'Dowd, C. D.; Marinoni, A.; Horn, H.-G.; Keck, L.; Jiang, J.; Scheckman, J.; McMurry, P. H.; Deng, Z.; Zhao, C. S.; Moerman, M.; Henzing, B.; de Leeuw, G.; Löschau, G.; Bastian, S.

    2012-03-01

    Mobility particle size spectrometers often referred to as DMPS (Differential Mobility Particle Sizers) or SMPS (Scanning Mobility Particle Sizers) have found a wide range of applications in atmospheric aerosol research. However, comparability of measurements conducted world-wide is hampered by lack of generally accepted technical standards and guidelines with respect to the instrumental set-up, measurement mode, data evaluation as well as quality control. Technical standards were developed for a minimum requirement of mobility size spectrometry to perform long-term atmospheric aerosol measurements. Technical recommendations include continuous monitoring of flow rates, temperature, pressure, and relative humidity for the sheath and sample air in the differential mobility analyzer. We compared commercial and custom-made inversion routines to calculate the particle number size distributions from the measured electrical mobility distribution. All inversion routines are comparable within few per cent uncertainty for a given set of raw data. Furthermore, this work summarizes the results from several instrument intercomparison workshops conducted within the European infrastructure project EUSAAR (European Supersites for Atmospheric Aerosol Research) and ACTRIS (Aerosols, Clouds, and Trace gases Research InfraStructure Network) to determine present uncertainties especially of custom-built mobility particle size spectrometers. Under controlled laboratory conditions, the particle number size distributions from 20 to 200 nm determined by mobility particle size spectrometers of different design are within an uncertainty range of around ±10% after correcting internal particle losses, while below and above this size range the discrepancies increased. For particles larger than 200 nm, the uncertainty range increased to 30%, which could not be explained. The network reference mobility spectrometers with identical design agreed within ±4% in the peak particle number concentration when all settings were done carefully. The consistency of these reference instruments to the total particle number concentration was demonstrated to be less than 5%. Additionally, a new data structure for particle number size distributions was introduced to store and disseminate the data at EMEP (European Monitoring and Evaluation Program). This structure contains three levels: raw data, processed data, and final particle size distributions. Importantly, we recommend reporting raw measurements including all relevant instrument parameters as well as a complete documentation on all data transformation and correction steps. These technical and data structure standards aim to enhance the quality of long-term size distribution measurements, their comparability between different networks and sites, and their transparency and traceability back to raw data.

  19. Puget Sound sediment-trap data: 1980-1985. Data report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, A.J.; Baker, E.T.; Feely, R.A.

    1991-12-01

    In 1979, scientists at the Pacific Marine Environmental Laboratory began investigating the sources, transformation, transport and fate of pollutants in Puget Sound and its watershed under Sec. 202 of the Marine Protection, Research and Sanctuaries Act of 1971 (P.L. 92-532) which called in part for '...a comprehensive and continuing program of research with respect to the possible long range effects of pollution, overfishing, and man-induced changes of ocean ecosystems...' The effort was called the Long-Range Effects Research Program (L-RERP) after language in the Act and was later called the PMEL Marine Environmental Quality Program. The Long-Range Effect Research Program consistedmore » of (1) sampling dissolved and particulate constituents in the water column by bottle sampling, (2) sampling settling particles by sediment trap and (3) sampling sediments by grab, box, gravity and Kasten corers. In the Data Report, a variety of data from particles collected in 104 traps deployed on 34 moorings in open waters between 1980 and 1985 are presented. The text of the data report begins with the sampling and analytical methods with the accompanying quality control/quality assurance data. The text of the data sections are a summary of the available data and published literature in which the data is interpreted along with a catalogue of the data available in the Appendix (on microfiche located in the back pocket of the data report).« less

  20. Particle size distribution control of Pt particles used for particle gun

    NASA Astrophysics Data System (ADS)

    Ichiji, M.; Akiba, H.; Nagao, H.; Hirasawa, I.

    2017-07-01

    The purpose of this study is particle size distribution (PSD) control of submicron sized Pt particles used for particle gun. In this report, simple reaction crystallization is conducted by mixing H2PtCl6 and ascorbic acid. Without the additive, obtained Pt particles have broad PSD and reproducibility of experiment is low. With seeding, Pt particles have narrow PSD and reproducibility improved. Additionally, mean particle diameter of 100-700 nm is controlled by changing seeding amount. Obtained particles are successfully characterized as Pt by XRD results. Moreover, XRD spectra indicate that obtained particles are polycrystals. These experimental results suggest that seeding consumed nucleation, as most nuclei attached on the seed surface. This mechanism virtually restricted nucleation to have narrow PSD can be obtained.

  1. Particle Pusher for the Investigation of Wave-Particle Interactions in the Magnetic Centrifugal Mass Filter (MCMF)

    NASA Astrophysics Data System (ADS)

    Kulp-McDowall, Taylor; Ochs, Ian; Fisch, Nathaniel

    2016-10-01

    A particle pusher was constructed in MATLAB using a fourth order Runge-Kutta algorithm to investigate the wave-particle interactions within theoretical models of the MCMF. The model simplified to a radial electric field and a magnetic field focused in the z direction. Studies on an average velocity calculation were conducted in order to test the program's behavior in the large radius limit. The results verified that the particle pusher was behaving correctly. Waves were then simulated on the rotating particles with a periodic divergenceless perturbation in the Bz component of the magnetic field. Preliminary runs indicate an agreement of the particle's motion with analytical predictions-ie. cyclic contractions of the doubly rotating particle's gyroradius.The next stage of the project involves the implementation of particle collisions and turbulence within the particle pusher in order to increase its accuracy and applicability. This will allow for a further investigation of the alpha channeling electrode replacement thesis first proposed by Abraham Fetterman in 2011. Made possible by Grants from the Princeton Environmental Institute (PEI) and the Program for Plasma Science and Technology (PPST).

  2. Application of particle swarm optimisation for solving deteriorating inventory model with fluctuating demand and controllable deterioration rate

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Ren; Dye, Chung-Yuan

    2013-06-01

    In most of the inventory models in the literature, the deterioration rate of goods is viewed as an exogenous variable, which is not subject to control. In the real market, the retailer can reduce the deterioration rate of product by making effective capital investment in storehouse equipments. In this study, we formulate a deteriorating inventory model with time-varying demand by allowing preservation technology cost as a decision variable in conjunction with replacement policy. The objective is to find the optimal replenishment and preservation technology investment strategies while minimising the total cost over the planning horizon. For any given feasible replenishment scheme, we first prove that the optimal preservation technology investment strategy not only exists but is also unique. Then, a particle swarm optimisation is coded and used to solve the nonlinear programming problem by employing the properties derived from this article. Some numerical examples are used to illustrate the features of the proposed model.

  3. Three-dimensional structural dynamics of DNA origami Bennett linkages using individual-particle electron tomography

    DOE PAGES

    Lei, Dongsheng; Marras, Alexander E.; Liu, Jianfang; ...

    2018-02-09

    Scaffolded DNA origami has proven to be a powerful and efficient technique to fabricate functional nanomachines by programming the folding of a single-stranded DNA template strand into three-dimensional (3D) nanostructures, designed to be precisely motion-controlled. Although two-dimensional (2D) imaging of DNA nanomachines using transmission electron microscopy and atomic force microscopy suggested these nanomachines are dynamic in 3D, geometric analysis based on 2D imaging was insufficient to uncover the exact motion in 3D. In this paper, we use the individual-particle electron tomography method and reconstruct 129 density maps from 129 individual DNA origami Bennett linkage mechanisms at ~6-14 nm resolution. The statisticalmore » analyses of these conformations lead to understanding the 3D structural dynamics of Bennett linkage mechanisms. Moreover, our effort provides experimental verification of a theoretical kinematics model of DNA origami, which can be used as feedback to improve the design and control of motion via optimized DNA sequences and routing.« less

  4. Three-dimensional structural dynamics of DNA origami Bennett linkages using individual-particle electron tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Dongsheng; Marras, Alexander E.; Liu, Jianfang

    Scaffolded DNA origami has proven to be a powerful and efficient technique to fabricate functional nanomachines by programming the folding of a single-stranded DNA template strand into three-dimensional (3D) nanostructures, designed to be precisely motion-controlled. Although two-dimensional (2D) imaging of DNA nanomachines using transmission electron microscopy and atomic force microscopy suggested these nanomachines are dynamic in 3D, geometric analysis based on 2D imaging was insufficient to uncover the exact motion in 3D. In this paper, we use the individual-particle electron tomography method and reconstruct 129 density maps from 129 individual DNA origami Bennett linkage mechanisms at ~6-14 nm resolution. The statisticalmore » analyses of these conformations lead to understanding the 3D structural dynamics of Bennett linkage mechanisms. Moreover, our effort provides experimental verification of a theoretical kinematics model of DNA origami, which can be used as feedback to improve the design and control of motion via optimized DNA sequences and routing.« less

  5. Effect of Finite Particle Size on Convergence of Point Particle Models in Euler-Lagrange Multiphase Dispersed Flow

    NASA Astrophysics Data System (ADS)

    Nili, Samaun; Park, Chanyoung; Haftka, Raphael T.; Kim, Nam H.; Balachandar, S.

    2017-11-01

    Point particle methods are extensively used in simulating Euler-Lagrange multiphase dispersed flow. When particles are much smaller than the Eulerian grid the point particle model is on firm theoretical ground. However, this standard approach of evaluating the gas-particle coupling at the particle center fails to converge as the Eulerian grid is reduced below particle size. We present an approach to model the interaction between particles and fluid for finite size particles that permits convergence. We use the generalized Faxen form to compute the force on a particle and compare the results against traditional point particle method. We apportion the different force components on the particle to fluid cells based on the fraction of particle volume or surface in the cell. The application is to a one-dimensional model of shock propagation through a particle-laden field at moderate volume fraction, where the convergence is achieved for a well-formulated force model and back coupling for finite size particles. Comparison with 3D direct fully resolved numerical simulations will be used to check if the approach also improves accuracy compared to the point particle model. Work supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  6. Tuning Amphiphilicity of Particles for Controllable Pickering Emulsion

    PubMed Central

    Wang, Zhen; Wang, Yapei

    2016-01-01

    Pickering emulsions with the use of particles as emulsifiers have been extensively used in scientific research and industrial production due to their edge in biocompatibility and stability compared with traditional emulsions. The control over Pickering emulsion stability and type plays a significant role in these applications. Among the present methods to build controllable Pickering emulsions, tuning the amphiphilicity of particles is comparatively effective and has attracted enormous attention. In this review, we highlight some recent advances in tuning the amphiphilicity of particles for controlling the stability and type of Pickering emulsions. The amphiphilicity of three types of particles including rigid particles, soft particles, and Janus particles are tailored by means of different mechanisms and discussed here in detail. The stabilization-destabilization interconversion and phase inversion of Pickering emulsions have been successfully achieved by changing the surface properties of these particles. This article provides a comprehensive review of controllable Pickering emulsions, which is expected to stimulate inspiration for designing and preparing novel Pickering emulsions, and ultimately directing the preparation of functional materials. PMID:28774029

  7. Asymmetric Bidirectional Controlled Quantum Information Transmission via Seven-Particle Entangled State

    NASA Astrophysics Data System (ADS)

    Sang, Ming-huang; Nie, Li-ping

    2017-11-01

    We demonstrate that a seven-particle entangled state can be used to realize the deterministic asymmetric bidirectional controlled quantum information transmission by performing only Bell-state measurement and two-particle projective measurement and single-particle measurement. In our protocol, Alice can teleport an arbitrary unknown single-particle state to Bob and at the same time Bob can remotely prepare an arbitrary known two-particle state for Alice via the control of the supervisor Charlie.

  8. Particle damping applied research on mining dump truck vibration control

    NASA Astrophysics Data System (ADS)

    Song, Liming; Xiao, Wangqiang; Guo, Haiquan; Yang, Zhe; Li, Zeguang

    2018-05-01

    Vehicle vibration characteristics has become an important evaluation indexes of mining dump truck. In this paper, based on particle damping technology, mining dump truck vibration control was studied by combining the theoretical simulation with actual testing, particle damping technology was successfully used in mining dump truck cab vibration control. Through testing results analysis, with a particle damper, cab vibration was reduced obviously, the methods and basis were provided for vehicle vibration control research and particle damping technology application.

  9. Particle Engulfment and Pushing By Solidifying Interfaces

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The study of particle behavior at solid/liquid interfaces (SLI s) is at the center of the Particle Engulfment and Pushing (PEP) research program. Interactions of particles with SLI s have been of interest since the 1960 s, starting with geological observations, i.e., frost heaving. Ever since, this field of research has become significant to such diverse areas as metal matrix composite materials, fabrication of superconductors, and inclusion control in steels. The PEP research effort is geared towards understanding the fundamental physics of the interaction between particles and a planar SLI. Experimental work including 1-g and mu-g experiments accompany the development of analytical and numerical models. The experimental work comprised of substantial groundwork with aluminum (Al) and zinc (Zn) matrices containing spherical zirconia particles, mu-g experiments with metallic Al matrices and the use of transparent organic metal-analogue materials. The modeling efforts have grown from the initial steady-state analytical model to dynamic models, accounting for the initial acceleration of a particle at rest by an advancing SLI. To gain a more comprehensive understanding, numerical models were developed to account for the influence of the thermal and solutal field. Current efforts are geared towards coupling the diffusive 2-D front tracking model with a fluid flow model to account for differences in the physics of interaction between 1-g and -g environments. A significant amount of this theoretical investigation has been and is being performed by co-investigators at NASA MSFC.

  10. On Determination of the Equation of State of Colloidal Suspensions

    NASA Astrophysics Data System (ADS)

    Sirorattanakul, Krittanon; Huang, Hao; Uhl, Christopher; Ou-Yang, Daniel

    Colloidal suspensions are the main ingredients for a variety of materials in our daily life, e.g., milk, salad dressing, skin lotions and paint for wall coatings. Material properties of these systems require an understanding of the equation of state of these materials. Our project aims to experimentally determine the equation of state of colloidal suspensions by microfluidics, dielectrophoresis (DEP) and optical imaging. We use fluorescent polystyrene latexes as a model system for this study. Placing semi-permeable membranes between microfluidics channels, which made from PDMS, we control the particle concentration and ionic strengths of the suspension. We use osmotic equilibrium equation to analyze the particle concentration distribution in a potential force field created by DEP. We use confocal optical imaging to measure the spatial distribution of the particle concentration. We compare the results of our experimental study with data obtained by computer simulation of osmotic equilibrium of interacting colloids. NSF DMR-0923299, Emulsion Polymer Institute, Department of Physics, Bioengineering Program of Lehigh University.

  11. Long-term Results of the UCSF-LBNL Randomized Trial: Charged Particle With Helium Ion Versus Iodine-125 Plaque Therapy for Choroidal and Ciliary Body Melanoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, Kavita K., E-mail: Kavita.mishra@ucsf.edu; Quivey, Jeanne M.; Daftari, Inder K.

    Purpose: Relevant clinical data are needed given the increasing national interest in charged particle radiation therapy (CPT) programs. Here we report long-term outcomes from the only randomized, stratified trial comparing CPT with iodine-125 plaque therapy for choroidal and ciliary body melanoma. Methods and Materials: From 1985 to 1991, 184 patients met eligibility criteria and were randomized to receive particle (86 patients) or plaque therapy (98 patients). Patients were stratified by tumor diameter, thickness, distance to disc/fovea, anterior extension, and visual acuity. Tumors close to the optic disc were included. Local tumor control, as well as eye preservation, metastases due tomore » melanoma, and survival were evaluated. Results: Median follow-up times for particle and plaque arm patients were 14.6 years and 12.3 years, respectively (P=.22), and for those alive at last follow-up, 18.5 and 16.5 years, respectively (P=.81). Local control (LC) for particle versus plaque treatment was 100% versus 84% at 5 years, and 98% versus 79% at 12 years, respectively (log rank: P=.0006). If patients with tumors close to the disc (<2 mm) were excluded, CPT still resulted in significantly improved LC: 100% versus 90% at 5 years and 98% versus 86% at 12 years, respectively (log rank: P=.048). Enucleation rate was lower after CPT: 11% versus 22% at 5 years and 17% versus 37% at 12 years, respectively (log rank: P=.01). Using Cox regression model, likelihood ratio test, treatment was the most important predictor of LC (P=.0002) and eye preservation (P=.01). CPT was a significant predictor of prolonged disease-free survival (log rank: P=.001). Conclusions: Particle therapy resulted in significantly improved local control, eye preservation, and disease-free survival as confirmed by long-term outcomes from the only randomized study available to date comparing radiation modalities in choroidal and ciliary body melanoma.« less

  12. The ALTEA/ALTEINO projects: studying functional effects of microgravity and cosmic radiation

    NASA Technical Reports Server (NTRS)

    Narici, L.; Belli, F.; Bidoli, V.; Casolino, M.; De Pascale, M. P.; Di Fino, L.; Furano, G.; Modena, I.; Morselli, A.; Picozza, P.; hide

    2004-01-01

    The ALTEA project investigates the risks of functional brain damage induced by particle radiation in space. A modular facility (the ALTEA facility) is being implemented and will be operated in the International Space Station (ISS) to record electrophysiological and behavioral descriptors of brain function and to monitor their time dynamics and correlation with particles and space environment. The focus of the program will be on abnormal visual perceptions (often reported as "light flashes" by astronauts) and the impact on retinal and brain visual structures of particle in microgravity conditions. The facility will be made available to the international scientific community for human neurophysiological, electrophysiological and psychophysics experiments, studies on particle fluxes, and dosimetry. A precursor of ALTEA (the 'Alteino' project) helps set the experimental baseline for the ALTEA experiments, while providing novel information on the radiation environment onboard the ISS and on the brain electrophysiology of the astronauts during orbital flights. Alteino was flown to the ISS on the Soyuz TM34 as part of mission Marco Polo. Controlled ground experiments using mice and accelerator beams complete the experimental strategy of ALTEA. We present here the status of progress of the ALTEA project and preliminary results of the Alteino study on brain dynamics, particle fluxes and abnormal visual perceptions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  13. The ALTEA/ALTEINO projects: studying functional effects of microgravity and cosmic radiation.

    PubMed

    Narici, L; Belli, F; Bidoli, V; Casolino, M; De Pascale, M P; Di Fino, L; Furano, G; Modena, I; Morselli, A; Picozza, P; Reali, E; Rinaldi, A; Ruggieri, D; Sparvoli, R; Zaconte, V; Sannita, W G; Carozzo, S; Licoccia, S; Romagnoli, P; Traversa, E; Cotronei, V; Vazquez, M; Miller, J; Salnitskii, V P; Shevchenko, O I; Petrov, V P; Trukhanov, K A; Galper, A; Khodarovich, A; Korotkov, M G; Popov, A; Vavilov, N; Avdeev, S; Boezio, M; Bonvicini, W; Vacchi, A; Zampa, N; Mazzenga, G; Ricci, M; Spillantini, P; Castellini, G; Vittori, R; Carlson, P; Fuglesang, C; Schardt, D

    2004-01-01

    The ALTEA project investigates the risks of functional brain damage induced by particle radiation in space. A modular facility (the ALTEA facility) is being implemented and will be operated in the International Space Station (ISS) to record electrophysiological and behavioral descriptors of brain function and to monitor their time dynamics and correlation with particles and space environment. The focus of the program will be on abnormal visual perceptions (often reported as "light flashes" by astronauts) and the impact on retinal and brain visual structures of particle in microgravity conditions. The facility will be made available to the international scientific community for human neurophysiological, electrophysiological and psychophysics experiments, studies on particle fluxes, and dosimetry. A precursor of ALTEA (the 'Alteino' project) helps set the experimental baseline for the ALTEA experiments, while providing novel information on the radiation environment onboard the ISS and on the brain electrophysiology of the astronauts during orbital flights. Alteino was flown to the ISS on the Soyuz TM34 as part of mission Marco Polo. Controlled ground experiments using mice and accelerator beams complete the experimental strategy of ALTEA. We present here the status of progress of the ALTEA project and preliminary results of the Alteino study on brain dynamics, particle fluxes and abnormal visual perceptions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  14. The Bermuda Bio-Optics Program (BBOP). Chapter 16

    NASA Technical Reports Server (NTRS)

    Siegel, David A.

    2001-01-01

    The Bermuda Bio-Optics Project (BBOP) is a collaborative effort between the Institute for Computational Earth System Science (ICESS) at the University of California at Santa Barbara (UCSB) and the Bermuda Biological Station for Research (BBSR). This research program is designed to characterize light availability and utilization in the Sargasso Sea, and to provide an optical link by which biogeochemical observations may be used to evaluate bio-optical models for pigment concentration, primary production, and sinking particle fluxes from satellite-based ocean color sensors. The BBOP time-series was initiated in 1992, and is carried out in conjunction with the US JGOFS Bermuda Atlantic Time-series Study (BATS) at the Bermuda Biological Station for Research. The BATS program itself has been observing biogeochemical processes (primary productivity, particle flux at and elemental cycles) in the mesotrophic waters of the Sargasso Sea since 1988. Closely affiliated with BBOP and BATS is a separate NASA-funded study of the spatial variability of biogeochemical processes in the Sargasso Sea using high-resolution Advanced Very High Resolution Radiometer (AVHRR) and Sea-Viewing Wide Field-of-view Sensor (SeaWiFS) data collected at Bermuda. The collaboration between BATS and BBOP measurements has resulted in a unique data set that addresses not only the SIMBIOS goals but also the broader issues of important factors controlling the carbon cycle.

  15. [Design of an HACCP program for a cocoa processing facility].

    PubMed

    López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar

    2012-12-01

    The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well

  16. Quantitative determination of low-Z elements in single atmospheric particles on boron substrates by automated scanning electron microscopy-energy-dispersive X-ray spectrometry.

    PubMed

    Choël, Marie; Deboudt, Karine; Osán, János; Flament, Pascal; Van Grieken, René

    2005-09-01

    Atmospheric aerosols consist of a complex heterogeneous mixture of particles. Single-particle analysis techniques are known to provide unique information on the size-resolved chemical composition of aerosols. A scanning electron microscope (SEM) combined with a thin-window energy-dispersive X-ray (EDX) detector enables the morphological and elemental analysis of single particles down to 0.1 microm with a detection limit of 1-10 wt %, low-Z elements included. To obtain data statistically representative of the air masses sampled, a computer-controlled procedure can be implemented in order to run hundreds of single-particle analyses (typically 1000-2000) automatically in a relatively short period of time (generally 4-8 h, depending on the setup and on the particle loading). However, automated particle analysis by SEM-EDX raises two practical challenges: the accuracy of the particle recognition and the reliability of the quantitative analysis, especially for micrometer-sized particles with low atomic number contents. Since low-Z analysis is hampered by the use of traditional polycarbonate membranes, an alternate choice of substrate is a prerequisite. In this work, boron is being studied as a promising material for particle microanalysis. As EDX is generally said to probe a volume of approximately 1 microm3, geometry effects arise from the finite size of microparticles. These particle geometry effects must be corrected by means of a robust concentration calculation procedure. Conventional quantitative methods developed for bulk samples generate elemental concentrations considerably in error when applied to microparticles. A new methodology for particle microanalysis, combining the use of boron as the substrate material and a reverse Monte Carlo quantitative program, was tested on standard particles ranging from 0.25 to 10 microm. We demonstrate that the quantitative determination of low-Z elements in microparticles is achievable and that highly accurate results can be obtained using the automatic data processing described here compared to conventional methods.

  17. Nondestructive Testing Magnetic Particle RQA/M1-5330.11.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Huntsville, AL. George C. Marshall Space Flight Center.

    As one in the series of programmed instruction handbooks, prepared by the U. S. space program, home study material is presented in this volume concerning familiarization and orientation on magnetic particle properties. The subject is presented under the following headings: Magnetism, Producing a Magnetic Field, Magnetizing Currents, Materials and…

  18. μ-PIV measurements of the ensemble flow fields surrounding a migrating semi-infinite bubble.

    PubMed

    Yamaguchi, Eiichiro; Smith, Bradford J; Gaver, Donald P

    2009-08-01

    Microscale particle image velocimetry (μ-PIV) measurements of ensemble flow fields surrounding a steadily-migrating semi-infinite bubble through the novel adaptation of a computer controlled linear motor flow control system. The system was programmed to generate a square wave velocity input in order to produce accurate constant bubble propagation repeatedly and effectively through a fused glass capillary tube. We present a novel technique for re-positioning of the coordinate axis to the bubble tip frame of reference in each instantaneous field through the analysis of the sudden change of standard deviation of centerline velocity profiles across the bubble interface. Ensemble averages were then computed in this bubble tip frame of reference. Combined fluid systems of water/air, glycerol/air, and glycerol/Si-oil were used to investigate flows comparable to computational simulations described in Smith and Gaver (2008) and to past experimental observations of interfacial shape. Fluorescent particle images were also analyzed to measure the residual film thickness trailing behind the bubble. The flow fields and film thickness agree very well with the computational simulations as well as existing experimental and analytical results. Particle accumulation and migration associated with the flow patterns near the bubble tip after long experimental durations are discussed as potential sources of error in the experimental method.

  19. Computer programs for computing particle-size statistics of fluvial sediments

    USGS Publications Warehouse

    Stevens, H.H.; Hubbell, D.W.

    1986-01-01

    Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)

  20. Magnetic Particle Testing, RQA/M1-5330.16.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Huntsville, AL. George C. Marshall Space Flight Center.

    As one in the series of classroom training handbooks, prepared by the U.S. space program, instructional material is presented in this volume concerning familiarization and orientation on magnetic particle testing. The subject is divided under the following headings: Introduction, Principles of Magnetic Particle Testing, Magnetic Particle Test…

  1. Real-World Vehicle Emissions: A Summary of the 18th Coordinating Research Council On-Road Vehicle Emissions Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadle, S. H.; Ayala, A.; Black, K. N.

    2009-02-01

    The Coordinating Research Council (CRC) convened its 18th On-Road Vehicle Emissions Workshop March 31-April 2, 2008, with 104 presentations describing the most recent mobile source-related emissions research. In this paper we summarize the presentations from researchers whose efforts are improving our understanding of the contribution of mobile sources to air quality. Participants in the workshop discussed emission models and emissions inventories, results from gas- and particle-phase emissions studies from spark-ignition and diesel-powered vehicles (with an emphasis in this workshop on particle emissions), effects of fuels on emissions, evaluation of in-use emission-control programs, and efforts to improve our capabilities in performingmore » on-board emissions measurements, as well as topics for future research.« less

  2. Effects of Initial Particle Distribution on an Energetic Dispersal of Particles

    NASA Astrophysics Data System (ADS)

    Rollin, Bertrand; Ouellet, Frederick; Koneru, Rahul; Garno, Joshua; Durant, Bradford

    2017-11-01

    Accurate predictions of the late time solid particle cloud distribution ensuing an explosive dispersal of particles is an extremely challenging problem for compressible multiphase flow simulations. The source of this difficulty is twofold: (i) The complex sequence of events taking place. Indeed, as the blast wave crosses the surrounding layer of particles, compaction occurs shortly before particles disperse radially at high speed. Then, during the dispersion phase, complex multiphase interactions occurs between particles and detonation products. (ii) Precise characterization of the explosive and particle distribution is virtually impossible. In this numerical experiment, we focus on the sensitivity of late time particle cloud distributions relative to carefully designed initial distributions, assuming the explosive is well described. Using point particle simulations, we study the case of a bed of glass particles surrounding an explosive. Constraining our simulations to relatively low initial volume fractions to prevent reaching of the close packing limit, we seek to describe qualitatively and quantitatively the late time dependency of a solid particle cloud on its distribution before the energy release of an explosive. This work was supported by the U.S. DoE, NNSA, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  3. Gas and particle motions in a rapidly decompressed flow

    NASA Astrophysics Data System (ADS)

    Johnson, Blair; Zunino, Heather; Adrian, Ronald; Clarke, Amanda

    2017-11-01

    To understand the behavior of a rapidly decompressed particle bed in response to a shock, an experimental study is performed in a cylindrical (D = 4.1 cm) glass vertical shock tube of a densely packed (ρ = 61%) particle bed. The bed is comprised of spherical glass particles, ranging from D50 = 44-297 μm between experiments. High-speed pressure sensors are incorporated to capture shock speeds and strengths. High-speed video and particle image velocimetry (PIV) measurements are collected to examine vertical and radial velocities of both the particles and gas to elucidate features of the shock wave and resultant expansion wave in the lateral center of the tube, away from boundaries. In addition to optically analyzing the front velocity of the rising particle bed, interaction between the particle and gas phases are investigated as the flow accelerates and the particle front becomes more dilute. Particle and gas interactions are also considered in exploring mechanisms through which turbulence develops in the flow. This work is supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science and Academic Alliance Program, under Contract No. DE-NA0002378.

  4. GASP cloud- and particle-encounter statistics and their application to LFC aircraft studies. Volume 2: Appendixes

    NASA Technical Reports Server (NTRS)

    Jasperson, W. H.; Nastron, G. D.; Davis, R. E.; Holdeman, J. D.

    1984-01-01

    Summary studies are presented for the entire cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle-concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud-encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long-range airline routes, and to assess the probability and extent of laminaar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical. This report is presented in two volumes. Volume I contains the narrative, analysis, and conclusions. Volume II contains five supporting appendixes.

  5. Natural user interface as a supplement of the holographic Raman tweezers

    NASA Astrophysics Data System (ADS)

    Tomori, Zoltan; Kanka, Jan; Kesa, Peter; Jakl, Petr; Sery, Mojmir; Bernatova, Silvie; Antalik, Marian; Zemánek, Pavel

    2014-09-01

    Holographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal "Natural User Interface" (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited "Leap Motion" and "MyGaze" low-cost sensors and a simple speech recognition program "Tazti". We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called "pin" and "tweezers") serving for the manipulation with particles are displayed on the transparent "overlay" window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work.

  6. Steric stabilization of nonaqueous silicon slips. I - Control of particle agglomeration and packing. II - Pressure casting of powder compacts

    NASA Technical Reports Server (NTRS)

    Kerkar, Awdhoot V.; Henderson, Robert J. M.; Feke, Donald L.

    1990-01-01

    The application of steric stabilization to control particle agglomeration and packing of silicon powder in benzene and trichloroethylene is reported. The results provide useful guidelines for controlling unfavorable particle-particle interactions during nonaqueous processing of silicon-based ceramic materials. The application of steric stabilization to the control and improvement of green processing of nonaqueous silicon slips in pressure consolidation is also demonstrated.

  7. Fabrication, Characterization, and Biological Activity of Avermectin Nano-delivery Systems with Different Particle Sizes

    NASA Astrophysics Data System (ADS)

    Wang, Anqi; Wang, Yan; Sun, Changjiao; Wang, Chunxin; Cui, Bo; Zhao, Xiang; Zeng, Zhanghua; Yao, Junwei; Yang, Dongsheng; Liu, Guoqiang; Cui, Haixin

    2018-01-01

    Nano-delivery systems for the active ingredients of pesticides can improve the utilization rates of pesticides and prolong their control effects. This is due to the nanocarrier envelope and controlled release function. However, particles containing active ingredients in controlled release pesticide formulations are generally large and have wide size distributions. There have been limited studies about the effect of particle size on the controlled release properties and biological activities of pesticide delivery systems. In the current study, avermectin (Av) nano-delivery systems were constructed with different particle sizes and their performances were evaluated. The Av release rate in the nano-delivery system could be effectively controlled by changing the particle size. The biological activity increased with decreasing particle size. These results suggest that Av nano-delivery systems can significantly improve the controllable release, photostability, and biological activity, which will improve efficiency and reduce pesticide residues.

  8. DNA-controlled assembly of a NaTl lattice structure from gold nanoparticles and protein nanoparticles

    NASA Astrophysics Data System (ADS)

    Cigler, Petr; Lytton-Jean, Abigail K. R.; Anderson, Daniel G.; Finn, M. G.; Park, Sung Yong

    2010-11-01

    The formation of diamond structures from tailorable building blocks is an important goal in colloidal crystallization because the non-compact diamond lattice is an essential component of photonic crystals for the visible-light range. However, designing nanoparticle systems that self-assemble into non-compact structures has proved difficult. Although several methods have been proposed, single-component nanoparticle assembly of a diamond structure has not been reported. Binary systems, in which at least one component is arranged in a diamond lattice, provide alternatives, but control of interparticle interactions is critical to this approach. DNA has been used for this purpose in a number of systems. Here we show the creation of a non-compact lattice by DNA-programmed crystallization using surface-modified Qβ phage capsid particles and gold nanoparticles, engineered to have similar effective radii. When combined with the proper connecting oligonucleotides, these components form NaTl-type colloidal crystalline structures containing interpenetrating organic and inorganic diamond lattices, as determined by small-angle X-ray scattering. DNA control of assembly is therefore shown to be compatible with particles possessing very different properties, as long as they are amenable to surface modification.

  9. Final Environmental Impact Statement (EIS) for the Space Nuclear Thermal Propulsion (SNTP) program

    NASA Astrophysics Data System (ADS)

    1991-09-01

    A program has been proposed to develop the technology and demonstrate the feasibility of a high-temperature particle bed reactor (PBR) propulsion system to be used to power an advanced second stage nuclear rocket engine. The purpose of this Final Environmental Impact Statement (FEIS) is to assess the potential environmental impacts of component development and testing, construction of ground test facilities, and ground testing. Major issues and goals of the program include the achievement and control of predicted nuclear power levels; the development of materials that can withstand the extremely high operating temperatures and hydrogen flow environments; and the reliable control of cryogenic hydrogen and hot gaseous hydrogen propellant. The testing process is designed to minimize radiation exposure to the environment. Environmental impact and mitigation planning are included for the following areas of concern: (1) Population and economy; (2) Land use and infrastructure; (3) Noise; (4) Cultural resources; (5) Safety (non-nuclear); (6) Waste; (7) Topography; (8) Geology; (9) Seismic activity; (10) Water resources; (11) Meteorology/Air quality; (12) Biological resources; (13) Radiological normal operations; (14) Radiological accidents; (15) Soils; and (16) Wildlife habitats.

  10. Shock Interaction with Random Spherical Particle Beds

    NASA Astrophysics Data System (ADS)

    Neal, Chris; Mehta, Yash; Salari, Kambiz; Jackson, Thomas L.; Balachandar, S. "Bala"; Thakur, Siddharth

    2016-11-01

    In this talk we present results on fully resolved simulations of shock interaction with randomly distributed bed of particles. Multiple simulations were carried out by varying the number of particles to isolate the effect of volume fraction. Major focus of these simulations was to understand 1) the effect of the shockwave and volume fraction on the forces experienced by the particles, 2) the effect of particles on the shock wave, and 3) fluid mediated particle-particle interactions. Peak drag force for particles at different volume fractions show a downward trend as the depth of the bed increased. This can be attributed to dissipation of energy as the shockwave travels through the bed of particles. One of the fascinating observations from these simulations was the fluctuations in different quantities due to presence of multiple particles and their random distribution. These are large simulations with hundreds of particles resulting in large amount of data. We present statistical analysis of the data and make relevant observations. Average pressure in the computational domain is computed to characterize the strengths of the reflected and transmitted waves. We also present flow field contour plots to support our observations. U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  11. Particle monitoring and control in vacuum processing equipment

    NASA Astrophysics Data System (ADS)

    Borden, Peter G., Dr.; Gregg, John

    1989-10-01

    Particle contamination during vacuum processes has emerged as the largest single source of yield loss in VLSI manufacturing. While a number of tools have been available to help understand the sources and nature of this contamination, only recently has it been possible to monitor free particle levels within vacuum equipment in real-time. As a result, a better picture is available of how particle contamination can affect a variety of processes. This paper reviews some of the work that has been done to monitor particles in vacuum loadlocks and in processes such as etching, sputtering and ion implantation. The aim has been to make free particles in vacuum equipment a measurable process parameter. Achieving this allows particles to be controlled using statistical process control. It will be shown that free particle levels in load locks correlate to wafer surface counts, device yield and process conditions, but that these levels are considerable higher during production than when dummy wafers are run to qualify a system. It will also be shown how real-time free particle monitoring can be used to monitor and control cleaning cycles, how major episodic events can be detected, and how data can be gathered in a format suitable for statistical process control.

  12. Particle Mass in Deep-Water Benthic Nepheloid Layers: a Global Synthesis

    NASA Astrophysics Data System (ADS)

    Mishonov, A. V.; Gardner, W. D.; Richardson, M. J.

    2016-12-01

    The mass of particles in benthic nepheloid layers in the deep ocean is mapped using profiles of beam attenuation coefficient obtained with transmissometers interfaced with CTDs during WOCE, SAVE, JGOFS, CLIVAR-Repeat Hydrography, and other programs during the last four decades using data from over 8000 profiles from >70 cruises. We map the maximum concentration of particle mass near the seafloor and integrate the particle mass throughout the benthic nepheloid layer. In the Atlantic Ocean particle mass is greater in areas where eddy kinetic energy is high in overlying waters. Areas of high bottom particle concentrations and integrated benthic nepheloid layer particle loads include the western North Atlantic beneath the Gulf Stream meanders and eddies, Argentine Basin, parts of the Southern Ocean and areas around South Africa. Particle concentrations are low in most of the Pacific and tropical and subtropical Atlantic away from margins. This synthesis is useful for GEOTRACES and other global programs where knowing particle distribution is critical for understanding trace metal absorption, sediment-water exchange and near-bottom processes. Additionally, our synthesis provides baseline data to identify where mining of metal-rich nodules and metal sulfides on the seafloor may impact the benthic environment.

  13. Simulations of Shock Wave Interaction with a Particle Cloud

    NASA Astrophysics Data System (ADS)

    Koneru, Rahul; Rollin, Bertrand; Ouellet, Frederick; Annamalai, Subramanian; Balachandar, S.'Bala'

    2016-11-01

    Simulations of a shock wave interacting with a cloud of particles are performed in an attempt to understand similar phenomena observed in dispersal of solid particles under such extreme environment as an explosion. We conduct numerical experiments in which a particle curtain fills only 87% of the shock tube from bottom to top. As such, the particle curtain upon interaction with the shock wave is expected to experience Kelvin-Helmholtz (KH) and Richtmyer-Meshkov (RM) instabilities. In this study, the initial volume fraction profile matches with that of Sandia Multiphase Shock Tube experiments, and the shock Mach number is limited to M =1.66. In these simulations we use a Eulerian-Lagrangian approach along with state-of-the-art point-particle force and heat transfer models. Measurements of particle dispersion are made at different initial volume fractions of the particle cloud. A detailed analysis of the evolution of the particle curtain with respect to the initial conditions is presented. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, Contract No. DE-NA0002378.

  14. MC-TESTER: a universal tool for comparisons of Monte Carlo predictions for particle decays in high energy physics

    NASA Astrophysics Data System (ADS)

    Golonka, P.; Pierzchała, T.; Waş, Z.

    2004-02-01

    Theoretical predictions in high energy physics are routinely provided in the form of Monte Carlo generators. Comparisons of predictions from different programs and/or different initialization set-ups are often necessary. MC-TESTER can be used for such tests of decays of intermediate states (particles or resonances) in a semi-automated way. Our test consists of two steps. Different Monte Carlo programs are run; events with decays of a chosen particle are searched, decay trees are analyzed and appropriate information is stored. Then, at the analysis step, a list of all found decay modes is defined and branching ratios are calculated for both runs. Histograms of all scalar Lorentz-invariant masses constructed from the decay products are plotted and compared for each decay mode found in both runs. For each plot a measure of the difference of the distributions is calculated and its maximal value over all histograms for each decay channel is printed in a summary table. As an example of MC-TESTER application, we include a test with the τ lepton decay Monte Carlo generators, TAUOLA and PYTHIA. The HEPEVT (or LUJETS) common block is used as exclusive source of information on the generated events. Program summaryTitle of the program:MC-TESTER, version 1.1 Catalogue identifier: ADSM Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSM Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: PC, two Intel Xeon 2.0 GHz processors, 512MB RAM Operating system: Linux Red Hat 6.1, 7.2, and also 8.0 Programming language used:C++, FORTRAN77: gcc 2.96 or 2.95.2 (also 3.2) compiler suite with g++ and g77 Size of the package: 7.3 MB directory including example programs (2 MB compressed distribution archive), without ROOT libraries (additional 43 MB). No. of bytes in distributed program, including test data, etc.: 2 024 425 Distribution format: tar gzip file Additional disk space required: Depends on the analyzed particle: 40 MB in the case of τ lepton decays (30 decay channels, 594 histograms, 82-pages booklet). Keywords: particle physics, decay simulation, Monte Carlo methods, invariant mass distributions, programs comparison Nature of the physical problem: The decays of individual particles are well defined modules of a typical Monte Carlo program chain in high energy physics. A fast, semi-automatic way of comparing results from different programs is often desirable, for the development of new programs, to check correctness of the installations or for discussion of uncertainties. Method of solution: A typical HEP Monte Carlo program stores the generated events in the event records such as HEPEVT or PYJETS. MC-TESTER scans, event by event, the contents of the record and searches for the decays of the particle under study. The list of the found decay modes is successively incremented and histograms of all invariant masses which can be calculated from the momenta of the particle decay products are defined and filled. The outputs from the two runs of distinct programs can be later compared. A booklet of comparisons is created: for every decay channel, all histograms present in the two outputs are plotted and parameter quantifying shape difference is calculated. Its maximum over every decay channel is printed in the summary table. Restrictions on the complexity of the problem: For a list of limitations see Section 6. Typical running time: Varies substantially with the analyzed decay particle. On a PC/Linux with 2.0 GHz processors MC-TESTER increases the run time of the τ-lepton Monte Carlo program TAUOLA by 4.0 seconds for every 100 000 analyzed events (generation itself takes 26 seconds). The analysis step takes 13 seconds; ? processing takes additionally 10 seconds. Generation step runs may be executed simultaneously on multi-processor machines. Accessibility: web page: http://cern.ch/Piotr.Golonka/MC/MC-TESTER e-mails: Piotr.Golonka@CERN.CH, T.Pierzchala@friend.phys.us.edu.pl, Zbigniew.Was@CERN.CH.

  15. MPPhys—A many-particle simulation package for computational physics education

    NASA Astrophysics Data System (ADS)

    Müller, Thomas

    2014-03-01

    In a first course to classical mechanics elementary physical processes like elastic two-body collisions, the mass-spring model, or the gravitational two-body problem are discussed in detail. The continuation to many-body systems, however, is deferred to graduate courses although the underlying equations of motion are essentially the same and although there is a strong motivation for high-school students in particular because of the use of particle systems in computer games. The missing link between the simple and the more complex problem is a basic introduction to solve the equations of motion numerically which could be illustrated, however, by means of the Euler method. The many-particle physics simulation package MPPhys offers a platform to experiment with simple particle simulations. The aim is to give a principle idea how to implement many-particle simulations and how simulation and visualization can be combined for interactive visual explorations. Catalogue identifier: AERR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERR_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 111327 No. of bytes in distributed program, including test data, etc.: 608411 Distribution format: tar.gz Programming language: C++, OpenGL, GLSL, OpenCL. Computer: Linux and Windows platforms with OpenGL support. Operating system: Linux and Windows. RAM: Source Code 4.5 MB Complete package 242 MB Classification: 14, 16.9. External routines: OpenGL, OpenCL Nature of problem: Integrate N-body simulations, mass-spring models Solution method: Numerical integration of N-body-simulations, 3D-Rendering via OpenGL. Running time: Problem dependent

  16. Design of prototype charged particle fog dispersal unit

    NASA Technical Reports Server (NTRS)

    Collins, F. G.; Frost, W.; Kessel, P.

    1981-01-01

    The unit was designed to be easily modified so that certain features that influence the output current and particle size distribution could be examined. An experimental program was designed to measure the performance of the unit. The program described includes measurements in a fog chamber and in the field. Features of the nozzle and estimated nozzle characteristics are presented.

  17. Effect of particle momentum transfer on an oblique-shock-wave/laminar-boundary-layer interaction

    NASA Astrophysics Data System (ADS)

    Teh, E.-J.; Johansen, C. T.

    2016-11-01

    Numerical simulations of solid particles seeded into a supersonic flow containing an oblique shock wave reflection were performed. The momentum transfer mechanism between solid and gas phases in the shock-wave/boundary-layer interaction was studied by varying the particle size and mass loading. It was discovered that solid particles were capable of significant modulation of the flow field, including suppression of flow separation. The particle size controlled the rate of momentum transfer while the particle mass loading controlled the magnitude of momentum transfer. The seeding of micro- and nano-sized particles upstream of a supersonic/hypersonic air-breathing propulsion system is proposed as a flow control concept.

  18. "Smart pebble" designs for sediment transport monitoring

    NASA Astrophysics Data System (ADS)

    Valyrakis, Manousos; Alexakis, Athanasios; Pavlovskis, Edgars

    2015-04-01

    Sediment transport, due to primarily the action of water, wind and ice, is one of the most significant geomorphic processes responsible for shaping Earth's surface. It involves entrainment of sediment grains in rivers and estuaries due to the violently fluctuating hydrodynamic forces near the bed. Here an instrumented particle, namely a "smart pebble", is developed to investigate the exact flow conditions under which individual grains may be entrained from the surface of a gravel bed. This could lead in developing a better understanding of the processes involved, focusing on the response of the particle during a variety of flow entrainment events. The "smart pebble" is a particle instrumented with MEMS sensors appropriate for capturing the hydrodynamic forces a coarse particle might experience during its entrainment from the river bed. A 3-axial gyroscope and accelerometer registers data to a memory card via a microcontroller, embedded in a 3D-printed waterproof hollow spherical particle. The instrumented board is appropriately fit and centred into the shell of the pebble, so as to achieve a nearly uniform distribution of the mass which could otherwise bias its motion. The "smart pebble" is powered by an independent power to ensure autonomy and sufficiently long periods of operation appropriate for deployment in the field. Post-processing and analysis of the acquired data is currently performed offline, using scientific programming software. The performance of the instrumented particle is validated, conducting a series of calibration experiments under well-controlled laboratory conditions.

  19. Final Report for Geometric Observers and Particle Filtering for Controlled Active Vision

    DTIC Science & Technology

    2016-12-15

    code) 15-12-2016 Final Report 01Sep06 - 09May11 Final Report for Geometric Observers & Particle Filtering for Controlled Active Vision 49414-NS.1Allen...Observers and Particle Filtering for Controlled Active Vision by Allen R. Tannenbaum School of Electrical and Computer Engineering Georgia Institute of...7 2.2.4 Conformal Area Minimizing Flows . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 Particle Filters

  20. The Control Unit of KM3NeT data acquisition

    NASA Astrophysics Data System (ADS)

    Bozza, Cristiano

    2016-04-01

    The KM3NeT Collaboration is building a new generation of neutrino telescopes in the Mediterranean Sea. With the telescopes, scientists will search for cosmic neutrinos to study highly energetic objects in the Universe, while one neutrino detector will be dedicated to measure the properties of the high-energy neutrino particles themselves. Control of the KM3NeT data acquisition processes is handled by the KM3NeT Control Unit, which has been designed to maximise the detector live time. The Control Unit features software programs with different roles, following the philosophy of having no single point of failure. While all programs are interconnected, each one can also work alone for most of the time in case other services are unavailable. All services run on the Common Language Runtime, which ensures portability, flexibility and automatic memory management. Each service has an embedded Web server, providing a user interface as well as programmatic access to data and functions. Data to and from detector components for monitoring and management purposes are transmitted using a custom designed protocol. The Control Unit is interfaced to one or more Message Dispatchers to control the data acquisition chain. A Data Base Interface provides fast and fault-tolerant connection to a remote Data Base.

  1. SHAREv2: fluctuations and a comprehensive treatment of decay feed-down

    NASA Astrophysics Data System (ADS)

    Torrieri, G.; Jeon, S.; Letessier, J.; Rafelski, J.

    2006-11-01

    This the user's manual for SHARE version 2. SHARE [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229] (Statistical Hadronization with Resonances) is a collection of programs designed for the statistical analysis of particle production in relativistic heavy-ion collisions. While the structure of the program remains similar to v1.x, v2 provides several new features such as evaluation of statistical fluctuations of particle yields, and a greater versatility, in particular regarding decay feed-down and input/output structure. This article describes all the new features, with emphasis on statistical fluctuations. Program summaryTitle of program:SHAREv2 Catalogue identifier:ADVD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC, Pentium III, 512 MB RAM not hardware dependent Operating system:Linux: RedHat 6.1, 7.2, FEDORA, etc. not system dependent Programming language:FORTRAN77 Size of the package:167 KB directory, without libraries (see http://wwwasdoc.web.cern.ch/wwwasdoc/minuit/minmain.html, http://wwwasd.web.cern.ch/wwwasd/cernlib.html for details on library requirements) Number of lines in distributed program, including test data, etc.:26 101 Number of bytes in distributed program, including test data, etc.:170 346 Distribution format:tar.gzip file Computer:Any computer with an f77 compiler Nature of the physical problem:Event-by-event fluctuations have been recognized to be the physical observable capable to constrain particle production models. Therefore, consideration of event-by-event fluctuations is required for a decisive falsification or constraining of (variants of) particle production models based on (grand-, micro-) canonical statistical mechanics phase space, the so called statistical hadronization models (SHM). As in the case of particle yields, to properly compare model calculations to data it is necessary to consistently take into account resonance decays. However, event-by-event fluctuations are more sensitive than particle yields to experimental acceptance issues, and a range of techniques needs to be implemented to extract 'physical' fluctuations from an experimental event-by-event measurement. Method of solving the problem:The techniques used within the SHARE suite of programs [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229; SHAREv1] are updated and extended to fluctuations. A full particle data-table, decay tree, and set of experimental feed-down coefficients are provided. Unlike SHAREv1.x, experimental acceptance feed-down coefficients can be entered for any resonance decay. SHAREv2 can calculate yields, fluctuations, and bulk properties of the fireball from provided thermal parameters; alternatively, parameters can be obtained from fits to experimental data, via the MINUIT fitting algorithm [F. James, M. Roos, Comput. Phys. Comm. 10 (1975) 343]. Fits can also be analyzed for significance, parameter and data point sensitivity. Averages and fluctuations at freeze-out of both the stable particles and the hadronic resonances are set according to a statistical prescription, calculated via a series of Bessel functions, using CERN library programs. We also have the option of including finite particle widths of the resonances. A χ minimization algorithm, also from the CERN library programs, is used to perform and analyze the fit. Please see SHAREv1 for more details on these. Purpose:The vast amount of high quality soft hadron production data, from experiments running at the SPS, RHIC, in past at the AGS, and in the near future at the LHC, offers the opportunity for statistical particle production model falsification. This task has turned out to be difficult when considering solely particle yields addressed in the context of SHAREv1.x. For this reason physical conditions at freeze-out remain contested. Inclusion in the analysis of event-by-event fluctuations appears to resolve this issue. Similarly, a thorough analysis including both fluctuations and average multiplicities gives a way to explore the presence and strength of interactions following hadronization (when hadrons form), ending with thermal freeze-out (when all interactions cease). SHAREv2 with fluctuations will also help determine which statistical ensemble (if any), e.g., canonical or grand-canonical, is more physically appropriate for analyzing a given system. Together with resonances, fluctuations can also be used for a direct estimate of the extent the system re-interacts between chemical and thermal freeze-out. We hope and expect that SHAREv2 will contribute to decide if any of the statistical hadronization model variants has a genuine physical connection to hadron particle production. Computation time survey:We encounter, in the FORTRAN version computation, times up to seconds for evaluation of particle yields. These rise by up to a factor of 300 in the process of minimization and a further factor of a few when χ/N profiles and contours with chemical non-equilibrium are requested. Summary of new features (w.r.t. SHAREv1.x)Fluctuations:In addition to particle yields, ratios and bulk quantities SHAREv2 can calculate, fit and analyze statistical fluctuations of particles and particle ratios Decays:SHAREv2 has the flexibility to account for any experimental method of allowing for decay feed-downs to the particle yields Charm flavor:Charmed particles have been added to the decay tree, allowing as an option study of statistical hadronization of J/ψ, χ, D, etc. Quark chemistry:Chemical non-equilibrium yields for both u and d flavors, as opposed to generically light quarks q, are considered; η- η mixing, etc., are properly dealt with, and chemical non-equilibrium can be studied for each flavor separately Misc:Many new commands and features have been introduced and added to the basic user interface. For example, it is possible to study combinations of particles and their ratios. It is also possible to combine all the input files into one file. SHARE compatibility and manual:This write-up is an update and extension of SHAREv1. The user should consult SHAREv1 regarding the principles of user interface and for all particle yield related physics and program instructions, other than the parameter additions and minor changes described here. SHAREv2 is downward compatible for the changes of the user interface, offering the user of SHAREv1 a computer generated revised input files compatible with SHAREv2.

  2. Neural Networks for Modeling and Control of Particle Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelen, A. L.; Biedron, S. G.; Chase, B. E.

    Myriad nonlinear and complex physical phenomena are host to particle accelerators. They often involve a multitude of interacting systems, are subject to tight performance demands, and should be able to run for extended periods of time with minimal interruptions. Often times, traditional control techniques cannot fully meet these requirements. One promising avenue is to introduce machine learning and sophisticated control techniques inspired by artificial intelligence, particularly in light of recent theoretical and practical advances in these fields. Within machine learning and artificial intelligence, neural networks are particularly well-suited to modeling, control, and diagnostic analysis of complex, nonlinear, and time-varying systems,more » as well as systems with large parameter spaces. Consequently, the use of neural network-based modeling and control techniques could be of significant benefit to particle accelerators. For the same reasons, particle accelerators are also ideal test-beds for these techniques. Moreover, many early attempts to apply neural networks to particle accelerators yielded mixed results due to the relative immaturity of the technology for such tasks. For the purpose of this paper is to re-introduce neural networks to the particle accelerator community and report on some work in neural network control that is being conducted as part of a dedicated collaboration between Fermilab and Colorado State University (CSU). We also describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator control systems, and describe a neural network-based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility, including initial experimental results from a benchmark controller.« less

  3. Neural Networks for Modeling and Control of Particle Accelerators

    NASA Astrophysics Data System (ADS)

    Edelen, A. L.; Biedron, S. G.; Chase, B. E.; Edstrom, D.; Milton, S. V.; Stabile, P.

    2016-04-01

    Particle accelerators are host to myriad nonlinear and complex physical phenomena. They often involve a multitude of interacting systems, are subject to tight performance demands, and should be able to run for extended periods of time with minimal interruptions. Often times, traditional control techniques cannot fully meet these requirements. One promising avenue is to introduce machine learning and sophisticated control techniques inspired by artificial intelligence, particularly in light of recent theoretical and practical advances in these fields. Within machine learning and artificial intelligence, neural networks are particularly well-suited to modeling, control, and diagnostic analysis of complex, nonlinear, and time-varying systems, as well as systems with large parameter spaces. Consequently, the use of neural network-based modeling and control techniques could be of significant benefit to particle accelerators. For the same reasons, particle accelerators are also ideal test-beds for these techniques. Many early attempts to apply neural networks to particle accelerators yielded mixed results due to the relative immaturity of the technology for such tasks. The purpose of this paper is to re-introduce neural networks to the particle accelerator community and report on some work in neural network control that is being conducted as part of a dedicated collaboration between Fermilab and Colorado State University (CSU). We describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator control systems, and describe a neural network-based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility, including initial experimental results from a benchmark controller.

  4. Neural Networks for Modeling and Control of Particle Accelerators

    DOE PAGES

    Edelen, A. L.; Biedron, S. G.; Chase, B. E.; ...

    2016-04-01

    Myriad nonlinear and complex physical phenomena are host to particle accelerators. They often involve a multitude of interacting systems, are subject to tight performance demands, and should be able to run for extended periods of time with minimal interruptions. Often times, traditional control techniques cannot fully meet these requirements. One promising avenue is to introduce machine learning and sophisticated control techniques inspired by artificial intelligence, particularly in light of recent theoretical and practical advances in these fields. Within machine learning and artificial intelligence, neural networks are particularly well-suited to modeling, control, and diagnostic analysis of complex, nonlinear, and time-varying systems,more » as well as systems with large parameter spaces. Consequently, the use of neural network-based modeling and control techniques could be of significant benefit to particle accelerators. For the same reasons, particle accelerators are also ideal test-beds for these techniques. Moreover, many early attempts to apply neural networks to particle accelerators yielded mixed results due to the relative immaturity of the technology for such tasks. For the purpose of this paper is to re-introduce neural networks to the particle accelerator community and report on some work in neural network control that is being conducted as part of a dedicated collaboration between Fermilab and Colorado State University (CSU). We also describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator control systems, and describe a neural network-based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility, including initial experimental results from a benchmark controller.« less

  5. SEDIDAT: A BASIC program for the collection and statistical analysis of particle settling velocity data

    NASA Astrophysics Data System (ADS)

    Wright, Robyn; Thornberg, Steven M.

    SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.

  6. Laser pushing or pulling of absorbing airborne particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chuji, E-mail: cw175@msstate.edu; Gong, Zhiyong; Pan, Yong-Le

    2016-07-04

    A single absorbing particle formed by carbon nanotubes in the size range of 10–50 μm is trapped in air by a laser trapping beam and concurrently illuminated by another laser manipulating beam. When the trapping beam is terminated, the movement of the particle controlled by the manipulating beam is investigated. We report our observations of light-controlled pushing and pulling motions. We show that the movement direction has little relationship with the particle size and manipulating beam's parameters but is dominated by the particle's orientation and morphology. With this observation, the controllable optical manipulation is now able to be generalized to arbitrarymore » particles, including irregularly shaped absorbing particles that are shown in this work.« less

  7. Evaluation of an Intervention Instructional Program to Facilitate Understanding of Basic Particle Concepts among Students Enrolled in Several Levels of Study

    ERIC Educational Resources Information Center

    Treagust, David F.; Chandrasegaran, A. L.; Zain, Ahmad N. M.; Ong, Eng Tek; Karpudewan, Mageswary; Halim, Lilia

    2011-01-01

    The efficacy of an intervention instructional program was evaluated to facilitate understanding of particle theory concepts among students (N = 190) using a diagnostic instrument consisting of eleven two-tier multiple-choice items in a pre-test--post-test design. The students involved were high school students, undergraduates and postgraduates…

  8. Particle Collections - Skylab Experiment S149

    NASA Technical Reports Server (NTRS)

    1970-01-01

    This photograph shows Skylab's Particle Collection device, a scientific experiment designed to study micro-meteoroid particles in near-Earth space and determine their abundance, mass distribution, composition, and erosive effects. The Marshall Space Flight Center had program management responsibility for the development of Skylab hardware and experiments.

  9. Particle Collection - Skylab Experiment S149

    NASA Technical Reports Server (NTRS)

    1970-01-01

    This chart describes Skylab's Particle Collection device, a scientific experiment designed to study micro-meteoroid particles in near-Earth space and determine their abundance, mass distribution, composition, and erosive effects. The Marshall Space Flight Center had program management responsibility for the development of Skylab hardware and experiments.

  10. Development and Design of a User Interface for a Computer Automated Heating, Ventilation, and Air Conditioning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, B.; /Fermilab

    1999-10-08

    A user interface is created to monitor and operate the heating, ventilation, and air conditioning system. The interface is networked to the system's programmable logic controller. The controller maintains automated control of the system. The user through the interface is able to see the status of the system and override or adjust the automatic control features. The interface is programmed to show digital readouts of system equipment as well as visual queues of system operational statuses. It also provides information for system design and component interaction. The interface is made easier to read by simple designs, color coordination, and graphics.more » Fermi National Accelerator Laboratory (Fermi lab) conducts high energy particle physics research. Part of this research involves collision experiments with protons, and anti-protons. These interactions are contained within one of two massive detectors along Fermilab's largest particle accelerator the Tevatron. The D-Zero Assembly Building houses one of these detectors. At this time detector systems are being upgraded for a second experiment run, titled Run II. Unlike the previous run, systems at D-Zero must be computer automated so operators do not have to continually monitor and adjust these systems during the run. Human intervention should only be necessary for system start up and shut down, and equipment failure. Part of this upgrade includes the heating, ventilation, and air conditioning system (HVAC system). The HVAC system is responsible for controlling two subsystems, the air temperatures of the D-Zero Assembly Building and associated collision hall, as well as six separate water systems used in the heating and cooling of the air and detector components. The BYAC system is automated by a programmable logic controller. In order to provide system monitoring and operator control a user interface is required. This paper will address methods and strategies used to design and implement an effective user interface. Background material pertinent to the BYAC system will cover the separate water and air subsystems and their purposes. In addition programming and system automation will also be covered.« less

  11. Digital quantum simulation of Dirac equation with a trapped ion

    NASA Astrophysics Data System (ADS)

    Shen, Yangchao; Zhang, Xiang; Zhang, Junhua; Casanova, Jorge; Lamata, Lucas; Solano, Enrique; Yung, Man-Hong; Zhang, Jingning; Kim, Kihwan; Department Of Physical Chemistry Collaboration

    2014-05-01

    Recently there has been growing interest in simulating relativistic effects in controllable physical system. We digitally simulate the Dirac equation in 3 +1 dimensions with a single trapped ion. We map four internal levels of 171Yb+ ion to the Dirac bispinor. The time evolution of the Dirac equation is implemented by trotter expansion. In the 3 +1 dimension, we can observe a helicoidal motion of a free Dirac particle which reduces to Zitterbewegung in 1 +1 dimension. This work was supported in part by the National Basic Research Program of China Grant 2011CBA00300, 2011CBA00301, the National Natural Science Foundation of China Grant 61033001, 61061130540. KK acknowledge the support from the recruitment program of global youth experts.

  12. Soft Particle Spectrometer, Langmuir Probe, and Data Analysis for Aerospace Magnetospheric/Thermospheric Coupling Rocket Program

    NASA Technical Reports Server (NTRS)

    Sharber, J. R.; Frahm, R. A.; Scherrer, J. R.

    1997-01-01

    Under this grant two instruments, a soft particle spectrometer and a Langmuir probe, were refurbished and calibrated, and flown on three instrumented rocket payloads as part of the Magnetosphere/Thermosphere Coupling program. The flights took place at the Poker Flat Research Range on February 12, 1994 (T(sub o) = 1316:00 UT), February 2, 1995 (T(sub o) = 1527:20 UT), and November 27, 1995 (T(sub o) = 0807:24 UT). In this report the observations of the particle instrumentation flown on all three of the flights are described, and brief descriptions of relevant geophysical activity for each flight are provided. Calibrations of the particle instrumentation for all ARIA flights are also provided.

  13. Lattice Boltzmann simulation of the gas-solid adsorption process in reconstructed random porous media.

    PubMed

    Zhou, L; Qu, Z G; Ding, T; Miao, J Y

    2016-04-01

    The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.

  14. Lattice Boltzmann simulation of the gas-solid adsorption process in reconstructed random porous media

    NASA Astrophysics Data System (ADS)

    Zhou, L.; Qu, Z. G.; Ding, T.; Miao, J. Y.

    2016-04-01

    The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.

  15. How do particle physicists learn the programming concepts they need?

    NASA Astrophysics Data System (ADS)

    Kluth, S.; Pia, M. G.; Schoerner-Sadenius, T.; Steinbach, P.

    2015-12-01

    The ability to read, use and develop code efficiently and successfully is a key ingredient in modern particle physics. We report the experience of a training program, identified as “Advanced Programming Concepts”, that introduces software concepts, methods and techniques to work effectively on a daily basis in a HEP experiment or other programming intensive fields. This paper illustrates the principles, motivations and methods that shape the “Advanced Computing Concepts” training program, the knowledge base that it conveys, an analysis of the feedback received so far, and the integration of these concepts in the software development process of the experiments as well as its applicability to a wider audience.

  16. Very high pressure liquid chromatography using fully porous particles: quantitative analysis of fast gradient separations without post-run times.

    PubMed

    Stankovich, Joseph J; Gritti, Fabrice; Stevenson, Paul G; Beaver, Lois Ann; Guiochon, Georges

    2014-01-10

    Using a column packed with fully porous particles, four methods for controlling the flow rates at which gradient elution runs are conducted in very high pressure liquid chromatography (VHPLC) were tested to determine whether reproducible thermal conditions could be achieved, such that subsequent analyses would proceed at nearly the same initial temperature. In VHPLC high flow rates are achieved, producing fast analyses but requiring high inlet pressures. The combination of high flow rates and high inlet pressures generates local heat, leading to temperature changes in the column. Usually in this case a post-run time is input into the analytical method to allow the return of the column temperature to its initial state. An alternative strategy involves operating the column without a post-run equilibration period and maintaining constant temperature variations for subsequent analysis after conducting one or a few separations to bring the column to a reproducible starting temperature. A liquid chromatography instrument equipped with a pressure controller was used to perform constant pressure and constant flow rate VHPLC separations. Six replicate gradient separations of a nine component mixture consisting of acetophenone, propiophenone, butyrophenone, valerophenone, hexanophenone, heptanophenone, octanophenone, benzophenone, and acetanilide dissolved in water/acetonitrile (65:35, v/v) were performed under various experimental conditions: constant flow rate, two sets of constant pressure, and constant pressure operation with a programmed flow rate. The relative standard deviations of the response factors for all the analytes are lower than 5% across the methods. Programming the flow rate to maintain a fairly constant pressure instead of using instrument controlled constant pressure improves the reproducibility of the retention times by a factor of 5, when plotting the chromatograms in time. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. A Wideband Fast Multipole Method for the two-dimensional complex Helmholtz equation

    NASA Astrophysics Data System (ADS)

    Cho, Min Hyung; Cai, Wei

    2010-12-01

    A Wideband Fast Multipole Method (FMM) for the 2D Helmholtz equation is presented. It can evaluate the interactions between N particles governed by the fundamental solution of 2D complex Helmholtz equation in a fast manner for a wide range of complex wave number k, which was not easy with the original FMM due to the instability of the diagonalized conversion operator. This paper includes the description of theoretical backgrounds, the FMM algorithm, software structures, and some test runs. Program summaryProgram title: 2D-WFMM Catalogue identifier: AEHI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4636 No. of bytes in distributed program, including test data, etc.: 82 582 Distribution format: tar.gz Programming language: C Computer: Any Operating system: Any operating system with gcc version 4.2 or newer Has the code been vectorized or parallelized?: Multi-core processors with shared memory RAM: Depending on the number of particles N and the wave number k Classification: 4.8, 4.12 External routines: OpenMP ( http://openmp.org/wp/) Nature of problem: Evaluate interaction between N particles governed by the fundamental solution of 2D Helmholtz equation with complex k. Solution method: Multilevel Fast Multipole Algorithm in a hierarchical quad-tree structure with cutoff level which combines low frequency method and high frequency method. Running time: Depending on the number of particles N, wave number k, and number of cores in CPU. CPU time increases as N log N.

  18. Euler-Lagrange Simulations of Shock Wave-Particle Cloud Interaction

    NASA Astrophysics Data System (ADS)

    Koneru, Rahul; Rollin, Bertrand; Ouellet, Frederick; Park, Chanyoung; Balachandar, S.

    2017-11-01

    Numerical experiments of shock interacting with an evolving and fixed cloud of particles are performed. In these simulations we use Eulerian-Lagrangian approach along with state-of-the-art point-particle force and heat transfer models. As validation, we use Sandia Multiphase Shock Tube experiments and particle-resolved simulations. The particle curtain upon interaction with the shock wave is expected to experience Kelvin-Helmholtz (KH) and Richtmyer-Meshkov (RM) instabilities. In the simulations evolving the particle cloud, the initial volume fraction profile matches with that of Sandia Multiphase Shock Tube experiments, and the shock Mach number is limited to M =1.66. Measurements of particle dispersion are made at different initial volume fractions. A detailed analysis of the influence of initial conditions on the evolution of the particle cloudis presented. The early time behavior of the models is studied in the fixed bed simulations at varying volume fractions and shock Mach numbers.The mean gas quantities are measured in the context of 1-way and 2-way coupled simulations. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, Contract No. DE-NA0002378.

  19. Growth behavior of LiMn{sub 2}O{sub 4} particles formed by solid-state reactions in air and water vapor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozawa, Takahiro, E-mail: t-kozawa@jwri.osaka-u.ac.jp; Yanagisawa, Kazumichi; Murakami, Takeshi

    Morphology control of particles formed during conventional solid-state reactions without any additives is a challenging task. Here, we propose a new strategy to control the morphology of LiMn{sub 2}O{sub 4} particles based on water vapor-induced growth of particles during solid-state reactions. We have investigated the synthesis and microstructural evolution of LiMn{sub 2}O{sub 4} particles in air and water vapor atmospheres as model reactions; LiMn{sub 2}O{sub 4} is used as a low-cost cathode material for lithium-ion batteries. By using spherical MnCO{sub 3} precursor impregnated with LiOH, LiMn{sub 2}O{sub 4} spheres with a hollow structure were obtained in air, while angulated particlesmore » with micrometer sizes were formed in water vapor. The pore structure of the particles synthesized in water vapor was found to be affected at temperatures below 700 °C. We also show that the solid-state reaction in water vapor is a simple and valuable method for the large-scale production of particles, where the shape, size, and microstructure can be controlled. - Graphical abstract: This study has demonstrated a new strategy towards achieving morphology control without the use of additives during conventional solid-state reactions by exploiting water vapor-induced particle growth. - Highlights: • A new strategy to control the morphology of LiMn{sub 2}O{sub 4} particles is proposed. • Water vapor-induced particle growth is exploited in solid-state reactions. • The microstructural evolution of LiMn{sub 2}O{sub 4} particles is investigated. • The shape, size and microstructure can be controlled by solid-state reactions.« less

  20. Particulate matter emissions from biochar-amended soils as a potential tradeoff to the negative emission potential

    NASA Astrophysics Data System (ADS)

    Ravi, Sujith; Sharratt, Brenton S.; Li, Junran; Olshevski, Stuart; Meng, Zhongju; Zhang, Jianguo

    2016-10-01

    Novel carbon sequestration strategies such as large-scale land application of biochar may provide sustainable pathways to increase the terrestrial storage of carbon. Biochar has a long residence time in the soil and hence comprehensive studies are urgently needed to quantify the environmental impacts of large-scale biochar application. In particular, black carbon emissions from soils amended with biochar may counteract the negative emission potential due to the impacts on air quality, climate, and biogeochemical cycles. We investigated, using wind tunnel experiments, the particulate matter emission potential of a sand and two agriculturally important soils amended with different concentrations of biochar, in comparison to control soils. Our results indicate that biochar application considerably increases particulate emissions possibly by two mechanisms-the accelerated emission of fine biochar particles and the generation and emission of fine biochar particles resulting from abrasion of large biochar particles by sand grains. Our study highlights the importance of considering the background soil properties (e.g., texture) and geomorphological processes (e.g., aeolian transport) for biochar-based carbon sequestration programs.

  1. Conductivity for soot sensing: possibilities and limitations.

    PubMed

    Grob, Benedikt; Schmid, Johannes; Ivleva, Natalia P; Niessner, Reinhard

    2012-04-17

    In this study we summarize the possibilities and limitations of a conductometric measurement principle for soot sensing. The electrical conductivity of different carbon blacks (FW 200, lamp black 101, Printex 30, Printex U, Printex XE2, special black 4, and special black 6), spark discharge soot (GfG), and graphite powder was measured by a van der Pauw arrangement. Additionally the influence of inorganic admixtures on the conductivity of carbonaceous materials was proven to follow the percolation theory. Structural and oxidation characteristics obtained with Raman microspectroscopy and temperature programmed oxidation, respectively, were correlated with the electrical conductivity data. Moreover, a thermophoretic precipitator has been applied to deposit soot particles from the exhaust stream between interdigital electrodes. This combines a controlled and size independent particle collection method with the conductivity measurement principle. A test vehicle was equipped with the AVL Micro Soot Sensor (photoacoustic soot sensor) to prove the conductometric sensor principle with an independent and reliable technique. Our results demonstrate promising potential of the conductometric sensor for on-board particle diagnostic. Furthermore this sensor can be applied as a simple, rapid, and cheap analytical tool for characterization of soot structure.

  2. THERMINATOR: THERMal heavy-IoN generATOR

    NASA Astrophysics Data System (ADS)

    Kisiel, Adam; Tałuć, Tomasz; Broniowski, Wojciech; Florkowski, Wojciech

    2006-04-01

    THERMINATOR is a Monte Carlo event generator designed for studying of particle production in relativistic heavy-ion collisions performed at such experimental facilities as the SPS, RHIC, or LHC. The program implements thermal models of particle production with single freeze-out. It performs the following tasks: (1) generation of stable particles and unstable resonances at the chosen freeze-out hypersurface with the local phase-space density of particles given by the statistical distribution factors, (2) subsequent space-time evolution and decays of hadronic resonances in cascades, (3) calculation of the transverse-momentum spectra and numerous other observables related to the space-time evolution. The geometry of the freeze-out hypersurface and the collective velocity of expansion may be chosen from two successful models, the Cracow single-freeze-out model and the Blast-Wave model. All particles from the Particle Data Tables are used. The code is written in the object-oriented c++ language and complies to the standards of the ROOT environment. Program summaryProgram title:THERMINATOR Catalogue identifier:ADXL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXL_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland RAM required to execute with typical data:50 Mbytes Number of processors used:1 Computer(s) for which the program has been designed: PC, Pentium III, IV, or Athlon, 512 MB RAM not hardware dependent (any computer with the c++ compiler and the ROOT environment [R. Brun, F. Rademakers, Nucl. Instrum. Methods A 389 (1997) 81, http://root.cern.ch] Operating system(s) for which the program has been designed:Linux: Mandrake 9.0, Debian 3.0, SuSE 9.0, Red Hat FEDORA 3, etc., Windows XP with Cygwin ver. 1.5.13-1 and gcc ver. 3.3.3 (cygwin special)—not system dependent External routines/libraries used: ROOT ver. 4.02.00 Programming language:c++ Size of the package: (324 KB directory 40 KB compressed distribution archive), without the ROOT libraries (see http://root.cern.ch for details on the ROOT [R. Brun, F. Rademakers, Nucl. Instrum. Methods A 389 (1997) 81, http://root.cern.ch] requirements). The output files created by the code need 1.1 GB for each 500 events. Distribution format: tar gzip file Number of lines in distributed program, including test data, etc.: 6534 Number of bytes in ditribution program, including test data, etc.:41 828 Nature of the physical problem: Statistical models have proved to be very useful in the description of soft physics in relativistic heavy-ion collisions [P. Braun-Munzinger, K. Redlich, J. Stachel, 2003, nucl-th/0304013. [2

  3. A TEOM (tm) particulate monitor for comet dust, near Earth space, and planetary atmospheres

    NASA Astrophysics Data System (ADS)

    1988-04-01

    Scientific missions to comets, near earth space, and planetary atmospheres require particulate and mass accumulation instrumentation for both scientific and navigation purposes. The Rupprecht & Patashnick tapered element oscillating microbalance can accurately measure both mass flux and mass distribution of particulates over a wide range of particle sizes and loadings. Individual particles of milligram size down to a few picograms can be resolved and counted, and the accumulation of smaller particles or molecular deposition can be accurately measured using the sensors perfected and toughened under this contract. No other sensor has the dynamic range or sensitivity attained by these picogram direct mass measurement sensors. The purpose of this contract was to develop and implement reliable and repeatable manufacturing methods; build and test prototype sensors; and outline a quality control program. A dust 'thrower' was to be designed and built, and used to verify performance. Characterization and improvement of the optical motion detection system and drive feedback circuitry was to be undertaken, with emphasis on reliability, low noise, and low power consumption. All the goals of the contract were met or exceeded. An automated glass puller was built and used to make repeatable tapered elements. Materials and assembly methods were standardized, and controllers and calibrated fixtures were developed and used in all phases of preparing, coating and assembling the sensors. Quality control and reliability resulted from the use of calibrated manufacturing equipment with measurable working parameters. Thermal and vibration testing of completed prototypes showed low temperature sensitivity and high vibration tolerance. An electrostatic dust thrower was used in vacuum to throw particles from 2 x 106 g to 7 x 10-12 g in size. Using long averaging times, particles as small as 0.7 to 4 x 1011 g were weighted to resolutions in the 5 to 9 x 10-13 g range. The drive circuit and optics systems were developed beyond what was anticipated in the contract, and are now virtually flight prototypes. There is already commercial interest in the developed capability of measuring picogram mass losses and gains. One area is contamination and outgassing research, both measuring picogram losses from samples and collecting products of outgassing.

  4. Control of friction at the nanoscale

    DOEpatents

    Barhen, Jacob; Braiman, Yehuda Y.; Protopopescu, Vladimir

    2010-04-06

    Methods and apparatus are described for control of friction at the nanoscale. A method of controlling frictional dynamics of a plurality of particles using non-Lipschitzian control includes determining an attribute of the plurality of particles; calculating an attribute deviation by subtracting the attribute of the plurality of particles from a target attribute; calculating a non-Lipschitzian feedback control term by raising the attribute deviation to a fractionary power .xi.=(2m+1)/(2n+1) where n=1, 2, 3 . . . and m=0, 1, 2, 3 . . . , with m strictly less than n and then multiplying by a control amplitude; and imposing the non-Lipschitzian feedback control term globally on each of the plurality of particles; imposing causes a subsequent magnitude of the attribute deviation to be reduced.

  5. NANO-PARTICLE TRANSPORT AND DEPOSITION IN BIFURCATING TUBES WITH DIFFERENT INLET CONDITIONS

    EPA Science Inventory

    Transport and deposition of ultrafine particles in straight, bend and bifurcating tubes are considered for different inlet Reynolds numbers, velocity profiles, and particle sizes i.e., 1 nm= =150 nm. A commercial finite-volume code with user-supplied programs was validated with a...

  6. A Twist on the Richtmyer-Meshkov Instability

    NASA Astrophysics Data System (ADS)

    Rollin, Bertrand; Koneru, Rahul; Ouellet, Frederick

    2017-11-01

    The Richtmyer-Meshkov instability is caused by the interaction of a shock wave with a perturbed interface between two fluids of different densities. Typical contexts in which it plays a key role include inertial confinement fusion, supernovae or scramjets. However, little is known of the phenomenology of this instability if one of the interacting media is a dense solid-particle phase. In the context of an explosive dispersal of particles, this gas-particle variant of the Richtmyer-Meshkov instability may play a role in the late time formation of aerodynamically stable particle jets. Thus, this numerical experiment aims at shedding some light on this phenomenon with the help of high fidelity numerical simulations. Using a Eulerian-Lagrangian approach, we track trajectories of computational particles composing an initially corrugated solid particle curtain, in a two-dimensional planar geometry. This study explores the effects of the initial shape (designed using single mode and multimode perturbations) and volume fraction of the particle curtain on its subsequent evolution. Complexities associated with compaction of the curtain of particles to the random close packing limit are avoided by constraining simulations to modest initial volume fraction of particles. This work was supported by the U.S. DoE, NNSA, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  7. In Vitro Capture of Small Ferrous Particles with a Magnetic Filtration Device Designed for Intravascular Use with Intraarterial Chemotherapy: Proof-of-Concept Study.

    PubMed

    Mabray, Marc C; Lillaney, Prasheel; Sze, Chia-Hung; Losey, Aaron D; Yang, Jeffrey; Kondapavulur, Sravani; Liu, Derek; Saeed, Maythem; Patel, Anand; Cooke, Daniel; Jun, Young-Wook; El-Sayed, Ivan; Wilson, Mark; Hetts, Steven W

    2016-03-01

    To establish that a magnetic device designed for intravascular use can bind small iron particles in physiologic flow models. Uncoated iron oxide particles 50-100 nm and 1-5 µm in size were tested in a water flow chamber over a period of 10 minutes without a magnet (ie, control) and with large and small prototype magnets. These same particles and 1-µm carboxylic acid-coated iron oxide beads were likewise tested in a serum flow chamber model without a magnet (ie, control) and with the small prototype magnet. Particles were successfully captured from solution. Particle concentrations in solution decreased in all experiments (P < .05 vs matched control runs). At 10 minutes, concentrations were 98% (50-100-nm particles in water with a large magnet), 97% (50-100-nm particles in water with a small magnet), 99% (1-5-µm particles in water with a large magnet), 99% (1-5-µm particles in water with a small magnet), 95% (50-100-nm particles in serum with a small magnet), 92% (1-5-µm particles in serum with a small magnet), and 75% (1-µm coated beads in serum with a small magnet) lower compared with matched control runs. This study demonstrates the concept of magnetic capture of small iron oxide particles in physiologic flow models by using a small wire-mounted magnetic filter designed for intravascular use. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.

  8. Trajectory and Relative Dispersion Case Studies and Statistics from the Green River Mesoscale Deformation, Dispersion, and Dissipation Program

    NASA Astrophysics Data System (ADS)

    Niemann, Brand Lee

    A major field program to study beta-mesoscale transport and dispersion over complex mountainous terrain was conducted during 1969 with the cooperation of three government agencies at the White Sands Missile Range in central Utah. The purpose of the program was to measure simultaneously on a large number of days the synoptic and mesoscale wind fields, the relative dispersion between pairs of particle trajectories and the rate of small scale turbulence dissipation. The field program included measurements during more than 60 days in the months of March, June, and November. The large quantity of data generated from this program has been processed and analyzed to provide case studies and statistics to evaluate and refine Lagrangian variable trajectory models. The case studies selected to illustrate the complexities of mesoscale transport and dispersion over complex terrain include those with terrain blocking, lee waves, and stagnation, as well as those with large vertical wind shears and horizontal wind field deformation. The statistics of relative particle dispersion were computed and compared to the classical theories of Richardson and Batchelor and the more recent theories of Lin and Kao among others. The relative particle dispersion was generally found to increase with travel time in the alongwind and crosswind directions, but in a more oscillatory than sustained or even accelerated manner as predicted by most theories, unless substantial wind shears or finite vertical separations between particles were present. The relative particle dispersion in the vertical was generally found to be small and bounded even when substantial vertical motions due to lee waves were present because of the limiting effect of stable temperature stratification. The data show that velocity shears have a more significant effect than turbulence on relative particle dispersion and that sufficient turbulence may not always be present above the planetary boundary layer for "wind direction shear induced dispersion" to become effective horizontal dispersion by vertical mixing over the shear layer. The statistics of relative particle dispersion in the three component directions have been summarized and stratified by flow parameters for use in practical prediction problems.

  9. Enhanced hydrophobicity and volatility of submicron aerosols under severe emission control conditions in Beijing

    NASA Astrophysics Data System (ADS)

    Wang, Yuying; Zhang, Fang; Li, Zhanqing; Tan, Haobo; Xu, Hanbing; Ren, Jingye; Zhao, Jian; Du, Wei; Sun, Yele

    2017-04-01

    A series of strict emission control measures was implemented in Beijing and the surrounding seven provinces to ensure good air quality during the 2015 China Victory Day parade, rendering a unique opportunity to investigate the anthropogenic impact of aerosol properties. Submicron aerosol hygroscopicity and volatility were measured during and after the control period using a hygroscopic and volatile tandem differential mobility analyzer (H/V-TDMA) system. Three periods, namely the control clean period (Clean1), the non-control clean period (Clean2), and the non-control pollution period (Pollution), were selected to study the effect of the emission control measures on aerosol hygroscopicity and volatility. Aerosol particles became more hydrophobic and volatile due to the emission control measures. The hygroscopicity parameter (κ) of 40-200 nm particles decreased by 32.0-8.5 % during the Clean1 period relative to the Clean2 period, while the volatile shrink factor (SF) of 40-300 nm particles decreased by 7.5-10.5 %. The emission controls also changed the diurnal variation patterns of both the probability density function of κ (κ-PDF) and the probability density function of SF (SF-PDF). During Clean1 the κ-PDF showed one nearly hydrophobic (NH) mode for particles in the nucleation mode, which was likely due to the dramatic reduction in industrial emissions of inorganic trace gases. Compared to the Pollution period, particles observed during the Clean1 and Clean2 periods exhibited a more significant nonvolatile (NV) mode throughout the day, suggesting a more externally mixed state particularly for the 150 nm particles. Aerosol hygroscopicities increased as particle sizes increased, with the greatest increases seen during the Pollution period. Accordingly, the aerosol volatility became weaker (i.e., SF increased) as particle sizes increased during the Clean1 and Clean2 periods, but no apparent trend was observed during the Pollution period. Based on a correlation analysis of the number fractions of NH and NV particles, we found that a higher number fraction of hydrophobic and volatile particles during the emission control period.

  10. Fabrication of Controllable Pore and Particle Size of Mesoporous Silica Nanoparticles via a Liquid-phase Synthesis Method and Its Absorption Characteristics

    NASA Astrophysics Data System (ADS)

    Nandiyanto, Asep Bayu Dani; Iskandar, Ferry; Okuyama, Kikuo

    2011-12-01

    Monodisperse spherical mesoporous silica nanoparticles were successfully synthesized using a liquid-phase synthesis method. The result showed particles with controllable pore size from several to tens nanometers with outer diameter of several tens nanometers. The ability in the control of pore size and outer diameter was altered by adjusting the precursor solution ratios. In addition, we have conducted the adsorption ability of the prepared particles. The result showed that large organic molecules were well-absorbed to the prepared silica porous particles, in which this result was not obtained when using commercial dense silica particle and/or hollow silica particle. With this result, the prepared mesoporous silica particles may be used efficiently in various applications, such as sensors, pharmaceuticals, environmentally sensitive pursuits, etc.

  11. Tracking control of colloidal particles through non-homogeneous stationary flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Híjar, Humberto, E-mail: humberto.hijar@lasallistas.org.mx

    2013-12-21

    We consider the problem of controlling the trajectory of a single colloidal particle in a fluid with steady non-homogeneous flow. We use a Langevin equation to describe the dynamics of this particle, where the friction term is assumed to be given by the Faxén's Theorem for the force on a sphere immersed in a stationary flow. We use this description to propose an explicit control force field to be applied on the particle such that it will follow asymptotically any given desired trajectory, starting from an arbitrary initial condition. We show that the dynamics of the controlled particle can bemore » mapped into a set of stochastic harmonic oscillators and that the velocity gradient of the solvent induces an asymmetric coupling between them. We study the particular case of a Brownian particle controlled through a plane Couette flow and show explicitly that the velocity gradient of the solvent renders the dynamics non-stationary and non-reversible in time. We quantify this effect in terms of the correlation functions for the position of the controlled particle, which turn out to exhibit contributions depending exclusively on the non-equilibrium character of the state of the solvent. In order to test the validity of our model, we perform simulations of the controlled particle moving in a simple shear flow, using a hybrid method combining molecular dynamics and multi-particle collision dynamics. We confirm numerically that the proposed guiding force allows for controlling the trajectory of the micro-sized particle by obligating it to follow diverse specific trajectories in fluids with homogeneous shear rates of different strengths. In addition, we find that the non-equilibrium correlation functions in simulations exhibit the same qualitative behavior predicted by the model, thus revealing the presence of the asymmetric non-equilibrium coupling mechanism induced by the velocity gradient.« less

  12. Method for producing ceramic particles and agglomerates

    DOEpatents

    Phillips, Jonathan; Gleiman, Seth S.; Chen, Chun-Ku

    2001-01-01

    A method for generating spherical and irregularly shaped dense particles of ceramic oxides having a controlled particle size and particle size distribution. An aerosol containing precursor particles of oxide ceramics is directed into a plasma. As the particles flow through the hot zone of the plasma, they melt, collide, and join to form larger particles. If these larger particles remain in the hot zone, they continue melting and acquire a spherical shape that is retained after they exit the hot zone, cool down, and solidify. If they exit the hot zone before melting completely, their irregular shape persists and agglomerates are produced. The size and size distribution of the dense product particles can be controlled by adjusting several parameters, the most important in the case of powder precursors appears to be the density of powder in the aerosol stream that enters the plasma hot zone. This suggests that particle collision rate is responsible for determining ultimate size of the resulting sphere or agglomerate. Other parameters, particularly the gas flow rates and the microwave power, are also adjusted to control the particle size distribution.

  13. FabricS: A user-friendly, complete and robust software for particle shape-fabric analysis

    NASA Astrophysics Data System (ADS)

    Moreno Chávez, G.; Castillo Rivera, F.; Sarocchi, D.; Borselli, L.; Rodríguez-Sedano, L. A.

    2018-06-01

    Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability. The program has been verified by means of several simulations performed using appropriately designed features and by analyzing real samples.

  14. Gravitational Agglomeration of Post-HCDA LMFBR Nonspherical Aerosols.

    DTIC Science & Technology

    1980-12-01

    equations for two particle motions are developed . A computer program NGCEFF is constructed., the Navier-Stokes equation is solved by the finite difference...dynamic equations for two particle motions are developed . A computer program NGCEFF I is constructed, the Navier-Stokes equation is solved by the...spatial inhomogeneities for the aerosol. Thus, following an HCDA, an aerosol mixture of sodium compounds, fuel and core structural materials will

  15. Laboratory directed research and development. FY 1995 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, J.; Prono, J.

    1996-03-01

    This document presents an overview of Laboratory Directed Research and Development Programs at Los Alamos. The nine technical disciplines in which research is described include materials, engineering and base technologies, plasma, fluids, and particle beams, chemistry, mathematics and computational science, atmic and molecular physics, geoscience, space science, and astrophysics, nuclear and particle physics, and biosciences. Brief descriptions are provided in the above programs.

  16. Tulane/Xavier Vaccine Peptide Program

    DTIC Science & Technology

    2013-07-01

    include a dry powder formulation, microemulsions , nonspherical liposomes, ceramic shell vesicles, and nanometer-sized silk particles. Nasal...pulmonary delivery: dry powder formulation, microemulsions , nonspherical liposomes, ceramic shell vesicles, and nanometer-sized silk particles. (3) Confirm...include a dry powder formulation, microemulsions , nonspherical liposomes, ceramic shell vesicles, and nanometer-sized silk particles. Nasal

  17. The Ultimate Structure of Matter: The High Energy Physics Program from the 1950s through the 1980s

    DOE R&D Accomplishments Database

    1990-02-01

    This discusses the following topics in High Energy Physics: The Particle Zoo; The Strong and the Weak; The Particle Explosion; Deep Inside the Nucleon; The Search for Unity; Physics in Collision; The Standard Model; Particles and the Cosmos; and Practical Benefits.

  18. Comparative Laboratory and Numerical Simulations of Shearing Granular Fault Gouge: Micromechanical Processes

    NASA Astrophysics Data System (ADS)

    Morgan, J. K.; Marone, C. J.; Guo, Y.; Anthony, J. L.; Knuth, M. W.

    2004-12-01

    Laboratory studies of granular shear zones have provided significant insight into fault zone processes and the mechanics of earthquakes. The micromechanisms of granular deformation are more difficult to ascertain, but have been hypothesized based on known variations in boundary conditions, particle properties and geometries, and mechanical behavior. Numerical simulations using particle dynamics methods (PDM) can offer unique views into deforming granular shear zones, revealing the precise details of granular microstructures, particle interactions, and packings, which can be correlated with macroscopic mechanical behavior. Here, we describe a collaborative program of comparative laboratory and numerical experiments of granular shear using idealized materials, i.e., glass beads, glass rods or pasta, and angular sand. Both sets of experiments are carried out under similar initial and boundary conditions in a non-fracturing stress regime. Phenomenologically, the results of the two sets of experiments are very similar. Peak friction values vary as a function of particle dimensionality (1-D vs. 2-D vs. 3-D), particle angularity, particle size and size distributions, boundary roughness, and shear zone thickness. Fluctuations in shear strength during an experiment, i.e., stick-slip events, can be correlated with distinct changes in the nature, geometries, and durability of grain bridges that support the shear zone walls. Inclined grain bridges are observed to form, and to support increasing loads, during gradual increases in assemblage strength. Collapse of an individual grain bridge leads to distinct localization of strain, generating a rapidly propagating shear surface that cuts across multiple grain bridges, accounting for the sudden drop in strength. The distribution of particle sizes within an assemblage, along with boundary roughness and its periodicity, influence the rate of formation and dissipation of grain bridges, thereby controlling friction variations during shear.

  19. Modeling the complex shape evolution of sedimenting particle swarms in fractures

    NASA Astrophysics Data System (ADS)

    Mitchell, C. A.; Nitsche, L.; Pyrak-Nolte, L. J.

    2016-12-01

    The flow of micro- and nano-particles through subsurface systems can occur in several environments, such as hydraulic fracturing or enhanced oil recovery. Computer simulations were performed to advance our understanding of the complexity of subsurface particle swarm transport in fractures. Previous experiments observed that particle swarms in fractures with uniform apertures exhibit enhanced transport speeds and suppressed bifurcations for an optimal range of apertures. Numerical simulations were performed for low Reynolds number, no interfacial tension and uniform viscosity conditions with particulate swarms represented by point-particles that mutually interact through their (regularized) Stokeslet fields. A P3 M technique accelerates the summations for swarms exceeding 105 particles. Fracture wall effects were incorporated using a least-squares variant of the method of fundamental solutions, with grid mapping of the surface force and source elements within the fast-summation scheme. The numerical study was executed on the basis of dimensionless variables and parameters, in the interest of examining the fundamental behavior and relationships of particle swarms in the presence of uniform apertures. Model parameters were representative of particle swarms experiments to enable direct comparison of the results with the experimental observations. The simulations confirmed that the principal phenomena observed in the experiments can be explained within the realm of Stokes flow. The numerical investigation effectively replicated swarm evolution in a uniform fracture and captured the coalescence, torus and tail formation, and ultimate breakup of the particle swarm as it fell under gravity in a quiescent fluid. The rate of swarm evolution depended on the number of particles in a swarm. When an ideal number of particles was used, swarm transport was characterized by an enhanced velocity regime as observed in the laboratory data. Understanding the physics particle swarms in fractured media will improve the ability to perform controlled micro-particulate transport through rock. Acknowledgment: This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Geosciences Research Program under Award Number (DE-FG02-09ER16022).

  20. Programmable Multiple-Ramped-Voltage Power Supply

    NASA Technical Reports Server (NTRS)

    Ajello, Joseph M.; Howell, S. K.

    1993-01-01

    Ramp waveforms range up to 2,000 V. Laboratory high-voltage power-supply system puts out variety of stable voltages programmed to remain fixed with respect to ground or float with respect to ramp waveform. Measures voltages it produces with high resolution; automatically calibrates, zeroes, and configures itself; and produces variety of input/output signals for use with other instruments. Developed for use with ultraviolet spectrometer. Also applicable to control of electron guns in general and to operation of such diverse equipment used in measuring scattering cross sections of subatomic particles and in industrial electron-beam welders.

  1. Self-assembled three-dimensional chiral colloidal architecture

    NASA Astrophysics Data System (ADS)

    Ben Zion, Matan Yah; He, Xiaojin; Maass, Corinna C.; Sha, Ruojie; Seeman, Nadrian C.; Chaikin, Paul M.

    2017-11-01

    Although stereochemistry has been a central focus of the molecular sciences since Pasteur, its province has previously been restricted to the nanometric scale. We have programmed the self-assembly of micron-sized colloidal clusters with structural information stemming from a nanometric arrangement. This was done by combining DNA nanotechnology with colloidal science. Using the functional flexibility of DNA origami in conjunction with the structural rigidity of colloidal particles, we demonstrate the parallel self-assembly of three-dimensional microconstructs, evincing highly specific geometry that includes control over position, dihedral angles, and cluster chirality.

  2. Monitoring of magnetic nano-particles in EOR by using the CSEM modeling and inversion.

    NASA Astrophysics Data System (ADS)

    Heo, J. Y.; KIM, S.; Jeong, G.; Hwang, J.; Min, D. J.

    2016-12-01

    EOR, which injects water, CO2, or other chemical components into reservoirs to increase the production rate of oil and gas, has widely been used. To promote efficiency of EOR, it is important to monitor distribution of injected materials in reservoirs. Using nano-particles in EOR has advantages that the size of particles is smaller than the pore and particles can be characterized by various physical properties. Specifically, if we use magnetic nano-particles, we can effectively monitor nano-particles by using the electromagnetic survey. CSEM, which can control the frequency range of source, is good to monitor magnetic nano-particles under various reservoir circumstances. In this study, we first perform numerical simulation of 3D CSEM for reservoir under production. In general, two wells are used for EOR: one is for injection, and the other is for extraction. We assume that sources are applied inside the injection well, and receivers are deployed inside the extraction well. To simulate the CSEM survey, we decompose the total fields into primary and secondary fields in Maxwell's equations. For the primary fields, we calculate the analytic solutions of the layered earth. With the calculated primary fields, we compute the secondary fields due to anomalies using the edge-based finite-element method. Finally, we perform electromagnetic inversion for both conductivity and permeability to trace the distribution of magnetic nano-particles. Since these two parameters react differently according to the frequency range of sources, we can effectively describe the distribution of magnetic nano-particles by considering two parameters at the same time. Acknowledgements This work was supported by the Korea Institute of Energy Technology Evaluation and Planning(KETEP) and the Ministry of Trade, Industry & Energy(MOTIE) of the Republic of Korea (No. 20168510030830), and by the International Cooperation (No. 2012-8510030010) of KETEP, and by the Dual Use Technology Program, granted financial resource from the MOTIE.

  3. In situ measurement of mesopelagic particle sinking rates and the control of carbon transfer to the ocean interior during the Vertical Flux in the Global Ocean (VERTIGO) voyages in the North Pacific

    NASA Astrophysics Data System (ADS)

    Trull, T. W.; Bray, S. G.; Buesseler, K. O.; Lamborg, C. H.; Manganini, S.; Moy, C.; Valdes, J.

    2008-07-01

    Among the parameters affecting carbon transfer to the ocean interior, particle sinking rates vary three orders of magnitude and thus more than primary production, f-ratios, or particle carbon contents [e.g., Boyd, P.W., Trull, T.W., 2006. Understanding the export of marine biogenic particles: is there consensus? Progress in Oceanography 4, 276-312, doi:10.1016/j.pocean.2006.10.007]. Very few data have been obtained from the mesopelagic zone where the majority of carbon remineralization occurs and the attenuation of the sinking flux is determined. Here, we report sinking rates from ˜300 m depth for the subtropical (station ALOHA, June 2004) and subarctic (station K2, July 2005) North Pacific Ocean, obtained from short (6.5 day) deployments of an indented rotating sphere (IRS) sediment trap operating as an in situ settling column [Peterson, M.L., Wakeham, S.G., Lee, C., Askea, M.A., Miquel, J.C., 2005. Novel techniques for collection of sinking particles in the ocean and determining their settling rates. Limnology and Oceanography Methods 3, 520-532] to separate the flux into 11 sinking-rate fractions ranging from >820 to >2 m d -1 that are collected by a carousel for further analysis. Functioning of the IRS trap was tested using a novel programming sequence to check that all particles have cleared the settling column prior to the next delivery of particles by the 6-hourly rotation cycle of the IRS. There was some evidence (from the flux distribution among the cups and photomicroscopy of the collected particles) that very slow-sinking particles may have been under-collected because they were unable to penetrate the brine-filled collection cups, but good evidence for appropriate collection of fast-settling fractions. Approximately 50% of the particulate organic carbon (POC) flux was sinking at greater than 100 m d -1 at both stations. At ALOHA, more than 15% of the POC flux sank at >820 m d -1, but low fluxes make this uncertain, and precluded resolution of particles sinking slower than 137 m d -1. At K2, less than 1% of the POC flux sank at >820 m d -1, but a large fraction (˜15-45%) of the flux was contributed by other fast-sinking classes (410 and 205 m d -1). PIC and BSi minerals were not present in higher proportions in the faster sinking fractions, but the observations were too limited to rule out a ballasting contribution to the control of sinking rates. Photographic evidence for a wide range of particle types within individual sinking-rate fractions suggests that biological processes that set the porosity and shape of particles are also important and may mask the role of minerals. Comparing the spectrum of sinking rates observed at K2 with the power-law profile of flux attenuation with depth obtained from other VERTIGO sediment traps deployed at multiple depths [Buesseler, K.O., Lamborg, C.H., Boyd, P.W., Lam, P.J., Trull, T.W., Bidigare, R.R., Bishop, J.K.B., Casciotti, K.L., Dehairs, F., Elskens, M., Honda, M., Karl, D.M., Siegel, D., Silver, M., Steinberg, D., Valdes, J., Van Mooy, B., Wilson, S.E., 2007b. Revisiting carbon flux through the Ocean's twilight zone. Science 316(5824), 567-570, doi: 10.1126/science.1137959] emphasizes the importance of particle transformations within the mesopelagic zone in the control of carbon transport to the ocean interior.

  4. Engineering ellipsoidal cap-like hydrogel particles as building blocks or sacrificial templates for three-dimensional cell culture.

    PubMed

    Zhang, Weiwei; Huang, Guoyou; Ng, Kelvin; Ji, Yuan; Gao, Bin; Huang, Liqing; Zhou, Jinxiong; Lu, Tian Jian; Xu, Feng

    2018-03-26

    Hydrogel particles that can be engineered to compartmentally culture cells in a three-dimensional (3D) and high-throughput manner have attracted increasing interest in the biomedical area. However, the ability to generate hydrogel particles with specially designed structures and their potential biomedical applications need to be further explored. This work introduces a method for fabricating hydrogel particles in an ellipsoidal cap-like shape (i.e., ellipsoidal cap-like hydrogel particles) by employing an open-pore anodic aluminum oxide membrane. Hydrogel particles of different sizes are fabricated. The ability to produce ellipsoidal cap-like magnetic hydrogel particles with controlled distribution of magnetic nanoparticles is demonstrated. Encapsulated cells show high viability, indicating the potential for using these hydrogel particles as structure- and remote-controllable building blocks for tissue engineering application. Moreover, the hydrogel particles are also used as sacrificial templates for fabricating ellipsoidal cap-like concave wells, which are further applied for producing size controllable cell aggregates. The results are beneficial for the development of hydrogel particles and their applications in 3D cell culture.

  5. A stochastic bioburden model for spacecraft sterilization.

    NASA Technical Reports Server (NTRS)

    Roark, A. L.

    1972-01-01

    Development of a stochastic model of the probability distribution for the random variable representing the number of microorganisms on a surface as a function of time. The first basic principle associated with bioburden estimation is that viable particles are removed from surfaces. The second notion important to the analysis is that microorganisms in environments and on surfaces occur in clumps. The last basic principle relating to bioburden modeling is that viable particles are deposited on a surface. The bioburden on a spacecraft is determined by the amount and kind of control exercised on the spacecraft assembly location, the shedding characteristics of the individuals in the vicinity of the spacecraft, its orientation, the geographical location in which the assembly takes place, and the steps in the assembly procedure. The model presented has many of the features which are desirable for its use in the spacecraft sterilization programs currently being planned by NASA.

  6. In Vitro Capture of Small Ferrous Particles with a Magnetic Filtration Device Designed for Intravascular Use with Intraarterial Chemotherapy: Proof-of-Concept Study

    PubMed Central

    Mabray, Marc C.; Lillaney, Prasheel; Sze, Chia-Hung; Losey, Aaron D.; Yang, Jeffrey; Kondapavulur, Sravani; Liu, Derek; Saeed, Maythem; Patel, Anand; Cooke, Daniel; Jun, Young-Wook; El-Sayed, Ivan; Wilson, Mark; Hetts, Steven W.

    2015-01-01

    Purpose To establish that a magnetic device designed for intravascular use can bind small iron particles in physiologic flow models. Materials and Methods Uncoated iron oxide particles 50–100 nm and 1–5 μm in size were tested in a water flow chamber over a period of 10 minutes without a magnet (ie, control) and with large and small prototype magnets. These same particles and 1-μm carboxylic acid–coated iron oxide beads were likewise tested in a serum flow chamber model without a magnet (ie, control) and with the small prototype magnet. Results Particles were successfully captured from solution. Particle concentrations in solution decreased in all experiments (P < .05 vs matched control runs). At 10 minutes, concentrations were 98% (50–100-nm particles in water with a large magnet), 97% (50–100-nm particles in water with a small magnet), 99% (1–5-μm particles in water with a large magnet), 99% (1–5-μm particles in water with a small magnet), 95% (50–100-nm particles in serum with a small magnet), 92% (1–5-μm particles in serum with a small magnet), and 75% (1-μm coated beads in serum with a small magnet) lower compared with matched control runs. Conclusions This study demonstrates the concept of magnetic capture of small iron oxide particles in physiologic flow models by using a small wire-mounted magnetic filter designed for intravascular use. PMID:26706187

  7. General solution for diffusion-controlled dissolution of spherical particles. 1. Theory.

    PubMed

    Wang, J; Flanagan, D R

    1999-07-01

    Three classical particle dissolution rate expressions are commonly used to interpret particle dissolution rate phenomena. Our analysis shows that an assumption used in the derivation of the traditional cube-root law may not be accurate under all conditions for diffusion-controlled particle dissolution. Mathematical analysis shows that the three classical particle dissolution rate expressions are approximate solutions to a general diffusion layer model. The cube-root law is most appropriate when particle size is much larger than the diffusion layer thickness, the two-thirds-root expression applies when the particle size is much smaller than the diffusion layer thickness. The square-root expression is intermediate between these two models. A general solution to the diffusion layer model for monodispersed spherical particles dissolution was derived for sink and nonsink conditions. Constant diffusion layer thickness was assumed in the derivation. Simulated dissolution data showed that the ratio between particle size and diffusion layer thickness (a0/h) is an important factor in controlling the shape of particle dissolution profiles. A new semiempirical general particle dissolution equation is also discussed which encompasses the three classical particle dissolution expressions. The success of the general equation in explaining limitations of traditional particle dissolution expressions demonstrates the usefulness of the general diffusion layer model.

  8. Surface Waves as Major Controls on Particle Backscattering in Southern California Coastal Waters

    NASA Astrophysics Data System (ADS)

    Henderikx Freitas, F.; Fields, E.; Maritorena, S.; Siegel, D.

    2016-02-01

    Satellite observations of particle loads and optical backscattering coefficients (bbp) in the Southern California Bight (SCB) have been thought to be driven by episodic inputs from storm runoff. Here we show however that surface waves have a larger role in controlling remotely sensed bbp values than previously considered. More than 14 years of 2-km resolution SeaWiFS, MODIS and MERIS satellite imagery spectrally-merged with the Garver-Siegel-Maritorena bio-optical model were used to assess the relative importance of terrestrial runoff and surface wave forcings in determining changes in particle load in the SCB. The space-time distributions of particle backscattering at 443nm and chlorophyll concentration estimates from the model were analyzed using Empirical Orthogonal Function analysis, and patterns were compared with several environmental variables. While offshore values of bbp are tightly related to chlorophyll concentrations, as expected for productive Case-1 waters, values of bbp in a 10km band near the coast are primarily modulated by surface waves. The relationship with waves holds throughout all seasons and is most apparent around the 40m isobath, but extends offshore until about 100m in depth. Riverine inputs are associated with elevated bbp near the coast mostly during the larger El Nino events of 1997/1998 and 2005. These findings are consistent with bio-optical glider and field observations from the Santa Barbara Channel taken as part of the Santa Barbara Coastal Long-Term Ecological Research and Plumes and Blooms programs. The implication of surface waves determining bbp variability beyond the surf zone has large consequences for the life cycle of many marine organisms, as well as for the interpretation of remote sensing signals near the coast.

  9. Analysis of PM10, PM2.5, and PM2 5-10 concentrations in Santiago, Chile, from 1989 to 2001.

    PubMed

    Koutrakis, Petros; Sax, Sonja N; Sarnat, Jeremy A; Coull, Brent; Demokritou, Phil; Oyola, Pedro; Garcia, Javier; Gramsch, Ernesto

    2005-03-01

    Daily particle samples were collected in Santiago, Chile, at four urban locations from January 1, 1989, through December 31, 2001. Both fine PM with da < 2.5 microm (PM2.5) and coarse PM with 2.5 < da < 10 microm (PM2.5-10) were collected using dichotomous samplers. The inhalable particle fraction, PM10, was determined as the sum of fine and coarse concentrations. Wind speed, temperature and relative humidity (RH) were also measured continuously. Average concentrations of PM2.5 for the 1989-2001 period ranged from 38.5 microg/m3 to 53 microg/m3. For PM2.5-10 levels ranged from 35.8-48.2 microg/m3 and for PM10 results were 74.4-101.2 microg/m3 across the four sites. Both annual and daily PM2.5 and PM10 concentration levels exceeded the U.S. National Ambient Air Quality Standards and the European Union concentration limits. Mean PM2.5 levels during the cold season (April through September) were more than twice as high as those observed in the warm season (October through March); whereas coarse particle levels were similar in both seasons. PM concentration trends were investigated using regression models, controlling for site, weekday, month, wind speed, temperature, and RH. Results showed that PM2.5 concentrations decreased substantially, 52% over the 12-year period (1989-2000), whereas PM2.5-10 concentrations increased by approximately 50% in the first 5 years and then decreased by a similar percentage over the following 7 years. These decreases were evident even after controlling for significant climatic effects. These results suggest that the pollution reduction programs developed and implemented by the Comisión Nacional del Medio Ambiente (CONAMA) have been effective in reducing particle levels in the Santiago Metropolitan region. However, particle levels remain high and it is thus imperative that efforts to improve air quality continue.

  10. Maximization of DRAM yield by control of surface charge and particle addition during high dose implantation

    NASA Astrophysics Data System (ADS)

    Horvath, J.; Moffatt, S.

    1991-04-01

    Ion implantation processing exposes semiconductor devices to an energetic ion beam in order to deposit dopant ions in shallow layers. In addition to this primary process, foreign materials are deposited as particles and surface films. The deposition of particles is a major cause of IC yield loss and becomes even more significant as device dimensions are decreased. Control of particle addition in a high-volume production environment requires procedures to limit beamline and endstation sources, control of particle transport, cleaning procedures and a well grounded preventative maintenance philosophy. Control of surface charge by optimization of the ion beam and electron shower conditions and measurement with a real-time charge sensor has been effective in improving the yield of NMOS and CMOS DRAMs. Control of surface voltages to a range between 0 and -20 V was correlated with good implant yield with PI9200 implanters for p + and n + source-drain implants.

  11. Controlled Expansion of Supercritical Solution: A Robust Method to Produce Pure Drug Nanoparticles With Narrow Size-Distribution.

    PubMed

    Pessi, Jenni; Lassila, Ilkka; Meriläinen, Antti; Räikkönen, Heikki; Hæggström, Edward; Yliruusi, Jouko

    2016-08-01

    We introduce a robust, stable, and reproducible method to produce nanoparticles based on expansion of supercritical solutions using carbon dioxide as a solvent. The method, controlled expansion of supercritical solution (CESS), uses controlled mass transfer, flow, pressure reduction, and particle collection in dry ice. CESS offers control over the crystallization process as the pressure in the system is reduced according to a specific profile. Particle formation takes place before the exit nozzle, and condensation is the main mechanism for postnucleation particle growth. A 2-step gradient pressure reduction is used to prevent Mach disk formation and particle growth by coagulation. Controlled particle growth keeps the production process stable. With CESS, we produced piroxicam nanoparticles, 60 mg/h, featuring narrow size distribution (176 ± 53 nm). Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. Optimal Jet Finder (v1.0 C++)

    NASA Astrophysics Data System (ADS)

    Chumakov, S.; Jankowski, E.; Tkachov, F. V.

    2006-10-01

    We describe a C++ implementation of the Optimal Jet Definition for identification of jets in hadronic final states of particle collisions. We explain interface subroutines and provide a usage example. The source code is available from http://www.inr.ac.ru/~ftkachov/projects/jets/. Program summaryTitle of program: Optimal Jet Finder (v1.0 C++) Catalogue identifier: ADSB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSB_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: any computer with a standard C++ compiler Tested with:GNU gcc 3.4.2, Linux Fedora Core 3, Intel i686; Forte Developer 7 C++ 5.4, SunOS 5.9, UltraSPARC III+; Microsoft Visual C++ Toolkit 2003 (compiler 13.10.3077, linker 7.10.30777, option /EHsc), Windows XP, Intel i686. Programming language used: C++ Memory required:˜1 MB (or more, depending on the settings) No. of lines in distributed program, including test data, etc.: 3047 No. of bytes in distributed program, including test data, etc.: 17 884 Distribution format: tar.gz Nature of physical problem: Analysis of hadronic final states in high energy particle collision experiments often involves identification of hadronic jets. A large number of hadrons detected in the calorimeter is reduced to a few jets by means of a jet finding algorithm. The jets are used in further analysis which would be difficult or impossible when applied directly to the hadrons. Grigoriev et al. [D.Yu. Grigoriev, E. Jankowski, F.V. Tkachov, Phys. Rev. Lett. 91 (2003) 061801] provide brief introduction to the subject of jet finding algorithms and a general review of the physics of jets can be found in [R. Barlow, Rep. Prog. Phys. 36 (1993) 1067]. Method of solution: The software we provide is an implementation of the so-called Optimal Jet Definition (OJD). The theory of OJD was developed in [F.V. Tkachov, Phys. Rev. Lett. 73 (1994) 2405; Erratum, Phys. Rev. Lett. 74 (1995) 2618; F.V. Tkachov, Int. J. Modern Phys. A 12 (1997) 5411; F.V. Tkachov, Int. J. Modern Phys. A 17 (2002) 2783]. The desired jet configuration is obtained as the one that minimizes Ω, a certain function of the input particles and jet configuration. A FORTRAN 77 implementation of OJD is described in [D.Yu. Grigoriev, E. Jankowski, F.V. Tkachov, Comput. Phys. Comm. 155 (2003) 42]. Restrictions on the complexity of the program: Memory required by the program is proportional to the number of particles in the input × the number of jets in the output. For example, for 650 particles and 20 jets ˜300 KB memory is required. Typical running time: The running time (in the running mode with a fixed number of jets) is proportional to the number of particles in the input × the number of jets in the output × times the number of different random initial configurations tried ( ntries). For example, for 65 particles in the input and 4 jets in the output, the running time is ˜4ṡ10 s per try (Pentium 4 2.8 GHz).

  13. Laser based synthesis of nanofunctionalized particulates for pulmonary based controlled drug delivery applications

    NASA Astrophysics Data System (ADS)

    Singh, R. K.; Kim, W.-S.; Ollinger, M.; Craciun, V.; Coowantwong, I.; Hochhaus, G.; Koshizaki, N.

    2002-09-01

    There is an urgent need to develop controlled drug release systems for the delivery of drugs via the pulmonary route. A key issue in pulmonary dry delivery systems is to reduce the amount of biodegradable polymers that are added to control the drug release. We have synthesized nanofunctionalized drug particles using the pulsed laser deposition on particles (PLDP) (e.g. budesonide) in an effort to control the architecture and thickness of a nanoscale polymer coating on the drug particles. In vitro studies indicated that the dry half-life release for budesonide can be enhanced from 1.2 to over 60 min by a nanoscale coating on the drug particle. Extensive studies have been conducted to characterize the bonding and composition of the polymer film deposited on drug particles.

  14. Multiple electrokinetic actuators for feedback control of colloidal crystal size.

    PubMed

    Juárez, Jaime J; Mathai, Pramod P; Liddle, J Alexander; Bevan, Michael A

    2012-10-21

    We report a feedback control method to precisely target the number of colloidal particles in quasi-2D ensembles and their subsequent assembly into crystals in a quadrupole electrode. Our approach relies on tracking the number of particles within a quadrupole electrode, which is used in a real-time feedback control algorithm to dynamically actuate competing electrokinetic transport mechanisms. Particles are removed from the quadrupole using DC-field mediated electrophoretic-electroosmotic transport, while high-frequency AC-field mediated dielectrophoretic transport is used to concentrate and assemble colloidal crystals. Our results show successful control of the size of crystals containing 20 to 250 colloidal particles with less than 10% error. Assembled crystals are characterized by their radius of gyration, crystallinity, and number of edge particles, and demonstrate the expected size-dependent properties. Our findings demonstrate successful ensemble feedback control of the assembly of different sized colloidal crystals using multiple actuators, which has broad implications for control over nano- and micro- scale assembly processes involving colloidal components.

  15. Novel algorithm and MATLAB-based program for automated power law analysis of single particle, time-dependent mean-square displacement

    NASA Astrophysics Data System (ADS)

    Umansky, Moti; Weihs, Daphne

    2012-08-01

    In many physical and biophysical studies, single-particle tracking is utilized to reveal interactions, diffusion coefficients, active modes of driving motion, dynamic local structure, micromechanics, and microrheology. The basic analysis applied to those data is to determine the time-dependent mean-square displacement (MSD) of particle trajectories and perform time- and ensemble-averaging of similar motions. The motion of particles typically exhibits time-dependent power-law scaling, and only trajectories with qualitatively and quantitatively comparable MSD should be ensembled. Ensemble averaging trajectories that arise from different mechanisms, e.g., actively driven and diffusive, is incorrect and can result inaccurate correlations between structure, mechanics, and activity. We have developed an algorithm to automatically and accurately determine power-law scaling of experimentally measured single-particle MSD. Trajectories can then categorized and grouped according to user defined cutoffs of time, amplitudes, scaling exponent values, or combinations. Power-law fits are then provided for each trajectory alongside categorized groups of trajectories, histograms of power laws, and the ensemble-averaged MSD of each group. The codes are designed to be easily incorporated into existing user codes. We expect that this algorithm and program will be invaluable to anyone performing single-particle tracking, be it in physical or biophysical systems. Catalogue identifier: AEMD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 25 892 No. of bytes in distributed program, including test data, etc.: 5 572 780 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.11 (2010b) or higher, program should also be backwards compatible. Symbolic Math Toolboxes (5.5) is required. The Curve Fitting Toolbox (3.0) is recommended. Computer: Tested on Windows only, yet should work on any computer running MATLAB. In Windows 7, should be used as administrator, if the user is not the administrator the program may not be able to save outputs and temporary outputs to all locations. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.11 / 2010b or higher. Supplementary material: Sample output files (approx. 30 MBytes) are available. Classification: 12 External routines: Several MATLAB subfunctions (m-files), freely available on the web, were used as part of and included in, this code: count, NaN suite, parseArgs, roundsd, subaxis, wcov, wmean, and the executable pdfTK.exe. Nature of problem: In many physical and biophysical areas employing single-particle tracking, having the time-dependent power-laws governing the time-averaged meansquare displacement (MSD) of a single particle is crucial. Those power laws determine the mode-of-motion and hint at the underlying mechanisms driving motion. Accurate determination of the power laws that describe each trajectory will allow categorization into groups for further analysis of single trajectories or ensemble analysis, e.g. ensemble and time-averaged MSD. Solution method: The algorithm in the provided program automatically analyzes and fits time-dependent power laws to single particle trajectories, then group particles according to user defined cutoffs. It accepts time-dependent trajectories of several particles, each trajectory is run through the program, its time-averaged MSD is calculated, and power laws are determined in regions where the MSD is linear on a log-log scale. Our algorithm searches for high-curvature points in experimental data, here time-dependent MSD. Those serve as anchor points for determining the ranges of the power-law fits. Power-law scaling is then accurately determined and error estimations of the parameters and quality of fit are provided. After all single trajectory time-averaged MSDs are fit, we obtain cutoffs from the user to categorize and segment the power laws into groups; cutoff are either in exponents of the power laws, time of appearance of the fits, or both together. The trajectories are sorted according to the cutoffs and the time- and ensemble-averaged MSD of each group is provided, with histograms of the distributions of the exponents in each group. The program then allows the user to generate new trajectory files with trajectories segmented according to the determined groups, for any further required analysis. Additional comments: README file giving the names and a brief description of all the files that make-up the package and clear instructions on the installation and execution of the program is included in the distribution package. Running time: On an i5 Windows 7 machine with 4 GB RAM the automated parts of the run (excluding data loading and user input) take less than 45 minutes to analyze and save all stages for an 844 trajectory file, including optional PDF save. Trajectory length did not affect run time (tested up to 3600 frames/trajectory), which was on average 3.2±0.4 seconds per trajectory.

  16. Morphology control of anisotropic BaTiO 3 and BaTiOF 4 using organic-inorganic interaction

    NASA Astrophysics Data System (ADS)

    Masuda, Yoshitake; Tanaka, Yuki; Gao, Yanfeng; Koumoto, Kunihito

    2009-01-01

    We proposed a novel concept for morphology control of barium titanate precursor to fabricate platy particles. Organic molecules play an essential role in the crystallization of BaTiOF 4 to synthesize multi-needle particles, polyhedron particles or platy particles in an aqueous solution. Precursors were successfully transformed to barium titanate single phase by annealing. Platy barium titanate precursor particles are expected for future multilayer ceramic capacitors.

  17. Enhanced hydrophobicity and volatility of submicron aerosols under severe emission control conditions in Beijing

    NASA Astrophysics Data System (ADS)

    Wang, Yuying; Zhang, Fang; Li, Zhanqing

    2017-04-01

    A series of strict emission control measures were implemented in Beijing and the surrounding seven provinces to ensure good air quality during the 2015 China Victory Day parade, rendering a unique opportunity to investigate anthropogenic impact of aerosol properties. Submicron aerosol hygroscopicity and volatility were measured during and after the control period using a hygroscopic and volatile tandem differential mobility analyzer (H/V-TDMA) system. Three periods, namely, the control clean period (Clean1), the non-control clean period (Clean2), and the non-control pollution period (Pollution), were selected to study the effect of the emission control measures on aerosol hygroscopicity and volatility. Aerosol particles became more hydrophobic and volatile due to the emission control measures. The hygroscopicity parameter (κ) of 40-200 nm particles decreased by 32.0%-8.5% during the Clean1 period relative to the Clean2 period, while the volatile shrink factor (SF) of 40-300 nm particles decreased by 7.5%-10.5%. The emission controls also changed the diurnal variation patterns of both the probability density function of κ (κ-PDF) and the probability density function of SF (SF-PDF). During Clean1 the κ-PDF showed one nearly-hydrophobic (NH) mode for particles in the nucleation mode, which was likely due to the dramatic reduction in industrial emissions of inorganic trace gases. Compared to the Pollution period, particles observed during the Clean1 and Clean2 periods exhibited a more significant non-volatile (NV) mode throughout the day, suggesting a more externally-mixed state particularly for the 150 nm particles. Aerosol hygroscopicities increased as particle sizes increased, with the greatest increases seen during the Pollution period. Accordingly, the aerosol volatility became weaker (i.e., SF increased) during the Clean1 and Clean2 periods, but no apparent trend was observed during the Pollution period. Based on a correlation analysis of the number fractions of NH and NV particles, we found that a higher number fraction of hydrophobic and volatile particles during the emission control period.

  18. Compact toroid generation, lifetime, and stability studies in linear reversed-field theta pinch geometries, (TRX-1): Second annual and final report, June 1981-March 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, A.L.; Slough, J.T.

    1983-09-01

    Four major areas have been investigated in the triggered reconnection experiment (TRX) program. These areas are flux trapping; formation (reconnection and axial dynamics); stability; and lifetime. This report describes the progress in each of these areas. Flux trapping for relatively slow field reversal rates due to the formation of a wall sheath has been accomplished and techniques have been developed for both triggered and programmed reconnection and the formation process has been optimized for maximum flux retention. Rotational n=2 instability has been controlled through the use of octopole barrier fields and long particle lifetimes have been achieved through optimization ofmore » the formation process. 46 refs., 63 figs., 4 tabs. (FI)« less

  19. On-chip particle trapping and manipulation

    NASA Astrophysics Data System (ADS)

    Leake, Kaelyn Danielle

    The ability to control and manipulate the world around us is human nature. Humans and our ancestors have used tools for millions of years. Only in recent years have we been able to control objects at such small levels. In order to understand the world around us it is frequently necessary to interact with the biological world. Optical trapping and manipulation offer a non-invasive way to move, sort and interact with particles and cells to see how they react to the world around them. Optical tweezers are ideal in their abilities but they require large, non-portable, and expensive setups limiting how and where we can use them. A cheap portable platform is required in order to have optical manipulation reach its full potential. On-chip technology offers a great solution to this challenge. We focused on the Liquid-Core Anti-Resonant Reflecting Optical Waveguide (liquid-core ARROW) for our work. The ARROW is an ideal platform, which has anti-resonant layers which allow light to be guided in liquids, allowing for particles to easily be manipulated. It is manufactured using standard silicon manufacturing techniques making it easy to produce. The planner design makes it easy to integrate with other technologies. Initially I worked to improve the ARROW chip by reducing the intersection losses and by reducing the fluorescence and background on the ARROW chip. The ARROW chip has already been used to trap and push particles along its channel but here I introduce several new methods of particle trapping and manipulation on the ARROW chip. Traditional two beam traps use two counter propagating beams. A trapping scheme that uses two orthogonal beams which counter to first instinct allow for trapping at their intersection is introduced. This scheme is thoroughly predicted and analyzed using realistic conditions. Simulations of this method were done using a program which looks at both the fluidics and optical sources to model complex situations. These simulations were also used to model and predict a sorting method which combines fluid flow with a single optical source to automatically sort dielectric particles by size in waveguide networks. These simulations were shown to be accurate when repeated on-chip. Lastly I introduce a particle trapping technique that uses Multimode Interference(MMI) patterns in order to trap multiple particles at once. The location of the traps can be adjusted as can the number of trapping location by changing the input wavelength. By changing the wavelength back and forth between two values this MMI can be used to pass a particle down the channel like a conveyor belt.

  20. Production of morphology-controllable porous hyaluronic acid particles using a spray-drying method.

    PubMed

    Iskandar, Ferry; Nandiyanto, Asep Bayu Dani; Widiyastuti, W; Young, Lee Sin; Okuyama, Kikuo; Gradon, Leon

    2009-05-01

    Hyaluronic acid (HA) porous particles with controllable porosity and pore size, ranging from 100 to 300 nm, were successfully prepared using a colloidal templating and spray-drying method. HA powder and polystyrene latex (PSL) particles, which were used as the precursor and templating agent, respectively, were mixed in aqueous solution and spray-dried using a two-fluid nozzle system to produce HA and PSL composite particles. Water was evaporated during spray-drying using heated air with a temperature of 120 degrees C. This simple process was completed within several seconds. The prepared particles were collected and washed with an organic solvent to dissolve the PSL templating agent. The porosity and pore size of the resulting particles were easily controlled by changing the initial mass ratio of precursor to templating agent, i.e., HA to PSL, and by altering the size of the PSL template particles.

  1. Fuzzy logic particle tracking velocimetry

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1993-01-01

    Fuzzy logic has proven to be a simple and robust method for process control. Instead of requiring a complex model of the system, a user defined rule base is used to control the process. In this paper the principles of fuzzy logic control are applied to Particle Tracking Velocimetry (PTV). Two frames of digitally recorded, single exposure particle imagery are used as input. The fuzzy processor uses the local particle displacement information to determine the correct particle tracks. Fuzzy PTV is an improvement over traditional PTV techniques which typically require a sequence (greater than 2) of image frames for accurately tracking particles. The fuzzy processor executes in software on a PC without the use of specialized array or fuzzy logic processors. A pair of sample input images with roughly 300 particle images each, results in more than 200 velocity vectors in under 8 seconds of processing time.

  2. Portable air pollution control equipment for the control of toxic particulate emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaurushia, A.; Odabashian, S.; Busch, E.

    1997-12-31

    Chromium VI (Cr VI) has been identified by the environmental regulatory agencies as a potent carcinogen among eleven heavy metals. A threshold level of 0.0001 lb/year for Cr VI emissions has been established by the California Air Resources Board for reporting under Assembly Bill 2588. A need for an innovative control technology to reduce fugitive emissions of Cr VI was identified during the Air Toxic Emissions Reduction Program at Northrop Grumman Military Aircraft Systems Division (NGMASD). NGMASD operates an aircraft assembly facility in El Segundo, CA. Nearly all of the aircraft components are coated with a protective coating (primer) priormore » to assembly. The primer has Cr VI as a component for its excellent corrosion resistance property. The complex assembly process requires fasteners which also need primer coating. Therefore, NGMASD utilizes High Volume Low Pressure (HVLP) guns for the touch-up spray coating operations. During the touch-up spray coating operations, Cr VI particles are atomized and transferred to the aircraft surface. The South Coast Air Quality Management District (SCAQMD) has determined that the HVLP gun transfers 65% of the paint particles onto the substrate and the remaining 35% are emitted as an overspray if air pollution controls are not applied. NGMASD has developed the Portable Air Pollution Control Equipment (PAPCE) to capture and control the overspray in order to reduce fugitive Cr VI emissions from the touch-up spray coating operations. A source test was performed per SCAQMD guidelines and the final report has been approved by the SCAQMD.« less

  3. Methylsilane derived silicon carbide particle coatings produced by fluid-bed chemical vapor deposition

    NASA Astrophysics Data System (ADS)

    Miller, James Henry

    This report describes the research effort that was undertaken to develop and understand processing techniques for the deposition of both low and high density SiC coatings from a non-halide precursor, in support of the Generation IV Gas-Cooled Fast Reactor (GFR) fuel development program. The research was conducted in two phases. In the first phase, the feasibility of producing both porous SiC coatings and dense SiC coatings on surrogate fuel particles by fluidized bed chemical vapor deposition (FBCVD) using gas mixtures of methylsilane and argon was demonstrated. In the second phase, a combined experimental and modeling effort was carried out in order to gain an understanding of the deposition mechanisms that result in either porous or dense SiC coatings, depending on the coating conditions. For this second phase effort, a simplified (compared to the fluid bed) single-substrate chemical vapor deposition (CVD) system was employed. Based on the experimental and modeling results, the deposition of SiC from methylsilane is controlled by the extent of gas-phase reaction, and is therefore highly sensitive to temperature. The results show that all SiC coatings are due to the surface adsorption of species that result from gas-phase reactions. The model terms these gas-borne species embryos, and while the model does not include a prediction of coating morphology, a comparison of the model and experimental results indicates that the morphology of the coatings is controlled by the nucleation and growth of the embryos. The coating that results from small embryos (embryos with only two Si-C pairs) appears relatively dense and continuous, while the coating that results from larger embryos becomes less continuous and more nodular as embryo size increases. At some point in the growth of embryos they cease to behave as molecular species and instead behave as particles that grow by either agglomeration or by incorporation of molecular species on their surface. As these particles adhere to the substrate surface and become fixed in place by surface deposition in the interstices between adjacent particles, a low density coating consisting of these particles results.

  4. Size-controlled synthesis, surface functionalization, and biological applications of thiol-organosilica particles.

    PubMed

    Nakamura, Michihiro; Ozaki, Shuji; Abe, Masahiro; Doi, Hiroyuki; Matsumoto, Toshio; Ishimura, Kazunori

    2010-08-01

    Thiol-organosilica particles of a narrow size distribution, made from 3-mercaptopropyltrimethoxysilane (MPMS), were prepared by means of a one-pot synthesis. We examined three synthetic conditions at high temperature (100 degrees C), including the Stöber synthesis and two entirely aqueous syntheses. Under all conditions, the sizes of MPMS particles were well controlled, and the average of the coefficient of variation for the size distribution was less than 20%. The incubation times required for formation of MPMS particles were shorter at high temperature than at low temperature. MPMS particles internally functionalized with fluorescent dye were also prepared by means of the same one-pot synthesis. On flow cytometry analysis these MPMS particles showed distinct peaks of scattering due to well-controlled sizes of particles as well as due to fluorescence signals. Real-time observation of interaction between fluorescent MPMPS particles and cultured cells could be observed under fluorescent microscopy with bright light. The surface of the as-prepared MPMS particles contained exposed mercaptopropyl residues, and the ability to adsorb proteins was at least 6 times higher than that of gold nanopaticles. In addition, fluorescein-labeled proteins adsorbed to the surface of the particles were quantitatively detected at the pg/ml level by flow cytometry. MPMS particles surface functionalized with anti-CD20 antibody using adsorption could bind with lymphoma cells expressing CD20 specifically. In this paper, we demonstrated the possibility of size-controlled thiol-organosilica particles for wild range of biological applications. Crown Copyright 2010. Published by Elsevier B.V. All rights reserved.

  5. Quarked! - Adventures in Particle Physics Education

    NASA Astrophysics Data System (ADS)

    MacDonald, Teresa; Bean, Alice

    2009-01-01

    Particle physics is a subject that can send shivers down the spines of students and educators alike-with visions of long mathematical equations and inscrutable ideas. This perception, along with a full curriculum, often leaves this topic the road less traveled until the latter years of school. Particle physics, including quarks, is typically not introduced until high school or university.1,2 Many of these concepts can be made accessible to younger students when presented in a fun and engaging way. Informal science institutions are in an ideal position to communicate new and challenging science topics in engaging and innovative ways and offer a variety of educational enrichment experiences for students that support and enhance science learning.3 Quarked!™ Adventures in the Subatomic Universe, a National Science Foundation EPSCoR-funded particle physics education program, provides classroom programs and online educational resources.

  6. Dense Suspension Splash

    NASA Astrophysics Data System (ADS)

    Zhang, Wendy; Dodge, Kevin M.; Peters, Ivo R.; Ellowitz, Jake; Klein Schaarsberg, Martin H.; Jaeger, Heinrich M.

    2014-03-01

    Upon impact onto a solid surface at several meters-per-second, a dense suspension plug splashes by ejecting liquid-coated particles. We study the mechanism for splash formation using experiments and a numerical model. In the model, the dense suspension is idealized as a collection of cohesionless, rigid grains with finite surface roughness. The grains also experience lubrication drag as they approach, collide inelastically and rebound away from each other. Simulations using this model reproduce the measured momentum distribution of ejected particles. They also provide direct evidence supporting the conclusion from earlier experiments that inelastic collisions, rather than viscous drag, dominate when the suspension contains macroscopic particles immersed in a low-viscosity solvent such as water. Finally, the simulations reveal two distinct routes for splash formation: a particle can be ejected by a single high momentum-change collision. More surprisingly, a succession of small momentum-change collisions can accumulate to eject a particle outwards. Supported by NSF through its MRSEC program (DMR-0820054) and fluid dynamics program (CBET-1336489).

  7. Theoretic model and computer simulation of separating mixture metal particles from waste printed circuit board by electrostatic separator.

    PubMed

    Li, Jia; Xu, Zhenming; Zhou, Yaohe

    2008-05-30

    Traditionally, the mixture metals from waste printed circuit board (PCB) were sent to the smelt factory to refine pure copper. Some valuable metals (aluminum, zinc and tin) with low content in PCB were lost during smelt. A new method which used roll-type electrostatic separator (RES) to recovery low content metals in waste PCB was presented in this study. The theoretic model which was established from computing electric field and the analysis of forces on the particles was used to write a program by MATLAB language. The program was design to simulate the process of separating mixture metal particles. Electrical, material and mechanical factors were analyzed to optimize the operating parameters of separator. The experiment results of separating copper and aluminum particles by RES had a good agreement with computer simulation results. The model could be used to simulate separating other metal (tin, zinc, etc.) particles during the process of recycling waste PCBs by RES.

  8. Assessing consumption of bioactive micro-particles by filter-feeding Asian carp

    USGS Publications Warehouse

    Jensen, Nathan R.; Amberg, Jon J.; Luoma, James A.; Walleser, Liza R.; Gaikowski, Mark P.

    2012-01-01

    Silver carp Hypophthalmichthys molitrix (SVC) and bighead carp H. nobilis (BHC) have impacted waters in the US since their escape. Current chemical controls for aquatic nuisance species are non-selective. Development of a bioactive micro-particle that exploits filter-feeding habits of SVC or BHC could result in a new control tool. It is not fully understood if SVC or BHC will consume bioactive micro-particles. Two discrete trials were performed to: 1) evaluate if SVC and BHC consume the candidate micro-particle formulation; 2) determine what size they consume; 3) establish methods to evaluate consumption of filter-feeders for future experiments. Both SVC and BHC were exposed to small (50-100 μm) and large (150-200 μm) micro-particles in two 24-h trials. Particles in water were counted electronically and manually (microscopy). Particles on gill rakers were counted manually and intestinal tracts inspected for the presence of micro-particles. In Trial 1, both manual and electronic count data confirmed reductions of both size particles; SVC appeared to remove more small particles than large; more BHC consumed particles; SVC had fewer overall particles in their gill rakers than BHC. In Trial 2, electronic counts confirmed reductions of both size particles; both SVC and BHC consumed particles, yet more SVC consumed micro-particles compared to BHC. Of the fish that ate micro-particles, SVC consumed more than BHC. It is recommended to use multiple metrics to assess consumption of candidate micro-particles by filter-feeders when attempting to distinguish differential particle consumption. This study has implications for developing micro-particles for species-specific delivery of bioactive controls to help fisheries, provides some methods for further experiments with bioactive micro-particles, and may also have applications in aquaculture.

  9. Performance comparison of genetic algorithms and particle swarm optimization for model integer programming bus timetabling problem

    NASA Astrophysics Data System (ADS)

    Wihartiko, F. D.; Wijayanti, H.; Virgantari, F.

    2018-03-01

    Genetic Algorithm (GA) is a common algorithm used to solve optimization problems with artificial intelligence approach. Similarly, the Particle Swarm Optimization (PSO) algorithm. Both algorithms have different advantages and disadvantages when applied to the case of optimization of the Model Integer Programming for Bus Timetabling Problem (MIPBTP), where in the case of MIPBTP will be found the optimal number of trips confronted with various constraints. The comparison results show that the PSO algorithm is superior in terms of complexity, accuracy, iteration and program simplicity in finding the optimal solution.

  10. Phase I: energy conservation potential of Portland Cement particle size distribution control. Progress report, November 1978-January 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmuth, R.A.

    1979-03-01

    Progress is reported on the energy conservation potential of Portland cement particle size distribution control. Results of preliminary concrete tests, Series IIIa and Series IIIb, effects of particle size ranges on strength and drying shrinkage, are presented. Series IV, effects of mixing and curing temperature, tests compare the properties of several good particle size controlled cements with normally ground cements at low and high temperatures. The work on the effects of high alkali and high sulfate clinker cements (Series V) has begun.

  11. Blind Quantum Signature with Controlled Four-Particle Cluster States

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Jinjing; Shi, Ronghua; Guo, Ying

    2017-08-01

    A novel blind quantum signature scheme based on cluster states is introduced. Cluster states are a type of multi-qubit entangled states and it is more immune to decoherence than other entangled states. The controlled four-particle cluster states are created by acting controlled-Z gate on particles of four-particle cluster states. The presented scheme utilizes the above entangled states and simplifies the measurement basis to generate and verify the signature. Security analysis demonstrates that the scheme is unconditional secure. It can be employed to E-commerce systems in quantum scenario.

  12. Electrodynamic Dust Shield Technology for Thermal Radiators Used in Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Calle, Carlos I.; Hogue, Michael D.; Snyder, Sarah J.; Clements, Sidney J.; Johansen, Michael R.; Chen, Albert

    2011-01-01

    Two general types of thermal radiators are being considered for lunar missions: coated metallic surfaces and Second Surface Mirrors. Metallic surfaces are coated with a specially formulated white paint that withstands the space environment and adheres well to aluminium, the most common metal used in space hardware. AZ-93 White Thermal Control Paint, developed for the space program, is an electrically conductive inorganic coating that offers thermal control for spacecraft. It is currently in use on satellite surfaces (Fig 1). This paint withstands exposure to atomic oxygen, charged particle radiation, and vacuum ultraviolet radiation form 118 nm to 170 nm while reflecting 84 to 85% of the incident solar radiation and emitting 89-93% of the internal heat generated inside the spacecraft.

  13. The Ferrofluids Story

    NASA Technical Reports Server (NTRS)

    1993-01-01

    A new Ferrofluidics exclusion seal promises improvement in controlling "fugitive emissions" -vapors that escape into the atmosphere from petroleum refining and chemical processing facilities. These are primarily volatile organic compounds, and their emissions are highly regulated by the EPA. The ferrofluid system consists of a primary mechanical seal working in tandem with a secondary seal. Ferrofluids are magnetic liquids - fluids in which microscopic metal particles have been suspended, allowing the liquid to be controlled by a magnetic force. The concept was developed in the early years of the Space program, but never used. Two Avco scientists, however, saw commercial potential in ferrofluids and formed a company. Among exclusion seal commercial applications are rotary feedthrough seals, hydrodynamic bearings and fluids for home and automotive loudspeakers. Ferrofluidics has subsidiaries throughout the world.

  14. Effect of Transitioning from Standard Reference Material 2806a to Standard Reference Material 2806b for Light Obscuration Particle Countering

    DTIC Science & Technology

    2016-04-01

    Reference Material 2806b for Light Obscuration Particle Countering April 2016 UNCLASSIFIED UNCLASSIFIED Joel Schmitigal 27809 Standard Form 298 (Rev...Standard Reference Material 2806b for Light Obscuration Particle Countering 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...Reference Material 2806a to Standard Reference Material 2806b for Light Obscuration Particle Countering Joel Schmitigal Force Projection

  15. Reduced atomic shadowing in HiPIMS: Role of the thermalized metal ions

    NASA Astrophysics Data System (ADS)

    Oliveira, João Carlos; Ferreira, Fábio; Anders, André; Cavaleiro, Albano

    2018-03-01

    In magnetron sputtering, the ability to tailor film properties depends primarily on the control of the flux of particles impinging on the growing film. Among deposition mechanisms, the shadowing effect leads to the formation of a rough surface and a porous, columnar microstructure. Re-sputtered species may be re-deposited in the valleys of the films surface and thereby contribute to a reduction of roughness and to fill the underdense regions. Both effects are non-local and they directly compete to shape the final properties of the deposited films. Additional control of the bombarding flux can be obtained by ionizing the sputtered flux, because ions can be controlled with respect to their energy and impinging direction, such as in High-Power Impulse Magnetron Sputtering (HiPIMS). In this work, the relation between ionization of the sputtered species and thin film properties is investigated in order to identify the mechanisms which effectively influence the shadowing effect in Deep Oscillation Magnetron Sputtering (DOMS), a variant of HiPIMS. The properties of two Cr films deposited using the same averaged target power by d.c. magnetron sputtering and DOMS have been compared. Additionally, the angle distribution of the Cr species impinging on the substrate was simulated using Monte Carlo-based programs while the energy distribution of the energetic particles bombarding the substrate was evaluated by energy-resolved mass analysis. It was found that the acceleration of the thermalized chromium ions at the substrate sheath in DOMS significantly reduces the high angle component of their impinging angle distribution and, thus, efficiently reduces atomic shadowing. Therefore, a high degree of ionization in HiPIMS results in almost shadowing effect-free film deposition and allows us to deposit dense and compact films without the need of high energy particle bombardment during growth.

  16. Reduction of angular divergence of laser-driven ion beams during their acceleration and transport

    NASA Astrophysics Data System (ADS)

    Zakova, M.; Pšikal, Jan; Margarone, Daniele; Maggiore, Mario; Korn, G.

    2015-05-01

    Laser plasma physics is a field of big interest because of its implications in basic science, fast ignition, medicine (i.e. hadrontherapy), astrophysics, material science, particle acceleration etc. 100-MeV class protons accelerated from the interaction of a short laser pulse with a thin target have been demonstrated. With continuing development of laser technology, greater and greater energies are expected, therefore projects focusing on various applications are being formed, e.g. ELIMAIA (ELI Multidisciplinary Applications of laser-Ion Acceleration). One of the main characteristic and crucial disadvantage of ion beams accelerated by ultra-short intense laser pulses is their large divergence, not suitable for the most of applications. In this paper two ways how to decrease beam divergence are proposed. Firstly, impact of different design of targets on beam divergence is studied by using 2D Particlein-cell simulations (PIC). Namely, various types of targets include at foils, curved foil and foils with diverse microstructures. Obtained results show that well-designed microstructures, i.e. a hole in the center of the target, can produce proton beam with the lowest divergence. Moreover, the particle beam accelerated from a curved foil has lower divergence compared to the beam from a flat foil. Secondly, another proposed method for the divergence reduction is using of a magnetic solenoid. The trajectories of the laser accelerated particles passing through the solenoid are modeled in a simple Matlab program. Results from PIC simulations are used as input in the program. The divergence is controlled by optimizing the magnetic field inside the solenoid and installing an aperture in front of the device.

  17. Modeling Lost-Particle Backgrounds in PEP-II Using LPTURTLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fieguth, T.; /SLAC; Barlow, R.

    2005-05-17

    Background studies during the design, construction, commissioning, operation and improvement of BaBar and PEP-II have been greatly influenced by results from a program referred to as LPTURTLE (Lost Particle TURTLE) which was originally conceived for the purpose of studying gas background for SLC. This venerable program is still in use today. We describe its use, capabilities and improvements and refer to current results now being applied to BaBar.

  18. Modeling and Numerical Challenges in Eulerian-Lagrangian Computations of Shock-driven Multiphase Flows

    NASA Astrophysics Data System (ADS)

    Diggs, Angela; Balachandar, Sivaramakrishnan

    2015-06-01

    The present work addresses the numerical methods required for particle-gas and particle-particle interactions in Eulerian-Lagrangian simulations of multiphase flow. Local volume fraction as seen by each particle is the quantity of foremost importance in modeling and evaluating such interactions. We consider a general multiphase flow with a distribution of particles inside a fluid flow discretized on an Eulerian grid. Particle volume fraction is needed both as a Lagrangian quantity associated with each particle and also as an Eulerian quantity associated with the flow. In Eulerian Projection (EP) methods, the volume fraction is first obtained within each cell as an Eulerian quantity and then interpolated to each particle. In Lagrangian Projection (LP) methods, the particle volume fraction is obtained at each particle and then projected onto the Eulerian grid. Traditionally, EP methods are used in multiphase flow, but sub-grid resolution can be obtained through use of LP methods. By evaluating the total error and its components we compare the performance of EP and LP methods. The standard von Neumann error analysis technique has been adapted for rigorous evaluation of rate of convergence. The methods presented can be extended to obtain accurate field representations of other Lagrangian quantities. Most importantly, we will show that such careful attention to numerical methodologies is needed in order to capture complex shock interaction with a bed of particles. Supported by U.S. Department of Defense SMART Program and the U.S. Department of Energy PSAAP-II program under Contract No. DE-NA0002378.

  19. Fusion programs in applied plasma physics

    NASA Astrophysics Data System (ADS)

    1992-07-01

    The Applied Plasma Physics (APP) program at General Atomics (GA) described here includes four major elements: (1) Applied Plasma Physics Theory Program, (2) Alpha Particle Diagnostic, (3) Edge and Current Density Diagnostic, and (4) Fusion User Service Center (USC). The objective of the APP theoretical plasma physics research at GA is to support the DIII-D and other tokamak experiments and to significantly advance our ability to design a commercially-attractive fusion reactor. We categorize our efforts in three areas: magnetohydrodynamic (MHD) equilibria and stability; plasma transport with emphasis on H-mode, divertor, and boundary physics; and radio frequency (RF). The objective of the APP alpha particle diagnostic is to develop diagnostics of fast confined alpha particles using the interactions with the ablation cloud surrounding injected pellets and to develop diagnostic systems for reacting and ignited plasmas. The objective of the APP edge and current density diagnostic is to first develop a lithium beam diagnostic system for edge fluctuation studies on the Texas Experimental Tokamak (TEXT). The objective of the Fusion USC is to continue to provide maintenance and programming support to computer users in the GA fusion community. The detailed progress of each separate program covered in this report period is described.

  20. A program for performing exact quantum dynamics calculations using cylindrical polar coordinates: A nanotube application

    NASA Astrophysics Data System (ADS)

    Skouteris, Dimitris; Gervasi, Osvaldo; Laganà, Antonio

    2009-03-01

    A program that uses the time-dependent wavepacket method to study the motion of structureless particles in a force field of quasi-cylindrical symmetry is presented here. The program utilises cylindrical polar coordinates to express the wavepacket, which is subsequently propagated using a Chebyshev expansion of the Schrödinger propagator. Time-dependent exit flux as well as energy-dependent S matrix elements can be obtained for all states of the particle (describing its angular momentum component along the nanotube axis and the excitation of the radial degree of freedom in the cylinder). The program has been used to study the motion of an H atom across a carbon nanotube. Program summaryProgram title: CYLWAVE Catalogue identifier: AECL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3673 No. of bytes in distributed program, including test data, etc.: 35 237 Distribution format: tar.gz Programming language: Fortran 77 Computer: RISC workstations Operating system: UNIX RAM: 120 MBytes Classification: 16.7, 16.10 External routines: SUNSOFT performance library (not essential) TFFT2D.F (Temperton Fast Fourier Transform), BESSJ.F (from Numerical Recipes, for the calculation of Bessel functions) (included in the distribution file). Nature of problem: Time evolution of the state of a structureless particle in a quasicylindrical potential. Solution method: Time dependent wavepacket propagation. Running time: 50000 secs. The test run supplied with the distribution takes about 10 minutes to complete.

  1. Surgical smoke control with local exhaust ventilation: Experimental study.

    PubMed

    Lee, Taekhee; Soo, Jhy-Charm; LeBouf, Ryan F; Burns, Dru; Schwegler-Berry, Diane; Kashon, Michael; Bowers, Jay; Harper, Martin

    2018-04-01

    This experimental study aimed to evaluate airborne particulates and volatile organic compounds (VOCs) from surgical smoke when a local exhaust ventilation (LEV) system is in place. Surgical smoke was generated from human tissue in an unoccupied operating room using an electrocautery surgical device for 15 min with 3 different test settings: (1) without LEV control; (2) control with a wall irrigation suction unit with an in-line ultra-low penetration air filter; and (3) control with a smoke evacuation system. Flow rate of LEVs was approximately 35 L/min and suction was maintained within 5 cm of electrocautery interaction site. A total of 6 experiments were conducted. Particle number and mass concentrations were measured using direct reading instruments including a condensation particle counter (CPC), a light-scattering laser photometer (DustTrak DRX), a scanning mobility particle sizer (SMPS), an aerodynamic particle sizer (APS), and a viable particle counter. Selected VOCs were collected using evacuated canisters using grab, personal and area sampling techniques. The largest average particle and VOCs concentrations were found in the absence of LEV control followed by LEV controls. Average ratios of LEV controls to without LEV control ranged 0.24-0.33 (CPC), 0.28-0.39 (SMPS), 0.14-0.31 (DustTrak DRX), and 0.26-0.55 (APS). Ethanol and isopropyl alcohol were dominant in the canister samples. Acetaldehyde, acetone, acetonitrile, benzene, hexane, styrene, and toluene were detected but at lower concentrations (<500 μg/m 3 ) and concentrations of the VOCs were much less than the National Institute for Occupational Safety and Health recommended exposure limit values. Utilization of the LEVs for surgical smoke control can significantly reduce but not completely eliminate airborne particles and VOCs.

  2. In-situ heating TEM observation of microscopic structural changes of size-controlled metallic copper/gelatin composite.

    PubMed

    Narushima, Takashi; Hyono, Atsushi; Nishida, Naoki; Yonezawa, Tetsu

    2012-10-01

    Copper/gelatin composite particles with controlled sizes were prepared at room temperature from cupric sulfate pentahydrate in the presence of gelatin as a protective reagent by using hydrazine monohydrate as a reducing agent. The formed particles with the size between 190-940 nm were secondary aggregated particles which were composed of smaller nanosized particles ("particle-in-particle"), the presence of which was established by XRD patterns and a cross-sectional TEM image. The sintering behavior of these copper/gelatin composite particles was demonstrated by in-situ heating TEM under a high vacuum (approximately 10(-5) Pa) and separately with the oxygen partial pressure controlled at the 10(-4) Pa level. It was established that the particles began to sinter at about 330 degrees C with the oxygen and that they sublimate above 450 degrees C both in the vacuum and oxygen conditions. This result shows that the introduction of an adequate amount of oxygen was effective to remove the gelatin surrounding the particles. It can also be concluded that the sintering of the copper/gelatin composite particles occurred even in the absence of a reducing agent such as hydrogen gas.

  3. Classroom Materials for Teaching "The Particle Nature of Matter." Practical Paper No. 173.

    ERIC Educational Resources Information Center

    Pella, Milton O.; And Others

    This document presents the lesson plans and tests used in the research study reported in Technical Report 173 (ED 070 658), together with descriptions of models and films developed for the teaching program. Thirty-one lessons are included, covering the topics of matter and energy; making interferences; particles; a model for matter; particles and…

  4. Operational radiological support for the US manned space program

    NASA Technical Reports Server (NTRS)

    Golightly, Michael J.; Hardy, Alva C.; Atwell, William; Weyland, Mark D.; Kern, John; Cash, Bernard L.

    1993-01-01

    Radiological support for the manned space program is provided by the Space Radiation Analysis Group at NASA/JSC. This support ensures crew safety through mission design analysis, real-time space environment monitoring, and crew exposure measurements. Preflight crew exposure calculations using mission design information are used to ensure that crew exposures will remain within established limits. During missions, space environment conditions are continuously monitored from within the Mission Control Center. In the event of a radiation environment enhancement, the impact to crew exposure is assessed and recommendations are provided to flight management. Radiation dosimeters are placed throughout the spacecraft and provided to each crewmember. During a radiation contingency, the crew could be requested to provide dosimeter readings. This information would be used for projecting crew dose enhancements. New instrumentation and computer technology are being developed to improve the support. Improved instruments include tissue equivalent proportional counter (TEPC)-based dosimeters and charged particle telescopes. Data from these instruments will be telemetered and will provide flight controllers with unprecedented information regarding the radiation environment in and around the spacecraft. New software is being acquired and developed to provide 'smart' space environmental data displays for use by flight controllers.

  5. Aircraft measurements of trace gases and particles near the tropopause

    NASA Technical Reports Server (NTRS)

    Falconer, P.; Pratt, R.; Detwiler, A.; Chen, C. S.; Hogan, A.; Bernard, S.; Krebschull, K.; Winters, W.

    1983-01-01

    Research activities which were performed using atmospheric constituent data obtained by the NASA Global Atmospheric Sampling Program are described. The characteristics of the particle size spectrum in various meteorological settings from a special collection of GASP data are surveyed. The relationship between humidity and cloud particles is analyzed. Climatological and case studies of tropical ozone distributions measured on a large number of flights are reported. Particle counter calibrations are discussed as well as the comparison of GASP particle data in the upper troposphere with other measurements at lower altitudes over the Pacific Ocean.

  6. Asthma-Related Outcomes in Patients Initiating Extrafine Ciclesonide or Fine-Particle Inhaled Corticosteroids

    PubMed Central

    Postma, Dirkje S.; Dekhuijzen, Richard; van der Molen, Thys; Martin, Richard J.; van Aalderen, Wim; Roche, Nicolas; Guilbert, Theresa W.; Israel, Elliot; van Eickels, Daniela; Khalid, Javaria Mona; Herings, Ron M.C.; Overbeek, Jetty A.; Miglio, Cristiana; Thomas, Victoria; Hutton, Catherine; Hillyer, Elizabeth V.

    2017-01-01

    Purpose Extrafine-particle inhaled corticosteroids (ICS) have greater small airway deposition than standard fine-particle ICS. We sought to compare asthma-related outcomes after patients initiated extrafine-particle ciclesonide or fine-particle ICS (fluticasone propionate or non-extrafine beclomethasone). Methods This historical, matched cohort study included patients aged 12-60 years prescribed their first ICS as ciclesonide or fine-particle ICS. The 2 cohorts were matched 1:1 for key demographic and clinical characteristics over the baseline year. Co-primary endpoints were 1-year severe exacerbation rates, risk-domain asthma control, and overall asthma control; secondary endpoints included therapy change. Results Each cohort included 1,244 patients (median age 45 years; 65% women). Patients in the ciclesonide cohort were comparable to those in the fine-particle ICS cohort apart from higher baseline prevalence of hospitalization, gastroesophageal reflux disease, and rhinitis. Median (interquartile range) prescribed doses of ciclesonide and fine-particle ICS were 160 (160-160) µg/day and 500 (250-500) µg/day, respectively (P<0.001). During the outcome year, patients prescribed ciclesonide experienced lower severe exacerbation rates (adjusted rate ratio [95% CI], 0.69 [0.53-0.89]), and higher odds of risk-domain asthma control (adjusted odds ratio [95% CI], 1.62 [1.27-2.06]) and of overall asthma control (2.08 [1.68-2.57]) than those prescribed fine-particle ICS. The odds of therapy change were 0.70 (0.59-0.83) with ciclesonide. Conclusions In this matched cohort analysis, we observed that initiation of ICS with ciclesonide was associated with better 1-year asthma outcomes and fewer changes to therapy, despite data suggesting more difficult-to-control asthma. The median prescribed dose of ciclesonide was one-third that of fine-particle ICS. PMID:28102056

  7. On the Green's function of the partially diffusion-controlled reversible ABCD reaction for radiation chemistry codes

    NASA Astrophysics Data System (ADS)

    Plante, Ianik; Devroye, Luc

    2015-09-01

    Several computer codes simulating chemical reactions in particles systems are based on the Green's functions of the diffusion equation (GFDE). Indeed, many types of chemical systems have been simulated using the exact GFDE, which has also become the gold standard for validating other theoretical models. In this work, a simulation algorithm is presented to sample the interparticle distance for partially diffusion-controlled reversible ABCD reaction. This algorithm is considered exact for 2-particles systems, is faster than conventional look-up tables and uses only a few kilobytes of memory. The simulation results obtained with this method are compared with those obtained with the independent reaction times (IRT) method. This work is part of our effort in developing models to understand the role of chemical reactions in the radiation effects on cells and tissues and may eventually be included in event-based models of space radiation risks. However, as many reactions are of this type in biological systems, this algorithm might play a pivotal role in future simulation programs not only in radiation chemistry, but also in the simulation of biochemical networks in time and space as well.

  8. Organic speciation of size-segregated atmospheric particulate matter

    NASA Astrophysics Data System (ADS)

    Tremblay, Raphael

    Particle size and composition are key factors controlling the impacts of particulate matter (PM) on human health and the environment. A comprehensive method to characterize size-segregated PM organic content was developed, and evaluated during two field campaigns. Size-segregated particles were collected using a cascade impactor (Micro-Orifice Uniform Deposit Impactor) and a PM2.5 large volume sampler. A series of alkanes and polycyclic aromatic hydrocarbons (PAHs) were solvent extracted and quantified using a gas chromatograph coupled with a mass spectrometer (GC/MS). Large volume injections were performed using a programmable temperature vaporization (PTV) inlet to lower detection limits. The developed analysis method was evaluated during the 2001 and 2002 Intercomparison Exercise Program on Organic Contaminants in PM2.5 Air Particulate Matter led by the US National Institute of Standards and Technology (NIST). Ambient samples were collected in May 2002 as part of the Tampa Bay Regional Atmospheric Chemistry Experiment (BRACE) in Florida, USA and in July and August 2004 as part of the New England Air Quality Study - Intercontinental Transport and Chemical Transformation (NEAQS - ITCT) in New Hampshire, USA. Morphology of the collected particles was studied using scanning electron microscopy (SEM). Smaller particles (one micrometer or less) appeared to consist of solid cores surrounded by a liquid layer which is consistent with combustion particles and also possibly with particles formed and/or coated by secondary material like sulfate, nitrate and secondary organic aerosols. Source apportionment studies demonstrated the importance of stationary sources on the organic particulate matter observed at these two rural sites. Coal burning and biomass burning were found to be responsible for a large part of the observed PAHs during the field campaigns. Most of the measured PAHs were concentrated in particles smaller than one micrometer and linked to combustion sources. The presence of known carcinogenic PAHs in the respirable particles has strong importance for human health. Recommendations for method improvements and further studies are included.

  9. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    NASA Technical Reports Server (NTRS)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  10. Rock sampling. [method for controlling particle size distribution

    NASA Technical Reports Server (NTRS)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  11. Control of manganese dioxide particles resulting from in situ chemical oxidation using permanganate.

    PubMed

    Crimi, Michelle; Ko, Saebom

    2009-02-01

    In situ chemical oxidation using permanganate is an approach to organic contaminant site remediation. Manganese dioxide particles are products of permanganate reactions. These particles have the potential to deposit in the subsurface and impact the flow-regime in/around permanganate injection, including the well screen, filter pack, and the surrounding subsurface formation. Control of these particles can allow for improved oxidant injection and transport and contact between the oxidant and contaminants of concern. The goals of this research were to determine if MnO(2) can be stabilized/controlled in an aqueous phase, and to determine the dependence of particle stabilization on groundwater characteristics. Bench-scale experiments were conducted to study the ability of four stabilization aids (sodium hexametaphosphate (HMP), Dowfax 8390, xanthan gum, and gum arabic) in maintaining particles suspended in solution under varied reaction conditions and time. Variations included particle and stabilization aid concentrations, ionic content, and pH. HMP demonstrated the most promising results, as compared to xanthan gum, gum arabic, and Dowfax 8390 based on results of spectrophotometric studies of particle behavior, particle filtration, and optical measurements of particle size and zeta potential. HMP inhibited particle settling, provided for greater particle stability, and resulted in particles of a smaller average size over the range of experimental conditions evaluated compared to results for systems that did not include HMP. Additionally, HMP did not react unfavorably with permanganate. These results indicate that the inclusion of HMP in a permanganate oxidation system improves conditions that may facilitate particle transport.

  12. Simulation of ultra-high energy photon propagation in the geomagnetic field

    NASA Astrophysics Data System (ADS)

    Homola, P.; Góra, D.; Heck, D.; Klages, H.; PeĶala, J.; Risse, M.; Wilczyńska, B.; Wilczyński, H.

    2005-12-01

    The identification of primary photons or specifying stringent limits on the photon flux is of major importance for understanding the origin of ultra-high energy (UHE) cosmic rays. UHE photons can initiate particle cascades in the geomagnetic field, which leads to significant changes in the subsequent atmospheric shower development. We present a Monte Carlo program allowing detailed studies of conversion and cascading of UHE photons in the geomagnetic field. The program named PRESHOWER can be used both as an independent tool or together with a shower simulation code. With the stand-alone version of the code it is possible to investigate various properties of the particle cascade induced by UHE photons interacting in the Earth's magnetic field before entering the Earth's atmosphere. Combining this program with an extensive air shower simulation code such as CORSIKA offers the possibility of investigating signatures of photon-initiated showers. In particular, features can be studied that help to discern such showers from the ones induced by hadrons. As an illustration, calculations for the conditions of the southern part of the Pierre Auger Observatory are presented. Catalogue identifier:ADWG Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWG Program obtainable: CPC Program Library, Quen's University of Belfast, N. Ireland Computer on which the program has been thoroughly tested:Intel-Pentium based PC Operating system:Linux, DEC-Unix Programming language used:C, FORTRAN 77 Memory required to execute with typical data:<100 kB No. of bits in a word:32 Has the code been vectorized?:no Number of lines in distributed program, including test data, etc.:2567 Number of bytes in distributed program, including test data, etc.:25 690 Distribution format:tar.gz Other procedures used in PRESHOWER:IGRF [N.A. Tsyganenko, National Space Science Data Center, NASA GSFC, Greenbelt, MD 20771, USA, http://nssdc.gsfc.nasa.gov/space/model/magnetos/data-based/geopack.html], bessik, ran2 [Numerical Recipes, http://www.nr.com]. Nature of the physical problem:Simulation of a cascade of particles initiated by UHE photon passing through the geomagnetic field above the Earth's atmosphere. Method of solution: The primary photon is tracked until its conversion into ee pair or until it reaches the upper atmosphere. If conversion occurred each individual particle in the resultant preshower is checked for either bremsstrahlung radiation (electrons) or secondary gamma conversion (photons). The procedure ends at the top of atmosphere and the shower particle data are saved. Restrictions on the complexity of the problem: Gamma conversion into particles other than electron pair has not been taken into account. Typical running time: 100 preshower events with primary energy 10 eV require a 800 MHz CPU time of about 50 min, with 10 eV the simulation time for 100 events grows up to 500 min.

  13. Aerosol mass spectrometry systems and methods

    DOEpatents

    Fergenson, David P.; Gard, Eric E.

    2013-08-20

    A system according to one embodiment includes a particle accelerator that directs a succession of polydisperse aerosol particles along a predetermined particle path; multiple tracking lasers for generating beams of light across the particle path; an optical detector positioned adjacent the particle path for detecting impingement of the beams of light on individual particles; a desorption laser for generating a beam of desorbing light across the particle path about coaxial with a beam of light produced by one of the tracking lasers; and a controller, responsive to detection of a signal produced by the optical detector, that controls the desorption laser to generate the beam of desorbing light. Additional systems and methods are also disclosed.

  14. Simulation of Planetary Formation using Python

    NASA Astrophysics Data System (ADS)

    Bufkin, James; Bixler, David

    2015-03-01

    A program to simulate planetary formation was developed in the Python programming language. The program consists of randomly placed and massed bodies surrounding a central massive object in order to approximate a protoplanetary disk. The orbits of these bodies are time-stepped, with accelerations, velocities and new positions calculated in each step. Bodies are allowed to merge if their disks intersect. Numerous parameters (orbital distance, masses, number of particles, etc.) were varied in order to optimize the program. The program uses an iterative difference equation approach to solve the equations of motion using a kinematic model. Conservation of energy and angular momentum are not specifically forced, but conservation of momentum is forced during the merging of bodies. The initial program was created in Visual Python (VPython) but the current intention is to allow for higher particle count and faster processing by utilizing PyOpenCl and PyOpenGl. Current results and progress will be reported.

  15. A New Digital Holographic Instrument for Measuring Microphysical Properties of Contrails in the SASS (Subsonic Assessment) Program

    NASA Technical Reports Server (NTRS)

    Lawson, R. Paul

    2000-01-01

    SPEC incorporated designed, built and operated a new instrument, called a pi-Nephelometer, on the NASA DC-8 for the SUCCESS field project. The pi-Nephelometer casts an image of a particle on a 400,000 pixel solid-state camera by freezing the motion of the particle using a 25 ns pulsed, high-power (60 W) laser diode. Unique optical imaging and particle detection systems precisely detect particles and define the depth-of-field so that at least one particle in the image is almost always in focus. A powerful image processing engine processes frames from the solid-state camera, identifies and records regions of interest (i.e. particle images) in real time. Images of ice crystals are displayed and recorded with 5 micron pixel resolution. In addition, a scattered light system simultaneously measures the scattering phase function of the imaged particle. The system consists of twenty-eight 1-mm optical fibers connected to microlenses bonded on the surface of avalanche photo diodes (APDs). Data collected with the pi-Nephelometer during the SUCCESS field project was reported in a special issue of Geophysical Research Letters. The pi-Nephelometer provided the basis for development of a commercial imaging probe, called the cloud particle imager (CPI), which has been installed on several research aircraft and used in More than a dozen field programs.

  16. User guide for MODPATH version 6 - A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2012-01-01

    MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.

  17. Flash nano-precipitation of polymer blends: a role for fluid flow?

    NASA Astrophysics Data System (ADS)

    Grundy, Lorena; Mason, Lachlan; Chergui, Jalel; Juric, Damir; Craster, Richard V.; Lee, Victoria; Prudhomme, Robert; Priestley, Rodney; Matar, Omar K.

    2017-11-01

    Porous structures can be formed by the controlled precipitation of polymer blends; ranging from porous matrices, with applications in membrane filtration, to porous nano-particles, with applications in catalysis, targeted drug delivery and emulsion stabilisation. Under a diffusive exchange of solvent for non-solvent, prevailing conditions favour the decomposition of polymer blends into multiple phases. Interestingly, dynamic structures can be `trapped' via vitrification prior to thermodynamic equilibrium. A promising mechanism for large-scale polymer processing is flash nano-precipitation (FNP). FNP particle formation has recently been modelled using spinodal decomposition theory, however the influence of fluid flow on structure formation is yet to be clarified. In this study, we couple a Navier-Stokes equation to a Cahn-Hilliard model of spinodal decomposition. The framework is implemented using Code BLUE, a massively scalable fluid dynamics solver, and applied to flows within confined impinging jet mixers. The present method is valid for a wide range of mixing timescales spanning FNP and conventional immersion precipitation processes. Results aid in the fabrication of nano-scale polymer particles with tuneable internal porosities. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM), PETRONAS.

  18. Bermuda Bio Optics Project. Chapter 14

    NASA Technical Reports Server (NTRS)

    Nelson, Norm

    2003-01-01

    The Bermuda BioOptics Project (BBOP) is a collaborative effort between the Institute for Computational Earth System Science (ICESS) at the University of California at Santa Barbara (UCSB) and the Bermuda Biological Station for Research (BBSR). This research program is designed to characterize light availability and utilization in the Sargasso Sea, and to provide an optical link by which biogeochemical observations may be used to evaluate bio-optical models for pigment concentration, primary production, and sinking particle fluxes from satellite-based ocean color sensors. The BBOP time-series was initiated in 1992, and is carried out in conjunction with the U.S. JGOFS Bermuda Atlantic Time-series Study (BATS) at the Bermuda Biological Station for Research. The BATS program itself has been observing biogeochemical processes (primary productivity, particle flux and elemental cycles) in the mesotrophic waters of the Sargasso Sea since 1988. Closely affiliated with BBOP and BATS is a separate NASA-funded study of the spatial variability of biogeochemical processes in the Sargasso Sea using high-resolution AVHRR and SeaWiFS data collected at Bermuda (N. Nelson, P.I.). The collaboration between BATS and BBOP measurements has resulted in a unique data set that addresses not only the SIMBIOS goals but also the broader issues of important factors controlling the carbon cycle.

  19. The Bermuda BioOptics Project (BBOP) Years 9-11

    NASA Technical Reports Server (NTRS)

    Maritorena, S.; Siegel, D. A.; Nelson, Norm B.

    2004-01-01

    The Bermuda BioOptics Project (BBOP) is a collaborative effort between the Institute for Computational Earth System Science (ICESS) at the University of California at Santa Barbara (UCSB) and the Bermuda Biological Station for Research (BBSR). This research program is designed to characterize light availability and utilization in the Sargasso Sea, and to provide an optical link by which biogeochemical observations may be used to evaluate bio-optical models for pigment concentration, primary production, and sinking particle fluxes from satellite-based ocean color sensors. The BBOP time-series was initiated in 1992, and is carried out in conjunction with the U.S. JGOFS Bermuda Atlantic Time-series Study (BATS) at the Bermuda Biological Station for Research. The BATS program itself has been observing biogeochemical processes (primary productivity, particle flux and elemental cycles) in the mesotrophic waters of the Sargasso Sea since 1988. Closely affiliated with BBOP and BATS is a separate NASA-funded study of the spatial variability of biogeochemical processes in the Sargasso Sea using high-resolution AVHRR and SeaWiFS data collected at Bermuda. The collaboration between BATS and BBOP measurements has resulted in a unique data set that addresses not only the SIMBIOS goals but also the broader issues of important factors controlling the carbon cycle.

  20. Condensation-nuclei (Aitken Particle) measurement system used in NASA global atmospheric sampling program

    NASA Technical Reports Server (NTRS)

    Nyland, T. W.

    1979-01-01

    The condensation-nuclei (Aitken particle) measuring system used in the NASA Global Atmospheric Sampling Program is described. Included in the paper is a description of the condensation-nuclei monitor sensor, the pressurization system, and the Pollack-counter calibration system used to support the CN measurement. The monitor has a measurement range to 1000 CN/cm cubed and a noise level equivalent to 5 CN/cm cubed at flight altitudes between 6 and 13 km.

  1. 1999 LDRD Laboratory Directed Research and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rita Spencer; Kyle Wheeler

    This is the FY 1999 Progress Report for the Laboratory Directed Research and Development (LDRD) Program at Los Alamos National Laboratory. It gives an overview of the LDRD Program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principal investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic, molecular, optical, and plasma physics, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  2. Laboratory Directed Research and Development FY 1998 Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Vigil; Kyle Wheeler

    This is the FY 1998 Progress Report for the Laboratory Directed Research and Development (LDRD) Program at Los Alamos National Laboratory. It gives an overview of the LDRD Program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principle investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic, molecular, optical, and plasma physics, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  3. Laboratory directed research and development: FY 1997 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, J.; Prono, J.

    1998-05-01

    This is the FY 1997 Progress Report for the Laboratory Directed Research and Development (LDRD) program at Los Alamos National Laboratory. It gives an overview of the LDRD program, summarizes work done on individual research projects, relates the projects to major Laboratory program sponsors, and provides an index to the principal investigators. Project summaries are grouped by their LDRD component: Competency Development, Program Development, and Individual Projects. Within each component, they are further grouped into nine technical categories: (1) materials science, (2) chemistry, (3) mathematics and computational science, (4) atomic and molecular physics and plasmas, fluids, and particle beams, (5)more » engineering science, (6) instrumentation and diagnostics, (7) geoscience, space science, and astrophysics, (8) nuclear and particle physics, and (9) bioscience.« less

  4. 40 CFR 52.131 - Control Strategy and regulations: Fine Particle Matter.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Control Strategy and regulations: Fine Particle Matter. 52.131 Section 52.131 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Strategy and regulations: Fine Particle Matter. (a) Determination of Attainment: Effective February 6, 2013...

  5. 40 CFR 52.247 - Control Strategy and regulations: Fine Particle Matter.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Control Strategy and regulations: Fine Particle Matter. 52.247 Section 52.247 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Strategy and regulations: Fine Particle Matter. (a) Determination of Attainment: Effective February 8, 2013...

  6. 40 CFR 52.247 - Control Strategy and regulations: Fine Particle Matter.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Control Strategy and regulations: Fine Particle Matter. 52.247 Section 52.247 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Strategy and regulations: Fine Particle Matter. (a) Determination of Attainment: Effective February 8, 2013...

  7. 40 CFR 52.131 - Control Strategy and regulations: Fine Particle Matter.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Control Strategy and regulations: Fine Particle Matter. 52.131 Section 52.131 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Strategy and regulations: Fine Particle Matter. (a) Determination of Attainment: Effective February 6, 2013...

  8. Analytical solutions for coagulation and condensation kinetics of composite particles

    NASA Astrophysics Data System (ADS)

    Piskunov, Vladimir N.

    2013-04-01

    The processes of composite particles formation consisting of a mixture of different materials are essential for many practical problems: for analysis of the consequences of accidental releases in atmosphere; for simulation of precipitation formation in clouds; for description of multi-phase processes in chemical reactors and industrial facilities. Computer codes developed for numerical simulation of these processes require optimization of computational methods and verification of numerical programs. Kinetic equations of composite particle formation are given in this work in a concise form (impurity integrated). Coagulation, condensation and external sources associated with nucleation are taken into account. Analytical solutions were obtained in a number of model cases. The general laws for fraction redistribution of impurities were defined. The results can be applied to develop numerical algorithms considerably reducing the simulation effort, as well as to verify the numerical programs for calculation of the formation kinetics of composite particles in the problems of practical importance.

  9. "Smart pebble" design for environmental monitoring applications

    NASA Astrophysics Data System (ADS)

    Valyrakis, Manousos; Pavlovskis, Edgars

    2014-05-01

    Sediment transport, due to primarily the action of water, wind and ice, is one of the most significant geomorphic processes responsible for shaping Earth's surface. It involves entrainment of sediment grains in rivers and estuaries due to the violently fluctuating hydrodynamic forces near the bed. Here an instrumented particle, namely a "smart pebble", is developed to investigate the exact flow conditions under which individual grains may be entrained from the surface of a gravel bed. This could lead in developing a better understanding of the processes involved, while focusing on the response of the particle during a variety of flow entrainment events. The "smart pebble" is a particle instrumented with MEMS sensors appropriate for capturing the hydrodynamic forces a coarse particle might experience during its entrainment from the river bed. A 3-axial gyroscope and accelerometer registers data to a memory card via a microcontroller, embedded in a 3D-printed waterproof hollow spherical particle. The instrumented board is appropriately fit and centred into the shell of the pebble, so as to achieve a nearly uniform distribution of the mass which could otherwise bias its motion. The "smart pebble" is powered by an independent power to ensure autonomy and sufficiently long periods of operation appropriate for deployment in the field. Post-processing and analysis of the acquired data is currently performed offline, using scientific programming software. The performance of the instrumented particle is validated, conducting a series of calibration experiments under well-controlled laboratory conditions. "Smart pebble" allows for a wider range of environmental sensors (e.g. for environmental/pollutant monitoring) to be incorporated so as to extend the range of its application, enabling accurate environmental monitoring which is required to ensure infrastructure resilience and preservation of ecological health.

  10. Impact of Hormonal Contraception and Weight Loss on HDL-C efflux and Lipoprotein Particles in Women with Polycystic Ovary Syndrome

    PubMed Central

    Dokras, Anuja; Playford, Martin; Kris-Etherton, Penny M.; Kunselman, Allen R.; Stetter, Christy M.; Williams, Nancy I.; Gnatuk, Carol L.; Estes, Stephanie J.; Sarwer, David B; Allison, Kelly C; Coutifaris, Christos; Mehta, Nehal; Legro, Richard S

    2017-01-01

    Objective To study the effects of oral contraceptive pills (OCP), the first line treatment for PCOS, on HDL-C function (reverse cholesterol efflux capacity) and lipoprotein particles measured by NMR spectroscopy. Design Secondary analysis of a randomized controlled trial (OWL-PCOS) of OCP or Lifestyle (intensive lifestyle modification) or Combined (OCP+Lifestyle) treatment for 16 weeks. Patients 87 overweight/obese women with PCOS at two academic centers Measurements Change in HDL-C efflux capacity and lipoprotein particles. Results HDL-C efflux capacity increased significantly at 16 weeks in the OCP group (0.11; 95% CI 0.03, 0.18, p=0.008) but not in the Lifestyle (p=0.39) or Combined group (p=0.18). After adjusting for HDL-C and TG levels, there was significant mean change in efflux in the Combined group (0.09; 95% CI 0.01, 0.15; p=0.01). Change in HDL-C efflux correlated inversely with change in serum testosterone (rs = −0.21; p=0.05). In contrast, OCP use induced an atherogenic LDL-C profile with increase in small (p=0.006) and large LDL-particles (p=0.002). Change in small LDL-particles correlated with change in serum testosterone (rs = −0.31, p=0.009) and insulin sensitivity index (rs = −0.31, p=0.02). Both Lifestyle and Combined groups did not show significant changes in the atherogenic LDL-particles. Conclusions OCP use is associated with improved HDL-C function and a concomitant atherogenic LDL-C profile. Combination of a Lifestyle program with OCP use improved HDL-C function and mitigated adverse effects of OCP on lipoproteins. Our study provides evidence for use of OCP in overweight/obese women with PCOS when combined with Lifestyle changes. PMID:28199736

  11. Research on mining truck vibration control based on particle damping

    NASA Astrophysics Data System (ADS)

    Liming, Song; Wangqiang, Xiao; Zeguang, Li; Haiquan, Guo; Zhe, Yang

    2018-03-01

    More and more attentions were got by people about the research on mining truck driving comfort. As the vibration transfer terminal, cab is one of the important part of mining truck vibration control. In this paper, based on particle damping technology and its application characteristics, through the discrete element modeling, DEM & FEM coupling simulation and analysis, lab test verification and actual test in the truck, particle damping technology was successfully used in driver’s seat base of mining truck, cab vibration was reduced obviously, meanwhile applied research and method of particle damping technology in mining truck vibration control were provided.

  12. Facile fabrication of core-in-shell particles by the slow removal of the core and its use in the encapsulation of metal nanoparticles.

    PubMed

    Choi, Won San; Koo, Hye Young; Kim, Dong-Yu

    2008-05-06

    Core-in-shell particles with controllable core size have been fabricated from core-shell particles by means of the controlled core-dissolution method. These cores in inorganic shells were employed as scaffolds for the synthesis of metal nanoparticles. After dissolution of the cores, metal nanoparticles embedded in cores were encapsulated into the interior of shell, without any damage or change. This article describes a very simple method for deriving core-in-shell particles with controllable core size and encapsulation of nanoparticles into the interior of shell.

  13. Development of the fine-particle agglomerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldman, P.; Balasic, P.

    1999-07-01

    This paper presents the current status of the commercial development of a new technology to more efficiently control fine particulate emissions. The technology is based on an invention by Environmental Elements Corporation (EEC) which utilizes laminar flow to promote contact of fine submicron particles with larger particles to form agglomerates prior to their removal in a conventional particulate control device, such as an ESP. As agglomerates the particles are easily captured in the control device, whereas a substantial amount would pass through if allowed to remain as fine particles. EEC has developed the laminar-flow agglomerator technology through the laboratory proof-of-conceptmore » stage, which was funded by a DOE SBIR grant, to pilot-scale and full-scale demonstrations.« less

  14. Nanomodified heat-accumulating materials controlled by a magnetic field

    NASA Astrophysics Data System (ADS)

    Shchegolkov, Alexander; Shchegolkov, Alexey; Dyachkova, Tatyana; Bodin, Nikolay; Semenov, Alexander

    2017-11-01

    The paper presents studies of nanomodified heat-accumulating materials controlled by a magnetic field. In order to obtain controlled heat-accumulating materials, synthetic motor oil CASTROL 0W30, ferromagnetic particles, CNTs and paraffin were used. Mechanically activated carbon nanotubes with ferromagnetic particles were used for the nanomodification of paraffin. Mechanoactivation ensured the production of ferromagnetic particles with an average particle size of 5 µm. Using an extrusion plant, a mixture of CNTs and ferromagnetic particles was introduced into the paraffin. Further, the nanomodified paraffin in a granular form was introduced into synthetic oil. To conduct experimental studies, a contactless method for measuring temperature was used. The thermal contact control with the help of the obtained nanomodified material is possible with a magnetic induction of 1250 mT, and a heat flux of about 74 kW/m2 is provided at the same time.

  15. Controlling placement of nonspherical (boomerang) colloids in nematic cells with photopatterned director

    NASA Astrophysics Data System (ADS)

    Peng, Chenhui; Turiv, Taras; Zhang, Rui; Guo, Yubing; Shiyanovskii, Sergij V.; Wei, Qi-Huo; de Pablo, Juan; Lavrentovich, Oleg D.

    2017-01-01

    Placing colloidal particles in predesigned sites represents a major challenge of the current state-of-the-art colloidal science. Nematic liquid crystals with spatially varying director patterns represent a promising approach to achieve a well-controlled placement of colloidal particles thanks to the elastic forces between the particles and the surrounding landscape of molecular orientation. Here we demonstrate how the spatially varying director field can be used to control placement of non-spherical particles of boomerang shape. The boomerang colloids create director distortions of a dipolar symmetry. When a boomerang particle is placed in a periodic splay-bend director pattern, it migrates towards the region of a maximum bend. The behavior is contrasted to that one of spherical particles with normal surface anchoring, which also produce dipolar director distortions, but prefer to compartmentalize into the regions with a maximum splay. The splay-bend periodic landscape thus allows one to spatially separate these two types of particles. By exploring overdamped dynamics of the colloids, we determine elastic driving forces responsible for the preferential placement. Control of colloidal locations through patterned molecular orientation can be explored for future applications in microfluidic, lab on a chip, sensing and sorting devices.

  16. Controlling placement of nonspherical (boomerang) colloids in nematic cells with photopatterned director.

    PubMed

    Peng, Chenhui; Turiv, Taras; Zhang, Rui; Guo, Yubing; Shiyanovskii, Sergij V; Wei, Qi-Huo; de Pablo, Juan; Lavrentovich, Oleg D

    2017-01-11

    Placing colloidal particles in predesigned sites represents a major challenge of the current state-of-the-art colloidal science. Nematic liquid crystals with spatially varying director patterns represent a promising approach to achieve a well-controlled placement of colloidal particles thanks to the elastic forces between the particles and the surrounding landscape of molecular orientation. Here we demonstrate how the spatially varying director field can be used to control placement of non-spherical particles of boomerang shape. The boomerang colloids create director distortions of a dipolar symmetry. When a boomerang particle is placed in a periodic splay-bend director pattern, it migrates towards the region of a maximum bend. The behavior is contrasted to that one of spherical particles with normal surface anchoring, which also produce dipolar director distortions, but prefer to compartmentalize into the regions with a maximum splay. The splay-bend periodic landscape thus allows one to spatially separate these two types of particles. By exploring overdamped dynamics of the colloids, we determine elastic driving forces responsible for the preferential placement. Control of colloidal locations through patterned molecular orientation can be explored for future applications in microfluidic, lab on a chip, sensing and sorting devices.

  17. Full-Color Biomimetic Photonic Materials with Iridescent and Non-Iridescent Structural Colors.

    PubMed

    Kawamura, Ayaka; Kohri, Michinari; Morimoto, Gen; Nannichi, Yuri; Taniguchi, Tatsuo; Kishikawa, Keiki

    2016-09-23

    The beautiful structural colors in bird feathers are some of the brightest colors in nature, and some of these colors are created by arrays of melanin granules that act as both structural colors and scattering absorbers. Inspired by the color of bird feathers, high-visibility structural colors have been created by altering four variables: size, blackness, refractive index, and arrangement of the nano-elements. To control these four variables, we developed a facile method for the preparation of biomimetic core-shell particles with melanin-like polydopamine (PDA) shell layers. The size of the core-shell particles was controlled by adjusting the core polystyrene (PSt) particles' diameter and the PDA shell thicknesses. The blackness and refractive index of the colloidal particles could be adjusted by controlling the thickness of the PDA shell. The arrangement of the particles was controlled by adjusting the surface roughness of the core-shell particles. This method enabled the production of both iridescent and non-iridescent structural colors from only one component. This simple and novel process of using core-shell particles containing PDA shell layers can be used in basic research on structural colors in nature and their practical applications.

  18. Two decades of Mexican particle physics at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy Rubinstein

    2002-12-03

    This report is a view from Fermilab of Mexican particle physics at the Laboratory since about 1980; it is not intended to be a history of Mexican particle physics: that topic is outside the expertise of the writer. The period 1980 to the present coincides with the growth of Mexican experimental particle physics from essentially no activity to its current state where Mexican groups take part in experiments at several of the world's major laboratories. Soon after becoming Fermilab director in 1979, Leon Lederman initiated a program to encourage experimental physics, especially experimental particle physics, in Latin America. At themore » time, Mexico had significant theoretical particle physics activity, but none in experiment. Following a visit by Lederman to UNAM in 1981, a conference ''Panamerican Symposium on Particle Physics and Technology'' was held in January 1982 at Cocoyoc, Mexico, with about 50 attendees from Europe, North America, and Latin America; these included Lederman, M. Moshinsky, J. Flores, S. Glashow, J. Bjorken, and G. Charpak. Among the conference outcomes were four subsequent similar symposia over the next decade, and a formal Fermilab program to aid Latin American physics (particularly particle physics); it also influenced a decision by Mexican physicist Clicerio Avilez to switch from theoretical to experimental particle physics. The first physics collaboration between Fermilab and Mexico was in particle theory. Post-docs Rodrigo Huerta and Jose Luis Lucio spent 1-2 years at Fermilab starting in 1981, and other theorists (including Augusto Garcia, Arnulfo Zepeda, Matias Moreno and Miguel Angel Perez) also spent time at the Laboratory in the 1980s.« less

  19. An Evaluation of the Particle Physics Masterclass as a Context for Student Learning about the Nature of Science

    ERIC Educational Resources Information Center

    Wadness, Michael J.

    2010-01-01

    This dissertation addresses the research question: To what extent do secondary school science students attending the U.S. Particle Physics Masterclass change their view of the nature of science (NOS)? The U.S. Particle Physics Masterclass is a physics outreach program run by QuarkNet, a national organization of secondary school physics teachers…

  20. Investigation of the Profile Control Mechanisms of Dispersed Particle Gel

    PubMed Central

    Zhao, Guang; Dai, Caili; Zhao, Mingwei

    2014-01-01

    Dispersed particle gel (DPG) particles of nano- to micron- to mm-size have been prepared successfully and will be used for profile control treatment in mature oilfields. The profile control and enhanced oil recovery mechanisms of DPG particles have been investigated using core flow tests and visual simulation experiments. Core flow test results show that DPG particles can easily be injected into deep formations and can effectively plug the high permeability zones. The high profile improvement rate improves reservoir heterogeneity and diverts fluid into the low permeability zone. Both water and oil permeability were reduced when DPG particles were injected, but the disproportionate permeability reduction effect was significant. Water permeability decreases more than the oil permeability to ensure that oil flows in its own pathways and can easily be driven out. Visual simulation experiments demonstrate that DPG particles can pass directly or by deformation through porous media and enter deep formations. By retention, adsorption, trapping and bridging, DPG particles can effectively reduce the permeability of porous media in high permeability zones and divert fluid into a low permeability zone, thus improving formation profiles and enhancing oil recovery. PMID:24950174

  1. Fortran interface layer of the framework for developing particle simulator FDPS

    NASA Astrophysics Data System (ADS)

    Namekata, Daisuke; Iwasawa, Masaki; Nitadori, Keigo; Tanikawa, Ataru; Muranushi, Takayuki; Wang, Long; Hosono, Natsuki; Nomura, Kentaro; Makino, Junichiro

    2018-06-01

    Numerical simulations based on particle methods have been widely used in various fields including astrophysics. To date, various versions of simulation software have been developed by individual researchers or research groups in each field, through a huge amount of time and effort, even though the numerical algorithms used are very similar. To improve the situation, we have developed a framework, called FDPS (Framework for Developing Particle Simulators), which enables researchers to develop massively parallel particle simulation codes for arbitrary particle methods easily. Until version 3.0, FDPS provided an API (application programming interface) for the C++ programming language only. This limitation comes from the fact that FDPS is developed using the template feature in C++, which is essential to support arbitrary data types of particle. However, there are many researchers who use Fortran to develop their codes. Thus, the previous versions of FDPS require such people to invest much time to learn C++. This is inefficient. To cope with this problem, we developed a Fortran interface layer in FDPS, which provides API for Fortran. In order to support arbitrary data types of particle in Fortran, we design the Fortran interface layer as follows. Based on a given derived data type in Fortran representing particle, a PYTHON script provided by us automatically generates a library that manipulates the C++ core part of FDPS. This library is seen as a Fortran module providing an API of FDPS from the Fortran side and uses C programs internally to interoperate Fortran with C++. In this way, we have overcome several technical issues when emulating a `template' in Fortran. Using the Fortran interface, users can develop all parts of their codes in Fortran. We show that the overhead of the Fortran interface part is sufficiently small and a code written in Fortran shows a performance practically identical to the one written in C++.

  2. Dielectrophoretic columnar focusing device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Conrad D; Galambos, Paul C; Derzon, Mark S

    2010-05-11

    A dielectrophoretic columnar focusing device uses interdigitated microelectrodes to provide a spatially non-uniform electric field in a fluid that generates a dipole within particles in the fluid. The electric field causes the particles to either be attracted to or repelled from regions where the electric field gradient is large, depending on whether the particles are more or less polarizable than the fluid. The particles can thereby be forced into well defined stable paths along the interdigitated microelectrodes. The device can be used for flow cytometry, particle control, and other process applications, including cell counting or other types of particle counting,more » and for separations in material control.« less

  3. Reduction of particle deposition on substrates using temperature gradient control

    DOEpatents

    Rader, Daniel J.; Dykhuizen, Ronald C.; Geller, Anthony S.

    2000-01-01

    A method of reducing particle deposition during the fabrication of microelectronic circuitry is presented. Reduction of particle deposition is accomplished by controlling the relative temperatures of various parts of the deposition system so that a large temperature gradient near the surface on which fabrication is taking place exists. This temperature gradient acts to repel particles from that surface, thereby producing cleaner surfaces, and thus obtaining higher yields from a given microelectronic fabrication process.

  4. Chamber for the optical manipulation of microscopic particles

    DOEpatents

    Buican, Tudor N.; Upham, Bryan D.

    1992-01-01

    A particle control chamber enables experiments to be carried out on biological cells and the like using a laser system to trap and manipulate the particles. A manipulation chamber provides a plurality of inlet and outlet ports for the particles and for fluids used to control or to contact the particles. A central manipulation area is optically accessible by the laser and includes first enlarged volumes for containing a selected number of particles for experimentation. A number of first enlarged volumes are connected by flow channels through second enlarged volumes. The second enlarged volumes act as bubble valves for controlling the interconnections between the first enlarged volumes. Electrode surfaces may be applied above the first enlarged volumes to enable experimentation using the application of electric fields within the first enlarged volumes. A variety of chemical and environmental conditions may be established within individual first enlarged volumes to enable experimental conditions for small scale cellular interactions.

  5. A TEOM (tm) particulate monitor for comet dust, near Earth space, and planetary atmospheres

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Scientific missions to comets, near earth space, and planetary atmospheres require particulate and mass accumulation instrumentation for both scientific and navigation purposes. The Rupprecht & Patashnick tapered element oscillating microbalance can accurately measure both mass flux and mass distribution of particulates over a wide range of particle sizes and loadings. Individual particles of milligram size down to a few picograms can be resolved and counted, and the accumulation of smaller particles or molecular deposition can be accurately measured using the sensors perfected and toughened under this contract. No other sensor has the dynamic range or sensitivity attained by these picogram direct mass measurement sensors. The purpose of this contract was to develop and implement reliable and repeatable manufacturing methods; build and test prototype sensors; and outline a quality control program. A dust 'thrower' was to be designed and built, and used to verify performance. Characterization and improvement of the optical motion detection system and drive feedback circuitry was to be undertaken, with emphasis on reliability, low noise, and low power consumption. All the goals of the contract were met or exceeded. An automated glass puller was built and used to make repeatable tapered elements. Materials and assembly methods were standardized, and controllers and calibrated fixtures were developed and used in all phases of preparing, coating and assembling the sensors. Quality control and reliability resulted from the use of calibrated manufacturing equipment with measurable working parameters. Thermal and vibration testing of completed prototypes showed low temperature sensitivity and high vibration tolerance. An electrostatic dust thrower was used in vacuum to throw particles from 2 x 10(exp 6) g to 7 x 10(exp -12) g in size. Using long averaging times, particles as small as 0.7 to 4 x 10(exp 11) g were weighted to resolutions in the 5 to 9 x 10(exp -13) g range. The drive circuit and optics systems were developed beyond what was anticipated in the contract, and are now virtually flight prototypes. There is already commercial interest in the developed capability of measuring picogram mass losses and gains. One area is contamination and outgassing research, both measuring picogram losses from samples and collecting products of outgassing.

  6. Spacecraft environmental interactions: A joint Air Force and NASA research and technology program

    NASA Technical Reports Server (NTRS)

    Pike, C. P.; Purvis, C. K.; Hudson, W. R.

    1985-01-01

    A joint Air Force/NASA comprehensive research and technology program on spacecraft environmental interactions to develop technology to control interactions between large spacecraft systems and the charged-particle environment of space is described. This technology will support NASA/Department of Defense operations of the shuttle/IUS, shuttle/Centaur, and the force application and surveillance and detection missions, planning for transatmospheric vehicles and the NASA space station, and the AFSC military space system technology model. The program consists of combined contractual and in-house efforts aimed at understanding spacecraft environmental interaction phenomena and relating results of ground-based tests to space conditions. A concerted effort is being made to identify project-related environmental interactions of concern. The basic properties of materials are being investigated to develop or modify the materials as needed. A group simulation investigation is evaluating basic plasma interaction phenomena to provide inputs to the analytical modeling investigation. Systems performance is being evaluated by both groundbased tests and analysis.

  7. Status of the tokamak program

    NASA Astrophysics Data System (ADS)

    Sheffield, J.

    1981-08-01

    For a specific configuration of magnetic field and plasma to be economically attractive as a commercial source of energy, it must contain a high-pressure plasma in a stable fashion while thermally isolating the plasma from the walls of the containment vessel. The tokamak magnetic configuration is presently the most successful in terms of reaching the considered goals. Tokamaks were developed in the USSR in a program initiated in the mid-1950s. By the early 1970s tokamaks were operating not only in the USSR but also in the U.S., Australia, Europe, and Japan. The advanced state of the tokamak program is indicated by the fact that it is used as a testbed for generic fusion development - for auxiliary heating, diagnostics, materials - as well as for specific tokamak advancement. This has occurred because it is the most economic source of a large, reproducible, hot, dense plasma. The basic tokamak is considered along with tokamak improvements, impurity control, additional heating, particle and power balance in a tokamak, aspects of microscopic transport, and macroscopic stability.

  8. Resonance controlled transport in phase space

    NASA Astrophysics Data System (ADS)

    Leoncini, Xavier; Vasiliev, Alexei; Artemyev, Anton

    2018-02-01

    We consider the mechanism of controlling particle transport in phase space by means of resonances in an adiabatic setting. Using a model problem describing nonlinear wave-particle interaction, we show that captures into resonances can be used to control transport in momentum space as well as in physical space. We design the model system to provide creation of a narrow peak in the distribution function, thus producing effective cooling of a sub-ensemble of the particles.

  9. Developmental assessment of the Fort St. Vrain version of the Composite HTGR Analysis Program (CHAP-2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stroh, K.R.

    1980-01-01

    The Composite HTGR Analysis Program (CHAP) consists of a model-independent systems analysis mainframe named LASAN and model-dependent linked code modules, each representing a component, subsystem, or phenomenon of an HTGR plant. The Fort St. Vrain (FSV) version (CHAP-2) includes 21 coded modules that model the neutron kinetics and thermal response of the core; the thermal-hydraulics of the reactor primary coolant system, secondary steam supply system, and balance-of-plant; the actions of the control system and plant protection system; the response of the reactor building; and the relative hazard resulting from fuel particle failure. FSV steady-state and transient plant data are beingmore » used to partially verify the component modeling and dynamic smulation techniques used to predict plant response to postulated accident sequences.« less

  10. EMC Aspects of Turbulence Heating ObserveR (THOR) Spacecraft

    NASA Astrophysics Data System (ADS)

    Soucek, J.; Ahlen, L.; Bale, S.; Bonnell, J.; Boudin, N.; Brienza, D.; Carr, C.; Cipriani, F.; Escoubet, C. P.; Fazakerley, A.; Gehler, M.; Genot, V.; Hilgers, A.; Hanock, B.; Jannet, G.; Junge, A.; Khotyaintsev, Y.; De Keyser, J.; Kucharek, H.; Lan, R.; Lavraud, B.; Leblanc, F.; Magnes, W.; Mansour, M.; Marcucci, M. F.; Nakamura, R.; Nemecek, Z.; Owen, C.; Phal, Y.; Retino, A.; Rodgers, D.; Safrankova, J.; Sahraoui, F.; Vainio, R.; Wimmer-Schweingruber, R.; Steinhagen, J.; Vaivads, A.; Wielders, A.; Zaslavsky, A.

    2016-05-01

    Turbulence Heating ObserveR (THOR) is a spacecraft mission dedicated to the study of plasma turbulence in near-Earth space. The mission is currently under study for implementation as a part of ESA Cosmic Vision program. THOR will involve a single spinning spacecraft equipped with state of the art instruments capable of sensitive measurements of electromagnetic fields and plasma particles. The sensitive electric and magnetic field measurements require that the spacecraft- generated emissions are restricted and strictly controlled; therefore a comprehensive EMC program has been put in place already during the study phase. The THOR study team and a dedicated EMC working group are formulating the mission EMC requirements already in the earliest phase of the project to avoid later delays and cost increases related to EMC. This article introduces the THOR mission and reviews the current state of its EMC requirements.

  11. The nuclear battery

    NASA Astrophysics Data System (ADS)

    Kozier, K. S.; Rosinger, H. E.

    The evolution and present status of an Atomic Energy of Canada Limited program to develop a small, solid-state, passively cooled reactor power supply known as the Nuclear Battery is reviewed. Key technical features of the Nuclear Battery reactor core include a heat-pipe primary heat transport system, graphite neutron moderator, low-enriched uranium TRISO coated-particle fuel and the use of burnable poisons for long-term reactivity control. An external secondary heat transport system extracts useful heat energy, which may be converted into electricity in an organic Rankine cycle engine or used to produce high-pressure steam. The present reference design is capable of producing about 2400 kW(t) (about 600 kW(e) net) for 15 full-power years. Technical and safety features are described along with recent progress in component hardware development programs and market assessment work.

  12. Code C# for chaos analysis of relativistic many-body systems with reactions

    NASA Astrophysics Data System (ADS)

    Grossu, I. V.; Besliu, C.; Jipa, Al.; Stan, E.; Esanu, T.; Felea, D.; Bordeianu, C. C.

    2012-04-01

    In this work we present a reaction module for “Chaos Many-Body Engine” (Grossu et al., 2010 [1]). Following our goal of creating a customizable, object oriented code library, the list of all possible reactions, including the corresponding properties (particle types, probability, cross section, particle lifetime, etc.), could be supplied as parameter, using a specific XML input file. Inspired by the Poincaré section, we propose also the “Clusterization Map”, as a new intuitive analysis method of many-body systems. For exemplification, we implemented a numerical toy-model for nuclear relativistic collisions at 4.5 A GeV/c (the SKM200 Collaboration). An encouraging agreement with experimental data was obtained for momentum, energy, rapidity, and angular π distributions. Catalogue identifier: AEGH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 184 628 No. of bytes in distributed program, including test data, etc.: 7 905 425 Distribution format: tar.gz Programming language: Visual C#.NET 2005 Computer: PC Operating system: Net Framework 2.0 running on MS Windows Has the code been vectorized or parallelized?: Each many-body system is simulated on a separate execution thread. One processor used for each many-body system. RAM: 128 Megabytes Classification: 6.2, 6.5 Catalogue identifier of previous version: AEGH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 1464 External routines: Net Framework 2.0 Library Does the new version supersede the previous version?: Yes Nature of problem: Chaos analysis of three-dimensional, relativistic many-body systems with reactions. Solution method: Second order Runge-Kutta algorithm for simulating relativistic many-body systems with reactions. Object oriented solution, easy to reuse, extend and customize, in any development environment which accepts .Net assemblies or COM components. Treatment of two particles reactions and decays. For each particle, calculation of the time measured in the particle reference frame, according to the instantaneous velocity. Possibility to dynamically add particle properties (spin, isospin, etc.), and reactions/decays, using a specific XML input file. Basic support for Monte Carlo simulations. Implementation of: Lyapunov exponent, “fragmentation level”, “average system radius”, “virial coefficient”, “clusterization map”, and energy conservation precision test. As an example of use, we implemented a toy-model for nuclear relativistic collisions at 4.5 A GeV/c. Reasons for new version: Following our goal of applying chaos theory to nuclear relativistic collisions at 4.5 A GeV/c, we developed a reaction module integrated with the Chaos Many-Body Engine. In the previous version, inheriting the Particle class was the only possibility of implementing more particle properties (spin, isospin, and so on). In the new version, particle properties can be dynamically added using a dictionary object. The application was improved in order to calculate the time measured in the own reference frame of each particle. two particles reactions: a+b→c+d, decays: a→c+d, stimulated decays, more complicated schemas, implemented as various combinations of previous reactions. Following our goal of creating a flexible application, the reactions list, including the corresponding properties (cross sections, particles lifetime, etc.), could be supplied as parameter, using a specific XML configuration file. The simulation output files were modified for systems with reactions, assuring also the backward compatibility. We propose the “Clusterization Map” as a new investigation method of many-body systems. The multi-dimensional Lyapunov Exponent was adapted in order to be used for systems with variable structure. Basic support for Monte Carlo simulations was also added. Additional comments: Windows forms application for testing the engine. Easy copy/paste based deployment method. Running time: Quadratic complexity.

  13. Use of a ground-water flow model with particle tracking to evaluate ground-water vulnerability, Clark County, Washington

    USGS Publications Warehouse

    Snyder, D.T.; Wilkinson, J.M.; Orzol, L.L.

    1996-01-01

    A ground-water flow model was used in conjunction with particle tracking to evaluate ground-water vulnerability in Clark County, Washington. Using the particle-tracking program, particles were placed in every cell of the flow model (about 60,000 particles) and tracked backwards in time and space upgradient along flow paths to their recharge points. A new computer program was developed that interfaces the results from a particle-tracking program with a geographic information system (GIS). The GIS was used to display and analyze the particle-tracking results. Ground-water vulnerability was evaluated by selecting parts of the ground-water flow system and combining the results with ancillary information stored in the GIS to determine recharge areas, characteristics of recharge areas, downgradient impact of land use at recharge areas, and age of ground water. Maps of the recharge areas for each hydrogeologic unit illustrate the presence of local, intermediate, or regional ground-water flow systems and emphasize the three-dimensional nature of the ground-water flow system in Clark County. Maps of the recharge points for each hydrogeologic unit were overlaid with maps depicting aquifer sensitivity as determined by DRASTIC (a measure of the pollution potential of ground water, based on the intrinsic characteristics of the near-surface unsaturated and saturated zones) and recharge from on-site waste-disposal systems. A large number of recharge areas were identified, particularly in southern Clark County, that have a high aquifer sensitivity, coincide with areas of recharge from on-site waste-disposal systems, or both. Using the GIS, the characteristics of the recharge areas were related to the downgradient parts of the ground-water system that will eventually receive flow that has recharged through these areas. The aquifer sensitivity, as indicated by DRASTIC, of the recharge areas for downgradient parts of the flow system was mapped for each hydrogeologic unit. A number of public-supply wells in Clark County may be receiving a component of water that recharged in areas that are more conducive to contaminant entry. The aquifer sensitivity maps illustrate a critical deficiency in the DRASTIC methodology: the failure to account for the dynamics of the ground-water flow system. DRASTIC indices calculated for a particular location thus do not necessarily reflect the conditions of the ground-water resources at the recharge areas to that particular location. Each hydrogeologic unit was also mapped to highlight those areas that will eventually receive flow from recharge areas with on-site waste-disposal systems. Most public-supply wells in southern Clark County may eventually receive a component of water that was recharged from on-site waste-disposal systems.Traveltimes from particle tracking were used to estimate the minimum and maximum age of ground water within each model-grid cell. Chlorofluorocarbon (CFC)-age dating of ground water from 51 wells was used to calibrate effective porosity values used for the particle- tracking program by comparison of ground-water ages determined through the use of the CFC-age dating with those calculated by the particle- tracking program. There was a 76 percent agreement in predicting the presence of modern water in the 51 wells as determined using CFCs and calculated by the particle-tracking program. Maps showing the age of ground water were prepared for all the hydrogeologic units. Areas with the youngest ground-water ages are expected to be at greatest risk for contamination from anthropogenic sources. Comparison of these maps with maps of public- supply wells in Clark County indicates that most of these wells may withdraw ground water that is, in part, less than 100 years old, and in many instances less than 10 years old. Results of the analysis showed that a single particle-tracking analysis simulating advective transport can be used to evaluate ground-water vulnerability for any part of a ground-wate

  14. Program Package for 3d PIC Model of Plasma Fiber

    NASA Astrophysics Data System (ADS)

    Kulhánek, Petr; Břeň, David

    2007-08-01

    A fully three dimensional Particle in Cell model of the plasma fiber had been developed. The code is written in FORTRAN 95, implementation CVF (Compaq Visual Fortran) under Microsoft Visual Studio user interface. Five particle solvers and two field solvers are included in the model. The solvers have relativistic and non-relativistic variants. The model can deal both with periodical and non-periodical boundary conditions. The mechanism of the surface turbulences generation in the plasma fiber was successfully simulated with the PIC program package.

  15. IMPETUS - Interactive MultiPhysics Environment for Unified Simulations.

    PubMed

    Ha, Vi Q; Lykotrafitis, George

    2016-12-08

    We introduce IMPETUS - Interactive MultiPhysics Environment for Unified Simulations, an object oriented, easy-to-use, high performance, C++ program for three-dimensional simulations of complex physical systems that can benefit a large variety of research areas, especially in cell mechanics. The program implements cross-communication between locally interacting particles and continuum models residing in the same physical space while a network facilitates long-range particle interactions. Message Passing Interface is used for inter-processor communication for all simulations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Physics Accomplishments and Future Prospects of the BES Experiments at the Beijing Electron-Positron Collider

    NASA Astrophysics Data System (ADS)

    Briere, Roy A.; Harris, Frederick A.; Mitchell, Ryan E.

    2016-10-01

    The cornerstone of the Chinese experimental particle physics program is a series of experiments performed in the τ-charm energy region. China began building e+e- colliders at the Institute for High Energy Physics in Beijing more than three decades ago. Beijing Electron Spectrometer (BES) is the common root name for the particle physics detectors operated at these machines. We summarize the development of the BES program and highlight the physics results across several topical areas.

  17. Optimizing phase to enhance optical trap stiffness.

    PubMed

    Taylor, Michael A

    2017-04-03

    Phase optimization offers promising capabilities in optical tweezers, allowing huge increases in the applied forces, trap stiff-ness, or measurement sensitivity. One key obstacle to potential applications is the lack of an efficient algorithm to compute an optimized phase profile, with enhanced trapping experiments relying on slow programs that would take up to a week to converge. Here we introduce an algorithm that reduces the wait from days to minutes. We characterize the achievable in-crease in trap stiffness and its dependence on particle size, refractive index, and optical polarization. We further show that phase-only control can achieve almost all of the enhancement possible with full wavefront shaping; for instance phase control allows 62 times higher trap stiffness for 10 μm silica spheres in water, while amplitude control and non-trivial polarization further increase this by 1.26 and 1.01 respectively. This algorithm will facilitate future applications in optical trapping, and more generally in wavefront optimization.

  18. PCaPAC 2006 Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavel Chevtsov; Matthew Bickley

    2007-03-30

    The 6-th international PCaPAC (Personal Computers and Particle Accelerator Controls) workshop was held at Jefferson Lab, Newport News, Virginia, from October 24-27, 2006. The main objectives of the conference were to discuss the most important issues of the use of PCs and modern IT technologies for controls of accelerators and to give scientists, engineers, and technicians a forum to exchange the ideas on control problems and their solutions. The workshop consisted of plenary sessions and poster sessions. No parallel sessions were held.Totally, more than seventy oral and poster presentations as well as tutorials were made during the conference, on themore » basis of which about fifty papers were submitted by the authors and included in this publication. This printed version of the PCaPAC 2006 Proceedings is published at Jefferson Lab according to the decision of the PCaPAC International Program Committee of October 26, 2006.« less

  19. A Mechanics-Based Framework Leading to Improved Diagnosis and Treatment of Hydrocephalus

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Soren, Vedels; Wagshul, Mark; Egnor, Michael; Voorhees, Abram; Wei, Timothy

    2007-11-01

    Hydrocephalus is defined as an accumulation of cerebrospinal fluid (CSF) in the cranium, at the expense of brain tissue. The result is a disruption of the normal pressure and/or flow dynamics of the intracranial blood and CSF. We seek to introduce integral control volume analysis to the study of hydrocephalus. The goal is to provide a first principles framework to integrate a broad spectrum of sometimes disparate investigations into a highly complex, multidisciplinary problem. The general technique for the implementation of control volumes to hydrocephalus will be presented. This includes factors faced in choosing control volumes and making the required measurements to evaluate mass and momentum conservation. In addition, the use of our Digital Particle Image Velocimetry (DPIV) processing program has been extended to measure the displacement of the ventricles' walls from Magnetic Resonance (MR) images. This is done to determine the volume change of the intracranial fluid spaces.

  20. Stabilization of Co{sup 2+} in layered double hydroxides (LDHs) by microwave-assisted ageing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrero, M.; Benito, P.; Labajos, F.M.

    2007-03-15

    Co-containing layered double hydroxides at different pH have been prepared, and aged following different routes. The solids prepared have been characterized by element chemical analysis, powder X-ray diffraction, thermogravimetric and differential thermal analyses (both in nitrogen and in oxygen), FT-IR and Vis-UV spectroscopies, temperature-programmed reduction and surface area assessment by nitrogen adsorption at -196 deg. C. The best conditions found to preserve the cobalt species in the divalent oxidation state are preparing the samples at controlled pH, and then submit them to ageing under microwave irradiation. - Graphical abstract: The use of microwave-hydrothermal treatment, controlling both temperature and ageing time,more » permits to synthesize well-crystallized nanomaterials with controlled surface properties. An enhancement in the crystallinity degree and an increase in the particle size are observed when the irradiation time is prolonged.« less

  1. Non-additive simple potentials for pre-programmed self-assembly

    NASA Astrophysics Data System (ADS)

    Mendoza, Carlos

    2015-03-01

    A major goal in nanoscience and nanotechnology is the self-assembly of any desired complex structure with a system of particles interacting through simple potentials. To achieve this objective, intense experimental and theoretical efforts are currently concentrated in the development of the so called ``patchy'' particles. Here we follow a completely different approach and introduce a very accessible model to produce a large variety of pre-programmed two-dimensional (2D) complex structures. Our model consists of a binary mixture of particles that interact through isotropic interactions that is able to self-assemble into targeted lattices by the appropriate choice of a small number of geometrical parameters and interaction strengths. We study the system using Monte Carlo computer simulations and, despite its simplicity, we are able to self assemble potentially useful structures such as chains, stripes, Kagomé, twisted Kagomé, honeycomb, square, Archimedean and quasicrystalline tilings. Our model is designed such that it may be implemented using discotic particles or, alternatively, using exclusively spherical particles interacting isotropically. Thus, it represents a promising strategy for bottom-up nano-fabrication. Partial Financial Support: DGAPA IN-110613.

  2. Controlling silk fibroin particle features for drug delivery

    PubMed Central

    Lammel, Andreas; Hu, Xiao; Park, Sang-Hyug; Kaplan, David L.; Scheibel, Thomas

    2010-01-01

    Silk proteins are a promising material for drug delivery due to their aqueous processability, biocompatibility, and biodegradability. A simple aqueous preparation method for silk fibroin particles with controllable size, secondary structure and zeta potential is reported. The particles were produced by salting out a silk fibroin solution with potassium phosphate. The effect of ionic strength and pH of potassium phosphate solution on the yield and morphology of the particles was determined. Secondary structure and zeta potential of the silk particles could be controlled by pH. Particles produced by salting out with 1.25 M potassium phosphate pH 6 showed a dominating silk II (crystalline) structure whereas particles produced at pH 9 were mainly composed of silk I (less crystalline). The results show that silk I rich particles possess chemical and physical stability and secondary structure which remained unchanged during post treatments even upon exposure to 100% ethanol or methanol. A model is presented to explain the process of particle formation based on intra- and intermolecular interactions of the silk domains, influenced by pH and kosmotrope salts. The reported silk fibroin particles can be loaded with small molecule model drugs, such as alcian blue, rhodamine B, and crystal violet, by simple absorption based on electrostatic interactions. In vitro release of these compounds from the silk particles depends on charge – charge interactions between the compounds and the silk. With crystal violet we demonstrated that the release kinetics are dependent on the secondary structure of the particles. PMID:20219241

  3. A computer controlled signal preprocessor for laser fringe anemometer applications

    NASA Technical Reports Server (NTRS)

    Oberle, Lawrence G.

    1987-01-01

    The operation of most commercially available laser fringe anemometer (LFA) counter-processors assumes that adjustments are made to the signal processing independent of the computer used for reducing the data acquired. Not only does the researcher desire a record of these parameters attached to the data acquired, but changes in flow conditions generally require that these settings be changed to improve data quality. Because of this limitation, on-line modification of the data acquisition parameters can be difficult and time consuming. A computer-controlled signal preprocessor has been developed which makes possible this optimization of the photomultiplier signal as a normal part of the data acquisition process. It allows computer control of the filter selection, signal gain, and photo-multiplier voltage. The raw signal from the photomultiplier tube is input to the preprocessor which, under the control of a digital computer, filters the signal and amplifies it to an acceptable level. The counter-processor used at Lewis Research Center generates the particle interarrival times, as well as the time-of-flight of the particle through the probe volume. The signal preprocessor allows computer control of the acquisition of these data.Through the preprocessor, the computer also can control the hand shaking signals for the interface between itself and the counter-processor. Finally, the signal preprocessor splits the pedestal from the signal before filtering, and monitors the photo-multiplier dc current, sends a signal proportional to this current to the computer through an analog to digital converter, and provides an alarm if the current exceeds a predefined maximum. Complete drawings and explanations are provided in the text as well as a sample interface program for use with the data acquisition software.

  4. Targeted Therapies for Myeloma and Metastatic Bone Cancers

    DTIC Science & Technology

    2006-02-01

    present results from this program at talk at the Particles 2006 - Medical/Biochemical Diagnostic , Pharmaceutical, and Drug Delivery Applications of Particle...Technology Forum scheduled for May 13 -16, in Orlando, FL. "* Invited to give a guest lecture on nanoparticle drug delivery technology to the...The principal investigator was invited to give a talk at the Particles 2006 - Medical/Biochemical Diagnostic , Pharmaceutical, and Drug Delivery

  5. Skylab

    NASA Image and Video Library

    1970-01-01

    This chart describes Skylab's Particle Collection device, a scientific experiment designed to study micro-meteoroid particles in near-Earth space and determine their abundance, mass distribution, composition, and erosive effects. The Marshall Space Flight Center had program management responsibility for the development of Skylab hardware and experiments.

  6. Skylab

    NASA Image and Video Library

    1970-01-01

    This photograph shows Skylab's Particle Collection device, a scientific experiment designed to study micro-meteoroid particles in near-Earth space and determine their abundance, mass distribution, composition, and erosive effects. The Marshall Space Flight Center had program management responsibility for the development of Skylab hardware and experiments.

  7. Under What Conditions Can Equilibrium Gas-Particle Partitioning Be Expected to Hold in the Atmosphere?

    PubMed

    Mai, Huajun; Shiraiwa, Manabu; Flagan, Richard C; Seinfeld, John H

    2015-10-06

    The prevailing treatment of secondary organic aerosol formation in atmospheric models is based on the assumption of instantaneous gas-particle equilibrium for the condensing species, yet compelling experimental evidence indicates that organic aerosols can exhibit the properties of highly viscous, semisolid particles, for which gas-particle equilibrium may be achieved slowly. The approach to gas-particle equilibrium partitioning is controlled by gas-phase diffusion, interfacial transport, and particle-phase diffusion. Here we evaluate the controlling processes and the time scale to achieve gas-particle equilibrium as a function of the volatility of the condensing species, its surface accommodation coefficient, and its particle-phase diffusivity. For particles in the size range of typical atmospheric organic aerosols (∼50-500 nm), the time scale to establish gas-particle equilibrium is generally governed either by interfacial accommodation or particle-phase diffusion. The rate of approach to equilibrium varies, depending on whether the bulk vapor concentration is constant, typical of an open system, or decreasing as a result of condensation into the particles, typical of a closed system.

  8. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-10-01

    This document is the third volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, exceptmore » for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of appendices C through U of the report« less

  9. Method for ion implantation induced embedded particle formation via reduction

    DOEpatents

    Hampikian, Janet M; Hunt, Eden M

    2001-01-01

    A method for ion implantation induced embedded particle formation via reduction with the steps of ion implantation with an ion/element that will chemically reduce the chosen substrate material, implantation of the ion/element to a sufficient concentration and at a sufficient energy for particle formation, and control of the temperature of the substrate during implantation. A preferred embodiment includes the formation of particles which are nano-dimensional (<100 m-n in size). The phase of the particles may be affected by control of the substrate temperature during and/or after the ion implantation process.

  10. Effect of soil texture and chemical properties on laboratory-generated dust emissions from SW North America

    NASA Astrophysics Data System (ADS)

    Mockford, T.; Zobeck, T. M.; Lee, J. A.; Gill, T. E.; Dominguez, M. A.; Peinado, P.

    2012-12-01

    Understanding the controls of mineral dust emissions and their particle size distributions during wind-erosion events is critical as dust particles play a significant impact in shaping the earth's climate. It has been suggested that emission rates and particle size distributions are independent of soil chemistry and soil texture. In this study, 45 samples of wind-erodible surface soils from the Southern High Plains and Chihuahuan Desert regions of Texas, New Mexico, Colorado and Chihuahua were analyzed by the Lubbock Dust Generation, Analysis and Sampling System (LDGASS) and a Beckman-Coulter particle multisizer. The LDGASS created dust emissions in a controlled laboratory setting using a rotating arm which allows particle collisions. The emitted dust was transferred to a chamber where particulate matter concentration was recorded using a DataRam and MiniVol filter and dust particle size distribution was recorded using a GRIMM particle analyzer. Particle size analysis was also determined from samples deposited on the Mini-Vol filters using a Beckman-Coulter particle multisizer. Soil textures of source samples ranged from sands and sandy loams to clays and silts. Initial results suggest that total dust emissions increased with increasing soil clay and silt content and decreased with increasing sand content. Particle size distribution analysis showed a similar relationship; soils with high silt content produced the widest range of dust particle sizes and the smallest dust particles. Sand grains seem to produce the largest dust particles. Chemical control of dust emissions by calcium carbonate content will also be discussed.

  11. Reactor for producing large particles of materials from gases

    NASA Technical Reports Server (NTRS)

    Flagan, Richard C. (Inventor); Alam, Mohammed K. (Inventor)

    1987-01-01

    A method and apparatus is disclosed for producing large particles of material from gas, or gases, containing the material (e.g., silicon from silane) in a free-space reactor comprised of a tube (20) and controlled furnace (25). A hot gas is introduced in the center of the reactant gas through a nozzle (23) to heat a quantity of the reactant gas, or gases, to produce a controlled concentration of seed particles (24) which are entrained in the flow of reactant gas, or gases. The temperature profile (FIG. 4) of the furnace is controlled for such a slow, controlled rate of reaction that virtually all of the material released condenses on seed particles and new particles are not nucleated in the furnace. A separate reactor comprised of a tube (33) and furnace (30) may be used to form a seed aerosol which, after passing through a cooling section (34) is introduced in the main reactor tube (34) which includes a mixer (36) to mix the seed aerosol in a controlled concentration with the reactant gas or gases.

  12. Laboratory Experiments and Instrument Intercomparison Studies of Carbonaceous Aerosol Particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidovits, Paul

    Aerosols containing black carbon (and some specific types of organic particulate matter) directly absorb incoming light, heating the atmosphere. In addition, all aerosol particles backscatter solar light, leading to a net-cooling effect. Indirect effects involve hydrophilic aerosols, which serve as cloud condensation nuclei (CCN) that affect cloud cover and cloud stability, impacting both atmospheric radiation balance and precipitation patterns. At night, all clouds produce local warming, but overall clouds exert a net-cooling effect on the Earth. The effect of aerosol radiative forcing on climate may be as large as that of the greenhouse gases, but predominantly opposite in sign andmore » much more uncertain. The uncertainties in the representation of aerosol interactions in climate models makes it problematic to use model projections to guide energy policy. The objective of our program is to reduce the uncertainties in the aerosol radiative forcing in the two areas highlighted in the ASR Science and Program Plan. That is, (1) addressing the direct effect by correlating particle chemistry and morphology with particle optical properties (i.e. absorption, scattering, extinction), and (2) addressing the indirect effect by correlating particle hygroscopicity and CCN activity with particle size, chemistry, and morphology. In this connection we are systematically studying particle formation, oxidation, and the effects of particle coating. The work is specifically focused on carbonaceous particles where the uncertainties in the climate relevant properties are the highest. The ongoing work consists of laboratory experiments and related instrument inter-comparison studies both coordinated with field and modeling studies, with the aim of providing reliable data to represent aerosol processes in climate models. The work is performed in the aerosol laboratory at Boston College. At the center of our laboratory setup are two main sources for the production of aerosol particles: (a) two well-characterized source of soot particles and (b) a flow reactor for controlled OH and/or O3 oxidation of relevant gas phase species to produce well-characterized SOA particles. After formation, the aerosol particles are subjected to physical and chemical processes that simulate aerosol growth and aging. A suite of instruments in our laboratory is used to characterize the physical and chemical properties of aerosol particles before and after processing. The Time of Flight Aerosol Mass Spectrometer (ToF-AMS) together with a Scanning Mobility Particle Sizer (SMPS) measures particle mass, volume, density, composition (including black carbon content), dynamic shape factor, and fractal dimension. The–ToF-AMS was developed at ARI with Boston College participation. About 120 AMS instruments are now in service (including 5 built for DOE laboratories) performing field and laboratory studies world-wide. Other major instruments include a thermal denuder, two Differential Mobility Analyzers (DMA), a Cloud Condensation Nuclei Counter (CCN), a Thermal desorption Aerosol GC/MS (TAG) and the new Soot Particle Aerosol Mass Spectrometer (SP-AMS). Optical instrumentation required for the studies have been brought to our laboratory as part of ongoing and planned collaborative projects with colleagues from DOE, NOAA and university laboratories. Optical instruments that will be utilized include a Photoacoustic Spectrometer (PAS), a Cavity Ring Down Aerosol Extinction Spectrometer (CRD-AES), a Photo Thermal Interferometer (PTI), a new 7-wavelength Aethalometer and a Cavity Attenuated Phase Shift Extinction Monitor (CAPS). These instruments are providing aerosol absorption, extinction and scattering coefficients at a range of atmospherically relevant wavelengths. During the past two years our work has continued along the lines of our original proposal. We report on 12 completed and/or continuing projects conducted during the period 08/14 to 0814/2015. These projects are described in 17 manuscripts published in refereed journals.« less

  13. Far-Field Lorenz-Mie Scattering in an Absorbing Host Medium: Theoretical Formalism and FORTRAN Program

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Yang, Ping

    2018-01-01

    In this paper we make practical use of the recently developed first-principles approach to electromagnetic scattering by particles immersed in an unbounded absorbing host medium. Specifically, we introduce an actual computational tool for the calculation of pertinent far-field optical observables in the context of the classical Lorenzâ€"Mie theory. The paper summarizes the relevant theoretical formalism, explains various aspects of the corresponding numerical algorithm, specifies the input and output parameters of a FORTRAN program available at https://www.giss.nasa.gov/staff/mmishchenko/Lorenz-Mie.html, and tabulates benchmark results useful for testing purposes. This public-domain FORTRAN program enables one to solve the following two important problems: (i) simulate theoretically the reading of a remote well-collimated radiometer measuring electromagnetic scattering by an individual spherical particle or a small random group of spherical particles; and (ii) compute the single-scattering parameters that enter the vector radiative transfer equation derived directly from the Maxwell equations.

  14. Application of Photoshop and Scion Image analysis to quantification of signals in histochemistry, immunocytochemistry and hybridocytochemistry.

    PubMed

    Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva

    2006-02-01

    To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.

  15. A dynamic programming-based particle swarm optimization algorithm for an inventory management problem under uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Zeng, Ziqiang; Han, Bernard; Lei, Xiao

    2013-07-01

    This article presents a dynamic programming-based particle swarm optimization (DP-based PSO) algorithm for solving an inventory management problem for large-scale construction projects under a fuzzy random environment. By taking into account the purchasing behaviour and strategy under rules of international bidding, a multi-objective fuzzy random dynamic programming model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform fuzzy random parameters into fuzzy variables that are subsequently defuzzified by using an expected value operator with optimistic-pessimistic index. The iterative nature of the authors' model motivates them to develop a DP-based PSO algorithm. More specifically, their approach treats the state variables as hidden parameters. This in turn eliminates many redundant feasibility checks during initialization and particle updates at each iteration. Results and sensitivity analysis are presented to highlight the performance of the authors' optimization method, which is very effective as compared to the standard PSO algorithm.

  16. Far-field Lorenz-Mie scattering in an absorbing host medium: Theoretical formalism and FORTRAN program

    NASA Astrophysics Data System (ADS)

    Mishchenko, Michael I.; Yang, Ping

    2018-01-01

    In this paper we make practical use of the recently developed first-principles approach to electromagnetic scattering by particles immersed in an unbounded absorbing host medium. Specifically, we introduce an actual computational tool for the calculation of pertinent far-field optical observables in the context of the classical Lorenz-Mie theory. The paper summarizes the relevant theoretical formalism, explains various aspects of the corresponding numerical algorithm, specifies the input and output parameters of a FORTRAN program available at https://www.giss.nasa.gov/staff/mmishchenko/Lorenz-Mie.html, and tabulates benchmark results useful for testing purposes. This public-domain FORTRAN program enables one to solve the following two important problems: (i) simulate theoretically the reading of a remote well-collimated radiometer measuring electromagnetic scattering by an individual spherical particle or a small random group of spherical particles; and (ii) compute the single-scattering parameters that enter the vector radiative transfer equation derived directly from the Maxwell equations.

  17. PYTHIA 6.4 Physics and Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjostrand, Torbjorn; /Lund U., Dept. Theor. Phys.; Mrenna, Stephen

    2006-03-01

    The Pythia program can be used to generate high-energy-physics ''events'', i.e. sets of outgoing particles produced in the interactions between two incoming particles. The objective is to provide as accurate as possible a representation of event properties in a wide range of reactions, within and beyond the Standard Model, with emphasis on those where strong interactions play a role, directly or indirectly, and therefore multihadronic final states are produced. The physics is then not understood well enough to give an exact description; instead the program has to be based on a combination of analytical results and various QCD-based models. Thismore » physics input is summarized here, for areas such as hard subprocesses, initial- and final-state parton showers, underlying events and beam remnants, fragmentation and decays, and much more. Furthermore, extensive information is provided on all program elements: subroutines and functions, switches and parameters, and particle and process data. This should allow the user to tailor the generation task to the topics of interest.« less

  18. A review of electron bombardment thruster systems/spacecraft field and particle interfaces

    NASA Technical Reports Server (NTRS)

    Byers, D. C.

    1978-01-01

    Information on the field and particle interfaces of electron bombardment ion thruster systems was summarized. Major areas discussed were the nonpropellant particles, neutral propellant, ion beam, low energy plasma, and fields. Spacecraft functions and subsystems reviewed were solar arrays, thermal control systems, optical sensors, communications, science, structures and materials, and potential control.

  19. Influence of Poly (Ethylene Glycol) and Oleylamine on the Formation of Nano to Micron Size Spherical SiO2 Particles

    EPA Science Inventory

    We report an eco-friendly synthesis of well–controlled, nano-to-micron-size, spherical SiO2 particles using non-hazardous solvent and a byproducts-producing system. It was found that the morphology and size of spherical SiO2 particles are controlled by adjusting the concentration...

  20. Controlled and tunable polymer particles' production using a single microfluidic device

    NASA Astrophysics Data System (ADS)

    Amoyav, Benzion; Benny, Ofra

    2018-04-01

    Microfluidics technology offers a new platform to control liquids under flow in small volumes. The advantage of using small-scale reactions for droplet generation along with the capacity to control the preparation parameters, making microfluidic chips an attractive technology for optimizing encapsulation formulations. However, one of the drawback in this methodology is the ability to obtain a wide range of droplet sizes, from sub-micron to microns using a single chip design. In fact, typically, droplet chips are used for micron-dimension particles, while nanoparticles' synthesis requires complex chips design (i.e., microreactors and staggered herringbone micromixer). Here, we introduce the development of a highly tunable and controlled encapsulation technique, using two polymer compositions, for generating particles ranging from microns to nano-size using the same simple single microfluidic chip design. Poly(lactic-co-glycolic acid) (PLGA 50:50) or PLGA/polyethylene glycol polymeric particles were prepared with focused-flow chip, yielding monodisperse particle batches. We show that by varying flow rate, solvent, surfactant and polymer composition, we were able to optimize particles' size and decrease polydispersity index, using simple chip designs with no further related adjustments or costs. Utilizing this platform, which offers tight tuning of particle properties, could offer an important tool for formulation development and can potentially pave the way towards a better precision nanomedicine.

  1. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-10-01

    This document is the second volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, exceptmore » for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of failure modes and effects analysis; accident analysis; operational safety requirements; quality assurance program; ES&H management program; environmental, safety, and health systems critical to safety; summary of waste-management program; environmental monitoring program; facility expansion, decontamination, and decommissioning; summary of emergency response plan; summary plan for employee training; summary plan for operating procedures; glossary; and appendices A and B.« less

  2. Characteristics of individual particles in Beijing before, during and after the 2014 APEC meeting

    NASA Astrophysics Data System (ADS)

    Xu, Zhongjun; Shan, Wei; Qi, Tao; Gao, Jian

    2018-05-01

    To understand the characteristics of individual aerosol particles as well as the effects of emission control measures on the air quality in Beijing before, during and after the 2014 APEC meeting, aerosol samples collected in Beijing from Oct. 8 to Nov. 24 were investigated by a scanning electron microscopy (SEM) coupled with an energy-dispersive X-ray (EDX). Individual particles were classified into fly ash, ammonium sulfate, carbonaceous particle, tar ball, soot aggregates, Fe/Ti oxide, Ca/Mg carbonate, calcium sulfate and aluminosilicates/quartz. The results showed that PM0.5-1.0 was predominant in aerosol particles while PM2.5-10 was the fewest in aerosol particles. Soot aggregates and carbonaceous particles mainly located in the size range of 0.5-2.5 μm and mineral particles were dominant in the size range of 2.5-10 μm. The tough emission control measures taken by the local government greatly improved the air quality. Reducing vehicles on the roads substantially decreased the amount of soot aggregates, and restricting coal combustion decreased the amount of tar ball during the APEC meeting. The concentrations of carbonaceous and mineral particles abated probably owing to the control on VOCs emission, and water spray and demolition layoff, respectively, during the APEC meeting.

  3. Modulation of Spatiotemporal Particle Patterning in Evaporating Droplets: Applications to Diagnostics and Materials Science.

    PubMed

    Guha, Rajarshi; Mohajerani, Farzad; Mukhopadhyay, Ahana; Collins, Matthew D; Sen, Ayusman; Velegol, Darrell

    2017-12-13

    Spatiotemporal particle patterning in evaporating droplets lacks a common design framework. Here, we demonstrate autonomous control of particle distribution in evaporating droplets through the imposition of a salt-induced self-generated electric field as a generalized patterning strategy. Through modeling, a new dimensionless number, termed "capillary-phoresis" (CP) number, arises, which determines the relative contributions of electrokinetic and convective transport to pattern formation, enabling one to accurately predict the mode of particle assembly by controlling the spontaneous electric field and surface potentials. Modulation of the CP number allows the particles to be focused in a specific region in space or distributed evenly. Moreover, starting with a mixture of two different particle types, their relative placement in the ensuing pattern can be controlled, allowing coassemblies of multiple, distinct particle populations. By this approach, hypermethylated DNA, prevalent in cancerous cells, can be qualitatively distinguished from normal DNA of comparable molecular weights. In other examples, we show uniform dispersion of several particle types (polymeric colloids, multiwalled carbon nanotubes, and molecular dyes) on different substrates (metallic Cu, metal oxide, and flexible polymer), as dictated by the CP number. Depending on the particle, the highly uniform distribution leads to surfaces with a lower sheet resistance, as well as superior dye-printed displays.

  4. Control of both particle and pore size in nanoporous palladium alloy powders

    DOE PAGES

    Jones, Christopher G.; Cappillino, Patrick J.; Stavila, Vitalie; ...

    2014-07-15

    Energy storage materials often involve chemical reactions with bulk solids. Porosity within the solids can enhance reaction rates. The porosity can be either within or between individual particles of the material. Greater control of the size and uniformity of both types of pore should lead to enhancements of charging and discharging rates in energy storage systems. Furthermore, to control both particle and pore size in nanoporous palladium (Pd)-based hydrogen storage materials, first we created uniformly sized copper particles of about 1 μm diameter by the reduction of copper sulfate with ascorbic acid. In turn, these were used as reducing agentsmore » for tetrachloropalladate in the presence of a block copolymer surfactant. The copper reductant particles are geometrically self-limiting, so the resulting Pd particles are of similar size. The surfactant induces formation of 10 nm-scale pores within the particles. Some residual copper is alloyed with the Pd, reducing hydrogen storage capacity; use of a more reactive Pd salt can mitigate this. The reaction is conveniently performed in gram-scale batches.« less

  5. Contact Electrification of Individual Dielectric Microparticles Measured by Optical Tweezers in Air.

    PubMed

    Park, Haesung; LeBrun, Thomas W

    2016-12-21

    We measure charging of single dielectric microparticles after interaction with a glass substrate using optical tweezers to control the particle, measure its charge with a sensitivity of a few electrons, and precisely contact the particle with the substrate. Polystyrene (PS) microparticles adhered to the substrate can be selected based on size, shape, or optical properties and repeatedly loaded into the optical trap using a piezoelectric (PZT) transducer. Separation from the substrate leads to charge transfer through contact electrification. The charge on the trapped microparticles is measured from the response of the particle motion to a step excitation of a uniform electric field. The particle is then placed onto a target location of the substrate in a controlled manner. Thus, the triboelectric charging profile of the selected PS microparticle can be measured and controlled through repeated cycles of trap loading followed by charge measurement. Reversible optical trap loading and manipulation of the selected particle leads to new capabilities to study and control successive and small changes in surface interactions.

  6. Hardware and software systems for the determination of charged particle parameters in low pressure plasmas using impedance-tuned Langmuir probes

    NASA Astrophysics Data System (ADS)

    Ye, Yuancai; Marcus, R. Kenneth

    1997-12-01

    A computer-controlled, impedance-tuned Langmuir probe data acquisition system and processing software package have been designed for the diagnostic study of low pressure plasmas. The combination of impedance-tuning and a wide range of applied potentials (± 100 V) provides a versatile system, applicable to a variety of analytical plasmas without significant modification. The automated probe system can be used to produce complete and undistorted current-voltage (i-V) curves with extremely low noise over the wide potential range. Based on these hardware and software systems, it is possible to determine all of the important charged particle parameters in a plasma; electron number density ( ne), ion number density ( ni), electron temperature ( Te), electron energy distribution function (EEDF), and average electron energy (<ɛ>). The complete data acquisition system and evaluation software are described in detail. A LabView (National Instruments Corporation, Austin, TX) application program has been developed for the Apple Macintosh line of microcomputers to control all of the operational aspects of the Langmuir probe experiments. The description here is mainly focused on the design aspects of the acquisition system with the targets of extremely low noise and reduction of the influence of measurement noise in the calculation procedures. This is particularly important in the case of electron energy distribution functions where multiple derivatives are calculated from the obtained i-V curves. A separate C-language data processing program has been developed and is included here to allow the reader to evaluate data obtained with the described hardware, or any i-V data imported in tab separated variable format. Both of the software systems are included on a Macintosh formatted disk for their use in other laboratories desiring these capabilities.

  7. The MAVEN Magnetic Field Investigation

    NASA Technical Reports Server (NTRS)

    Connerney, J. E. P.; Espley, J.; Lawton, P.; Murphy, S.; Odom, J.; Oliversen, R.; Sheppard, D.

    2014-01-01

    The MAVEN magnetic field investigation is part of a comprehensive particles and fields subsystem that will measure the magnetic and electric fields and plasma environment of Mars and its interaction with the solar wind. The magnetic field instrumentation consists of two independent tri-axial fluxgate magnetometer sensors, remotely mounted at the outer extremity of the two solar arrays on small extensions ("boomlets"). The sensors are controlled by independent and functionally identical electronics assemblies that are integrated within the particles and fields subsystem and draw their power from redundant power supplies within that system. Each magnetometer measures the ambient vector magnetic field over a wide dynamic range (to 65,536 nT per axis) with a quantization uncertainty of 0.008 nT in the most sensitive dynamic range and an accuracy of better than 0.05%. Both magnetometers sample the ambient magnetic field at an intrinsic sample rate of 32 vector samples per second. Telemetry is transferred from each magnetometer to the particles and fields package once per second and subsequently passed to the spacecraft after some reformatting. The magnetic field data volume may be reduced by averaging and decimation, when necessary to meet telemetry allocations, and application of data compression, utilizing a lossless 8-bit differencing scheme. The MAVEN magnetic field experiment may be reconfigured in flight to meet unanticipated needs and is fully hardware redundant. A spacecraft magnetic control program was implemented to provide a magnetically clean environment for the magnetic sensors and the MAVEN mission plan provides for occasional spacecraft maneuvers - multiple rotations about the spacecraft x and z axes - to characterize spacecraft fields and/or instrument offsets in flight.

  8. The MAVEN Magnetic Field Investigation

    NASA Astrophysics Data System (ADS)

    Connerney, J. E. P.; Espley, J.; Lawton, P.; Murphy, S.; Odom, J.; Oliversen, R.; Sheppard, D.

    2015-12-01

    The MAVEN magnetic field investigation is part of a comprehensive particles and fields subsystem that will measure the magnetic and electric fields and plasma environment of Mars and its interaction with the solar wind. The magnetic field instrumentation consists of two independent tri-axial fluxgate magnetometer sensors, remotely mounted at the outer extremity of the two solar arrays on small extensions ("boomlets"). The sensors are controlled by independent and functionally identical electronics assemblies that are integrated within the particles and fields subsystem and draw their power from redundant power supplies within that system. Each magnetometer measures the ambient vector magnetic field over a wide dynamic range (to 65,536 nT per axis) with a resolution of 0.008 nT in the most sensitive dynamic range and an accuracy of better than 0.05 %. Both magnetometers sample the ambient magnetic field at an intrinsic sample rate of 32 vector samples per second. Telemetry is transferred from each magnetometer to the particles and fields package once per second and subsequently passed to the spacecraft after some reformatting. The magnetic field data volume may be reduced by averaging and decimation, when necessary to meet telemetry allocations, and application of data compression, utilizing a lossless 8-bit differencing scheme. The MAVEN magnetic field experiment may be reconfigured in flight to meet unanticipated needs and is fully hardware redundant. A spacecraft magnetic control program was implemented to provide a magnetically clean environment for the magnetic sensors and the MAVEN mission plan provides for occasional spacecraft maneuvers—multiple rotations about the spacecraft x and z axes—to characterize spacecraft fields and/or instrument offsets in flight.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardeen, Marjorie G.; /Fermilab; Johansson, K.Erik

    This review summarizes exemplary secondary education and outreach programs of the particle physics community. We examine programs from the following areas: research experiences, high-energy physics data for students, informal learning for students, instructional resources, and professional development. We report findings about these programs' impact on students and teachers and provide suggestions for practices that create effective programs from those findings. We also include some methods for assessing programs.

  10. Cosmic Radiation Detection and Observations

    NASA Astrophysics Data System (ADS)

    Ramirez Chavez, Juan; Troncoso, Maria

    Cosmic rays consist of high-energy particles accelerated from remote supernova remnant explosions and travel vast distances throughout the universe. Upon arriving at earth, the majority of these particles ionize gases in the upper atmosphere, while others interact with gas molecules in the troposphere and producing secondary cosmic rays, which are the main focus of this research. To observe these secondary cosmic rays, a detector telescope was designed and equipped with two silicon photomultipliers (SiPMs). Each SiPM is coupled to a bundle of 4 wavelength shifting optical fibers that are embedded inside a plastic scintillator sheet. The SiPM signals were amplified using a fast preamplifier with coincidence between detectors established using a binary logic gate. The coincidence events were recorded with two devices; a digital counter and an Arduino micro-controller. For detailed analysis of the SiPM waveforms, a DRS4 sensory digitizer captured the waveforms for offline analysis with the CERN software package Physics Analysis Workstation in a Linux environment. Results from our experiments would be presented. Hartnell College STEM Internship Program.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Under contract with the US Department of Energy (DE-AC22-92PCO0367), Pittsburgh Energy Technology Center, Radian Corporation has conducted a test program to collect and analyze size-fractionated stack gas particulate samples for selected inorganic hazardous air pollutants (HAPS). Specific goals of the program are (1) the collection of one-gram quantities of size-fractionated stack gas particulate matter for bulk (total) and surface chemical charactization, and (2) the determination of the relationship between particle size, bulk and surface (leachable) composition, and unit load. The information obtained from this program identifies the effects of unit load, particle size, and wet FGD system operation on themore » relative toxicological effects of exposure to particulate emissions.« less

  12. TURTLE with MAD input (Trace Unlimited Rays Through Lumped Elements) -- A computer program for simulating charged particle beam transport systems and DECAY TURTLE including decay calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carey, D.C.

    1999-12-09

    TURTLE is a computer program useful for determining many characteristics of a particle beam once an initial design has been achieved, Charged particle beams are usually designed by adjusting various beam line parameters to obtain desired values of certain elements of a transfer or beam matrix. Such beam line parameters may describe certain magnetic fields and their gradients, lengths and shapes of magnets, spacings between magnetic elements, or the initial beam accepted into the system. For such purposes one typically employs a matrix multiplication and fitting program such as TRANSPORT. TURTLE is designed to be used after TRANSPORT. For conveniencemore » of the user, the input formats of the two programs have been made compatible. The use of TURTLE should be restricted to beams with small phase space. The lumped element approximation, described below, precludes the inclusion of the effect of conventional local geometric aberrations (due to large phase space) or fourth and higher order. A reading of the discussion below will indicate clearly the exact uses and limitations of the approach taken in TURTLE.« less

  13. Aerobiology and Its Role in the Transmission of Infectious Diseases

    PubMed Central

    Fernstrom, Aaron; Goldblatt, Michael

    2013-01-01

    Aerobiology plays a fundamental role in the transmission of infectious diseases. As infectious disease and infection control practitioners continue employing contemporary techniques (e.g., computational fluid dynamics to study particle flow, polymerase chain reaction methodologies to quantify particle concentrations in various settings, and epidemiology to track the spread of disease), the central variables affecting the airborne transmission of pathogens are becoming better known. This paper reviews many of these aerobiological variables (e.g., particle size, particle type, the duration that particles can remain airborne, the distance that particles can travel, and meteorological and environmental factors), as well as the common origins of these infectious particles. We then review several real-world settings with known difficulties controlling the airborne transmission of infectious particles (e.g., office buildings, healthcare facilities, and commercial airplanes), while detailing the respective measures each of these industries is undertaking in its effort to ameliorate the transmission of airborne infectious diseases. PMID:23365758

  14. Oxidation property of SiO2-supported small nickel particle prepared by the sol-gel method

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Yamashita, S.; Afiza, N.; Katayama, M.; Inada, Y.

    2016-05-01

    The oxidation property of SiO2-supported small Ni particle has been studied by means of the in-situ XAFS method. The Ni particle with the average diameter of 4 nm supported on SiO2 was prepared by the sol-gel method. The XANES spectrum of the small metallic Ni particle was clearly different from that of bulk Ni. The exposure of diluted O2 gas at room temperature promoted the surface oxidation of Ni(0) particle. During the temperature programmed oxidation process, the supported Ni(0) particle was quantitatively oxidized to NiO, and the oxidation temperature was lower by ca. 200 °C than that of the SiO2-supported Ni particle with the larger particle radius of 17 nm prepared by the impregnation method.

  15. Magnetic Control of Lateral Migration of Ellipsoidal Microparticles in Microscale Flows

    NASA Astrophysics Data System (ADS)

    Zhou, Ran; Sobecki, Christopher A.; Zhang, Jie; Zhang, Yanzhi; Wang, Cheng

    2017-08-01

    Precise manipulations of nonspherical microparticles by shape have diverse applications in biology and biomedical engineering. Here, we study lateral migration of ellipsoidal paramagnetic microparticles in low-Reynolds-number flows under uniform magnetic fields. We show that magnetically induced torque alters the rotation dynamics of the particle and results in shape-dependent lateral migration. By adjusting the direction of the magnetic field, we demonstrate versatile control of the symmetric and asymmetric rotation of the particles, thereby controlling the direction of the particle's lateral migration. The particle rotations are experimentally measured, and their symmetry or asymmetry characteristics agree well with the prediction from a simple theory. The lateral migration mechanism is found to be valid for nonmagnetic particles suspended in a ferrofluid. Finally, we demonstrate shape-based sorting of microparticles by exploiting the proposed migration mechanism.

  16. Direct fabrication of gas diffusion cathode by pulse electrodeposition for proton exchange membrane water electrolysis

    NASA Astrophysics Data System (ADS)

    Park, Hyanjoo; Choe, Seunghoe; Kim, Hoyoung; Kim, Dong-Kwon; Cho, GeonHee; Park, YoonSu; Jang, Jong Hyun; Ha, Don-Hyung; Ahn, Sang Hyun; Kim, Soo-Kil

    2018-06-01

    Pt catalysts for water electrolysis were prepared on carbon paper by using both direct current and pulse electrodeposition. Controlling the mass transfer of Pt precursor in the electrolyte by varying the deposition potential enables the formation of various Pt particle shapes such as flower-like and polyhedral particles. Further control of the deposition parameters for pulse electrodeposition resulted in changes to the particle size and density. In particular, the upper potential of pulse was found to be the critical parameter controlling the morphology of the particles and their catalytic activity. In addition to the typical electrochemical measurements, Pt samples deposited on carbon paper were used as cathodes for a proton exchange membrane water electrolyser. This single cell test revealed that our Pt particle samples have exceptional mass activity while being cost effective.

  17. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    NASA Astrophysics Data System (ADS)

    Wilke, Daniel N.; Pizette, Patrick; Govender, Nicolin; Abriak, Nor-Edine

    2017-06-01

    The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA) particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  18. The effects of particle loading on turbulence structure and modelling

    NASA Technical Reports Server (NTRS)

    Squires, Kyle D.; Eaton, J. K.

    1989-01-01

    The objective of the present research was to extend the Direct Numerical Simulation (DNS) approach to particle-laden turbulent flows using a simple model of particle/flow interaction. The program addressed the simplest type of flow, homogeneous, isotropic turbulence, and examined interactions between the particles and gas phase turbulence. The specific range of problems examined include those in which the particle is much smaller than the smallest length scales of the turbulence yet heavy enough to slip relative to the flow. The particle mass loading is large enough to have a significant impact on the turbulence, while the volume loading was small enough such that particle-particle interactions could be neglected. Therefore, these simulations are relevant to practical problems involving small, dense particles conveyed by turbulent gas flows at moderate loadings. A sample of the results illustrating modifications of the particle concentration field caused by the turbulence structure is presented and attenuation of turbulence by the particle cloud is also illustrated.

  19. Transport of Particle Swarms Through Variable Aperture Fractures

    NASA Astrophysics Data System (ADS)

    Boomsma, E.; Pyrak-Nolte, L. J.

    2012-12-01

    Particle transport through fractured rock is a key concern with the increased use of micro- and nano-size particles in consumer products as well as from other activities in the sub- and near surface (e.g. mining, industrial waste, hydraulic fracturing, etc.). While particle transport is often studied as the transport of emulsions or dispersions, particles may also enter the subsurface from leaks or seepage that lead to particle swarms. Swarms are drop-like collections of millions of colloidal-sized particles that exhibit a number of unique characteristics when compared to dispersions and emulsions. Any contaminant or engineered particle that forms a swarm can be transported farther, faster, and more cohesively in fractures than would be expected from a traditional dispersion model. In this study, the effects of several variable aperture fractures on colloidal swarm cohesiveness and evolution were studied as a swarm fell under gravity and interacted with the fracture walls. Transparent acrylic was used to fabricate synthetic fracture samples with (1) a uniform aperture, (2) a converging region followed by a uniform region (funnel shaped), (3) a uniform region followed by a diverging region (inverted funnel), and (4) a cast of a an induced fracture from a carbonate rock. All of the samples consisted of two blocks that measured 100 x 100 x 50 mm. The minimum separation between these blocks determined the nominal aperture (0.5 mm to 20 mm). During experiments a fracture was fully submerged in water and swarms were released into it. The swarms consisted of a dilute suspension of 3 micron polystyrene fluorescent beads (1% by mass) with an initial volume of 5μL. The swarms were illuminated with a green (525 nm) LED array and imaged optically with a CCD camera. The variation in fracture aperture controlled swarm behavior. Diverging apertures caused a sudden loss of confinement that resulted in a rapid change in the swarm's shape as well as a sharp increase in its velocity. Converging apertures caused swarms to decelerate rapidly and become trapped in the transition point between the converging and parallel regions for apertures less than 2.5 mm. In uniform aperture fractures, an optimal aperture range (5 mm to 15 mm) exists where swarm velocity was higher and the swarm maintained cohesion over a longer distance. For apertures below this range the swarms were strongly slowed due to drag from the wall, while for larger apertures the swarm velocity approached an asymptote due to the loss of the walls influence. The transport of particle swarms in fractures is strongly controlled by aperture distribution. While drag from the fracture does slow swarms, especially at small apertures, much of the interesting behavior (shape changes in diverging fracture, optimal aperture in parallel fracture) is best explained by fracture induced preferential confinement that controls the evolution of the swarm. When this confinement is suddenly changed, the swarm responds quickly and dramatically to its new environment. This has important implications for the understanding of contaminant dispersal in subsurface fracture networks because the type of aperture variation can exert a strong influence on particle swarm transport. Acknowledgment: The authors wish to acknowledge support of this work by the Geosciences Research Program, Office of Basic Energy Sciences US Department of Energy (DE-FG02-09ER16022).

  20. A Comprehensive Program for Measurements of Military Aircraft Emissions

    DTIC Science & Technology

    2009-11-30

    gaseous measurement, but the same techniques could not be extended directly to ultrafine particles found in all engine exhausts. The results validated...emission measurement. Furthermore, ultrafine particles (defined as the diameter less than or equal to 100 nm or 0.1 µm) are the dominant...instruments that are capable of real-time or continuous measurement of various properties of ultrafine particles in laboratory and field conditions. Some of

  1. Department of Defense Enhanced Particulate Matter Surveillance Program (EPMSP)

    DTIC Science & Technology

    2008-02-01

    on Teflon® membrane, 23,807 on quartz fiber, and several million single particle analyses on Nuclepore® filters. Analytical results were...Nuclepore® filters, the sampling period was two hours, so as to provide lightly loaded filters with dispersed single particles, as required for CCSEM...membrane, 23,807 on quartz fiber, and several million single particle analyses on Nuclepore®. All results, together with summary tables and more than

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durham, M.D.

    Several tasks have been completed in a program to evaluate additives to improve fine particle collection in electrostatic precipitators. Screening tests and laboratory evaluations of additives are summarized in this report. Over 20 additives were evaluated; four were found to improve flyash precipitation rates. The Insitec particle analyzer was also evaluated; test results show that the analyzer will provide accurate sizing and counting information for particles in the size range of {le} 10 {mu}m dia.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durham, M.D.

    Several tasks have been completed in a program to evaluate additives to improve fine particle collection in electrostatic precipitators. Screening tests and laboratory evaluations of additives are summarized in this report. Over 20 additives were evaluated; four were found to improve flyash precipitation rates. The Insitec particle analyzer was also evaluated; test results show that the analyzer will provide accurate sizing and counting information for particles in the size range of [le] 10 [mu]m dia.

  4. Instellar Gas Experiment (IGE): Testing interstellar gas particles to provide information on the processes of nucleosynthesis in the big bang stars and supernova

    NASA Technical Reports Server (NTRS)

    Lind, Don

    1985-01-01

    The Interstellar Gas Experiment (IGE) is designed to collect particles of the interstellar gas - a wind of interstellar media particles moving in the vicinity of the solar system. These particles will be returned to earth where the isotopic ratios of the noble gases among these particles will be measured. IGE was designed and programmed to expose 7 sets of six copper-beryllium metallic collecting foils to the flux of neutral interstellar gas particles which penetrate the heliosphere to the vicinity of the earth's orbit. These particles are trapped in the collecting foils and will be returned to earth for mass-spectrographic analysis when Long Duration Exposure Facility (LDEF) on which IGE was launched, is recovered.

  5. Full-Color Biomimetic Photonic Materials with Iridescent and Non-Iridescent Structural Colors

    PubMed Central

    Kawamura, Ayaka; Kohri, Michinari; Morimoto, Gen; Nannichi, Yuri; Taniguchi, Tatsuo; Kishikawa, Keiki

    2016-01-01

    The beautiful structural colors in bird feathers are some of the brightest colors in nature, and some of these colors are created by arrays of melanin granules that act as both structural colors and scattering absorbers. Inspired by the color of bird feathers, high-visibility structural colors have been created by altering four variables: size, blackness, refractive index, and arrangement of the nano-elements. To control these four variables, we developed a facile method for the preparation of biomimetic core-shell particles with melanin-like polydopamine (PDA) shell layers. The size of the core-shell particles was controlled by adjusting the core polystyrene (PSt) particles’ diameter and the PDA shell thicknesses. The blackness and refractive index of the colloidal particles could be adjusted by controlling the thickness of the PDA shell. The arrangement of the particles was controlled by adjusting the surface roughness of the core-shell particles. This method enabled the production of both iridescent and non-iridescent structural colors from only one component. This simple and novel process of using core-shell particles containing PDA shell layers can be used in basic research on structural colors in nature and their practical applications. PMID:27658446

  6. Probabilistic Teleportation of an Arbitrary Three-Level Two-Particle State and Classical Communication Cost

    NASA Astrophysics Data System (ADS)

    Dai, Hong-Yi; Kuang, Le-Man; Li, Cheng-Zu

    2005-07-01

    We propose a scheme to probabilistically teleport an unknown arbitrary three-level two-particle state by using two partial entangled two-particle states of three-level as the quantum channel. The classical communication cost required in the ideal probabilistic teleportation process is also calculated. This scheme can be directly generalized to teleport an unknown and arbitrary three-level K-particle state by using K partial entangled two-particle states of three-level as the quantum channel. The project supported by National Fundamental Research Program of China under Grant No. 2001CB309310, National Natural Science Foundation of China under Grant Nos. 10404039 and 10325523

  7. Thixotropic particles suspensions and method for their formation

    DOEpatents

    Garino, T.J.

    1997-06-17

    Thixotropic particle suspensions are prepared by controlling the quantity of dispersant composition used for particle coating to an amount which is less than that quantity that would provide a full coating of dispersant on all particles suspended. 5 figs.

  8. 40 CFR 53.40 - General provisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 50 percent cutpoint of a test sampler shall be determined in a wind tunnel using 10 particle sizes... particle sampling effectiveness of a test sampler shall be determined in a wind tunnel using 25 µm... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR...

  9. 40 CFR 53.40 - General provisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 50 percent cutpoint of a test sampler shall be determined in a wind tunnel using 10 particle sizes... particle sampling effectiveness of a test sampler shall be determined in a wind tunnel using 25 µm... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR...

  10. 40 CFR 53.40 - General provisions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 50 percent cutpoint of a test sampler shall be determined in a wind tunnel using 10 particle sizes... particle sampling effectiveness of a test sampler shall be determined in a wind tunnel using 25 µm... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR...

  11. A New Cluster Analysis-Marker-Controlled Watershed Method for Separating Particles of Granular Soils.

    PubMed

    Alam, Md Ferdous; Haque, Asadul

    2017-10-18

    An accurate determination of particle-level fabric of granular soils from tomography data requires a maximum correct separation of particles. The popular marker-controlled watershed separation method is widely used to separate particles. However, the watershed method alone is not capable of producing the maximum separation of particles when subjected to boundary stresses leading to crushing of particles. In this paper, a new separation method, named as Monash Particle Separation Method (MPSM), has been introduced. The new method automatically determines the optimal contrast coefficient based on cluster evaluation framework to produce the maximum accurate separation outcomes. Finally, the particles which could not be separated by the optimal contrast coefficient were separated by integrating cuboid markers generated from the clustering by Gaussian mixture models into the routine watershed method. The MPSM was validated on a uniformly graded sand volume subjected to one-dimensional compression loading up to 32 MPa. It was demonstrated that the MPSM is capable of producing the best possible separation of particles required for the fabric analysis.

  12. ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.

    PubMed

    Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L

    2018-05-01

    In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. GPU acceleration of particle-in-cell methods

    NASA Astrophysics Data System (ADS)

    Cowan, Benjamin; Cary, John; Meiser, Dominic

    2015-11-01

    Graphics processing units (GPUs) have become key components in many supercomputing systems, as they can provide more computations relative to their cost and power consumption than conventional processors. However, to take full advantage of this capability, they require a strict programming model which involves single-instruction multiple-data execution as well as significant constraints on memory accesses. To bring the full power of GPUs to bear on plasma physics problems, we must adapt the computational methods to this new programming model. We have developed a GPU implementation of the particle-in-cell (PIC) method, one of the mainstays of plasma physics simulation. This framework is highly general and enables advanced PIC features such as high order particles and absorbing boundary conditions. The main elements of the PIC loop, including field interpolation and particle deposition, are designed to optimize memory access. We describe the performance of these algorithms and discuss some of the methods used. Work supported by DARPA contract W31P4Q-15-C-0061 (SBIR).

  14. PHoToNs–A parallel heterogeneous and threads oriented code for cosmological N-body simulation

    NASA Astrophysics Data System (ADS)

    Wang, Qiao; Cao, Zong-Yan; Gao, Liang; Chi, Xue-Bin; Meng, Chen; Wang, Jie; Wang, Long

    2018-06-01

    We introduce a new code for cosmological simulations, PHoToNs, which incorporates features for performing massive cosmological simulations on heterogeneous high performance computer (HPC) systems and threads oriented programming. PHoToNs adopts a hybrid scheme to compute gravitational force, with the conventional Particle-Mesh (PM) algorithm to compute the long-range force, the Tree algorithm to compute the short range force and the direct summation Particle-Particle (PP) algorithm to compute gravity from very close particles. A self-similar space filling a Peano-Hilbert curve is used to decompose the computing domain. Threads programming is advantageously used to more flexibly manage the domain communication, PM calculation and synchronization, as well as Dual Tree Traversal on the CPU+MIC platform. PHoToNs scales well and efficiency of the PP kernel achieves 68.6% of peak performance on MIC and 74.4% on CPU platforms. We also test the accuracy of the code against the much used Gadget-2 in the community and found excellent agreement.

  15. Neutron Monitor Observations and Space Weather, 1. Automatically Search of Great Solar Energetic Particle Event Beginning.

    NASA Astrophysics Data System (ADS)

    Dorman, L. I.; Pustil'Nik, L. A.; Sternlieb, A.; Zukerman, I. G.

    It is well known that in periods of great SEP fluxes of energetic particles can be so big that memory of computers and other electronics in space may be destroyed, satellites and spacecrafts became dead: according to NOAA Space Weather Scales are danger- ous Solar Radiation Storms S5-extreme (flux level of particles with energy > 10 MeV more than 10^5), S4-severe (flux more than 10^4) and S3-strong (flux more than 10^3). In these periods is necessary to switch off some part of electronics for few hours to protect computer memories. These periods are also dangerous for astronauts on space- ships, and passengers and crew in commercial jets (especially during S5 storms). The problem is how to forecast exactly these dangerous phenomena. We show that exact forecast can be made by using high-energy particles (few GeV/nucleon and higher) which transportation from the Sun is characterized by much bigger diffusion coeffi- cient than for small and middle energy particles. Therefore high energy particles came from the Sun much more early (8-20 minutes after acceleration and escaping into so- lar wind) than main part of smaller energy particles caused dangerous situation for electronics (about 30-60 minutes later). We describe here principles and experience of automatically working of program "FEP-Search". The positive result which shows the exact beginning of FEP event on the Emilio Segre' Observatory (2025 m above sea level, Rc=10.8 GV), is determined now automatically by simultaneously increas- ing on 2.5 St. Dev. in two sections of neutron supermonitor. The next 1-min data the program "FEP-Search" uses for checking that the observed increase reflects the begin- ning of real great FEP or not. If yes, automatically starts to work on line the programs "FEP-Research".

  16. Measurements of Nucleation-Mode Particle Size Distributions in Aircraft Plumes during SULFUR 6

    NASA Technical Reports Server (NTRS)

    Brock, Charles A.; Bradford, Deborah G.

    1999-01-01

    This report summarizes the participation of the University of Denver in an airborne measurement program, SULFUR 6, which was undertaken in late September and early October of 1998 by the Deutsches Zentrum fur Luft und Raumfahrt (DLR). Scientific findings from two papers that have been published or accepted and from one manuscript that is in preparation are presented. The SULFUR 6 experiment was designed to investigate the emissions from subsonic aircraft to constrain calculations of possible atmospheric chemical and climatic effects. The University of Denver effort contributed toward the following SULFUR 6 goals: (1) To investigate the relationship between fuel sulfur content (FSC--mass of sulfur per mass of fuel) and particle number and mass emission index (El--quantity emitted per kg of fuel burned); (2) To provide upper and lower limits for the mass conversion efficiency (nu) of fuel sulfur to gaseous and particulate sulfuric acid; (3) To constrain models of volatile particle nucleation and growth by measuring the particle size distribution between 3 and 100 nm at aircraft plume ages ranging from 10(exp -1) to 10(exp 3) s; (4) To determine microphysical and optical properties and bulk chemical composition of soot particles in aircraft exhaust; and (5) To investigate the differences in particle properties between aircraft plumes in contrail and non-contrail situations. The experiment focused on emissions from the ATTAS research aircraft (a well characterized, but older technology turbojet) and from an in-service Boeing 737-300 aircraft provided by Lufthansa, with modem, high-bypass turbofan engines. Measurements were made from the DLR Dassault Falcon 900 aircraft, a modified business jet. The Atmospheric Effects of Aviation Program (AEAP) provided funding to operate an instrument, the nucleation-mode aerosol size spectrometer (N-MASS), during the SULFUR 6 campaign and to analyze the data. The N-MASS was developed at the University of Denver with the support of NOAA's Office of Global Programs and NASA's AEAP and measures particle size distributions in the 4-100 nm range.

  17. Visualization assisted by parallel processing

    NASA Astrophysics Data System (ADS)

    Lange, B.; Rey, H.; Vasques, X.; Puech, W.; Rodriguez, N.

    2011-01-01

    This paper discusses the experimental results of our visualization model for data extracted from sensors. The objective of this paper is to find a computationally efficient method to produce a real time rendering visualization for a large amount of data. We develop visualization method to monitor temperature variance of a data center. Sensors are placed on three layers and do not cover all the room. We use particle paradigm to interpolate data sensors. Particles model the "space" of the room. In this work we use a partition of the particle set, using two mathematical methods: Delaunay triangulation and Voronoý cells. Avis and Bhattacharya present these two algorithms in. Particles provide information on the room temperature at different coordinates over time. To locate and update particles data we define a computational cost function. To solve this function in an efficient way, we use a client server paradigm. Server computes data and client display this data on different kind of hardware. This paper is organized as follows. The first part presents related algorithm used to visualize large flow of data. The second part presents different platforms and methods used, which was evaluated in order to determine the better solution for the task proposed. The benchmark use the computational cost of our algorithm that formed based on located particles compared to sensors and on update of particles value. The benchmark was done on a personal computer using CPU, multi core programming, GPU programming and hybrid GPU/CPU. GPU programming method is growing in the research field; this method allows getting a real time rendering instates of a precompute rendering. For improving our results, we compute our algorithm on a High Performance Computing (HPC), this benchmark was used to improve multi-core method. HPC is commonly used in data visualization (astronomy, physic, etc) for improving the rendering and getting real-time.

  18. An interdisciplinary study of the estaurine and coastal oceanography of Block Island Sound and adjacent New York coastal waters

    NASA Technical Reports Server (NTRS)

    Yost, E.; Hollman, R.; Alexander, J.; Nuzzi, R.

    1974-01-01

    ERTS-1 photographic data products have been analyzed using additive color viewing and electronic image analysis techniques. Satellite data were compared to water sample data collected simultaneously with the data of ERTS-1 coverage in New York Bight. Prediction of the absolute value of total suspended particles can be made using composites of positives of MSS bands 5 and 6 which have been precisely made using the step wedge supplied on the imagery. Predictions of the relative value of the extinction coefficient can be made using bands 4 and 5. Thematic charts of total suspended particles (particles per litre) and extinction coefficient provide scientists conducting state and federal water sampling programs in New York Bight with data which improves the performance of these programs.

  19. Size-dependent microstructures in rapidly solidified uranium-niobium powder particles

    DOE PAGES

    McKeown, Joseph T.; Hsiung, Luke L.; Park, Jong M.; ...

    2016-06-14

    The microstructures of rapidly solidified U-6wt%Nb powder particles synthesized by centrifugal atomization were characterized using scanning electron microscopy and transmission electron microscopy. Observed variations in microstructure are related to particle sizes. All of the powder particles exhibited a two-zone microstructure. The formation of this two-zone microstructure is described by a transition from solidification controlled by internal heat flow and high solidification rate during recalescence (micro-segregation-free or partitionless growth) to solidification controlled by external heat flow with slower solidification rates (dendritic growth with solute redistribution). The extent of partitionless solidification increased with decreasing particle size due to larger undercoolings in smallermore » particles prior to solidification. The metastable phases that formed are related to variations in Nb concentration across the particles. Lastly, the microstructures of the powders were heavily twinned.« less

  20. Sonochemical synthesis of silica particles and their size control

    NASA Astrophysics Data System (ADS)

    Kim, Hwa-Min; Lee, Chang-Hyun; Kim, Bonghwan

    2016-09-01

    Using an ultrasound-assisted sol-gel method, we successfully synthesized very uniformly shaped, monodisperse, and size-controlled spherical silica particles from a mixture of ethanol, water, and tetraethyl orthosilicate in the presence of ammonia as catalyst, at room temperature. The diameters of the silica particles were distributed in the range from 40 to 400 nm; their morphology was well characterized by scanning electron microscopy. The silica particle size could be adjusted by choosing suitable concentrations of ammonium hydroxide and water, which in turn determined the nucleation and growth rates of the particles during the reaction. This sonochemical-based silica synthesis offers an alternative way to produce spherical silica particles in a relatively short reaction time. Thus, we suggest that this simple, low-cost, and efficient method of preparing uniform silica particles of various sizes will have practical and wide-ranging industrial applicability.

  1. Stratification, segregation, and mixing of granular materials in quasi-two-dimensional bounded heaps.

    PubMed

    Fan, Yi; Boukerkour, Youcef; Blanc, Thibault; Umbanhowar, Paul B; Ottino, Julio M; Lueptow, Richard M

    2012-11-01

    Segregation and mixing of granular mixtures during heap formation has important consequences in industry and agriculture. This research investigates three different final particle configurations of bidisperse granular mixtures--stratified, segregated and mixed--during filling of quasi-two-dimensional silos. We consider a large number and wide range of control parameters, including particle size ratio, flow rate, system size, and heap rise velocity. The boundary between stratified and unstratified states is primarily controlled by the two-dimensional flow rate, with the critical flow rate for the transition depending weakly on particle size ratio and flowing layer length. In contrast, the transition from segregated to mixed states is controlled by the rise velocity of the heap, a control parameter not previously considered. The critical rise velocity for the transition depends strongly on the particle size ratio.

  2. Performance evaluation of mobile downflow booths for reducing airborne particles in the workplace.

    PubMed

    Lo, Li-Ming; Hocker, Braden; Steltz, Austin E; Kremer, John; Feng, H Amy

    2017-11-01

    Compared to other common control measures, the downflow booth is a costly engineering control used to contain airborne dust or particles. The downflow booth provides unidirectional filtered airflow from the ceiling, entraining released particles away from the workers' breathing zone, and delivers contained airflow to a lower level exhaust for removing particulates by filtering media. In this study, we designed and built a mobile downflow booth that is capable of quick assembly and easy size change to provide greater flexibility and particle control for various manufacturing processes or tasks. An experimental study was conducted to thoroughly evaluate the control performance of downflow booths used for removing airborne particles generated by the transfer of powdered lactose between two containers. Statistical analysis compared particle reduction ratios obtained from various test conditions including booth size (short, regular, or extended), supply air velocity (0.41 and 0.51 m/s or 80 and 100 feet per minute, fpm), powder transfer location (near or far from the booth exhaust), and inclusion or exclusion of curtains at the booth entrance. Our study results show that only short-depth downflow booths failed to protect the worker performing powder transfer far from the booth exhausts. Statistical analysis shows that better control performance can be obtained with supply air velocity of 0.51 m/s (100 fpm) than with 0.41 m/s (80 fpm) and that use of curtains for downflow booths did not improve their control performance.

  3. Physical properties of particulate matter (PM) from late model heavy-duty diesel vehicles operating with advanced PM and NO x emission control technologies

    NASA Astrophysics Data System (ADS)

    Biswas, Subhasis; Hu, Shaohua; Verma, Vishal; Herner, Jorn D.; Robertson, William H.; Ayala, Alberto; Sioutas, Constantinos

    Emission control technologies designed to meet the 2007 and 2010 emission standards for heavy-duty diesel vehicles (HDDV) remove effectively the non-volatile fraction of particles, but are comparatively less efficient at controlling the semi-volatile components. A collaborative study between the California Air Resources Board (CARB) and the University of Southern California was initiated to investigate the physicochemical and toxicological characteristics of the semi-volatile and non-volatile particulate matter (PM) fractions from HDDV emissions. This paper reports the physical properties, including size distribution, volatility (in terms of number and mass), surface diameter, and agglomeration of particles emitted from HDDV retrofitted with advanced emission control devices. Four vehicles in combination with six after-treatment devices (V-SCRT ®, Z-SCRT ®, CRT ®, DPX, Hybrid-CCRT ®, EPF) were tested under three driving cycles: steady state (cruise), transient (urban dynamometer driving schedule, UDDS), and idle. An HDDV without any control device is served as the baseline vehicle. Substantial reduction of PM mass emissions (>90%) was accomplished for the HDDV operating with advanced emission control technologies. This reduction was not observed for particle number concentrations under cruise conditions, with the exceptions of the Hybrid-CCRT ® and EPF vehicles, which were efficient in controlling both—mass and number emissions. In general, significant nucleation mode particles (<50 nm) were formed during cruise cycles in comparison with the UDDS cycles, which emit higher PM mass in the accumulation mode. The nucleation mode particles (<50 nm) were mainly internally mixed, and evaporated considerably between 150 and 230 °C. Compared to the baseline vehicle, particles from vehicles with controls (except of the Hybrid-CCRT ®) had a higher mass specific surface area.

  4. Chemistry with spatial control using particles and streams†

    PubMed Central

    Kalinin, Yevgeniy V.; Murali, Adithya

    2012-01-01

    Spatial control of chemical reactions, with micro- and nanometer scale resolution, has important consequences for one pot synthesis, engineering complex reactions, developmental biology, cellular biochemistry and emergent behavior. We review synthetic methods to engineer this spatial control using chemical diffusion from spherical particles, shells and polyhedra. We discuss systems that enable both isotropic and anisotropic chemical release from isolated and arrayed particles to create inhomogeneous and spatially patterned chemical fields. In addition to such finite chemical sources, we also discuss spatial control enabled with laminar flow in 2D and 3D microfluidic networks. Throughout the paper, we highlight applications of spatially controlled chemistry in chemical kinetics, reaction-diffusion systems, chemotaxis and morphogenesis. PMID:23145348

  5. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-10-01

    This document is the first volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, exceptmore » for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of an introduction, summary/conclusion, site description and assessment, description of facility, and description of operation.« less

  6. Computer program TRACK_TEST for calculating parameters and plotting profiles for etch pits in nuclear track materials

    NASA Astrophysics Data System (ADS)

    Nikezic, D.; Yu, K. N.

    2006-01-01

    A computer program called TRACK_TEST for calculating parameters (lengths of the major and minor axes) and plotting profiles in nuclear track materials resulted from light-ion irradiation and subsequent chemical etching is described. The programming steps are outlined, including calculations of alpha-particle ranges, determination of the distance along the particle trajectory penetrated by the chemical etchant, calculations of track coordinates, determination of the lengths of the major and minor axes and determination of the contour of the track opening. Descriptions of the program are given, including the built-in V functions for the two commonly employed nuclear track materials commercially known as LR 115 (cellulose nitrate) and CR-39 (poly allyl diglycol carbonate) irradiated by alpha particles. Program summaryTitle of the program:TRACK_TEST Catalogue identifier:ADWT Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWT Computer:Pentium PC Operating systems:Windows 95+ Programming language:Fortran 90 Memory required to execute with typical data:256 MB No. of lines in distributed program, including test data, etc.: 2739 No. of bytes in distributed program, including test data, etc.:204 526 Distribution format:tar.gz External subprograms used:The entire code must be linked with the MSFLIB library Nature of problem: Fast heavy charged particles (like alpha particles and other light ions etc.) create latent tracks in some dielectric materials. After chemical etching in aqueous NaOH or KOH solutions, these tracks become visible under an optical microscope. The growth of a track is based on the simultaneous actions of the etchant on undamaged regions (with the bulk etch rate V) and along the particle track (with the track etch rate V). Growth of the track is described satisfactorily by these two parameters ( V and V). Several models have been presented in the past describing the track development, one of which is the model of Nikezic and Yu (2003) [D. Nikezic, K.N. Yu, Three-dimensional analytical determination of the track parameters. Over-etched tracks, Radiat. Meas. 37 (2003) 39-45] used in the present program. The present computer program has been written to calculate coordinates of points on the track wall and to determine other relevant track parameters. Solution method:Coordinates of points on the track wall assuming normal incidence were calculated by using the method as described by Fromm et al. (1988) [M. Fromm, A. Chambaudet, F. Membrey, Data bank for alpha particle tracks in CR39 with energies ranging from 0.5 to 5 MeV recording for various incident angles, Nucl. Tracks Radiat. Meas. 15 (1988) 115-118]. The track is then rotated through the incident angle in order to obtain the coordinates of the oblique track [D. Nikezic, K.N. Yu, Three-dimensional analytical determination of the track parameters. Over-etched tracks, Radiat. Meas. 37 (2003) 39-45; D. Nikezic, Three dimensional analytical determination of the track parameters, Radiat. Meas. 32 (2000) 277-282]. In this way, the track profile in two dimensions (2D) was obtained. In the next step, points in the track wall profile are rotated around the particle trajectory. In this way, circles that outline the track in three dimensions (3D) are obtained. The intersection between the post-etching surface of the detector and the 3D track is the track opening (or the track contour). Coordinates of the track 2D and 3D profiles and the track opening are saved in separate output data files. Restrictions: The program cannot calculate track parameters for the incident angle of exactly 90°. The alpha-particle energy should be smaller than 10 MeV. Furthermore, the program cannot perform calculations for tracks in some extreme cases, such as for very low incident energies or very small incident angles. Additional comments: This is a freeware, but publications arising from using this program should cite the present paper and the paper describing the track growth model [D. Nikezic, K.N. Yu, Three-dimensional analytical determination of the track parameters. Over-etched tracks, Radiat. Meas. 37 (2003) 39-45]. Moreover, the references for the V functions used should also be cited. For the CR-39 detector: Function (1): S.A. Durrani, R.K. Bull, Solid State Nuclear Track Detection. Principles, Methods and Applications, Pergamon Press, 1987. Function (2): C. Brun, M. Fromm, M. Jouffroy, P. Meyer, J.E. Groetz, F. Abel, A. Chambaudet, B. Dorschel, D. Hermsdorf, R. Bretschneider, K. Kadner, H. Kuhne, Intercomparative study of the detection characteristics of the CR-39 SSNTD for light ions: Present status of the Besancon-Dresden approaches, Radiat. Meas. 31 (1999) 89-98. Function (3): K.N. Yu, F.M.F. Ng, D. Nikezic, Measuring depths of sub-micron tracks in a CR-39 detector from replicas using atomic force microscopy, Radiat. Meas. 40 (2005) 380-383. For the LR 115 detector: Function (1): S.A. Durrani, P.F. Green, The effect of etching conditions on the response of LR 115, Nucl. Tracks 8 (1984) 21-24. Function (2): C.W.Y. Yip, D. Nikezic, J.P.Y Ho, K.N. Yu, Chemical etching characteristics for cellulose nitrate, Mat. Chem. Phys. 95 (2005) 307-312. Running time: Order of several minutes, dependent on input parameters and the resolution requested by the user.

  7. Controlled human exposures to ambient pollutant particles in susceptible populations

    EPA Science Inventory

    Epidemiologic studies have established an association between exposures to air pollution particles and human mortality and morbidity at concentrations of particles currently found in major metropolitan areas. The adverse effects of pollution particles are most prominent in suscep...

  8. Design, construction, and characterization of a novel robotic welding fume generator and inhalation exposure system for laboratory animals.

    PubMed

    Antonini, James M; Afshari, Aliakbar A; Stone, Sam; Chen, Bean; Schwegler-Berry, Diane; Fletcher, W Gary; Goldsmith, W Travis; Vandestouwe, Kurt H; McKinney, Walter; Castranova, Vincent; Frazer, David G

    2006-04-01

    Respiratory effects observed in welders have included lung function changes, metal fume fever, bronchitis, and a possible increase in the incidence of lung cancer. Many questions remain unanswered regarding the causality and possible underlying mechanisms associated with the potential toxic effects of welding fume inhalation. The objective of the present study was to construct a completely automated, computer-controlled welding fume generation and inhalation exposure system to simulate real workplace exposures. The system comprised a programmable six-axis robotic welding arm, a water-cooled arc welding torch, and a wire feeder that supplied the wire to the torch at a programmed rate. For the initial studies, gas metal arc welding was performed using a stainless steel electrode. A flexible trunk was attached to the robotic arm of the welder and was used to collect and transport fume from the vicinity of the arc to the animal exposure chamber. Undiluted fume concentrations consistently ranged from 90-150 mg/m(3) in the animal chamber during welding. Temperature and humidity remained constant in the chamber during the welding operation. The welding particles were composed of (from highest to lowest concentration) iron, chromium, manganese, and nickel as measured by inductively coupled plasma atomic emission spectroscopy. Size distribution analysis indicated the mass median aerodynamic diameter of the generated particles to be approximately 0.24 microm with a geometric standard deviation (sigma(g)) of 1.39. As determined by transmission and scanning electron microscopy, the generated aerosols were mostly arranged as chain-like agglomerates of primary particles. Characterization of the laboratory-generated welding aerosol has indicated that particle morphology, size, and chemical composition are comparable to stainless steel welding fume generated in other studies. With the development of this novel system, it will be possible to establish an animal model using controlled welding exposures from automated gas metal arc and flux-cored arc welding processes to investigate how welding fumes affect health.

  9. Multi-Item Multiperiodic Inventory Control Problem with Variable Demand and Discounts: A Particle Swarm Optimization Algorithm

    PubMed Central

    Mousavi, Seyed Mohsen; Niaki, S. T. A.; Bahreininejad, Ardeshir; Musa, Siti Nurmaya

    2014-01-01

    A multi-item multiperiod inventory control model is developed for known-deterministic variable demands under limited available budget. Assuming the order quantity is more than the shortage quantity in each period, the shortage in combination of backorder and lost sale is considered. The orders are placed in batch sizes and the decision variables are assumed integer. Moreover, all unit discounts for a number of products and incremental quantity discount for some other items are considered. While the objectives are to minimize both the total inventory cost and the required storage space, the model is formulated into a fuzzy multicriteria decision making (FMCDM) framework and is shown to be a mixed integer nonlinear programming type. In order to solve the model, a multiobjective particle swarm optimization (MOPSO) approach is applied. A set of compromise solution including optimum and near optimum ones via MOPSO has been derived for some numerical illustration, where the results are compared with those obtained using a weighting approach. To assess the efficiency of the proposed MOPSO, the model is solved using multi-objective genetic algorithm (MOGA) as well. A large number of numerical examples are generated at the end, where graphical and statistical approaches show more efficiency of MOPSO compared with MOGA. PMID:25093195

  10. The effect of particle size distribution on the design of urban stormwater control measures

    USGS Publications Warehouse

    Selbig, William R.; Fienen, Michael N.; Horwatich, Judy A.; Bannerman, Roger T.

    2016-01-01

    An urban pollutant loading model was used to demonstrate how incorrect assumptions on the particle size distribution (PSD) in urban runoff can alter the design characteristics of stormwater control measures (SCMs) used to remove solids in stormwater. Field-measured PSD, although highly variable, is generally coarser than the widely-accepted PSD characterized by the Nationwide Urban Runoff Program (NURP). PSDs can be predicted based on environmental surrogate data. There were no appreciable differences in predicted PSD when grouped by season. Model simulations of a wet detention pond and catch basin showed a much smaller surface area is needed to achieve the same level of solids removal using the median value of field-measured PSD as compared to NURP PSD. Therefore, SCMs that used the NURP PSD in the design process could be unnecessarily oversized. The median of measured PSDs, although more site-specific than NURP PSDs, could still misrepresent the efficiency of an SCM because it may not adequately capture the variability of individual runoff events. Future pollutant loading models may account for this variability through regression with environmental surrogates, but until then, without proper site characterization, the adoption of a single PSD to represent all runoff conditions may result in SCMs that are under- or over-sized, rendering them ineffective or unnecessarily costly.

  11. The Development and Assessment of Particle Physics Summer Program for High School Students

    NASA Astrophysics Data System (ADS)

    Prefontaine, Brean; Kurahashi Neilson, Naoko, , Dr.; Love, Christina, , Dr.

    2017-01-01

    A four week immersive summer program for high school students was developed and implemented to promote awareness of university level research. The program was completely directed by an undergraduate physics major and included a hands-on and student-led capstone project for the high school students. The goal was to create an adaptive and shareable curriculum in order to influence high school students' views of university level research and what it means to be a scientist. The program was assessed through various methods including a survey developed for this program, a scientific attitudes survey, weekly blog posts, and an oral exit interview. The curriculum included visits to local laboratories, an introduction to particle physics and the IceCube collaboration, an introduction to electronics and computer programming, and their capstone project: planning and building a scale model of the IceCube detector. At the conclusion of the program, the students participated an informal outreach event for the general public and gave an oral presentation to the Department of Physics at Drexel University. Assessment results and details concerning the curriculum and its development will be discussed.

  12. Armored DNA in recombinant Baculoviruses as controls in molecular genetic assays.

    PubMed

    Freystetter, Andrea; Paar, Christian; Stekel, Herbert; Berg, Jörg

    2017-10-01

    The widespread use of molecular PCR-based assays in analytical and clinical laboratories brings about the need for test-specific, stable, and reliable external controls (EC) as well as standards and internal amplification controls (IC), in order to arrive at consistent test results. In addition, there is also a growing need to produce and provide stable, well-characterized molecular controls for quality assurance programs. In this study, we describe a novel approach to generate armored double-stranded DNA controls, which are encapsulated in baculovirus (BV) particles of the species Autographa californica multiple nucleopolyhedrovirus. We used the well-known BacPAK™ Baculovirus Expression System (Takara-Clontech), removed the polyhedrin promoter used for protein expression, and generated recombinant BV-armored DNAs. The obtained BV-armored DNAs were readily extracted by standard clinical DNA extraction methods, showed favorable linearity and performance in our clinical PCR assays, were resistant to DNase I digestion, and exhibited marked stability in human plasma and serum. BV-armored DNA ought to be used as ECs, quantification standards, and ICs in molecular assays, with the latter application allowing for the entire monitoring of clinical molecular assays for sample adequacy. BV-armored DNA may also be used to produce double-stranded DNA reference materials for, e.g., quality assurance programs. The ease to produce BV-armored DNA should make this approach feasible for a broad spectrum of molecular applications. Finally, as BV-armored DNAs are non-infectious to mammals, they may be even more conveniently shipped than clinical specimen.

  13. Special issue containing papers presented at the 12th IAEA Technical Meeting on Energetic Particles in Magnetic Confinement Systems (7-11 September 2011) Special issue containing papers presented at the 12th IAEA Technical Meeting on Energetic Particles in Magnetic Confinement Systems (7-11 September 2011)

    NASA Astrophysics Data System (ADS)

    Berk, H. L.

    2012-09-01

    The topic of the behaviour of energetic alpha particles in magnetic fusion confined plasmas is perhaps the ultimate frontier plasma physics issue that needs to be understood in the quest to achieve controlled power from the fusion reaction in magnetically confined plasmas. The partial pressure of alpha particles in a burning plasma will be ~5-10% of the total pressure and under these conditions the alpha particles may be prone to develop instability through Alfvénic interaction. This may lead, even with moderate alpha particle loss, to a burn quench or severe wall damage. Alternatively, benign Alfvénic signals may allow the vital information to control a fusion burn. The significance of this issue has led to extensive international investigations and a biannual meeting that began in Kyiv in 1989, followed by subsequent meetings in Aspenäs (1991), Trieste (1993), Princeton (1995), JET/Abingdon (1997), Naka (1999), Gothenburg (2001), San Diego (2003), Takayama (2005), Kloster Seeon (2007) and Kyiv (2009). The meeting was initially entitled 'Alpha Particles in Fusion Research' and then was changed during the 1997 meeting to 'Energetic Particles in Magnetic Confinement Systems' in appreciation of the need to study the significance of the electron runaway, which can lead to the production of energetic electrons with energies that can even exceed the energy produced by fusion products. This special issue presents some of the mature interesting work that was reported at the 12th IAEA Technical Meeting on Energetic Particles in Magnetic Confinement Systems, which was held in Austin, Texas, USA (7-11 September 2011). This meeting immediately followed a related meeting, the 5th IAEA Technical Meeting on Theory of Plasma Wave Instabilities (5-7 September 2011). The meetings shared one day (7 September 2011) with presentations relevant to both groups. The presentations from most of the participants, as well as some preliminary versions of papers, are available at the websites [1, 2]. To view a presentation or paper, go to the link 'program', view the list or speakers and poster presenters and press 'talk' or 'paper' under the appropriate name. Summaries of the Energetic Particle Conference presentations were given by Kazuo Toi and Boris Breizman. They respectively discussed the experimental and theoretical progress presented at the meeting. Their presentations can be viewed on the 'iaeaep' website [1], by pressing 'Summary-I (or II)' by each of their names. Highlights of this meeting include the tremendous progress that has been achieved in the development of diagnostics that enables the 'viewing' of internal fluctuations and allows comparison with theoretical predictions, as demonstrated, for example, in the talks of P. Lauber and M. Osakabe. The need and development of hardened diagnostics in the severe radiation environment, such as those that will exist in ITER, was discussed in the talks of V. Kiptiley and V.A. Kazakhov. In theoretical studies, much of the effort is focused on nonlinear phenomena. For example, detailed comparison of theory and experiment on D-III-D on the n = 0 geodesic mode was reported in separate papers by R. Nazikian and G. Fu. A large number of theoretical papers were presented on wave chirping including a paper by B.N. Breizman, which notes that continual wave chirping from a single frequency may emanate continuously once marginal stability conditions have been established. Another area of wide interest was the detailed study of alpha orbits in a burning plasma, where losses can come from perturbations from perfect toroidal symmetry arising from finite coil number, magnetic field imperfections introduced by diagnostic or test modules and from instability. An important area of development, covered by M.A. Hole and D.A. Spong, is concerned with the self-consistent treatment of the induced fields that accounts for responses beyond vacuum field perturbations or a pure toroidally symmetric MHD response. In addition, a significant number of studies focused on understanding nonlinear behaviour by means of computer simulation of energetic particle driven instability. An under-represented area of investigation was the study of electron runaway formation during major tokamak disruptions. It was noted in an overview by S. Putvinski that electron energies in the 10-20 MeV range is to be expected during projected major disruptions in ITER and that reliable methods for mitigation of the runaway process needs to be developed. Significant recent work in the field of the disruption induced electron runaway, which was reported by J. Riemann, does not appear in this special issue of Nuclear Fusion as the work had been previously submitted to Physics of Plasmas [3]. Overall it is clear that reliable mitigation of electron runaway is an extremely important topic that is in need of better understanding and solutions. It has been my pleasure to serve as the organizer of the 12th meeting and to serve as a Guest Editor of this issue of Nuclear Fusion. I am sure that the contents of this issue will serve as a valuable research guide to the field of energetic particle behaviour in a burning plasma for many years to come. The site of the next meeting will by Beijing, China in the fall of 2013, which will be organized by Zinghong Lin. References [1] Program 2011 12th IAEA Technical Meeting on Energetic Particles in Magnetic Confinement Systems (Austin, Texas, USA, 7-11 September 2011) http://w3fusion.ph.utexas.edu/ifs/iaeaep/program.html [2] Program 2011 5th IAEA Technical Meeting on Theory of Plasma Wave Instabilities (Austin, Texas, USA, 5-7 September 2011) http://w3fusion.ph.utexas.edu/ifs/iaeapi/program.html [3] Riemann J., Smith H.M. and Helander P. 2012 Phys. Plasmas 19 012507

  14. Lessons Learned with Metallized Gelled Propellants

    NASA Technical Reports Server (NTRS)

    1996-01-01

    During testing of metallized gelled propellants in a rocket engine, many changes had to be made to the normal test program for traditional liquid propellants. The lessons learned during the testing and the solutions for many of the new operational conditions posed with gelled fuels will help future programs run more smoothly. The major factors that influenced the success of the testing were propellant settling, piston-cylinder tank operation, control of self pressurization, capture of metal oxide particles, and a gelled-fuel protective layer. In these ongoing rocket combustion experiments at the NASA Lewis Research Center, metallized, gelled liquid propellants are used in a small modular engine that produces 30 to 40 lb of thrust. Traditional liquid RP-1 and gelled RP-1 with 0-, 5-, and 55-wt% loadings of aluminum are used with gaseous oxygen as the oxidizer. The figure compares the thrust chamber efficiencies of different engines.

  15. ASRM propellant and igniter propellant development and process scale-up

    NASA Technical Reports Server (NTRS)

    Landers, L. C.; Booth, D. W.; Stanley, C. B.; Ricks, D. W.

    1993-01-01

    A program of formulation and process development for ANB-3652 motor propellant was conducted to validate design concepts and screen critical propellant composition and process parameters. Design experiments resulted in the selection of a less active grade of ferric oxide to provide better burning rate control, the establishment of AP fluidization conditions that minimized the adverse effects of particle attrition, and the selection of a higher mix temperature to improve mechanical properties. It is shown that the propellant can be formulated with AP and aluminum powder from various producers. An extended duration pilot plant run demonstrated stable equipment operation and excellent reproducibility of propellant properties. A similar program of formulation and process optimization culminating in large batch scaleup was conducted for ANB-3672 igniter propellant. The results for both ANB-3652 and ANB 37672 confirmed that their processing characteristics are compatible with full-scale production.

  16. FMM-Yukawa: An adaptive fast multipole method for screened Coulomb interactions

    NASA Astrophysics Data System (ADS)

    Huang, Jingfang; Jia, Jun; Zhang, Bo

    2009-11-01

    A Fortran program package is introduced for the rapid evaluation of the screened Coulomb interactions of N particles in three dimensions. The method utilizes an adaptive oct-tree structure, and is based on the new version of fast multipole method in which the exponential expansions are used to diagonalize the multipole-to-local translations. The program and its full description, as well as several closely related packages are also available at http://www.fastmultipole.org/. This paper is a brief review of the program and its performance. Catalogue identifier: AEEQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL 2.0 No. of lines in distributed program, including test data, etc.: 12 385 No. of bytes in distributed program, including test data, etc.: 79 222 Distribution format: tar.gz Programming language: Fortran77 and Fortran90 Computer: Any Operating system: Any RAM: Depends on the number of particles, their distribution, and the adaptive tree structure Classification: 4.8, 4.12 Nature of problem: To evaluate the screened Coulomb potential and force field of N charged particles, and to evaluate a convolution type integral where the Green's function is the fundamental solution of the modified Helmholtz equation. Solution method: An adaptive oct-tree is generated, and a new version of fast multipole method is applied in which the "multipole-to-local" translation operator is diagonalized. Restrictions: Only three and six significant digits accuracy options are provided in this version. Unusual features: Most of the codes are written in Fortran77. Functions for memory allocation from Fortran90 and above are used in one subroutine. Additional comments: For supplementary information see http://www.fastmultipole.org/ Running time: The running time varies depending on the number of particles (denoted by N) in the system and their distribution. The running time scales linearly as a function of N for nearly uniform particle distributions. For three digits accuracy, the solver breaks even with direct summation method at about N = 750. References: [1] L. Greengard, J. Huang, A new version of the fast multipole method for screened Coulomb interactions in three dimensions, J. Comput. Phys. 180 (2002) 642-658.

  17. Size-controlled fabrication of zein nano/microparticles by modified anti-solvent precipitation with/without sodium caseinate

    PubMed Central

    Li, Feng; Chen, Yan; Liu, Shubo; Qi, Jian; Wang, Weiying; Wang, Chenhua; Zhong, Ruiyue; Chen, Zhijun; Li, Xiaoming; Guan, Yuanzhou; Kong, Wei; Zhang, Yong

    2017-01-01

    Zein-based nano/microparticles have been demonstrated to be promising carrier systems for both the food industry and biomedical applications. However, the fabrication of size-controlled zein particles has been a challenging issue. In this study, a modified anti-solvent precipitation method was developed, and the effects of various factors, such as mixing method, solvent/anti-solvent ratio, temperature, zein concentrations and the presence of sodium caseinate (SC) on properties of zein particles were investigated. Evidence is presented that, among the previously mentioned factors, the mixing method, especially mixing rate, could be used as an effective parameter to control the size of zein particles without changing other parameters. Moreover, through fine-tuning the mixing rate together with zein concentration, particles with sizes ranging from nanometers to micrometers and low polydispersity index values could be easily obtained. Based on the size-controlled fabrication method, SC-coated zein nanoparticles could also be obtained in a size-controlled manner by incubation of the coating material with the already-formed zein particles. The resultant nanoparticles showed better performance in both drug loading and controlled release, compared with zein/SC hybrid nanoparticles fabricated by adding aqueous ethanol solution to SC solution. The possible mechanisms of the nanoprecipitation process and self-assembly formation of these nanoparticles are discussed. PMID:29184408

  18. Size-controlled fabrication of zein nano/microparticles by modified anti-solvent precipitation with/without sodium caseinate.

    PubMed

    Li, Feng; Chen, Yan; Liu, Shubo; Qi, Jian; Wang, Weiying; Wang, Chenhua; Zhong, Ruiyue; Chen, Zhijun; Li, Xiaoming; Guan, Yuanzhou; Kong, Wei; Zhang, Yong

    2017-01-01

    Zein-based nano/microparticles have been demonstrated to be promising carrier systems for both the food industry and biomedical applications. However, the fabrication of size-controlled zein particles has been a challenging issue. In this study, a modified anti-solvent precipitation method was developed, and the effects of various factors, such as mixing method, solvent/anti-solvent ratio, temperature, zein concentrations and the presence of sodium caseinate (SC) on properties of zein particles were investigated. Evidence is presented that, among the previously mentioned factors, the mixing method, especially mixing rate, could be used as an effective parameter to control the size of zein particles without changing other parameters. Moreover, through fine-tuning the mixing rate together with zein concentration, particles with sizes ranging from nanometers to micrometers and low polydispersity index values could be easily obtained. Based on the size-controlled fabrication method, SC-coated zein nanoparticles could also be obtained in a size-controlled manner by incubation of the coating material with the already-formed zein particles. The resultant nanoparticles showed better performance in both drug loading and controlled release, compared with zein/SC hybrid nanoparticles fabricated by adding aqueous ethanol solution to SC solution. The possible mechanisms of the nanoprecipitation process and self-assembly formation of these nanoparticles are discussed.

  19. Revision of FMM-Yukawa: An adaptive fast multipole method for screened Coulomb interactions

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Huang, Jingfang; Pitsianis, Nikos P.; Sun, Xiaobai

    2010-12-01

    FMM-YUKAWA is a mathematical software package primarily for rapid evaluation of the screened Coulomb interactions of N particles in three dimensional space. Since its release, we have revised and re-organized the data structure, software architecture, and user interface, for the purpose of enabling more flexible, broader and easier use of the package. The package and its documentation are available at http://www.fastmultipole.org/, along with a few other closely related mathematical software packages. New version program summaryProgram title: FMM-Yukawa Catalogue identifier: AEEQ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEQ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL 2.0 No. of lines in distributed program, including test data, etc.: 78 704 No. of bytes in distributed program, including test data, etc.: 854 265 Distribution format: tar.gz Programming language: FORTRAN 77, FORTRAN 90, and C. Requires gcc and gfortran version 4.4.3 or later Computer: All Operating system: Any Classification: 4.8, 4.12 Catalogue identifier of previous version: AEEQ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2331 Does the new version supersede the previous version?: Yes Nature of problem: To evaluate the screened Coulomb potential and force field of N charged particles, and to evaluate a convolution type integral where the Green's function is the fundamental solution of the modified Helmholtz equation. Solution method: The new version of fast multipole method (FMM) that diagonalizes the multipole-to-local translation operator is applied with the tree structure adaptive to sample particle locations. Reasons for new version: To handle much larger particle ensembles, to enable the iterative use of the subroutines in a solver, and to remove potential contention in assignments for parallelization. Summary of revisions: The software package FMM-Yukawa has been revised and re-organized in data structure, software architecture, programming methods, and user interface. The revision enables more flexible use of the package and economic use of memory resources. It consists of five stages. The initial stage (stage 1) determines, based on the accuracy requirement and FMM theory, the length of multipole expansions and the number of quadrature points for diagonalization, and loads the quadrature nodes and weights that are computed off line. Stage 2 constructs the oct-tree and interaction lists, with adaptation to the sparsity or density of particles and employing a dynamic memory allocation scheme at every tree level. Stage 3 executes the core FMM subroutine for numerical calculation of the particle interactions. The subroutine can now be used iteratively as in a solver, while the particle locations remain the same. Stage 4 releases the memory allocated in Stage 2 for the adaptive tree and interaction lists. The user can modify the iterative routine easily. When the particle locations are changed such as in a molecular dynamics simulation, stage 2 to 4 can also be used together repeatedly. The final stage releases the memory space used for the quadrature and other remaining FMM parameters. Programs at the stage level and at the user interface are re-written in the C programming language, while most of the translation and interaction operations remain in FORTRAN. As a result of the change in data structures and memory allocation, the revised package can accommodate much larger particle ensembles while maintaining the same accuracy-efficiency performance. The new version is also developed as an important precursor to its parallel counterpart on multi-core or many core processors in a shared memory programming environment. Particularly, in order to ensure mutual exclusion in concurrent updates without incurring extra latency, we have replaced all the assignment statements at a source box that put its data to multiple target boxes with assignments at every target box that gather data from source boxes. This amounts to replacing the column version of matrix-vector multiplication with the row version. The matrix here, however, is in compressive representation. Sufficient care is taken in the revision not to alter the algorithmic complexity or numerical behavior, as concurrent writing potentially takes place in the upward calculation of the multipole expansion coefficients, interactions at every level of the FMM tree, and downward calculation of the local expansion coefficients. The software modules and their compositions are also organized according to the stages they are used. Demonstration files and makefiles for merging the user routines and the library routines are provided. Restrictions: Accuracy requirement is described in terms of three or six digits. Higher multiples of three digits will be allowed in a later version. Finer decimation in digits for accuracy specification may or may not be necessary. Unusual features: Ready and friendly for customized use and instrumental in expression of concurrency and dependency for efficient parallelization. Running time: The running time depends linearly on the number N of particles, and varies with the distribution characteristics of the particle distribution. It also depends on the accuracy requirement, a higher accuracy requirement takes relatively longer time. The code outperforms the direct summation method when N⩾750.

  20. Aerosol sampling system for collection of Capstone depleted uranium particles in a high-energy environment.

    PubMed

    Holmes, Thomas D; Guilmette, Raymond A; Cheng, Yung Sung; Parkhurst, Mary Ann; Hoover, Mark D

    2009-03-01

    The Capstone Depleted Uranium (DU) Aerosol Study was undertaken to obtain aerosol samples resulting from a large-caliber DU penetrator striking an Abrams or Bradley test vehicle. The sampling strategy was designed to (1) optimize the performance of the samplers and maintain their integrity in the extreme environment created during perforation of an armored vehicle by a DU penetrator, (2) collect aerosols as a function of time post perforation, and (3) obtain size-classified samples for analysis of chemical composition, particle morphology, and solubility in lung fluid. This paper describes the experimental setup and sampling methodologies used to achieve these objectives. Custom-designed arrays of sampling heads were secured to the inside of the target in locations approximating the breathing zones of the crew locations in the test vehicles. Each array was designed to support nine filter cassettes and nine cascade impactors mounted with quick-disconnect fittings. Shielding and sampler placement strategies were used to minimize sampler loss caused by the penetrator impact and the resulting fragments of eroded penetrator and perforated armor. A cyclone train was used to collect larger quantities of DU aerosol for measurement of chemical composition and solubility. A moving filter sample was used to obtain semicontinuous samples for DU concentration determination. Control for the air samplers was provided by five remotely located valve control and pressure monitoring units located inside and around the test vehicle. These units were connected to a computer interface chassis and controlled using a customized LabVIEW engineering computer control program. The aerosol sampling arrays and control systems for the Capstone study provided the needed aerosol samples for physicochemical analysis, and the resultant data were used for risk assessment of exposure to DU aerosol.

  1. Aerosol Sampling System for Collection of Capstone Depleted Uranium Particles in a High-Energy Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmes, Thomas D.; Guilmette, Raymond A.; Cheng, Yung-Sung

    2009-03-01

    The Capstone Depleted Uranium Aerosol Study was undertaken to obtain aerosol samples resulting from a kinetic-energy cartridge with a large-caliber depleted uranium (DU) penetrator striking an Abrams or Bradley test vehicle. The sampling strategy was designed to (1) optimize the performance of the samplers and maintain their integrity in the extreme environment created during perforation of an armored vehicle by a DU penetrator, (2) collect aerosols as a function of time post-impact, and (3) obtain size-classified samples for analysis of chemical composition, particle morphology, and solubility in lung fluid. This paper describes the experimental setup and sampling methodologies used tomore » achieve these objectives. Custom-designed arrays of sampling heads were secured to the inside of the target in locations approximating the breathing zones of the vehicle commander, loader, gunner, and driver. Each array was designed to support nine filter cassettes and nine cascade impactors mounted with quick-disconnect fittings. Shielding and sampler placement strategies were used to minimize sampler loss caused by the penetrator impact and the resulting fragments of eroded penetrator and perforated armor. A cyclone train was used to collect larger quantities of DU aerosol for chemical composition and solubility. A moving filter sample was used to obtain semicontinuous samples for depleted uranium concentration determination. Control for the air samplers was provided by five remotely located valve control and pressure monitoring units located inside and around the test vehicle. These units were connected to a computer interface chassis and controlled using a customized LabVIEW engineering computer control program. The aerosol sampling arrays and control systems for the Capstone study provided the needed aerosol samples for physicochemical analysis, and the resultant data were used for risk assessment of exposure to DU aerosol.« less

  2. Magnetophoretic circuits for digital control of single particles and cells

    NASA Astrophysics Data System (ADS)

    Lim, Byeonghwa; Reddy, Venu; Hu, Xinghao; Kim, Kunwoo; Jadhav, Mital; Abedini-Nassab, Roozbeh; Noh, Young-Woock; Lim, Yong Taik; Yellen, Benjamin B.; Kim, Cheolgi

    2014-05-01

    The ability to manipulate small fluid droplets, colloidal particles and single cells with the precision and parallelization of modern-day computer hardware has profound applications for biochemical detection, gene sequencing, chemical synthesis and highly parallel analysis of single cells. Drawing inspiration from general circuit theory and magnetic bubble technology, here we demonstrate a class of integrated circuits for executing sequential and parallel, timed operations on an ensemble of single particles and cells. The integrated circuits are constructed from lithographically defined, overlaid patterns of magnetic film and current lines. The magnetic patterns passively control particles similar to electrical conductors, diodes and capacitors. The current lines actively switch particles between different tracks similar to gated electrical transistors. When combined into arrays and driven by a rotating magnetic field clock, these integrated circuits have general multiplexing properties and enable the precise control of magnetizable objects.

  3. Multiscale spectral nanoscopy

    DOEpatents

    Yang, Haw; Welsher, Kevin

    2016-11-15

    A system and method for non-invasively tracking a particle in a sample is disclosed. The system includes a 2-photon or confocal laser scanning microscope (LSM) and a particle-holding device coupled to a stage with X-Y and Z position control. The system also includes a tracking module having a tracking excitation laser, X-Y and Z radiation-gathering components configured to detect deviations of the particle in an X-Y and Z directions. The system also includes a processor coupled to the X-Y and Z radiation gathering components, generate control signals configured to drive the stage X-Y and Z position controls to track the movement of the particle. The system may also include a synchronization module configured to generate LSM pixels stamped with stage position and a processing module configured to generate a 3D image showing the 3D trajectory of a particle using the LSM pixels stamped with stage position.

  4. Model Experiment of Two-Dimentional Brownian Motion by Microcomputer.

    ERIC Educational Resources Information Center

    Mishima, Nobuhiko; And Others

    1980-01-01

    Describes the use of a microcomputer in studying a model experiment (Brownian particles colliding with thermal particles). A flow chart and program for the experiment are provided. Suggests that this experiment may foster a deepened understanding through mutual dialog between the student and computer. (SK)

  5. SoAx: A generic C++ Structure of Arrays for handling particles in HPC codes

    NASA Astrophysics Data System (ADS)

    Homann, Holger; Laenen, Francois

    2018-03-01

    The numerical study of physical problems often require integrating the dynamics of a large number of particles evolving according to a given set of equations. Particles are characterized by the information they are carrying such as an identity, a position other. There are generally speaking two different possibilities for handling particles in high performance computing (HPC) codes. The concept of an Array of Structures (AoS) is in the spirit of the object-oriented programming (OOP) paradigm in that the particle information is implemented as a structure. Here, an object (realization of the structure) represents one particle and a set of many particles is stored in an array. In contrast, using the concept of a Structure of Arrays (SoA), a single structure holds several arrays each representing one property (such as the identity) of the whole set of particles. The AoS approach is often implemented in HPC codes due to its handiness and flexibility. For a class of problems, however, it is known that the performance of SoA is much better than that of AoS. We confirm this observation for our particle problem. Using a benchmark we show that on modern Intel Xeon processors the SoA implementation is typically several times faster than the AoS one. On Intel's MIC co-processors the performance gap even attains a factor of ten. The same is true for GPU computing, using both computational and multi-purpose GPUs. Combining performance and handiness, we present the library SoAx that has optimal performance (on CPUs, MICs, and GPUs) while providing the same handiness as AoS. For this, SoAx uses modern C++ design techniques such template meta programming that allows to automatically generate code for user defined heterogeneous data structures.

  6. Biodistribution of doxorubicin and nanostructured ferrocarbon carrier particles in organism during magnetically controlled drug delivery

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Anatoly A.; Filippov, Victor I.; Nikolskaya, Tatiana A.; Budko, Andrei P.; Kovarskii, Alexander L.; Zontov, Sergei V.; Kogan, Boris Ya.; Kuznetsov, Oleg A.

    2009-05-01

    Biodistribution of doxorubicin and ferrocarbon carrier particles in organism during and after magnetically controlled anti-tumor drug delivery and deposition was studied. Animal tests show high concentration of the cytostatic drug in the target zone, while its concentration is three orders of magnitude lower in bloodstream and other organs. A significant depot of the drug remains on the deposited particles days after the procedure. Macrophages actively phagocytose the ferrocarbon (FeC) particles and remain viable long enough to carry them to the lymph nodes.

  7. An Artificial Particle Precipitation Technique Using HAARP-Generated VLF Waves

    DTIC Science & Technology

    2006-11-02

    AFRL-VS-HA-TR-2007-1021 An Artificial Particle Precipitation Technique Using HAARP -Generated VLF Waves O o o r- Q M. J. Kosch T. Pedersen J...Artificial Particle Precipitation Technique Using HAARP Generated VLF Waves. 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62101F...model. The frequency-time modulated VLF wave patterns have been successfully implemented at the HAARP ionospheric modification facility in Alaska

  8. New Energetic Particle Data and Products from the GOES Program

    NASA Astrophysics Data System (ADS)

    Onsager, Terrance; Rodriguez, Juan

    The NOAA Geostationary Operational Environmental Satellite (GOES) program has provided continuous, real-time measurements of the near-Earth space environment for decades. In addition to their scientific value, the GOES energetic particle measurements are the basis for a variety of space weather products and services, including the forecasting of elevated energetic particle levels, real-time knowledge of the satellite environment at geostationary orbit, and data to allow post-event analyses when satellite anomalies occur. The GOES satellites have traditionally provided measurements of high-energy electrons, protons, and alpha particles (100s of keV to 100s of MeV). Beginning with the launch of GOES-13 in 2006, the measurement capabilities were expanded to include medium-energy electrons and protons (10s to 100s of keV) with pitch angle resolution. The next generation of GOES satellites, starting with GOES-R in 2016, will include low-energy electrons and ions (10s of eV to 10s of keV) as well as energetic heavy ions. In this presentation, we will overview the GOES particle measurements available now and in the future and describe the space weather services and scientific investigations that these data support.

  9. WAVECALC: an Excel-VBA spreadsheet to model the characteristics of fully developed waves and their influence on bottom sediments in different water depths

    NASA Astrophysics Data System (ADS)

    Le Roux, Jacobus P.; Demirbilek, Zeki; Brodalka, Marysia; Flemming, Burghard W.

    2010-10-01

    The generation and growth of waves in deep water is controlled by winds blowing over the sea surface. In fully developed sea states, where winds and waves are in equilibrium, wave parameters may be calculated directly from the wind velocity. We provide an Excel spreadsheet to compute the wave period, length, height and celerity, as well as horizontal and vertical particle velocities for any water depth, bottom slope, and distance below the reference water level. The wave profile and propagation can also be visualized for any water depth, modeling the sea surface change from sinusoidal to trochoidal and finally cnoidal profiles into shallow water. Bedload entrainment is estimated under both the wave crest and the trough, using the horizontal water particle velocity at the top of the boundary layer. The calculations are programmed in an Excel file called WAVECALC, which is available online to authorized users. Although many of the recently published formulas are based on theoretical arguments, the values agree well with several existing theories and limited field and laboratory observations. WAVECALC is a user-friendly program intended for sedimentologists, coastal engineers and oceanographers, as well as marine ecologists and biologists. It provides a rapid means to calculate many wave characteristics required in coastal and shallow marine studies, and can also serve as an educational tool.

  10. Fabrication of microscale materials with programmable composition gradients.

    PubMed

    Laval, Cédric; Bouchaudy, Anne; Salmon, Jean-Baptiste

    2016-04-07

    We present an original microfluidic technique coupling pervaporation and the use of Quake valves to fabricate microscale materials (∼10 × 100 μm(2) × 1 cm) with composition gradients along their longest dimension. Our device exploits pervaporation of water through a thin poly(dimethylsiloxane) (PDMS) membrane to continuously pump solutions (or dispersions) contained in different reservoirs connected to a microfluidic channel. This pervaporation-induced flow concentrates solutes (or particles) at the tip of the channel up to the formation of a dense material. The latter invades the channel as it is constantly enriched by an incoming flux of solutes/particles. Upstream Quake valves are used to select which reservoir is connected to the pervaporation channel and thus which solution (or dispersion) enriches the material during its growth. The microfluidic configuration of the pervaporation process is used to impose controlled growth along the channel thus enabling one to program spatial composition gradients using appropriate actuations of the valves. We demonstrate the possibilities offered by our technique through the fabrication of dense assemblies of nanoparticles and polymer composites with programmed gradients of fluorescent dyes. We also address the key issue of the spatial resolution of our gradients and we show that well-defined spatial modulations down to ≈50 μm can be obtained within colloidal materials, whereas gradients within polymer materials are resolved on length scales down to ≈1 mm due to molecular diffusion.

  11. ELMO Bumpy Square proposal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dory, R.A.; Uckan, N.A.; Ard, W.B.

    The ELMO Bumpy Square (EBS) concept consists of four straight magnetic mirror arrays linked by four high-field corner coils. Extensive calculations show that this configuration offers major improvements over the ELMO Bumpy Torus (EBT) in particle confinement, heating, transport, ring production, and stability. The components of the EBT device at Oak Ridge National Laboratory can be reconfigured into a square arrangement having straight sides composed of EBT coils, with new microwave cavities and high-field corners designed and built for this application. The elimination of neoclassical convection, identified as the dominant mechanism for the limited confinement in EBT, will give themore » EBS device substantially improved confinement and the flexibility to explore the concepts that produce this improvement. The primary goals of the EBS program are twofold: first, to improve the physics of confinement in toroidal systems by developing the concepts of plasma stabilization using the effects of energetic electrons and confinement optimization using magnetic field shaping and electrostatic potential control to limit particle drift, and second, to develop bumpy toroid devices as attractive candidates for fusion reactors. This report presents a brief review of the physics analyses that support the EBS concept, discussions of the design and expected performance of the EBS device, a description of the EBS experimental program, and a review of the reactor potential of bumpy toroid configurations. Detailed information is presented in the appendices.« less

  12. The Bermuda BioOptics Project (BBOP) Years 9-11

    NASA Technical Reports Server (NTRS)

    Nelson, Norm

    2003-01-01

    The Bermuda BioOptics Project (BBOP) is a collaborative effort between the Institute for Computational Earth System Science (ICESS) at the University of California at Santa Barbara (UCSB) and the Bermuda Biological Station for Research (BBSR). This research program is designed to characterize light availability and utilization in the Sargasso Sea, and to provide an optical link by which biogeochemical observations may be used to evaluate bio-optical models for pigment concentration, primary production, and sinking particle fluxes from satellite-based ocean color sensors. The BBOP time-series was initiated in 1992, and is carried out in conjunction with the U.S. JGOFS Bermuda Atlantic Time-series Study (BATS) at the Bermuda Biological Station for Research. The BATS program itself has been observing biogeochemical processes (primary productivity, particle flux and elemental cycles) in the mesotrophic waters of the Sargasso Sea since 1988. Closely affiliated with BBOP and BATS is a separate NASA-funded study of the spatial variability of biogeochemical processes in the Sargasso Sea using high-resolution AVHRR and SeaWiFS data collected at Bermuda. The collaboration between BATS and BBOP measurements has resulted in a unique data set that addresses not only the SIMBIOS goals but also the broader issues of important factors controlling the carbon cycle. This final report addresses specific research activities, research results, and lists of presentations and papers submitted for publication.

  13. Acoustic radiation force on a multilayered sphere in a Gaussian standing field

    NASA Astrophysics Data System (ADS)

    Wang, Haibin; Liu, Xiaozhou; Gao, Sha; Cui, Jun; Liu, Jiehui; He, Aijun; Zhang, Gutian

    2018-03-01

    We develop a model for calculating the radiation force on spherically symmetric multilayered particles based on the acoustic scattering approach. An expression is derived for the radiation force on a multilayered sphere centered on the axis of a Gaussian standing wave propagating in an ideal fluid. The effects of the sound absorption of the materials and sound wave on acoustic radiation force of a multilayered sphere immersed in water are analyzed, with particular emphasis on the shell thickness of every layer, and the width of the Gaussian beam. The results reveal that the existence of particle trapping behavior depends on the choice of the non-dimensional frequency ka, as well as the shell thickness of each layer. This study provides a theoretical basis for the development of acoustical tweezers in a Gaussian standing wave, which may benefit the improvement and development of acoustic control technology, such as trapping, sorting, and assembling a cell, and drug delivery applications. Project supported by National Key R&D Program (Grant No. 2016YFF0203000), the National Natural Science Foundation of China (Grant Nos. 11774167 and 61571222), the Fundamental Research Funds for the Central Universities of China (Grant No. 020414380001), the Key Laboratory of Underwater Acoustic Environment, Institute of Acoustics, Chinese Academy of Sciences (Grant No. SSHJ-KFKT-1701), and the AQSIQ Technology R&D Program of China (Grant No. 2017QK125).

  14. Control and formation mechanism of extended nanochannel geometry in colloidal mesoporous silica particles.

    PubMed

    Sokolov, I; Kalaparthi, V; Volkov, D O; Palantavida, S; Mordvinova, N E; Lebedev, O I; Owens, J

    2017-01-04

    A large class of colloidal multi-micron mesoporous silica particles have well-defined cylindrical nanopores, nanochannels which self-assembled in the templated sol-gel process. These particles are of broad interest in photonics, for timed drug release, enzyme stabilization, separation and filtration technologies, catalysis, etc. Although the pore geometry and mechanism of pore formation of such particles has been widely investigated at the nanoscale, their pore geometry and its formation mechanism at a larger (extended) scale is still under debate. The extended geometry of nanochannels is paramount for all aforementioned applications because it defines accessibility of nanochannels, and subsequently, kinetics of interaction of the nanochannel content with the particle surrounding. Here we present both experimental and theoretical investigation of the extended geometry and its formation mechanism in colloidal multi-micron mesoporous silica particles. We demonstrate that disordered (and consequently, well accessible) nanochannels in the initially formed colloidal particles gradually align and form extended self-sealed channels. This knowledge allows to control the percentage of disordered versus self-sealed nanochannels, which defines accessibility of nanochannels in such particles. We further show that the observed aligning the channels is in agreement with theory; it is thermodynamically favored as it decreases the Gibbs free energy of the particles. Besides the practical use of the obtained results, developing a fundamental understanding of the mechanisms of morphogenesis of complex geometry of nanopores will open doors to efficient and controllable synthesis that will, in turn, further fuel the practical utilization of these particles.

  15. Particle simulations on transport control in divertors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashiwagi, Mieko; Ido, Shunji

    1995-04-01

    Particle orbit simulations are carried out to study the reflection of He ions recycled from a tokamak divertor by RF electric fields, which have the frequency close to ion cyclotron resonance frequency (ICRF). The performance of particle reflection and the requirement to the intensity of RF fields are studied. The control of He recycling by ICRF fields is found to be available. 4 refs., 4 figs.

  16. Apparatus and method for controlling heat transfer between a fluidized bed and tubes immersed therein

    DOEpatents

    Hodges, James L.; Cerkanowicz, Anthony E.

    1983-01-01

    In a fluidized bed of solid particles having one or more heat exchange tubes immersed therein, the rate of heat transfer between the fluidized particles and a fluid flowing through the immersed heat exchange tubes is controlled by rotating an arcuate shield apparatus about each tube to selectively expose various portions of the tube to the fluidized particles.

  17. Apparatus and method for controlling heat transfer between a fluidized bed and tubes immersed therein

    DOEpatents

    Hodges, James L.; Cerkanowicz, Anthony E.

    1982-01-01

    In a fluidized bed of solid particles having one or more heat exchange tubes immersed therein, the rate of heat transfer between the fluidized particles and a fluid flowing through the immersed heat exchange tubes is controlled by rotating an arcuate shield apparatus about each tube to selectively expose various portions of the tube to the fluidized particles.

  18. Particle size effect of redox reactions for Co species supported on silica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chotiwan, Siwaruk; Tomiga, Hiroki; Katagiri, Masaki

    Conversions of chemical states during redox reactions of two silica-supported Co catalysts, which were prepared by the impregnation method, were evaluated by using an in situ XAFS technique. The addition of citric acid into the precursor solution led to the formation on silica of more homogeneous and smaller Co particles, with an average diameter of 4 nm. The supported Co{sub 3}O{sub 4} species were reduced to metallic Co via the divalent CoO species during a temperature-programmed reduction process. The reduced Co species were quantitatively oxidized with a temperature-programmed oxidation process. The higher observed reduction temperature of the smaller CoO particlesmore » and the lower observed oxidation temperature of the smaller metallic Co particles were induced by the higher dispersion of the Co oxide species, which apparently led to a stronger interaction with supporting silica. The redox temperature between CoO and Co{sub 3}O{sub 4} was found to be independent of the particle size. - Graphical abstract: Chemical state conversions of SiO{sub 2}-supported Co species and the particle size effect have been analyzed by means of in situ XAFS technique. The small CoO particles have endurance against the reduction and exist in a wide temperature range. Display Omitted - Highlights: • The conversions of the chemical state of supported Co species during redox reaction are evaluated. • In operando XAFS technique were applied to measure redox properties of small Co particles. • A small particle size affects to the redox temperatures of cobalt catalysts.« less

  19. Electrostatic effects on dust particles in space

    NASA Astrophysics Data System (ADS)

    Leung, Philip; Wuerker, Ralph

    1992-02-01

    The star scanner of the Magellan spacecraft experienced operational anomalies continuously during Magellan's journey to Venus. These anomalies were attributed to the presence of dust particles in the vicinity of the spacecraft. The dust particles, which were originated from the surface of thermal blankets, were liberated when the electrostatic force acting on them was of sufficient magnitude. In order to verify this hypothesis, an experimental program was initiated to study the mechanisms responsible for the release of dust particles from a spacecraft surface. In the experiments, dust particles were immersed in a plasma and/or subjected to ultra-violet irradiation. Results showed that the charging state of a dust particle was strongly dependent on the environment, and the charge on a dust particle was approximately 10(exp 3) elementary charges. Consequently, in the space environment, electrostatic force could be the most dominant force acting on a dust particle.

  20. Controlling chitosan-based encapsulation for protein and vaccine delivery

    PubMed Central

    Koppolu, Bhanu prasanth; Smith, Sean G.; Ravindranathan, Sruthi; Jayanthi, Srinivas; Kumar, Thallapuranam K.S.; Zaharoff, David A.

    2014-01-01

    Chitosan-based nano/microencapsulation is under increasing investigation for the delivery of drugs, biologics and vaccines. Despite widespread interest, the literature lacks a defined methodology to control chitosan particle size and drug/protein release kinetics. In this study, the effects of precipitation-coacervation formulation parameters on chitosan particle size, protein encapsulation efficiency and protein release were investigated. Chitosan particle sizes, which ranged from 300 nm to 3 μm, were influenced by chitosan concentration, chitosan molecular weight and addition rate of precipitant salt. The composition of precipitant salt played a significant role in particle formation with upper Hofmeister series salts containing strongly hydrated anions yielding particles with a low polydispersity index (PDI) while weaker anions resulted in aggregated particles with high PDIs. Sonication power had minimal effect on mean particle size, however, it significantly reduced polydispersity. Protein loading efficiencies in chitosan nano/microparticles, which ranged from 14.3% to 99.2%, was inversely related to the hydration strength of precipitant salts, protein molecular weight and directly related to the concentration and molecular weight of chitosan. Protein release rates increased with particle size and were generally inversely related to protein molecular weight. This study demonstrates that chitosan nano/microparticles with high protein loading efficiencies can be engineered with well-defined sizes and controllable release kinetics through manipulation of specific formulation parameters. PMID:24560459

  1. 3D laser traking of a particle in 3DFM

    NASA Astrophysics Data System (ADS)

    Desai, Kalpit; Welch, Gregory; Bishop, Gary; Taylor, Russell; Superfine, Richard

    2003-11-01

    The principal goal of 3D tracking in our home-built 3D Magnetic Force Microscope is to monitor movement of the particle with respect to laser beam waist and keep the particle at the center of laser beam. The sensory element is a Quadrant Photo Diode (QPD) which captures scattering of light caused by particle motion with bandwidth up to 40 KHz. XYZ translation stage is the driver element which moves particle back in the center of the laser with accuracy of couple of nanometers and with bandwidth up to 300 Hz. Since our particles vary in size, composition and shape, instead of using a priori model we use standard system identification techniques to have optimal approximation to the relationship between particle motion and QPD response. We have developed position feedback control system software that is capable of 3-dimensional tracking of beads that are attached to cilia on living cells which are beating at up to 15Hz. We have also modeled the control system of instrument to simulate performance of 3D particle tracking for different experimental conditions. Given operational level of nanometers, noise poses a great challenge for the tracking system. We propose to use stochastic control theory approaches to increase robustness of tracking.

  2. Synthesis, characterization, and evaluation of a superficially porous particle with unique, elongated pore channels normal to the surface.

    PubMed

    Wei, Ta-Chen; Mack, Anne; Chen, Wu; Liu, Jia; Dittmann, Monika; Wang, Xiaoli; Barber, William E

    2016-04-01

    In recent years, superficially porous particles (SPPs) have drawn great interest because of their special particle characteristics and improvement in separation efficiency. Superficially porous particles are currently manufactured by adding silica nanoparticles onto solid cores using either a multistep multilayer process or one-step coacervation process. The pore size is mainly controlled by the size of the silica nanoparticles and the tortuous pore channel geometry is determined by how those nanoparticles randomly aggregate. Such tortuous pore structure is also similar to that of all totally porous particles used in HPLC today. In this article, we report on the development of a next generation superficially porous particle with a unique pore structure that includes a thinner shell thickness and ordered pore channels oriented normal to the particle surface. The method of making the new superficially porous particles is a process called pseudomorphic transformation (PMT), which is a form of micelle templating. Porosity is no longer controlled by randomly aggregated nanoparticles but rather by micelles that have an ordered liquid crystal structure. The new particle possesses many advantages such as a narrower particle size distribution, thinner porous layer with high surface area and, most importantly, highly ordered, non-tortuous pore channels oriented normal to the particle surface. This PMT process has been applied to make 1.8-5.1μm SPPs with pore size controlled around 75Å and surface area around 100m(2)/g. All particles with different sizes show the same unique pore structure with tunable pore size and shell thickness. The impact of the novel pore structure on the performance of these particles is characterized by measuring van Deemter curves and constructing kinetic plots. Reduced plate heights as low as 1.0 have been achieved on conventional LC instruments. This indicates higher efficiency of such particles compared to conventional totally porous and superficially porous particles. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. A highly scalable particle tracking algorithm using partitioned global address space (PGAS) programming for extreme-scale turbulence simulations

    NASA Astrophysics Data System (ADS)

    Buaria, D.; Yeung, P. K.

    2017-12-01

    A new parallel algorithm utilizing a partitioned global address space (PGAS) programming model to achieve high scalability is reported for particle tracking in direct numerical simulations of turbulent fluid flow. The work is motivated by the desire to obtain Lagrangian information necessary for the study of turbulent dispersion at the largest problem sizes feasible on current and next-generation multi-petaflop supercomputers. A large population of fluid particles is distributed among parallel processes dynamically, based on instantaneous particle positions such that all of the interpolation information needed for each particle is available either locally on its host process or neighboring processes holding adjacent sub-domains of the velocity field. With cubic splines as the preferred interpolation method, the new algorithm is designed to minimize the need for communication, by transferring between adjacent processes only those spline coefficients determined to be necessary for specific particles. This transfer is implemented very efficiently as a one-sided communication, using Co-Array Fortran (CAF) features which facilitate small data movements between different local partitions of a large global array. The cost of monitoring transfer of particle properties between adjacent processes for particles migrating across sub-domain boundaries is found to be small. Detailed benchmarks are obtained on the Cray petascale supercomputer Blue Waters at the University of Illinois, Urbana-Champaign. For operations on the particles in a 81923 simulation (0.55 trillion grid points) on 262,144 Cray XE6 cores, the new algorithm is found to be orders of magnitude faster relative to a prior algorithm in which each particle is tracked by the same parallel process at all times. This large speedup reduces the additional cost of tracking of order 300 million particles to just over 50% of the cost of computing the Eulerian velocity field at this scale. Improving support of PGAS models on major compilers suggests that this algorithm will be of wider applicability on most upcoming supercomputers.

  4. Characterization of inertial confinement fusion (ICF) targets using PIXE, RBS, and STIM analysis.

    PubMed

    Li, Yongqiang; Liu, Xue; Li, Xinyi; Liu, Yiyang; Zheng, Yi; Wang, Min; Shen, Hao

    2013-08-01

    Quality control of the inertial confinement fusion (ICF) target in the laser fusion program is vital to ensure that energy deposition from the lasers results in uniform compression and minimization of Rayleigh-Taylor instabilities. The technique of nuclear microscopy with ion beam analysis is a powerful method to provide characterization of ICF targets. Distribution of elements, depth profile, and density image of ICF targets can be identified by particle-induced X-ray emission, Rutherford backscattering spectrometry, and scanning transmission ion microscopy. We present examples of ICF target characterization by nuclear microscopy at Fudan University in order to demonstrate their potential impact in assessing target fabrication processes.

  5. Control of Coptotermes havilandi (Isoptera: Rhinotermitidae) with hexaflumuron baits and a sensor incorporated into a monitoring and baiting program.

    PubMed

    Su, N Y; Ban, P M; Scheffrahn, R H

    2000-04-01

    A sensor consisting of a wooden monitor painted with a conductive circuit of silver particle emulsion was placed in a monitoring station to detect feeding activity of the subterranean termite Coptotermes havilandi Holmgren. Sensor accuracy was 100% 1 mo after installation, but 9 mo after sensor placement, the rate declined to 73%. After the detection of C. havilandi in the stations, baits containing the chitin synthesis inhibitor hexaflumuron were applied in five colonies, and four colonies were eliminated within 3-5 mo. Baiting could not be completed for the remaining one colony because the site became inaccessible.

  6. Multiparty-controlled teleportation of an arbitrary GHZ-class state by using a d-dimensional ( N+2)-particle nonmaximally entangled state as the quantum channel

    NASA Astrophysics Data System (ADS)

    Long, LiuRong; Li, HongWei; Zhou, Ping; Fan, Chao; Yin, CaiLiu

    2011-03-01

    We present a scheme for multiparty-controlled teleportation of an arbitrary high-dimensional GHZ-class state with a d-dimensional ( N+2)-particle GHZ state following some ideas from the teleportation (Chinese Physics B, 2007, 16: 2867). This scheme has the advantage of transmitting much fewer particles for controlled teleportation of an arbitrary multiparticle GHZ-class state. Moreover, we discuss the application of this scheme by using a nonmaximally entangled state as its quantum channel.

  7. Multicomponent inorganic Janus particles with controlled compositions, morphologies, and dimensions.

    PubMed

    Lyubarskaya, Yekaterina L; Shestopalov, Alexander A

    2013-08-14

    We report a new protocol for the preparation of shape-controlled multicomponent particles comprising metallic (Au and Ti), magnetic (Ni), and oxide (SiO2, TiO2) layers. Our method allows for a precise control over the composition, shape, and size and permits fabrication of nonsymmetrical particles, whose opposite sides can be orthogonally functionalized using well-established organosilanes and thiol chemistries. Because of their unique geometries and surface chemistries, these colloids represent ideal materials with which to study nonsymmetrical self-assembly at the meso- and microscales.

  8. Resonant circuit which provides dual frequency excitation for rapid cycling of an electromagnet

    DOEpatents

    Praeg, Walter F.

    1984-01-01

    Disclosed is a ring magnet control circuit that permits synchrotron repetition rates much higher than the frequency of the cosinusoidal guide field of the ring magnet during particle acceleration. the control circuit generates cosinusoidal excitation currents of different frequencies in the half waves. During radio frequency acceleration of the particles in the synchrotron, the control circuit operates with a lower frequency cosine wave and thereafter the electromagnets are reset with a higher frequency half cosine wave. Flat-bottom and flat-top wave shaping circuits maintain the magnetic guide field in a relatively time-invariant mode during times when the particles are being injected into the ring magnets and when the particles are being ejected from the ring magnets.

  9. Organic matter diagenesis within the water column and surface sediments of the northern Sargasso Sea revealed by lipid biomarkers

    NASA Astrophysics Data System (ADS)

    Conte, M. H.; Pedrosa Pàmies, R.; Weber, J.

    2017-12-01

    The intensity of particle cycling processes within the mesopelagic and bathypelagic ocean controls the length scale of organic material (OM) remineralization and diagenetic transformations of OM composition through the water column and into the sediments. To elucidate the OM cycling in the oligotrophic North Atlantic gyre, we analyzed lipid biomarkers in the suspended particles (30-4400 m depth, 100 mab), the particle flux (500 m, 1500 m and 3200 m depth), and in the underlying surficial sediments (0-0.5 cm, 4500-4600 m depth) collected at the Oceanic Flux Program (OFP) time series site located 75km SE of Bermuda. Changes in lipid biomarker concentration and composition with depth highlight the rapid remineralization of OM within the upper mesopelagic layer and continuing diagenetic transformations of OM throughout the water column and within surficial sediments. Despite observed similarities in biomarker composition in suspended and sinking particles, results show there are also consistent differences in relative contributions of phytoplankton-, bacterial- and zooplankton-derived sources that are maintained throughout the water column. For example, sinking particles are more depleted in labile biomarkers (e.g. polyunsaturated fatty acids (PUFA)) and more enriched in bacteria-derived biomarkers (e.g. hopanoids and odd/branched fatty acids) and indicators of fecal-derived OM (e.g. saturated fatty acids, FA 18:1w9 and cholesterol) than in the suspended pool. Strong seasonality in deep (3200 m) fluxes of phytoplankton-derived biomarkers reflect the seasonal input of bloom-derived material to underlying sediments. The rapid diagenetic alteration of this bloom-derived input is evidenced by depletion of PUFAs and enrichment of microbial biomarkers (e.g. odd/branched fatty acids) in surficial sediments over a two month period.

  10. Nucleation from seawater emissions during mesocosm experiments

    NASA Astrophysics Data System (ADS)

    Rose, Clémence; Culot, Anais; Pey, Jorge; Schwier, Allison; Mas, Sébastien; Charriere, Bruno; Sempéré, Richard; Marchand, Nicolas; D'Anna, Barbara; Sellegri, Karine

    2015-04-01

    Nucleation and new particle formation in the marine atmosphere is usually associated to the presence of macroalgea emerged at low tides in coastal areas, while these processes were very rarely detected away from coastlines. In the present study, we evidence the formation of new particles from the 1 nm size above the seawater surface in the absence of any macroalgea population. Within the SAM project (Sources of marine Aerosol in the Mediterranean),seawater mesocosms experiments were deployed in May 2013 at the STARESO in western Corsica, with the goal of investigating the relationship between marine aerosol emissions and the seawater biogeochemical properties. Three mesocosms imprisoned 3,3 m3 of seawater each and their emerged part was flushed with aerosol-filtered natural air. One of these mesocosms was left unchanged as control and the two others were enriched by addition of nitrates and phosphates respecting Redfield ratio (N:P = 16) in order to create different levels of phytoplanctonic activities. We followed both water and air characteristics of three mesocosms during a period of three weeks by using online water and atmospheric probes as well as seawater daily samples for chemical and biological analysis. Secondary new particle formation was followed on-line in the emerged parts of the mesocosms, using a SMPS for the size distribution above 6 nm and a Particle Size Magnifyer (PSM) for the number of cluster particles between 1 and 6 nm. We will present how the cluster formation rates and early growth rates relate to the gaz-phase emissions from the seawater and to its biogeochemical properties. Aknowledgemnts: The authors want to acknowledge the financial support of the ANR "Source of marine Aerosol in the Mediterranean" (SAM), and the support of MISTRAL CHARMEX and MERMEX programs.

  11. Magnetic particles guided by ellipsoidal AC magnetic fields in a shallow viscous fluid: Controlling trajectories and chain lengths

    NASA Astrophysics Data System (ADS)

    Jorge, Guillermo A.; Llera, María; Bekeris, Victoria

    2017-12-01

    We study the propulsion of superparamagnetic particles dispersed in a viscous fluid upon the application of an elliptically polarized rotating magnetic field. Reducing the fluid surface tension the particles sediment due to density mismatch and rotate close to the low recipient confining plate. We study the net translational motion arising from the hydrodynamic coupling with the plate and find that, above a cross over magnetic field, magnetically assembled doublets move faster than single particles. In turn, particles are driven in complex highly controlled trajectories by rotating the plane containing the magnetic field vector. The effect of the field rotation on long self assembled chains is discussed and the alternating breakup and reformation of the particle chains is described.

  12. Optimal wide-area monitoring and nonlinear adaptive coordinating neurocontrol of a power system with wind power integration and multiple FACTS devices.

    PubMed

    Qiao, Wei; Venayagamoorthy, Ganesh K; Harley, Ronald G

    2008-01-01

    Wide-area coordinating control is becoming an important issue and a challenging problem in the power industry. This paper proposes a novel optimal wide-area coordinating neurocontrol (WACNC), based on wide-area measurements, for a power system with power system stabilizers, a large wind farm and multiple flexible ac transmission system (FACTS) devices. An optimal wide-area monitor (OWAM), which is a radial basis function neural network (RBFNN), is designed to identify the input-output dynamics of the nonlinear power system. Its parameters are optimized through particle swarm optimization (PSO). Based on the OWAM, the WACNC is then designed by using the dual heuristic programming (DHP) method and RBFNNs, while considering the effect of signal transmission delays. The WACNC operates at a global level to coordinate the actions of local power system controllers. Each local controller communicates with the WACNC, receives remote control signals from the WACNC to enhance its dynamic performance and therefore helps improve system-wide dynamic and transient performance. The proposed control is verified by simulation studies on a multimachine power system.

  13. Strategies for the synthesis of supported gold palladium nanoparticles with controlled morphology and composition.

    PubMed

    Hutchings, Graham J; Kiely, Christopher J

    2013-08-20

    The discovery that supported gold nanoparticles are exceptionally effective catalysts for redox reactions has led to an explosion of interest in gold nanoparticles. In addition, incorporating a second metal as an alloy with gold can enhance the catalyst performance even more. The addition of small amounts of gold to palladium, in particular, and vice versa significantly enhances the activity of supported gold-palladium nanoparticles as redox catalysts through what researchers believe is an electronic effect. In this Account, we describe and discuss methodologies for the synthesis of supported gold-palladium nanoparticles and their use as heterogeneous catalysts. In general, three key challenges need to be addressed in the synthesis of bimetallic nanoparticles: (i) control of the particle morphology, (ii) control of the particle size distribution, and (iii) control of the nanoparticle composition. We describe three methodologies to address these challenges. First, we discuss the relatively simple method of coimpregnation. Impregnation allows control of particle morphology during alloy formation but does not control the particle compositions or the particle size distribution. Even so, we contend that this method is the best preparation method in the catalyst discovery phase of any project, since it permits the investigation of many different catalyst structures in one experiment, which may aid the identification of new catalysts. A second approach, sol-immobilization, allows enhanced control of the particle size distribution and the particle morphology, but control of the composition of individual nanoparticles is not possible. Finally, a modified impregnation method can allow the control of all three of these crucial parameters. We discuss the effect of the different methodologies on three redox reactions: benzyl alcohol oxidation, toluene oxidation, and the direct synthesis of hydrogen peroxide. We show that the coimpregnation method provides the best reaction selectivity for benzyl alcohol oxidation and the direct synthesis of hydrogen peroxide. However, because of the reaction mechanism, the sol-immobilzation method gives very active and selective catalysts for toluene oxidation. We discuss the possible nature of the preferred active structures of the supported nanoparticles for these reactions. This paper is based on the IACS Heinz Heinemann Award Lecture entitled "Catalysis using gold nanoparticles" which was given in Munich in July 2012.

  14. Engineered polymeric nanoparticles for soil remediation.

    PubMed

    Tungittiplakorn, Warapong; Lion, Leonard W; Cohen, Claude; Kim, Ju-Young

    2004-03-01

    Hydrophobic organic groundwater contaminants, such as polynuclear aromatic hydrocarbons (PAHs), sorb strongly to soils and are difficult to remove. We report here on the synthesis of amphiphilic polyurethane (APU) nanoparticles for use in remediation of soil contaminated with PAHs. The particles are made of polyurethane acrylate anionomer (UAA) or poly(ethylene glycol)-modified urethane acrylate (PMUA) precursor chains that can be emulsified and cross-linked in water. The resulting particles are of colloidal size (17-97 nm as measured by dynamic light scattering). APU particles have the ability to enhance PAH desorption and transport in a manner comparable to that of surfactant micelles, but unlike the surface-active components of micelles, the individual cross-linked precursor chains in APU particles are not free to sorb to the soil surface. Thus, the APU particles are stable independent of their concentration in the aqueous phase. In this paper we show that APU particles can be engineered to achieve desired properties. Our experimental results show that the APU particles can be designed to have hydrophobic interior regions that confer a high affinity for phenanthrene (PHEN) and hydrophilic surfaces that promote particle mobility in soil. The affinity of APU particles for contaminants such as PHEN can be controlled by changing the size of the hydrophobic segment used in the chain synthesis. The mobility of colloidal APU suspensions in soil is controlled by the charge density or the size of the pendent water-soluble chains that reside on the particle surface. Exemplary results are provided illustrating the influence of alternative APU particle formulations with respect to their efficacy for contaminant removal. The ability to control particle properties offers the potential to produce different nanoparticles optimized for varying contaminant types and soil conditions.

  15. Controlled Endolysosomal Release of Agents by pH-responsive Polymer Blend Particles.

    PubMed

    Zhan, Xi; Tran, Kenny K; Wang, Liguo; Shen, Hong

    2015-07-01

    A key step of delivering extracellular agents to its intracellular target is to escape from endosomal/lysosomal compartments, while minimizing the release of digestive enzymes that may compromise cellular functions. In this study, we examined the intracellular distribution of both fluorecent cargoes and enzymes by a particle delivery platform made from the controlled blending of poly(lactic-co-glycolic acid) (PLGA) and a random pH-sensitive copolymer. We utilized both microscopic and biochemical methods to semi-quantitatively assess how the composition of blend particles affects the level of endosomal escape of cargos of various sizes and enzymes into the cytosolic space. We demonstrated that these polymeric particles enabled the controlled delivery of cargos into the cytosolic space that was more dependent on the cargo size and less on the composition of blend particles. Blend particles did not induce the rupture of endosomal/lysosomal compartments and released less than 20% of endosomal/lysosomal enzymes. This study provides insight into understanding the efficacy and safety of a delivery system for intracellular delivery of biologics and drugs. Blend particles offer a potential platform to target intracellular compartments while potentially minimizing cellular toxicity.

  16. Controlled endolysosomal release of agents by pH-responsive polymer blend particles

    PubMed Central

    Zhan, Xi; Tran, Kenny K.; Wang, Liguo; Shen, Hong

    2015-01-01

    Purpose A key step of delivering extracellular agents to its intracellular target is to escape from endosomal/lysosomal compartments, while minimizing the release of digestive enzymes that may compromise cellular functions. In this study, we examined the intracellular distribution of both fluorecent cargoes and enzymes by a particle delivery platform made from the controlled blending of poly (lactic-co-glycolic acid) (PLGA) and a random pH-sensitive copolymer. Methods We utilized both microscopic and biochemical methods to semi-quantitatively assess how the composition of blend particles affects the level of endosomal escape of cargos of various sizes and enzymes into the cytosolic space. Results We demonstrated that these polymeric particles enabled the controlled delivery of cargos into the cytosolic space that was more dependent on the cargo size and less on the composition of blend particles. Blend particles did not induce the rupture of endosomal/lysosomal compartments and released less than 20% of endosomal/lysosomal enzymes. Conclusions This study provides insight into understanding the efficacy and safety of a delivery system for intracellular delivery of biologics and drugs. Blend particles offer a potential platform to target intracellular compartments while potentially minimizing cellular toxicity. PMID:25592550

  17. Method of phase space beam dilution utilizing bounded chaos generated by rf phase modulation

    DOE PAGES

    Pham, Alfonse N.; Lee, S. Y.; Ng, K. Y.

    2015-12-10

    This paper explores the physics of chaos in a localized phase-space region produced by rf phase modulation applied to a double rf system. The study can be exploited to produce rapid particle bunch broadening exhibiting longitudinal particle distribution uniformity. Hamiltonian models and particle-tracking simulations are introduced to understand the mechanism and applicability of controlled particle diffusion. When phase modulation is applied to the double rf system, regions of localized chaos are produced through the disruption and overlapping of parametric resonant islands and configured to be bounded by well-behaved invariant tori to prevent particle loss. The condition of chaoticity and themore » degree of particle dilution can be controlled by the rf parameters. As a result, the method has applications in alleviating adverse space-charge effects in high-intensity beams, particle bunch distribution uniformization, and industrial radiation-effects experiments.« less

  18. Shape effects in the turbulent tumbling of large particles

    NASA Astrophysics Data System (ADS)

    Variano, Evan; Oehmke, Theresa; Pujara, Nimish

    2017-11-01

    We present laboratory results on rotation of finite-sized, neutrally buoyant, anisotropic particles in isotropic turbulence. The isotropic turbulent flow is generated using a randomly-actuated synthetic jet array that minimizes tank scale circulation and measurements are made with stereoscopic particle image velocimetry. By using particles of different shapes, we explore the effects that symmetries have on particle rotation. We add to previous data collected for spheres cylinders and ellipsoids by performing new measurements on cubes, cuboids and cones. The measurement technique and results on mean-square particle rotation will be presented. Preliminary results, at the time of writing this abstract, indicate that symmetry breaking increases the rate of particle rotation. More complete quantitative results will be presented. This work was partially supported by the NSF award ENG-1604026 and by the Army Research Office Biomathematics Program.

  19. Modeling of the charge-state separation at ITEP experimental facility for material science based on a Bernas ion source.

    PubMed

    Barminova, H Y; Saratovskyh, M S

    2016-02-01

    The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 10(10) ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions.

  20. Modeling of the charge-state separation at ITEP experimental facility for material science based on a Bernas ion source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barminova, H. Y., E-mail: barminova@bk.ru; Saratovskyh, M. S.

    2016-02-15

    The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 10{sup 10} ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turnmore » in magnetic field are presented for different initial conditions.« less

  1. Predicting the soiling of modern glass in urban environments: A new physically-based model

    NASA Astrophysics Data System (ADS)

    Alfaro, S. C.; Chabas, A.; Lombardo, T.; Verney-Carron, A.; Ausset, P.

    2012-12-01

    This study revisits the measurements of the MULTI-ASSESS and Long Term Soiling programs for understanding physically, and modeling, the processes controlling the soiling of modern glass in polluted conditions. The results show a strong correlation between the size distribution of particles and the evolution of the mass deposited at the surface of the glass. Over observation periods covering more than 2 years, the mass deposition on glass panels sheltered from the rain is observed to accelerate regularly with time at the sites closest to the sources of particulate matter (Roadside sites). At these sites the deposit is also richer in coarse (supermicron) mineral particles than at more distant (Urban Background and Suburban) sites, where the contribution of submicron particles (among which a significant fraction of particulate organic matter) is larger. This size and compositional segregation probably explains that the mass accumulation tends to slow down with time and finally saturate after an estimated duration of more than 10 years at the Suburban sites. The analysis of the correlation between the measured accumulated mass and haze shows that the haze-creating mass efficiency of the deposit decreases progressively as the density of particles increases on the glass panels. This is interpreted as being a consequence of the increasing influence of multiple scattering. A steady-state is eventually obtained when layers of closely packed particles are formed, which occurs for surface masses of the order of a few tens of μg cm-2. After this stage is reached, the haze increases linearly with further mass deposition at a pace conditioned by the size-distribution of the deposit. The parameterization of the evolution of the deposited mass with time, and of the correlation linking this mass to the haze allows proposing a new physically-based model able to predict the development of the haze on sheltered glass. Finally, a comparison of the model predictions with the independent measurements performed at the experimental sites of the AERO program shows that the model is able to simulate correctly the development of the haze at a variety of urban sites ranging from the Suburban to Roadside categories. This predictive tool should help developing conservation strategies adapted to the real environmental conditions of the historical and modern buildings.

  2. Micro-valve using induced-charge electrokinetic motion of Janus particle.

    PubMed

    Daghighi, Yasaman; Li, Dongqing

    2011-09-07

    A new micro-valve using the electrokinetic motion of a Janus particle is introduced in this paper. A Janus particle with a conducting hemisphere and a non-conducting hemisphere is placed in a junction of several microchannels. Under an applied electric field, the induced-charge electrokinetic flow around the conducting side of the Janus particle forms vortices. The vortices push the particle moving forwards to block the entrance of a microchannel. By switching the direction of the applied electric field, the motion of the Janus particle can be changed to block different microchannels. This paper develops a theoretical model and conducts numerical simulations of the three-dimensional transient motion of the Janus particle. The results show that this Janus particle-based micro-valve is feasible for switching and controlling the flow rate in a microfluidic chip. This method is simple in comparison with other types of micro-valve methods. It is easy for fabrication, for operation control, and has a fast response time. To better understand the micro-valve functions, comparisons with a non-conducting particle and a fully conducting particle were made. Results proved that only a Janus particle can fulfill the requirements of such a micro-valve.

  3. Preparation of nano-hydroxyapatite particles with different morphology and their response to highly malignant melanoma cells in vitro

    NASA Astrophysics Data System (ADS)

    Li, Bo; Guo, Bo; Fan, Hongsong; Zhang, Xingdong

    2008-11-01

    To investigate the effects of nano-hydroxyapatite (HA) particles with different morphology on highly malignant melanoma cells, three kinds of HA particles with different morphology were synthesized and co-cultured with highly malignant melanoma cells using phosphate-buffered saline (PBS) as control. A precipitation method with or without citric acid addition as surfactant was used to produce rod-like hydroxyapatite (HA) particles with nano- and micron size, respectively, and a novel oil-in-water emulsion method was employed to prepare ellipse-like nano-HA particles. Particle morphology and size distribution of the as prepared HA powders were characterized by transmission electron microscope (TEM) and dynamic light scattering technique. The nano- and micron HA particles with different morphology were co-cultured with highly malignant melanoma cells. Immunofluorescence analysis and MTT assay were employed to evaluate morphological change of nucleolus and proliferation of tumour cells, respectively. To compare the effects of HA particles on cell response, the PBS without HA particles was used as control. The experiment results indicated that particle nanoscale effect rather than particle morphology of HA was more effective for the inhibition on highly malignant melanoma cells proliferation.

  4. CASTNet Air Toxics Monitoring Program (CATMP): VOC and carbonyl data for July, 1993 through March, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harlos, D.P.; Edgerton, E.S.

    1994-12-31

    The US EPA has, under the auspices of the CASTNet program (Clean Air Status and Trends Network), initiated the CASTNet Air Toxics Monitoring Program (CATMP). Volatile Organic Compounds (VOC) and carbonyls and metals are sampled for 24-hour periods on a 12-day schedule using TO-14 samplers (SUMMA canisters) and dinitrophenylhydrazine-coated (dmph) sorbent cartridges and high volume particle samplers. Sampling was begun at most sites in July of 1993. The sites are operated by state and local air pollution control programs and all analysis is performed by Environmental Science and Engineering (ESE) in Gainesville, Florida. The network currently supports 15 VOC sites,more » of which 7 also sample carbonyls. Three sites sample metals only in Pinellas County, Florida. The limits of detection of 0.05 ppb for VOCs allow routine tracking of a wide range of pollutants including several greenhouse gases, transportation pollutants and photochemically-derived compounds. The sites range from major urban areas (Chicago, St. Louis) to a rural village (Waterbury, Vermont). Results of the first three quarters of VOC and carbonyl data collection are summarized in this presentation.« less

  5. Monitoring and forecasting of great radiation hazards for spacecraft and aircrafts by online cosmic ray data

    NASA Astrophysics Data System (ADS)

    Dorman, L. I.

    2005-11-01

    We show that an exact forecast of great radiation hazard in space, in the magnetosphere, in the atmosphere and on the ground can be made by using high-energy particles (few GeV/nucleon and higher) whose transportation from the Sun is characterized by a much bigger diffusion coefficient than for small and middle energy particles. Therefore, high energy particles come from the Sun much earlier (8-20 min after acceleration and escaping into solar wind) than the main part of smaller energy particles (more than 30-60 min later), causing radiation hazard for electronics and personal health, as well as spacecraft and aircrafts. We describe here principles of an automatic set of programs that begin with "FEP-Search", used to determine the beginning of a large FEP event. After a positive signal from "FEP-Search", the following programs start working: "FEP-Research/Spectrum", and then "FEP-Research/Time of Ejection", "FEP-Research /Source" and "FEP-Research/Diffusion", which online determine properties of FEP generation and propagation. On the basis of the obtained information, the next set of programs immediately start to work: "FEP-Forecasting/Spacecrafts", "FEP-Forecasting/Aircrafts", "FEP-Forecasting/Ground", which determine the expected differential and integral fluxes and total fluency for spacecraft on different orbits, aircrafts on different airlines, and on the ground, depending on altitude and cutoff rigidity. If the level of radiation hazard is expected to be dangerous for high level technology or/and personal health, the following programs will be used "FEP-Alert/Spacecrafts", "FEP-Alert/ Aircrafts", "FEP-Alert/Ground".

  6. Controlling Particle Morphologies at Fluid Interfaces: Macro- and Micro- approaches

    NASA Astrophysics Data System (ADS)

    Beesabathuni, Shilpa Naidu

    The controlled generation of varying shaped particles is important for many applications: consumer goods, biomedical diagnostics, food processing, adsorbents and pharmaceuticals which can benefit from the availability of geometrically complex and chemically inhomogeneous particles. This thesis presents two approaches to spherical and non-spherical particle synthesis using macro and microfluidics. In the first approach, a droplet microfluidic technique is explored to fabricate spherical conducting polymer, polyaniline, particles with precise control over morphology and functionality. Microfluidics has recently emerged as an important alternate to the synthesis of complex particles. The conducting polymer, polyaniline, is widely used and known for its stability, high conductivity, and favorable redox properties. In this approach, monodisperse micron-sized polyaniline spherical particles were synthesized using two-phase droplet microfluidics from Aniline and Ammonium persulfate oxidative polymerization in an oil-based continuous phase. The morphology of the polymerized particles is porous in nature which can be used for encapsulation as well as controlled release applications. Encapsulation of an enzyme, glucose oxidase, was also performed using the technique to synthesize microspheres for glucose sensing. The polymer microspheres were characterized using SEM, UV-Vis and EDX to understand the relationship between their microstructure and stability. In the second approach, molten drop impact in a cooling aqueous medium to generate non-spherical particles was explored. Viscoelastic wax based materials are widely used in many applications and their performance and application depends on the particle morphology and size. The deformation of millimeter size molten wax drops as they impacted an immiscible liquid interface was investigated. Spherical molten wax drops impinged on a cooling water bath, then deformed and as a result of solidification were arrested into various shapes such as ellipsoids, mushrooms, spherulites and discs. The final morphology of the wax particles is governed by the interfacial, inertial, viscous and thermal effects, which can be studied over a range of Weber, Capillary, Reynolds and Stefan numbers. A simplified Stefan problem for a spherical drop was solved. The time required to initiate a phase transition at the interface of the molten wax and water after impact was estimated and correlated with the drop deformation history and final wax particle shape to develop a capability to predict the shape. While the microfluidic synthesis approach offers precise control over morphology and functionality, large particle throughput is a limitation. The drop impact in a liquid medium emulsion approach is limited to crosslinking or heat sensitive materials but can be extended to large scale production for industrial applications. Both approaches are simple, robust and cost effective making them viable and attractive solutions for complex particle synthesis. The choice of the approach is dependent on considerations such as particle material, size, shape, throughput and end application.

  7. Characterization of Aircraft Produced Soot and Contrails Near the Tropopause

    NASA Technical Reports Server (NTRS)

    Hallett, John; Gudson, James G.

    1997-01-01

    Participation in the SUCCESS project primarily involved development and deployment of specific instruments for characterizing jet aircraft exhaust emissions as particulates and their subsequent evolution as contrail particles, either liquid or solid, as cirrus. Observations can be conveniently considered in two categories - close or distant from the aircraft. Thus close to the aircraft the exhaust is mixing through the engine turbulence with a much drier and colder environment and developing water/ ice supersaturation along the trail depending on circumstances (near field), whereas distant from the aircraft (far field) the exhaust has cooled essentially to ambient temperature, the turbulence has decayed and any particle growth or evaporation is controlled by the prevailing ambient conditions. Intermediate between these two regions the main aircraft vortices form (one on each side of the aircraft) which tend to inhibit mixing under some conditions, a region extending from a few aircraft lengths to sometimes a hundred times this distance. Our approach to the problem lay in experience gained in characterizing the smoke from hydrocarbon combustion in terms of its cloud forming properties and its potential influence on the radiation properties of the smoke and subsequent cloud from the viewpoint of reduction (absorbtion and scattering ) of solar radiation flux leading to significant global cooling (Hudson et al 1991; Hallett and Hudson 1991). Engine exhaust contains a much smaller proportion of the fuel carbon than is sometimes present in ordinary combustion (less than 0.01% compared with 10%) and influences condensation in quite different ways, to be characterized by the Cloud Condensation Nucleus, CCN - supersaturation spectrum. The transition to ice is to be related to the dilution of solution droplets to freeze by homogeneous nucleation at temperatures somewhat below -40C (Pueschel et al 1998). The subsequent growth of ice particles depends critically on temperature, supersaturation and to some extent pressure, as is demonstrated in an NSF funded project being carried out in parallel with the work reported here. As will be discussed below, nucleation processes themselves and also exhaust impurities also influence the growth of ice particles and may control some aspects of growth of ice in contrails. Instrumentation was designed to give insight into these questions and to be flown on the NASA DC- 8 as a platform. In addition a modest program was undertaken to investigate the properties of laboratory produced smoke produced under controlled conditions from the viewpoint of forming both CCN and CN. The composition of the smoke could inferred from a thermal characterization technique; larger particles were captured by formvar replicator for detailed analysis; ice particles were captured and evaporated in flight on a new instrument, the cloudscope, to give their mass, density and impurity content.

  8. PaDe - The particle detection program

    NASA Astrophysics Data System (ADS)

    Ott, T.; Drolshagen, E.; Koschny, D.; Poppe, B.

    2016-01-01

    This paper introduces the Particle Detection program PaDe. Its aim is to analyze dust particles in the coma of the Jupiter-family comet 67P/Churyumov-Gerasimenko which were recorded by the two OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras onboard the ESA spacecraft Rosetta, see e.g. Keller et al. (2007). In addition to working with the Rosetta data, the code was modified to work with images from meteors. It was tested with data recorded by the ICCs (Intensified CCD Cameras) of the CILBO-System (Canary Island Long-Baseline Observatory) on the Canary Islands; compare Koschny et al. (2013). This paper presents a new method for the position determination of the observed meteors. The PaDe program was written in Python 3.4. Its original intent is to find the trails of dust particles in space from the OSIRIS images. For that it determines the positions where the trail starts and ends. They were found using a fit following the so-called error function (Andrews, 1998) for the two edges of the profiles. The positions where the intensities fall to the half maximum were found to be the beginning and end of the particle. In the case of meteors, this method can be applied to find the leading edge of the meteor. The proposed method has the potential to increase the accuracy of the position determination of meteors dramatically. Other than the standard method of finding the photometric center, our method is not influenced by any trails or wakes behind the meteor. This paper presents first results of this ongoing work.

  9. Dusty-Plasma Particle Accelerator

    NASA Technical Reports Server (NTRS)

    Foster, John E.

    2005-01-01

    A dusty-plasma apparatus is being investigated as means of accelerating nanometer- and micrometer-sized particles. Applications for the dusty-plasma particle accelerators fall into two classes: Simulation of a variety of rapidly moving dust particles and micrometeoroids in outer-space environments that include micrometeoroid streams, comet tails, planetary rings, and nebulae and Deposition or implantation of nanoparticles on substrates for diverse industrial purposes that could include hardening, increasing thermal insulation, altering optical properties, and/or increasing permittivities of substrate materials. Relative to prior apparatuses used for similar applications, dusty-plasma particle accelerators offer such potential advantages as smaller size, lower cost, less complexity, and increased particle flux densities. A dusty-plasma particle accelerator exploits the fact that an isolated particle immersed in plasma acquires a net electric charge that depends on the relative mobilities of electrons and ions. Typically, a particle that is immersed in a low-temperature, partially ionized gas, wherein the average kinetic energy of electrons exceeds that of ions, causes the particle to become negatively charged. The particle can then be accelerated by applying an appropriate electric field. A dusty-plasma particle accelerator (see figure) includes a plasma source such as a radio-frequency induction discharge apparatus containing (1) a shallow cup with a biasable electrode to hold the particles to be accelerated and (2) a holder for the substrate on which the particles are to impinge. Depending on the specific design, a pair of electrostatic-acceleration grids between the substrate and discharge plasma can be used to both collimate and further accelerate particles exiting the particle holder. Once exposed to the discharge plasma, the particles in the cup quickly acquire a negative charge. Application of a negative voltage pulse to the biasable electrode results in the initiation of a low-current, high-voltage cathode spot. Plasma pressure associated with the cathode spot as well as the large voltage drop at the cathode spot accelerates the charged particles toward the substrate. The ultimate kinetic energy attained by particles exiting the particle holder depends in part on the magnitude of the cathode spot sheath potential difference, which is proportional to the magnitude of the voltage pulse, and the on the electric charge on the dust. The magnitude of the voltage pulse can be controlled directly, whereas the particle s electric charge can be controlled indirectly by controlling the operating parameters of the plasma apparatus.

  10. HBTprogs Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D; Danielewicz, P

    2002-03-15

    This is the manual for a collection of programs that can be used to invert angled-averaged (i.e. one dimensional) two-particle correlation functions. This package consists of several programs that generate kernel matrices (basically the relative wavefunction of the pair, squared), programs that generate test correlation functions from test sources of various types and the program that actually inverts the data using the kernel matrix.

  11. A geometric approach to identify cavities in particle systems

    NASA Astrophysics Data System (ADS)

    Voyiatzis, Evangelos; Böhm, Michael C.; Müller-Plathe, Florian

    2015-11-01

    The implementation of a geometric algorithm to identify cavities in particle systems in an open-source python program is presented. The algorithm makes use of the Delaunay space tessellation. The present python software is based on platform-independent tools, leading to a portable program. Its successful execution provides information concerning the accessible volume fraction of the system, the size and shape of the cavities and the group of atoms forming each of them. The program can be easily incorporated into the LAMMPS software. An advantage of the present algorithm is that no a priori assumption on the cavity shape has to be made. As an example, the cavity size and shape distributions in a polyethylene melt system are presented for three spherical probe particles. This paper serves also as an introductory manual to the script. It summarizes the algorithm, its implementation, the required user-defined parameters as well as the format of the input and output files. Additionally, we demonstrate possible applications of our approach and compare its capability with the ones of well documented cavity size estimators.

  12. Investigating the adiabatic beam grouping at the NICA accelerator complex

    NASA Astrophysics Data System (ADS)

    Brovko, O. I.; Butenko, A. V.; Grebentsov, A. Yu.; Eliseev, A. V.; Meshkov, I. N.; Svetov, A. L.; Sidorin, A. O.; Slepnev, V. M.

    2016-12-01

    The NICA complex comprises the Booster and Nuclotron synchrotrons for accelerating particle beams to the required energy and the Collider machine, in which particle collisions are investigated. The experimental heavy-ion program deals with ions up to Au+79. The light-ion program deals with polarized deuterons and protons. Grouping of a beam coasting in an ion chamber is required in many parts of the complex. Beam grouping may effectively increase the longitudinal emittance and particle losses. To avoid these negative effects, various regimes of adiabatic grouping have been simulated and dedicated experiments with a deuteron beam have been conducted at the Nuclotron machine. As a result, we are able to construct and optimize the beam-grouping equipment, which provides a capture efficiency near 100% either retaining or varying the harmonic multiplicity of the HF system.

  13. Visual Basic VPython Interface: Charged Particle in a Magnetic Field

    NASA Astrophysics Data System (ADS)

    Prayaga, Chandra

    2006-12-01

    A simple Visual Basic (VB) to VPython interface is described and illustrated with the example of a charged particle in a magnetic field. This interface allows data to be passed to Python through a text file read by Python. The first component of the interface is a user-friendly data entry screen designed in VB, in which the user can input values of the charge, mass, initial position and initial velocity of the particle, and the magnetic field. Next, a command button is coded to write these values to a text file. Another command button starts the VPython program, which reads the data from the text file, numerically solves the equation of motion, and provides the 3d graphics animation. Students can use the interface to run the program several times with different data and observe changes in the motion.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephen Seong Lee

    Fuel flow to individual burners is complicated and difficult to determine on coal fired boilers, since coal solids were transported in a gas suspension that is governed by the complex physics of two-phase flow. The objectives of the project were the measurements of suspended coal solids-flows in the simulated test conditions. Various extractive methods were performed manually and can give only a snapshot result of fuel distribution. In order to measure particle diameter & velocity, laser based phase-Doppler particle analyzer (PDPA) and particle image velocimetry (PIV) were carefully applied. Statistical methods were used to analyze particle characteristics to see whichmore » factors have significant effect. The transparent duct model was carefully designed and fabricated for the laser-based-instrumentation of solids-flow monitoring (LISM). The experiments were conducted with two different kinds of particles with four different particle diameters. The particle types were organic particles and saw dust particles with the diameter range of 75-150 micron, 150-250 micron, 250-355 micron and 355-425 micron. The densities of the particles were measured to see how the densities affected the test results. Also the experiment was conducted with humid particles and fog particles. To generate humid particles, the humidifier was used. A pipe was connected to the humidifier to lead the particle flow to the intersection of the laser beam. The test results of the particle diameter indicated that, the mean diameter of humid particles was between 6.1703 microns and 6.6947 microns when the humid particle flow was low. When the humid particle flow was high, the mean diameter was between 6.6728 microns and 7.1872 microns. The test results of the particle mean velocity indicated that the mean velocity was between 1.3394 m/sec and 1.4556 m/sec at low humid particle flow. When the humid particle flow was high, the mean velocity was between 1.5694 m/sec and 1.7856 m/sec. The Air Flow Module, TQ AF 17 and shell ondina oil were used to generate fog particles. After the oil was heated inside the fog generator, the blower was used to generate the fog. The fog flew along the pipe to the intersection of the laser beam. The mean diameter of the fog particles was 5.765 microns. Compared with the humid particle diameter, we observed that the mean diameter of the fog particles was smaller than the humid particles. The test results of particle mean velocity was about 3.76 m/sec. Compared with the mean velocity of the humid particles, we can observed the mean velocity of fog particles were greater than humid particles. The experiments were conducted with four different kinds of particles with five different particle diameters. The particle types were organic particles, coal particles, potato particles and wheat particles with the diameter range of 63-75 micron, less than 150 micron, 150-250 micron, 250-355 micron and 355-425 micron. To control the flow rate, the control gate of the particle dispensing hopper was adjusted to 1/16 open rate, 1/8 open rate and 1/4 open rate. The captured image range was 0 cm to 5 cm from the control gate, 5 cm to 10 cm from the control gate and 10 cm to 15 cm from the control gate. Some of these experiments were conducted under both open environment conditions and closed environment conditions. Thus these experiments had a total of five parameters which were type of particles, diameter of particles, flow rate, observation range, and environment conditions. The coal particles (diameter between 63 and 75 microns) tested under the closed environment condition had three factors that were considered as the affecting factors. They were open rate, observation range, and environment conditions. In this experiment, the interaction of open rate and observation range had a significant effect on the lower limit. On the upper limit, the open rate and environment conditions had a significant effect. In addition, the interaction of open rate and environment conditions had a significant effect. The coal particles tested (diameter between 63 and 75 microns) under open environment, two factors were that considered as the affecting factors. They were the open rate and observation ranges. In this experiment, there was no significant effect on the lower limit. On the upper limit, the observation range had a significant effect. In addition, the interaction of open rate and observation range had a significant effect for the source of variation with 95% of confidence based on analysis of variance (ANOVA) results.« less

  15. The role of sorption and bacteria in mercury partitioning and bioavailability in artificial sediments.

    PubMed

    Zhong, Huan; Wang, Wen-Xiong

    2009-03-01

    This study compared the relative importance of three types of sorption (organic matter-particle, mercury-organic matter and mercury-particle) in controlling the overall mercury partitioning and bioavailability in sediments. We found that all three types of sorption were important for both inorganic mercury (Hg) and methylated mercury (MeHg). Mercury-particle sorption was more important than mercury-fulvic acid (FA) sorption in increasing the mercury concentrations with increasing aging. Bioavailability (quantified by gut juice extraction from sipunculans) was mainly controlled by mercury-particle sorption, while FA-particle and mercury-FA sorption were not as important, especially for MeHg. Bacterial activity also increased the partitioning of Hg or MeHg in the sediments and was further facilitated by the presence of organic matter. The bioavailability of Hg or MeHg from sediments was only slightly influenced by bacterial activity. This study highlights the importance of sorption from various sources (especially mercury-particle sorption) as well as bacteria in controlling the partitioning and bioavailability of Hg or MeHg in sediments.

  16. Blue nano titania made in diffusion flames.

    PubMed

    Teleki, Alexandra; Pratsinis, Sotiris E

    2009-05-21

    Blue titanium suboxide nanoparticles (including Magneli phases) were formed directly without any post-processing or addition of dopants by combustion of titanium-tetra-isopropoxide (TTIP) vapor at atmospheric pressure. Particle size, phase composition, rutile and anatase crystal sizes as well as the blue coloration were controlled by rapid quenching of the flame with a critical flow nozzle placed at various heights above the burner. The particles showed a broad absorption in the near-infrared region and retained their blue color upon storage in ambient atmosphere. A high concentration of paramagnetic Ti3+ centres was found in the substoichiometric particles by electron paramagnetic resonance (EPR) spectroscopy. Furthermore particles with controlled band gap energy from 3.2 to 3.6 eV were made by controlling the burner-nozzle-distance from 10 to 1 cm, respectively. The color robustness and extent of suboxidation could be further enhanced by co-oxidation of TTIP with hexamethyldisiloxane in the flame resulting in SiO2-coated titanium suboxide particles. The process is cost-effective and green while the particles produced can replace traditional blue colored, cobalt-containing pigments.

  17. Electrohydrodynamic controlled assembly and fracturing of thin colloidal particle films confined at drop interfaces

    NASA Astrophysics Data System (ADS)

    Rozynek, Z.; Dommersnes, P.; Mikkelsen, A.; Michels, L.; Fossum, J. O.

    2014-09-01

    Particles can adsorb strongly at liquid interfaces due to capillary forces, which in practice can confine the particles to the interface. Here we investigate the electrohydrodynamic flow driven packing and deformation of colloidal particle layers confined at the surface of liquid drops. The electrohydrodynamic flow has a stagnation point at the drop equator, leading to assembly of particles in a ribbon shaped film. The flow is entirely controlled by the electric field, and we demonstrate that AC fields can be used to induce hydrodynamic "shaking" of the colloidal particle film. We find that the mechanical properties of the film is highly dependent on the particles: monodisperse polystyrene beads form packed granular monolayers which "liquefies" upon shaking, whereas clay mineral particles form cohesive films that fracture upon shaking. The results are expected to be relevant for understanding the mechanics and rheology of particle stabilized emulsions. Supplementary material in the form of a pdf file available from the Journal web page at http://dx.doi.org/10.1140/epjst/e2014-02231-x

  18. A New Cluster Analysis-Marker-Controlled Watershed Method for Separating Particles of Granular Soils

    PubMed Central

    Alam, Md Ferdous

    2017-01-01

    An accurate determination of particle-level fabric of granular soils from tomography data requires a maximum correct separation of particles. The popular marker-controlled watershed separation method is widely used to separate particles. However, the watershed method alone is not capable of producing the maximum separation of particles when subjected to boundary stresses leading to crushing of particles. In this paper, a new separation method, named as Monash Particle Separation Method (MPSM), has been introduced. The new method automatically determines the optimal contrast coefficient based on cluster evaluation framework to produce the maximum accurate separation outcomes. Finally, the particles which could not be separated by the optimal contrast coefficient were separated by integrating cuboid markers generated from the clustering by Gaussian mixture models into the routine watershed method. The MPSM was validated on a uniformly graded sand volume subjected to one-dimensional compression loading up to 32 MPa. It was demonstrated that the MPSM is capable of producing the best possible separation of particles required for the fabric analysis. PMID:29057823

  19. Determination of time zero from a charged particle detector

    DOEpatents

    Green, Jesse Andrew [Los Alamos, NM

    2011-03-15

    A method, system and computer program is used to determine a linear track having a good fit to a most likely or expected path of charged particle passing through a charged particle detector having a plurality of drift cells. Hit signals from the charged particle detector are associated with a particular charged particle track. An initial estimate of time zero is made from these hit signals and linear tracks are then fit to drift radii for each particular time-zero estimate. The linear track having the best fit is then searched and selected and errors in fit and tracking parameters computed. The use of large and expensive fast detectors needed to time zero in the charged particle detectors can be avoided by adopting this method and system.

  20. Carbon Explorer Assessment of Carbon Biomass Variability and Carbon Flux Systematics in the Upper Ocean During SOFEX

    NASA Astrophysics Data System (ADS)

    Bishop, J. K.; Wood, T. J.; Sherman, J. T.

    2002-12-01

    Three autonomous Carbon Explorers built on SIO's Orbcomm/GPS enhanced Sounding Oceanographic Lagrangian Observer were launched near 55S 172W in the "North" SOFEX experiment area in early January 2002. All Explorers at 55S were programmed to perform profiles from 1000, 300, and 300 m with surfacings, GPS position, and telemetry of profile data initiating at local 0600, 1200, and 1800 hours. The floats were programmed to 'sleep' at 100 m depth between profiles to maximize tracking of the surface layer. Each Explorer carried SeaBird T and S sensors and was additionally fitted with a WETLabs transmissometer based "POC" sensor and a Seapoint scattering meter to assess particulate matter variability. A carbon flux "index" obtained during the 100 m sleep periods was also derived from the POC sensor readings. Explorer 1177 was deployed as a control outside of Fe treated waters on Jan 11 2002 (UTC) and drifted initially to the North East at 10 cm/sec. Explorer 2104, deployed on Jan 19 2002 after the 3rd Fe infusion, advected with the patch to the NE on a course that closely paralleled that of the "control". By Feb 8 2002, the two floats had drifted with the circumpolar current nearly 200 km; Explorer 2104 had recorded a 4-fold build-up of of particles in the upper 60 m whereas records from the nearby control Explorer 1177 showed little change. Ship survey data (Revelle) indicated that Explorer 2104 was near but "in" the trailing edge of the patch. Beginning Feb 14 (several days after the 4th infusion of Fe) and ending on Feb 24 2002, Explorer 2104 data showed isolines of POC concentration beginning to deepen in waters below 60 m and a coincident loss of POC from above; the POC flux index also began to show clearly different and enhanced 'spikes' compared to that recorded by the control. The spikes either reflected temporal variability of particle export from the patch or the intermingled sampling of the "in patch" settling plume of particles and "out-of-patch" background flux. Preliminary analysis of POC flux index integrated over time since the initial Fe amendment indicated a >2 fold enhancement of export from the iron treated waters. If the Explorer was indeed sampling the plume of sinking material intermittently, then the true export enhancement from the patch would be considerably greater. The last 'trace' of Fe treated waters was seen in early March 2002. Explorers 1177 and 2104 continue operations in the howling 50's of the Southern Ocean 8+ months after their deployment.

  1. Synthesis and self-assembly of Janus and patchy colloidal particles

    NASA Astrophysics Data System (ADS)

    Jiang, Shan

    Colloidal particles are considered classically as spherical particles with homogeneous surface chemistry. When this is so, the interactions between particles are isotropic and governed only by their separations. One can take advantage of this to simulate atoms, visualizing them one-by-one in a microscope, albeit at a larger length scale and longer time scale than for true atoms. However if the particles are not homogeneous, but Janus or patchy instead, with different surface chemistry on different hemispheres or otherwise different surface sites that are addressably controlled, the interactions between these particles depend not only on their separation, but also on their orientation. Research on Janus and patchy colloidal particles has opened a new chapter in the colloid research field, allowing us to mimic the behavior of these colloidal analogues of molecules, and in this way to ask new and exciting questions of condensed matter physics. In this dissertation, I investigated the synthesis and self-assembly of Janus and patchy colloidal particles with emphasis on Janus amphiphilic particles, which are the colloidal counterpart of surfactant molecules. Improving the scale-up capability, and also the capacity to control the geometry of Janus particles, I developed a simple and versatile method to synthesize Janus particles using an approach based on Pickering emulsions with particles adsorbed at the liquid-liquid interface. I showed that this method can be scaled up to synthesize Janus particles in large quantity. Also, the Janus balance can be predictably controlled by adding surfactant molecules during emulsification. In addition, going beyond the Janus geometry, I developed another synthetic method to fabricate trivalent patchy colloidal particles using micro-contact printing. With these synthetic methods in hand, I explored the self-assembly of Janus amphiphilic particles in aqueous solutions, while controlling systematically the salt concentration, the particle concentration, and the Janus balance. Various cluster and chain structures were observed. Using in situ optical microscopy, I found these structures to be dynamic in structure, in this respect analogous to the micelles formed by small surfactant molecules. A qualitative explanation about the possible underlying mechanism was proposed, based on considering the tradeoff between enthalpy gain from hydrophobic contacts, and entropy involving rotational orientation between neighboring particles. Monolayer crystals of Janus amphiphilic particles were investigated in a system of silica-based particles. Regarding positional order, these particles adopted a conventional hexagonal packing, but their orientations formed strikingly ordered linear clusters that extended the length of tens of particles. Study of their rotational dynamics using single particle tracking showed rotation to be strongly coupled between adjacent particles, with a correlation length extending to sevearl particle diameters. This is a beautiful example of a unique physical phenomenon that simply does not exist when dealing with classical particles whose surface chemical makeup is homogeneous. At the oil-water interface, Janus amphiphilic particles adsorb strongly. With simple calculations, I showed that the adsorption energy depends not only on surface tension but also on the Janus balance. I developed a rigorous mathematical definition of "Janus balance" that may find application in emulsions stabilized by Janus particles. On the experimental side, I performed experiments to quantify the efficacy of Janus particles to stabilize emulsions for extended times.

  2. Sound controlled rotation of a cluster of small particles on an ultrasonically vibrating metal strip

    NASA Astrophysics Data System (ADS)

    Zhang, Xueyi; Zheng, Yun; Hu, Junhui

    2008-01-01

    We show that a vibrating metal strip, mechanically driven by an ultrasonic transducer, can rotate a cluster of small particles around a fixed point, and the diameter of the cluster of small particles can reach a stable value (steady diameter) for a given driving condition. The rotation is very stable when the vibration of the metal strip is appropriate. The revolution speed, its direction, and steady diameter of the particle cluster can be controlled by the operating frequency of the ultrasonic transducer. For shrimp eggs, a revolution speed up to 360rpm can be obtained.

  3. Simultaneous Control of Multispecies Particle Transport and Segregation in Driven Lattices

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Aritra K.; Liebchen, Benno; Schmelcher, Peter

    2018-05-01

    We provide a generic scheme to separate the particles of a mixture by their physical properties like mass, friction, or size. The scheme employs a periodically shaken two-dimensional dissipative lattice and hinges on a simultaneous transport of particles in species-specific directions. This selective transport is achieved by controlling the late-time nonlinear particle dynamics, via the attractors embedded in the phase space and their bifurcations. To illustrate the spectrum of possible applications of the scheme, we exemplarily demonstrate the separation of polydisperse colloids and mixtures of cold thermal alkali atoms in optical lattices.

  4. Pair and triple correlations in the A+B-->B diffusion-controlled reaction

    NASA Astrophysics Data System (ADS)

    Kuzovkov, Vladimir; Kotomin, Eugene

    1994-03-01

    An exact solution for the one-dimensional kinetics of the diffusion-controlled reaction A+B-->B is obtained by means of the three-particle correlation functions. Because of a lattice discreteness each site could be occupied by a single particle only which leads to the so-called ``bus effect'': Recombination of any particle A is defined by a spatial configuration of two nearest particles B only surrounding A from its left and right. This results in the unusual algebraic decay law, n(t)~t-1, which asymptotically (as t-->∞) does not depend on the trap B concentration.

  5. MO-F-CAMPUS-T-05: Design of An Innovative Beam Monitor for Particle Therapy for the Simultaneous Measurement of Beam Fluence and Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sacchi, R; Guarachi, L Fanola; Monaco, V

    2015-06-15

    Purpose: Monitoring the prescribed dose in particle therapy is typically carried out by using parallel plate ionization chambers working in transmission mode. The use of gas detectors has several drawbacks: they need to be calibrated daily against standard dosimeters and their dependence on beam quality factors need to be fully characterized and controlled with high accuracy. A detector capable of single particle counting is proposed which would overcome all these limitations. Combined with a gas ionization chamber, it will allow determining the average particle stopping power, thus providing an effective method for the online verification of the selected particle energymore » and range. Methods: Low-Gain Avalanche Detectors (LGADs) are innovative n-in-p silicon sensors with moderate internal charge multiplication occurring in the strong field generated by an additional p+ doping layer implanted at a depth of a few µm in the bulk of the sensor. The increased signal-to-noise ratio allows designing very thin, few tens of microns, segmented LGADs, called Ultra Fast Silicon Detectors (UFSD), optimized for very fast signal, which would be suitable for charged particle counting at high rates. A prototype UFSD is being designed for this purpose. Results: Different LGAD diodes have been characterized both in laboratory and beam tests, and the results compared both with those obtained with similar diodes without the gain layer and with a program simulating the signal in the sensors. The signal is found to be enhanced in LGADs, while the leakage current and the noise is not affected by the gain. Possible alternative designs and implementations are also presented and discussed. Conclusion: Thanks to their excellent counting capabilities, UFSD detectors are a promising technology for future beam monitor devices in hadron-therapy applications. Studies are ongoing to better understand their properties and optimize the design in view of this application.« less

  6. A review of tephra transport and dispersal models: Evolution, current status, and future perspectives

    NASA Astrophysics Data System (ADS)

    Folch, A.

    2012-08-01

    Tephra transport models try to predict atmospheric dispersion and sedimentation of tephra depending on meteorology, particle properties, and eruption characteristics, defined by eruption column height, mass eruption rate, and vertical distribution of mass. Models are used for different purposes, from operational forecast of volcanic ash clouds to hazard assessment of tephra dispersion and fallout. The size of the erupted particles, a key parameter controlling the dynamics of particle sedimentation in the atmosphere, varies within a wide range. Largest centimetric to millimetric particles fallout at proximal to medial distances from the volcano and sediment by gravitational settling. On the other extreme, smallest micrometric to sub-micrometric particles can be transported at continental or even at global scales and are affected by other deposition and aggregation mechanisms. Different scientific communities had traditionally modeled the dispersion of these two end members. Volcanologists developed families of models suitable for lapilli and coarse ash and aimed at computing fallout deposits and for hazard assessment. In contrast, meteorologists and atmospheric scientists have traditionally used other atmospheric transport models, dealing with finer particles, for tracking motion of volcanic ash clouds and, eventually, for computing airborne ash concentrations. During the last decade, the increasing demand for model accuracy and forecast reliability has pushed on two fronts. First, the original gap between these different families of models has been filled with the emergence of multi-scale and multi-purpose models. Second, new modeling strategies including, for example, ensemble and probabilistic forecast or model data assimilation are being investigated for future implementation in models and or modeling strategies. This paper reviews the evolution of tephra transport and dispersal models during the last two decades, presents the status and limitations of the current modeling strategies, and discusses some emergent perspectives expected to be implemented at operational level during the next few years. Improvements in both real-time forecasting and long-term hazard assessment are necessary to loss prevention programs on a local, regional, national and international level.

  7. Interplanetary magnetic field control of mantle precipitation and associated field-aligned currents

    NASA Technical Reports Server (NTRS)

    Xu, Dingan; Kivelson, Margaret G.; Walker, Ray J.; Newell, Patrick T.; Meng, C.-I.

    1995-01-01

    Dayside reconnection, which is particularly effective for a southward interplanetary magnetic field (IMF), allows magnetosheath particles to enter the magnetosphere where they form the plasma mantle. The motions of the reconnected flux tube produce convective flows in the ionosphere. It is known that the convection patterns in the polar cap are skewed to the dawnside for a positive IMF B(sub y) (or duskside for a negative IMF B(sub y)) in the northern polar cap. Correspondingly, one would expect to find asymmetric distributions of mantle particle precipitation, but previous results have been unclear. In this paper the correlation between B(sub y) and the distribution of mantle particle precipitation is studied for steady IMF conditions with southward IMF. Ion and electron data from the Defense Meteorological Satellite Program (DMSP) F6 and F7 satellites are used to identify the mantle region and IMP 8 is used as a solar wind monitor to characterize the IMF. We study the local time extension of mantle precipitation in the prenoon and postnoon regions. We find that, in accordance with theoretical expectations for a positive (negative) IMF B(sub y), mantle particle precipitation mainly appears in the prenoon region of the northern (southern) hemisphere. The mantle particle precipitation can extend to as early as 0600 magnetic local time (MLT) in the prenoon region but extends over a smaller local time region in the postnoon sector (we did not find mantle plasma beyond 1600 MLT in our data set although coverage is scant in this area). Magnetometer data from F7 are used to determine whether part of the region 1 current flows on open field lines. We find that at times part of the region 1 sense current extends into the region of mantle particle precipitation, and is therefore on open field lines. In other cases, region 1 currents are absent on open field lines. Most of the observed features can be readily interpreted in terms of the open magnetosphere model.

  8. MC-TESTER v. 1.23: A universal tool for comparisons of Monte Carlo predictions for particle decays in high energy physics

    NASA Astrophysics Data System (ADS)

    Davidson, N.; Golonka, P.; Przedziński, T.; Waş, Z.

    2011-03-01

    Theoretical predictions in high energy physics are routinely provided in the form of Monte Carlo generators. Comparisons of predictions from different programs and/or different initialization set-ups are often necessary. MC-TESTER can be used for such tests of decays of intermediate states (particles or resonances) in a semi-automated way. Since 2002 new functionalities were introduced into the package. In particular, it now works with the HepMC event record, the standard for C++ programs. The complete set-up for benchmarking the interfaces, such as interface between τ-lepton production and decay, including QED bremsstrahlung effects is shown. The example is chosen to illustrate the new options introduced into the program. From the technical perspective, our paper documents software updates and supplements previous documentation. As in the past, our test consists of two steps. Distinct Monte Carlo programs are run separately; events with decays of a chosen particle are searched, and information is stored by MC-TESTER. Then, at the analysis step, information from a pair of runs may be compared and represented in the form of tables and plots. Updates introduced in the program up to version 1.24.4 are also documented. In particular, new configuration scripts or script to combine results from multitude of runs into single information file to be used in analysis step are explained. Program summaryProgram title: MC-TESTER, version 1.23 and version 1.24.4 Catalog identifier: ADSM_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSM_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 250 548 No. of bytes in distributed program, including test data, etc.: 4 290 610 Distribution format: tar.gz Programming language: C++, FORTRAN77 Tested and compiled with: gcc 3.4.6, 4.2.4 and 4.3.2 with g77/gfortran Computer: Tested on various platforms Operating system: Tested on operating systems: Linux SLC 4.6 and SLC 5, Fedora 8, Ubuntu 8.2 etc. Classification: 11.9 External routines: HepMC ( https://savannah.cern.ch/projects/hepmc/), PYTHIA8 ( http://home.thep.lu.se/~torbjorn/Pythia.html), LaTeX ( http://www.latex-project.org/) Catalog identifier of previous version: ADSM_v1_0 Journal reference of previous version: Comput. Phys. Comm. 157 (2004) 39 Does the new version supersede the previous version?: Yes Nature of problem: The decays of individual particles are well defined modules of a typical Monte Carlo program chain in high energy physics. A fast, semi-automatic way of comparing results from different programs is often desirable for the development of new programs, in order to check correctness of the installations or for discussion of uncertainties. Solution method: A typical HEP Monte Carlo program stores the generated events in event records such as HepMC, HEPEVT or PYJETS. MC-TESTER scans, event by event, the contents of the record and searches for the decays of the particle under study. The list of the found decay modes is successively incremented and histograms of all invariant masses which can be calculated from the momenta of the particle decay products are defined and filled. The outputs from the two runs of distinct programs can be later compared. A booklet of comparisons is created: for every decay channel, all histograms present in the two outputs are plotted and parameter quantifying shape difference is calculated. Its maximum over every decay channel is printed in the summary table. Reasons for new version: Interface for HepMC Event Record is introduced. Setup for benchmarking the interfaces, such as τ-lepton production and decay, including QED bremsstrahlung effects is introduced as well. This required significant changes in the algorithm. As a consequence, a new version of the code was introduced. Restrictions: Only the first 200 decay channels that were found will initialize histograms and if the multiplicity of decay products in a given channel was larger than 7, histograms will not be created for that channel. Additional comments: New features: HepMC interface, use of lists in definition of histograms and decay channels, filters for decay products or secondary decays to be omitted, bug fixing, extended flexibility in representation of program output, installation configuration scripts, merging multiple output files from separate generations. Running time: Varies substantially with the analyzed decay particle, but generally speed estimation of the old version remains valid. On a PC/Linux with 2.0 GHz processors MC-TESTER increases the run time of the τ-lepton Monte Carlo program TAUOLA by 4.0 seconds for every 100 000 analyzed events (generation itself takes 26 seconds). The analysis step takes 13 seconds; LATEX processing takes additionally 10 seconds. Generation step runs may be executed simultaneously on multiprocessor machines.

  9. The role of jet and film drops in controlling the mixing state of submicron sea spray aerosol particles

    NASA Astrophysics Data System (ADS)

    Wang, Xiaofei; Deane, Grant B.; Moore, Kathryn A.; Ryder, Olivia S.; Stokes, M. Dale; Beall, Charlotte M.; Collins, Douglas B.; Santander, Mitchell V.; Burrows, Susannah M.; Sultana, Camille M.; Prather, Kimberly A.

    2017-07-01

    The oceans represent a significant global source of atmospheric aerosols. Sea spray aerosol (SSA) particles comprise sea salts and organic species in varying proportions. In addition to size, the overall composition of SSA particles determines how effectively they can form cloud droplets and ice crystals. Thus, understanding the factors controlling SSA composition is critical to predicting aerosol impacts on clouds and climate. It is often assumed that submicrometer SSAs are mainly formed by film drops produced from bursting bubble-cap films, which become enriched with hydrophobic organic species contained within the sea surface microlayer. In contrast, jet drops formed from the base of bursting bubbles are postulated to mainly produce larger supermicrometer particles from bulk seawater, which comprises largely salts and water-soluble organic species. However, here we demonstrate that jet drops produce up to 43% of total submicrometer SSA number concentrations, and that the fraction of SSA produced by jet drops can be modulated by marine biological activity. We show that the chemical composition, organic volume fraction, and ice nucleating ability of submicrometer particles from jet drops differ from those formed from film drops. Thus, the chemical composition of a substantial fraction of submicrometer particles will not be controlled by the composition of the sea surface microlayer, a major assumption in previous studies. This finding has significant ramifications for understanding the factors controlling the mixing state of submicrometer SSA particles and must be taken into consideration when predicting SSA impacts on clouds and climate.

  10. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharjee, Amitava

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scalemore » laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh Refinement (AMR) framework, (2) the Particle Simulation Code, a fully electromagnetic 3D Particle-In-Cell (PIC) code that includes a collision operator, and (3) GS2, an Eulerian, electromagnetic, kinetic code that is widely used in the fusion program, and simulates the nonlinear gyrokinetic equations, together with a self-consistent set of Maxwell’s equations.« less

  11. Acquisition of a High Voltage/High resolution Transmission Electron Microscope.

    DTIC Science & Technology

    1988-08-21

    microstructural design starts at the nanometer level. One such method is colloidal processing of materials with ultrafine particles in which particle...applications in the colloidal processing of ceramics with ultrafine particles . Aftervards, nanometer-sized particles will be synthesized and...STRUCTURAL CONTROL WITH ULTRAFINE PARTICLES Jun Liu. Mehmet Sarikaya, and I. A. Aksay Department of Materials Science and Engineering. Advanced

  12. Simulating supersymmetry at the SSC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, R.M.; Haber, H.E.

    1984-08-01

    Careful study of supersymmetric signatures at the SSC is required in order to distinguish them from Standard Model physics backgrounds. To this end, we have created an efficient, accurate computer program which simulates supersymmetric particle production and decay (or other new particles). We have incorporated the full matrix elements, keeping track of the polarizations of all intermediate states. (At this time hadronization of final-state partons is ignored). Using Monte Carlo techniques this program can generate any desired final-state distribution or individual events for Lego plots. Examples of the results of our study of supersymmetry at SSC are provided.

  13. Exposure Assessment of a High-energy Tensile Test With Large Carbon Fiber Reinforced Polymer Cables.

    PubMed

    Schlagenhauf, Lukas; Kuo, Yu-Ying; Michel, Silvain; Terrasi, Giovanni; Wang, Jing

    2015-01-01

    This study investigated the particle and fiber release from two carbon fiber reinforced polymer cables that underwent high-energy tensile tests until rupture. The failing event was the source of a large amount of dust whereof a part was suspected to be containing possibly respirable fibers that could cause adverse health effects. The released fibers were suspected to migrate through small openings to the experiment control room and also to an adjacent machine hall where workers were active. To investigate the fiber release and exposure risk of the affected workers, the generated particles were measured with aerosol devices to obtain the particle size and particle concentrations. Furthermore, particles were collected on filter samples to investigate the particle shape and the fiber concentration. Three situations were monitored for the control room and the machine hall: the background concentrations, the impact of the cable failure, and the venting of the exposed rooms afterward. The results showed four important findings: The cable failure caused the release of respirable fibers with diameters below 3 μm and an average length of 13.9 μm; the released particles did migrate to the control room and to the machine hall; the measured peak fiber concentration of 0.76 fibers/cm(3) and the overall fiber concentration of 0.07 fibers/cm(3) in the control room were below the Permissible Exposure Limit (PEL) for fibers without indication of carcinogenicity; and the venting of the rooms was fast and effective. Even though respirable fibers were released, the low fiber concentration and effective venting indicated that the suspected health risks from the experiment on the affected workers was low. However, the effect of long-term exposure is not known therefore additional control measures are recommended.

  14. Iron speciation of airborne subway particles by the combined use of energy dispersive electron probe X-ray microanalysis and Raman microspectrometry.

    PubMed

    Eom, Hyo-Jin; Jung, Hae-Jin; Sobanska, Sophie; Chung, Sang-Gwi; Son, Youn-Suk; Kim, Jo-Chun; Sunwoo, Young; Ro, Chul-Un

    2013-11-05

    Quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA), known as low-Z particle EPMA, and Raman microspectrometry (RMS) were applied in combination for an analysis of the iron species in airborne PM10 particles collected in underground subway tunnels. Iron species have been reported to be a major chemical species in underground subway particles generated mainly from mechanical wear and friction processes. In particular, iron-containing particles in subway tunnels are expected to be generated with minimal outdoor influence on the particle composition. Because iron-containing particles have different toxicity and magnetic properties depending on their oxidation states, it is important to determine the iron species of underground subway particles in the context of both indoor public health and control measures. A recently developed analytical methodology, i.e., the combined use of low-Z particle EPMA and RMS, was used to identify the chemical species of the same individual subway particles on a single particle basis, and the bulk iron compositions of airborne subway particles were also analyzed by X-ray diffraction. The majority of airborne subway particles collected in the underground tunnels were found to be magnetite, hematite, and iron metal. All the particles collected in the tunnels of underground subway stations were attracted to permanent magnets due mainly to the almost ubiquitous ferrimagnetic magnetite, indicating that airborne subway particles can be removed using magnets as a control measure.

  15. A Biopharmaceutical Industry Perspective on the Control of Visible Particles in Biotechnology-Derived Injectable Drug Products.

    PubMed

    Mathonet, Serge; Mahler, Hanns-Christian; Esswein, Stefan T; Mazaheri, Maryam; Cash, Patricia W; Wuchner, Klaus; Kallmeyer, Georg; Das, Tapan K; Finkler, Christof; Lennard, Andrew

    2016-01-01

    Regulatory monographs in Europe and the United States require drug products for parenteral administration to be "practically free" or "essentially free" of visible particles, respectively. Both terms have been used interchangeably and acknowledge the probabilistic nature of visual particle inspection. The probability of seeing a particle in a drug product container varies according to the size and nature of the particles as well as container and inspection conditions. Therefore, the term "without visible particles" can be highly misleading in the context of what is practically achievable. This may lead to differences in understanding between industry practitioners and regulatory agencies. Is this term intended to mean "zero particles", or is there any intention to distinguish between particle type such as "zero extraneous visible particles" or "zero proteinaceous particles"? Furthermore, how can "zero" particles as a criterion for release testing be reconciled with "practically free from particles" as stated in the definition and a low, justified level of proteinaceous particles after production?The purpose of this position paper is to review best practices in the industry in terms of visual inspection process and associated operator training, quality control sampling, testing, and setting acceptance criteria corresponding to "practically free of visible particles" and providing considerations when visible proteinaceous particles are deemed unavoidable. It also provides a brief overview of visible particle characterization and gives perspectives on patient safety. This position paper applies to biotechnology-derived drug products including monoclonal antibodies in late-phase development to licensed products. In the 2011 monoclonal antibody monograph revision, European Pharmacopoeia experts acknowledged that protein products may also contain proteinaceous particles at release or that protein particles may form during storage. Indeed, industry experience has demonstrated that therapeutic proteins such as monoclonal antibodies can exhibit a propensity for self-association leading to the formation of aggregates that range in size from nanometres (oligomers) to microns (subvisible and visible particles). As a result, the requirement for drug product appearance for monoclonal antibodies was changed from "without visible particles" to "without visible particles unless otherwise authorised or justified". In our view, "practically free from particles" should be considered a suitable acceptance criterion for injectable biotechnology and small-molecule products, as long as appropriately defined. Furthermore, we argue that visual inspection is a suitable quality control release test and that "practically free from particles" is a suitable specification when adequately described. © PDA, Inc. 2016.

  16. Size control in the synthesis of 1-6 nm gold nanoparticles via solvent-controlled nucleation.

    PubMed

    Song, Jieun; Kim, Dukhan; Lee, Dongil

    2011-11-15

    We report a facile synthetic route for size-controlled preparation of gold nanoparticles. Nearly monodisperse gold nanoparticles with core diameters of 1-6 nm were obtained by reducing AuP(Phenyl)(3)Cl with tert-butylamine borane in the presence of dodecanethiol in the solvent mixture of benzene and CHCl(3). Mechanism studies have shown that the size control is achieved by the solvent-controlled nucleation in which the nuclei concentration increases with increasing the fraction of CHCl(3), leading to smaller particles. It was also found that, following the solvent-controlled nucleation, particle growth occurs via ligand replacement of PPh(3) on the nuclei by Au(I)thiolate generated by the digestive etching of small particles. This synthetic strategy was successfully demonstrated with other alkanethiols of different chain length with which size-controlled, monodisperse gold nanoparticles were prepared in remarkable yield without requiring any postsynthesis treatments.

  17. Accelerator-based validation of shielding codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeitlin, Cary; Heilbronn, Lawrence; Miller, Jack

    2002-08-12

    The space radiation environment poses risks to astronaut health from a diverse set of sources, ranging from low-energy protons and electrons to highly-charged, high-energy atomic nuclei and their associated fragmentation products, including neutrons. The low-energy protons and electrons are the source of most of the radiation dose to Shuttle and ISS crews, while the more energetic particles that comprise the Galactic Cosmic Radiation (protons, He, and heavier nuclei up to Fe) will be the dominant source for crews on long-duration missions outside the earth's magnetic field. Because of this diversity of sources, a broad ground-based experimental effort is required tomore » validate the transport and shielding calculations used to predict doses and dose-equivalents under various mission scenarios. The experimental program of the LBNL group, described here, focuses principally on measurements of charged particle and neutron production in high-energy heavy-ion fragmentation. Other aspects of the program include measurements of the shielding provided by candidate spacesuit materials against low-energy protons (particularly relevant to extra-vehicular activities in low-earth orbit), and the depth-dose relations in tissue for higher-energy protons. The heavy-ion experiments are performed at the Brookhaven National Laboratory's Alternating Gradient Synchrotron and the Heavy-Ion Medical Accelerator in Chiba in Japan. Proton experiments are performed at the Lawrence Berkeley National Laboratory's 88'' Cyclotron with a 55 MeV beam, and at the Loma Linda University Proton Facility with 100 to 250 MeV beam energies. The experimental results are an important component of the overall shielding program, as they allow for simple, well-controlled tests of the models developed to handle the more complex radiation environment in space.« less

  18. Effect of Nozzle Geometry on the Microstructure and Properties of HVAF-Sprayed WC-10Co4Cr and Cr3C2-25NiCr Coatings

    NASA Astrophysics Data System (ADS)

    Matikainen, V.; Koivuluoto, H.; Vuoristo, P.; Schubert, J.; Houdková, Š.

    2018-04-01

    Thermally sprayed hard metal coatings are the industrial standard solution for numerous demanding applications to improve wear resistance. In the aim of improving coating quality by utilising finer particle size distributions, several approaches have been studied to control the spray temperature. The most viable solution is to use the modern high velocity air-fuel (HVAF) spray process, which has already proven to produce high-quality coatings with dense structures. In HVAF spray process, the particle heating and acceleration can be efficiently controlled by changing the nozzle geometry. In this study, fine WC-10Co4Cr and Cr3C2-25NiCr powders were sprayed with three nozzle geometries to investigate their effect on the particle temperature, velocity and coating microstructure. The study demonstrates that the particle melting and resulting carbide dissolution can be efficiently controlled by changing the nozzle geometry from cylindrical to convergent-divergent. Moreover, the average particle velocity was increased from 780 to over 900 m/s. The increase in particle velocity significantly improved the coating structure and density. Further evaluation was carried out to resolve the effect of particle in-flight parameters on coating structure and cavitation erosion resistance, which was significantly improved in the case of WC-10Co4Cr coatings with the increasing average particle velocity.

  19. Particle Capture Devices and Methods of Use Thereof

    NASA Technical Reports Server (NTRS)

    Voldman, Joel (Inventor); Skelley, Alison M. (Inventor); Kirak, Oktay (Inventor); Jaenisch, Rudolf (Inventor)

    2015-01-01

    The present invention provides a device and methods of use thereof in microscale particle capturing and particle pairing. This invention provides particle patterning device, which mechanically traps individual particles within first chambers of capture units, transfer the particles to second chambers of opposing capture units, and traps a second type of particle in the same second chamber. The device and methods allow for high yield assaying of trapped cells, high yield fusion of trapped, paired cells, for controlled binding of particles to cells and for specific chemical reactions between particle interfaces and particle contents. The device and method provide means of identification of the particle population and a facile route to particle collection.

  20. Overview of C-2W Field-Reversed Configuration Experimental Program

    NASA Astrophysics Data System (ADS)

    Gota, H.; Binderbauer, M. W.; Tajima, T.; Putvinski, S.; Tuszewski, M.; Dettrick, S.; Korepanov, S.; Romero, J.; Smirnov, A.; Song, Y.; Thompson, M. C.; van Drie, A.; Yang, X.; Ivanov, A. A.; TAE Team

    2017-10-01

    Tri Alpha Energy's research has been devoted to producing a high temperature, stable, long-lived field-reversed configuration (FRC) plasma state by neutral-beam injection (NBI) and edge biasing/control. C-2U experiments have demonstrated drastic improvements in particle and energy confinement properties of FRC's, and the plasma performance obtained via 10 MW NBI has achieved plasma sustainment of up to 5 ms and plasma (diamagnetism) lifetimes of 10 + ms. The emerging confinement scaling, whereby electron energy confinement time is proportional to a positive power of the electron temperature, is very attractive for higher energy plasma confinement; accordingly, verification of the observed Te scaling law will be a key future research objective. The new experimental device, C-2W (now also called ``Norman''), has the following key subsystem upgrades from C-2U: (i) higher injected power, optimum energies, and extended pulse duration of the NBI system; (ii) installation of inner divertors with upgraded edge-biasing systems; (iii) fast external equilibrium/mirror-coil current ramp-up capability; and (iv) installation of trim/saddle coils for active feedback control of the FRC plasma. This paper will review highlights of the C-2W program.

  1. A global view of atmospheric ice particle complexity

    NASA Astrophysics Data System (ADS)

    Schmitt, Carl G.; Heymsfield, Andrew J.; Connolly, Paul; Järvinen, Emma; Schnaiter, Martin

    2016-11-01

    Atmospheric ice particles exist in a variety of shapes and sizes. Single hexagonal crystals like common hexagonal plates and columns are possible, but more frequently, atmospheric ice particles are much more complex. Ice particle shapes have a substantial impact on many atmospheric processes through fall speed, affecting cloud lifetime, to radiative properties, affecting energy balance to name a few. This publication builds on earlier work where a technique was demonstrated to separate single crystals and aggregates of crystals using particle imagery data from aircraft field campaigns. Here data from 10 field programs have been analyzed and ice particle complexity parameterized by cloud temperature for arctic, midlatitude (summer and frontal), and tropical cloud systems. Results show that the transition from simple to complex particles can be as small as 80 µm or as large as 400 µm depending on conditions. All regimes show trends of decreasing transition size with decreasing temperature.

  2. Optically controlled electrophoresis with a photoconductive substrate

    NASA Astrophysics Data System (ADS)

    Inami, Wataru; Nagashima, Taiki; Kawata, Yoshimasa

    2018-05-01

    A photoconductive substrate is used to perform electrophoresis. Light-induced micro-particle flow manipulation is demonstrated without using a fabricated flow channel. The path along which the particles were moved was formed by an illuminated light pattern on the substrate. Because the substrate conductivity and electric field distribution can be modified by light illumination, the forces acting on the particles can be controlled. This technique has potential applications as a high functionality analytical device.

  3. Mesoscopic monodisperse ferromagnetic colloids enable magnetically controlled photonic crystals.

    PubMed

    Xu, Xiangling; Majetich, Sara A; Asher, Sanford A

    2002-11-20

    We report here the first synthesis of mesoscopic, monodisperse particles which contain nanoscopic inclusions of ferromagnetic cobalt ferrites. These monodisperse ferromagnetic composite particles readily self-assemble into magnetically responsive photonic crystals that efficiently Bragg diffract incident light. Magnetic fields can be used to control the photonic crystal orientation and, thus, the diffracted wavelength. We demonstrate the use of these ferromagnetic particles to fabricate magneto-optical diffracting fluids and magnetically switchable diffracting mirrors.

  4. The Role of Traps in the Microstructural Control of Hydrogen Embrittlement of Steels.

    DTIC Science & Technology

    1984-04-01

    which hydrogen interacts with precipitate or other particles located on or near different structural features can in many cases directly control the...growth, can be and have been used to reduce the extent of hydrogen embrittlement in a number of ferrous alloys , ranging from low strength...sulfide induced crack at the extremity of an elongated MnS particle . Hence, round shaped second phase particles are desirable, which are achievable by

  5. An efficient and portable SIMD algorithm for charge/current deposition in Particle-In-Cell codes

    DOE PAGES

    Vincenti, H.; Lobet, M.; Lehe, R.; ...

    2016-09-19

    In current computer architectures, data movement (from die to network) is by far the most energy consuming part of an algorithm (≈20pJ/word on-die to ≈10,000 pJ/word on the network). To increase memory locality at the hardware level and reduce energy consumption related to data movement, future exascale computers tend to use many-core processors on each compute nodes that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD registermore » length is expected to double every four years. As a consequence, Particle-In-Cell (PIC) codes will have to achieve good vectorization to fully take advantage of these upcoming architectures. In this paper, we present a new algorithm that allows for efficient and portable SIMD vectorization of current/charge deposition routines that are, along with the field gathering routines, among the most time consuming parts of the PIC algorithm. Our new algorithm uses a particular data structure that takes into account memory alignment constraints and avoids gather/scat;ter instructions that can significantly affect vectorization performances on current CPUs. The new algorithm was successfully implemented in the 3D skeleton PIC code PICSAR and tested on Haswell Xeon processors (AVX2-256 bits wide data registers). Results show a factor of ×2 to ×2.5 speed-up in double precision for particle shape factor of orders 1–3. The new algorithm can be applied as is on future KNL (Knights Landing) architectures that will include AVX-512 instruction sets with 512 bits register lengths (8 doubles/16 singles). Program summary Program Title: vec_deposition Program Files doi:http://dx.doi.org/10.17632/nh77fv9k8c.1 Licensing provisions: BSD 3-Clause Programming language: Fortran 90 External routines/libraries:  OpenMP > 4.0 Nature of problem: Exascale architectures will have many-core processors per node with long vector data registers capable of performing one single instruction on multiple data during one clock cycle. Data register lengths are expected to double every four years and this pushes for new portable solutions for efficiently vectorizing Particle-In-Cell codes on these future many-core architectures. One of the main hotspot routines of the PIC algorithm is the current/charge deposition for which there is no efficient and portable vector algorithm. Solution method: Here we provide an efficient and portable vector algorithm of current/charge deposition routines that uses a new data structure, which significantly reduces gather/scatter operations. Vectorization is controlled using OpenMP 4.0 compiler directives for vectorization which ensures portability across different architectures. Restrictions: Here we do not provide the full PIC algorithm with an executable but only vector routines for current/charge deposition. These scalar/vector routines can be used as library routines in your 3D Particle-In-Cell code. However, to get the best performances out of vector routines you have to satisfy the two following requirements: (1) Your code should implement particle tiling (as explained in the manuscript) to allow for maximized cache reuse and reduce memory accesses that can hinder vector performances. The routines can be used directly on each particle tile. (2) You should compile your code with a Fortran 90 compiler (e.g Intel, gnu or cray) and provide proper alignment flags and compiler alignment directives (more details in README file).« less

  6. An efficient and portable SIMD algorithm for charge/current deposition in Particle-In-Cell codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vincenti, H.; Lobet, M.; Lehe, R.

    In current computer architectures, data movement (from die to network) is by far the most energy consuming part of an algorithm (≈20pJ/word on-die to ≈10,000 pJ/word on the network). To increase memory locality at the hardware level and reduce energy consumption related to data movement, future exascale computers tend to use many-core processors on each compute nodes that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD registermore » length is expected to double every four years. As a consequence, Particle-In-Cell (PIC) codes will have to achieve good vectorization to fully take advantage of these upcoming architectures. In this paper, we present a new algorithm that allows for efficient and portable SIMD vectorization of current/charge deposition routines that are, along with the field gathering routines, among the most time consuming parts of the PIC algorithm. Our new algorithm uses a particular data structure that takes into account memory alignment constraints and avoids gather/scat;ter instructions that can significantly affect vectorization performances on current CPUs. The new algorithm was successfully implemented in the 3D skeleton PIC code PICSAR and tested on Haswell Xeon processors (AVX2-256 bits wide data registers). Results show a factor of ×2 to ×2.5 speed-up in double precision for particle shape factor of orders 1–3. The new algorithm can be applied as is on future KNL (Knights Landing) architectures that will include AVX-512 instruction sets with 512 bits register lengths (8 doubles/16 singles). Program summary Program Title: vec_deposition Program Files doi:http://dx.doi.org/10.17632/nh77fv9k8c.1 Licensing provisions: BSD 3-Clause Programming language: Fortran 90 External routines/libraries:  OpenMP > 4.0 Nature of problem: Exascale architectures will have many-core processors per node with long vector data registers capable of performing one single instruction on multiple data during one clock cycle. Data register lengths are expected to double every four years and this pushes for new portable solutions for efficiently vectorizing Particle-In-Cell codes on these future many-core architectures. One of the main hotspot routines of the PIC algorithm is the current/charge deposition for which there is no efficient and portable vector algorithm. Solution method: Here we provide an efficient and portable vector algorithm of current/charge deposition routines that uses a new data structure, which significantly reduces gather/scatter operations. Vectorization is controlled using OpenMP 4.0 compiler directives for vectorization which ensures portability across different architectures. Restrictions: Here we do not provide the full PIC algorithm with an executable but only vector routines for current/charge deposition. These scalar/vector routines can be used as library routines in your 3D Particle-In-Cell code. However, to get the best performances out of vector routines you have to satisfy the two following requirements: (1) Your code should implement particle tiling (as explained in the manuscript) to allow for maximized cache reuse and reduce memory accesses that can hinder vector performances. The routines can be used directly on each particle tile. (2) You should compile your code with a Fortran 90 compiler (e.g Intel, gnu or cray) and provide proper alignment flags and compiler alignment directives (more details in README file).« less

  7. Exploring Focal and Aberration Properties of Electrostatic Lenses through Computer Simulation

    ERIC Educational Resources Information Center

    Sise, Omer; Manura, David J.; Dogan, Mevlut

    2008-01-01

    The interactive nature of computer simulation allows students to develop a deeper understanding of the laws of charged particle optics. Here, the use of commercially available optical design programs is described as a tool to aid in solving charged particle optics problems. We describe simple and practical demonstrations of basic electrostatic…

  8. Final Report May 1, 2012 to May 31, 2015: "Theoretical Studies in Elementary Particle Physics"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, John C.; Roiban, Radu

    2015-08-19

    This final report summarizes work at Penn State University from May 1, 2012 to May 31, 2015. The work was in theoretical elementary particle physics. Many new results in perturbative QCD, in string theory, and in related areas were obtained, with a substantial impact on the experimental program.

  9. Biking with Particles: Junior Triathletes' Learning about Drafting through Exploring Agent-Based Models and Inventing New Tactics

    ERIC Educational Resources Information Center

    Hirsh, Alon; Levy, Sharona T.

    2013-01-01

    The present research addresses a curious finding: how learning physical principles enhanced athletes' biking performance but not their conceptual understanding. The study involves a model-based triathlon training program, Biking with Particles, concerning aerodynamics of biking in groups (drafting). A conceptual framework highlights several…

  10. 78 FR 20868 - Approval and Promulgation of Implementation Plans; Designation of Areas for Air Quality Planning...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ... chemical reactions from precursor gases (e.g., secondary particles). Secondary particles, such as sulfates, nitrates, and complex carbon compounds, are formed from reactions with oxides of sulfur (SO X ), oxides of... nonattainment new source review (nonattainment NSR) permit programs; provisions for air pollution modeling; and...

  11. The High Momentum Particle IDentification (HMPID) detector PID performance and its contribution to the ALICE physics program

    NASA Astrophysics Data System (ADS)

    Volpe, Giacomo; ALICE Collaboration

    2017-12-01

    The ALICE apparatus is dedicated to study the properties of strongly interacting matter under extremely high temperature and energy density conditions. For this, enhanced particle identification (PID) capabilities are required. Among the PID ALICE detectors, the ALICE-HMPID (High Momentum Particle IDentification) detector is devoted to the identification of charged hadrons, exploiting the Cherenkov effect. It consists of seven identical RICH modules, with liquid C6F14 as Cherenkov radiator (n ≈1.298 at λ=175 nm). Photon and charged particle detection is performed by a MWPC, coupled with a pad segmented CsI coated photo-cathode. The total CsI active area is 10.3 m2. The HMPID provides 3 sigma separation for pions and kaons up to pT = 3 GeV / c and for kaons and (anti-)protons up to pT = 5 GeV / c . A review of the HMPID PID performance, in particular in the challenging central Pb-Pb collisions, and its contribution to the ALICE physics program, using the LHC RUN1 (2010-2013) and RUN2 (2015) data, are presented.

  12. Investigating Astromaterials Curation Applications for Dexterous Robotic Arms

    NASA Technical Reports Server (NTRS)

    Snead, C. J.; Jang, J. H.; Cowden, T. R.; McCubbin, F. M.

    2018-01-01

    The Astromaterials Acquisition and Curation office at NASA Johnson Space Center is currently investigating tools and methods that will enable the curation of future astromaterials collections. Size and temperature constraints for astromaterials to be collected by current and future proposed missions will require the development of new robotic sample and tool handling capabilities. NASA Curation has investigated the application of robot arms in the past, and robotic 3-axis micromanipulators are currently in use for small particle curation in the Stardust and Cosmic Dust laboratories. While 3-axis micromanipulators have been extremely successful for activities involving the transfer of isolated particles in the 5-20 micron range (e.g. from microscope slide to epoxy bullet tip, beryllium SEM disk), their limited ranges of motion and lack of yaw, pitch, and roll degrees of freedom restrict their utility in other applications. For instance, curators removing particles from cosmic dust collectors by hand often employ scooping and rotating motions to successfully free trapped particles from the silicone oil coatings. Similar scooping and rotating motions are also employed when isolating a specific particle of interest from an aliquot of crushed meteorite. While cosmic dust curators have been remarkably successful with these kinds of particle manipulations using handheld tools, operator fatigue limits the number of particles that can be removed during a given extraction session. The challenges for curation of small particles will be exacerbated by mission requirements that samples be processed in N2 sample cabinets (i.e. gloveboxes). We have been investigating the use of compact robot arms to facilitate sample handling within gloveboxes. Six-axis robot arms potentially have applications beyond small particle manipulation. For instance, future sample return missions may involve biologically sensitive astromaterials that can be easily compromised by physical interaction with a curator; other potential future returned samples may require cryogenic curation. Robot arms may be combined with high resolution cameras within a sample cabinet and controlled remotely by curator. Sophisticated robot arm and hand combination systems can be programmed to mimic the movements of a curator wearing a data glove; successful implementation of such a system may ultimately allow a curator to virtually operate in a nitrogen, cryogenic, or biologically sensitive environment with dexterity comparable to that of a curator physically handling samples in a glove box.

  13. Mathematical modeling of velocity and number density profiles of particles across the flame propagation through a micro-iron dust cloud.

    PubMed

    Bidabadi, Mehdi; Haghiri, Ali; Rahbari, Alireza

    2010-04-15

    In this study, an attempt has been made to analytically investigate the concentration and velocity profiles of particles across flame propagation through a micro-iron dust cloud. In the first step, Lagrangian particle equation of motion during upward flame propagation in a vertical duct is employed and then forces acting upon the particle, such as thermophoretic force (resulted from the temperature gradient), gravitation and buoyancy are introduced; and consequently, the velocity profile as a function of the distance from the leading edge of the combustion zone is extracted. In the resumption, a control volume above the leading edge of the combustion zone is considered and the change in the particle number density in this control volume is obtained via the balance of particle mass fluxes passing through it. This study explains that the particle concentration at the leading edge of the combustion zone is more than the particle agglomeration in a distance far from the flame front. This increase in the particle aggregation above the combustion zone has a remarkable effect on the lower flammability limits of combustible particle cloud. It is worth noticing that the velocity and particle concentration profiles show a reasonable compatibility with the experimental data. 2009 Elsevier B.V. All rights reserved.

  14. Novel Online Diagnostic Analysis for In-Flight Particle Properties in Cold Spraying

    NASA Astrophysics Data System (ADS)

    Koivuluoto, Heli; Matikainen, Ville; Larjo, Jussi; Vuoristo, Petri

    2018-02-01

    In cold spraying, powder particles are accelerated by preheated supersonic gas stream to high velocities and sprayed on a substrate. The particle velocities depend on the equipment design and process parameters, e.g., on the type of the process gas and its pressure and temperature. These, in turn, affect the coating structure and the properties. The particle velocities in cold spraying are high, and the particle temperatures are low, which can, therefore, be a challenge for the diagnostic methods. A novel optical online diagnostic system, HiWatch HR, will open new possibilities for measuring particle in-flight properties in cold spray processes. The system employs an imaging measurement technique called S-PTV (sizing-particle tracking velocimetry), first introduced in this research. This technique enables an accurate particle size measurement also for small diameter particles with a large powder volume. The aim of this study was to evaluate the velocities of metallic particles sprayed with HPCS and LPCS systems and with varying process parameters. The measured in-flight particle properties were further linked to the resulting coating properties. Furthermore, the camera was able to provide information about variations during the spraying, e.g., fluctuating powder feeding, which is important from the process control and quality control point of view.

  15. Flow and Jamming of Granular Materials in a Two-dimensional Hopper

    NASA Astrophysics Data System (ADS)

    Tang, Junyao

    Flow in a hopper is both a fertile testing ground for understanding fundamental granular flow rheology and industrially highly relevant. Despite increasing research efforts in this area, a comprehensive physical theory is still lacking for both jamming and flow of granular materials in a hopper. In this work, I have designed a two dimensional (2D) hopper experiment using photoelastic particles (particles' shape: disk or ellipse), with the goal to build a bridge between macroscopic phenomenon of hopper flow and microscopic particle-scale dynamics. Through synchronized data of particle tracking and stress distributions in particles, I have shown differences between my data of the time-averaged velocity/stress profile of 2D hopper flow with previous theoretical predictions. I have also demonstrated the importance of a mechanical stable arch near the opening on controlling hopper flow rheology and suggested a heuristic phase diagram for the hopper flow/jamming transition. Another part of this thesis work is focused on studying the impact of particle shape of particles on hopper flow. By comparing particle-tracking and photoelastic data for ellipses and disks at the appropriate length scale, I have demonstrated an important role for the rotational freedom of elliptical particles in controlling flow rheology through particle tracking and stress analysis. This work has been supported by International Fine Particle Research Institute (IFPRI) .

  16. Control of particle size by feed composition in the nanolatexes produced via monomer-starved semicontinuous emulsion copolymerization.

    PubMed

    Sajjadi, Shahriar

    2015-05-01

    Conventional batch and semicontinuous emulsion copolymerizations often produce large particles whose size cannot be easily correlated with the comonomer feed compositions, and are to some degree susceptible to composition drift. In contrast, we found that copolymer nanolatexes made via semicontinuous monomer-starved emulsion copolymerizations are featured with an average nanoparticle size being controlled by the feed composition, a high conversion achieved, and a high degree of particle composition uniformity. This was achieved because the rate of particle growth, during nucleation, was controlled by the rate of comonomer addition, and the copolymer composition, surfactant parking area on the particles, and nucleation efficiency determined by the comonomer feed composition. Two model systems, methyl methacrylate/styrene and vinyl acetate/butyl acrylate, with significant differences in water solubility were studied. Monomers were added to the aqueous solution of sodium dodecylsulfate and potassium persulfate at a low rate to achieve high instantaneous conversions. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. To alloy or not to alloy? Cr modified Pt/C cathode catalysts for PEM fuel cells.

    PubMed

    Wells, Peter P; Qian, Yangdong; King, Colin R; Wiltshire, Richard J K; Crabb, Eleanor M; Smart, Lesley E; Thompsett, David; Russell, Andrea E

    2008-01-01

    The cathode electrocatalysts for proton exchange membrane (PEM) fuel cells are commonly platinum and platinum based alloy nanoparticles dispersed on a carbon support. Control over the particle size and composition has, historically, been attained empirically, making systematic studies of the effects of various structural parameters difficult. The controlled surface modification methodology used in this work has enabled the controlled modification of carbon supported Pt nanoparticles by Cr so as to yield nanoalloy particles with defined compositions. Subsequent heat treatment in 5% H2 in N2 resulted in the formation of a distinct Pt3Cr alloy phase which was either restricted to the surface of the particles or present throughout the bulk of the particle structure. Measurement of the oxygen reduction activity of the catalysts was accomplished using the rotating thin film electrode method and the activities obtained were related to the structure of the nanoalloy catalyst particles, largely determined using Cr K edge and Pt L3 edge XAS.

  18. Ion-beam apparatus and method for analyzing and controlling integrated circuits

    DOEpatents

    Campbell, A.N.; Soden, J.M.

    1998-12-01

    An ion-beam apparatus and method for analyzing and controlling integrated circuits are disclosed. The ion-beam apparatus comprises a stage for holding one or more integrated circuits (ICs); a source means for producing a focused ion beam; and a beam-directing means for directing the focused ion beam to irradiate a predetermined portion of the IC for sufficient time to provide an ion-beam-generated electrical input signal to a predetermined element of the IC. The apparatus and method have applications to failure analysis and developmental analysis of ICs and permit an alteration, control, or programming of logic states or device parameters within the IC either separate from or in combination with applied electrical stimulus to the IC for analysis thereof. Preferred embodiments of the present invention including a secondary particle detector and an electron floodgun further permit imaging of the IC by secondary ions or electrons, and allow at least a partial removal or erasure of the ion-beam-generated electrical input signal. 4 figs.

  19. Ion-beam apparatus and method for analyzing and controlling integrated circuits

    DOEpatents

    Campbell, Ann N.; Soden, Jerry M.

    1998-01-01

    An ion-beam apparatus and method for analyzing and controlling integrated circuits. The ion-beam apparatus comprises a stage for holding one or more integrated circuits (ICs); a source means for producing a focused ion beam; and a beam-directing means for directing the focused ion beam to irradiate a predetermined portion of the IC for sufficient time to provide an ion-beam-generated electrical input signal to a predetermined element of the IC. The apparatus and method have applications to failure analysis and developmental analysis of ICs and permit an alteration, control, or programming of logic states or device parameters within the IC either separate from or in combination with applied electrical stimulus to the IC for analysis thereof. Preferred embodiments of the present invention including a secondary particle detector and an electron floodgun further permit imaging of the IC by secondary ions or electrons, and allow at least a partial removal or erasure of the ion-beam-generated electrical input signal.

  20. Study of Solid Particle Behavior in High Temperature Gas Flows

    NASA Astrophysics Data System (ADS)

    Majid, A.; Bauder, U.; Stindl, T.; Fertig, M.; Herdrich, G.; Röser, H.-P.

    2009-01-01

    The Euler-Lagrangian approach is used for the simulation of solid particles in hypersonic entry flows. For flow field simulation, the program SINA (Sequential Iterative Non-equilibrium Algorithm) developed at the Institut für Raumfahrtsysteme is used. The model for the effect of the carrier gas on a particle includes drag force and particle heating only. Other parameters like lift Magnus force or damping torque are not taken into account so far. The reverse effect of the particle phase on the gaseous phase is currently neglected. Parametric analysis is done regarding the impact of variation in the physical input conditions like position, velocity, size and material of the particle. Convective heat fluxes onto the surface of the particle and its radiative cooling are discussed. The variation of particle temperature under different conditions is presented. The influence of various input conditions on the trajectory is explained. A semi empirical model for the particle wall interaction is also discussed and the influence of the wall on the particle trajectory with different particle conditions is presented. The heat fluxes onto the wall due to impingement of particles are also computed and compared with the heat fluxes from the gas.

  1. Compilation, design tests: Energetic particles Satellite S-3 including design tests for S-3A, S-3B and S-3C

    NASA Technical Reports Server (NTRS)

    Ledoux, F. N.

    1973-01-01

    A compilation of engineering design tests which were conducted in support of the Energetic Particle Satellite S-3, S-3A, and S-3b programs. The purpose for conducting the tests was to determine the adequacy and reliability of the Energetic Particles Series of satellites designs. The various tests consisted of: (1) moments of inertia, (2) functional reliability, (3) component and structural integrity, (4) initiators and explosives tests, and (5) acceptance tests.

  2. Instituting a filtration/pressurization system to reduce dust concentrations in a control room at a mineral processing plant

    PubMed Central

    Noll, J.; Cecala, A.; Hummer, J.

    2016-01-01

    The National Institute for Occupational Safety and Health has observed that many control rooms and operator compartments in the U.S. mining industry do not have filtration systems capable of maintaining low dust concentrations in these areas. In this study at a mineral processing plant, to reduce respirable dust concentrations in a control room that had no cleaning system for intake air, a filtration and pressurization system originally designed for enclosed cabs was modified and installed. This system was composed of two filtering units: one to filter outside air and one to filter and recirculate the air inside the control room. Eighty-seven percent of submicrometer particles were reduced by the system under static conditions. This means that greater than 87 percent of respirable dust particles should be reduced as the particle-size distribution of respirable dust particles is greater than that of submicrometer particles, and filtration systems usually are more efficient in capturing the larger particles. A positive pressure near 0.02 inches of water gauge was produced, which is an important component of an effective system and minimizes the entry of particles, such as dust, into the room. The intake airflow was around 118 cfm, greater than the airflow suggested by the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) for acceptable indoor air quality. After one year, the loading of the filter caused the airflow to decrease to 80 cfm, which still produces acceptable indoor air quality. Due to the loading of the filters, the reduction efficiency for submicrometer particles under static conditions increased to 94 percent from 87 percent. PMID:26834293

  3. Atomic Bose-Hubbard Systems with Single-Particle Control

    NASA Astrophysics Data System (ADS)

    Preiss, Philipp Moritz

    Experiments with ultracold atoms in optical lattices provide outstanding opportunities to realize exotic quantum states due to a high degree of tunability and control. In this thesis, I present experiments that extend this control from global parameters to the level of individual particles. Using a quantum gas microscope for 87Rb, we have developed a single-site addressing scheme based on digital amplitude holograms. The system self-corrects for aberrations in the imaging setup and creates arbitrary beam profiles. We are thus able to shape optical potentials on the scale of single lattice sites and control the dynamics of individual atoms. We study the role of quantum statistics and interactions in the Bose-Hubbard model on the fundamental level of two particles. Bosonic quantum statistics are apparent in the Hong-Ou-Mandel interference of massive particles, which we observe in tailored double-well potentials. These underlying statistics, in combination with tunable repulsive interactions, dominate the dynamics in single- and two-particle quantum walks. We observe highly coherent position-space Bloch oscillations, bosonic bunching in Hanbury Brown-Twiss interference and the fermionization of strongly interacting bosons. Many-body states of indistinguishable quantum particles are characterized by large-scale spatial entanglement, which is difficult to detect in itinerant systems. Here, we extend the concept of Hong-Ou-Mandel interference from individual particles to many-body states to directly quantify entanglement entropy. We perform collective measurements on two copies of a quantum state and detect entanglement entropy through many-body interference. We measure the second order Renyi entropy in small Bose-Hubbard systems and detect the buildup of spatial entanglement across the superfluid-insulator transition. Our experiments open new opportunities for the single-particle-resolved preparation and characterization of many-body quantum states.

  4. Memoryless control of boundary concentrations of diffusing particles.

    PubMed

    Singer, A; Schuss, Z; Nadler, B; Eisenberg, R S

    2004-12-01

    Flux between regions of different concentration occurs in nearly every device involving diffusion, whether an electrochemical cell, a bipolar transistor, or a protein channel in a biological membrane. Diffusion theory has calculated that flux since the time of Fick (1855), and the flux has been known to arise from the stochastic behavior of Brownian trajectories since the time of Einstein (1905), yet the mathematical description of the behavior of trajectories corresponding to different types of boundaries is not complete. We consider the trajectories of noninteracting particles diffusing in a finite region connecting two baths of fixed concentrations. Inside the region, the trajectories of diffusing particles are governed by the Langevin equation. To maintain average concentrations at the boundaries of the region at their values in the baths, a control mechanism is needed to set the boundary dynamics of the trajectories. Different control mechanisms are used in Langevin and Brownian simulations of such systems. We analyze models of controllers and derive equations for the time evolution and spatial distribution of particles inside the domain. Our analysis shows a distinct difference between the time evolution and the steady state concentrations. While the time evolution of the density is governed by an integral operator, the spatial distribution is governed by the familiar Fokker-Planck operator. The boundary conditions for the time dependent density depend on the model of the controller; however, this dependence disappears in the steady state, if the controller is of a renewal type. Renewal-type controllers, however, produce spurious boundary layers that can be catastrophic in simulations of charged particles, because even a tiny net charge can have global effects. The design of a nonrenewal controller that maintains concentrations of noninteracting particles without creating spurious boundary layers at the interface requires the solution of the time-dependent Fokker-Planck equation with absorption of outgoing trajectories and a source of ingoing trajectories on the boundary (the so called albedo problem).

  5. User's guide for MODTOOLS: Computer programs for translating data of MODFLOW and MODPATH into geographic information system files

    USGS Publications Warehouse

    Orzol, Leonard L.

    1997-01-01

    MODTOOLS uses the particle data calculated by MODPATH to construct several types of GIS output. MODTOOLS uses particle information recorded by MODPATH such as the row, column, or layer of the model grid, to generate a set of characteristics associated with each particle. The user can choose from the set of characteristics associated with each particle and use the capabilities of the GIS to selectively trace the movement of water discharging from specific cells in the model grid. MODTOOLS allows the hydrogeologist to utilize the capabilities of the GIS to graphically combine the results of the particle-tracking analysis, which facilitates the analysis and understanding of complex ground-water flow systems.

  6. Repulsive Effect for Unbound High Energy Particles Along Rotation Axis in Kerr-Taub-NUT Spacetime

    NASA Astrophysics Data System (ADS)

    Zhang, Lu; Chen, Song-Bai

    2018-04-01

    We have investigated the acceleration of the unbound high energy particles moving along the rotation axis in the Kerr-Taub-NUT spacetime, and then study the dependence of the repulsive effects on the NUT charge for the particles in the spacetime. Whether the repulsive effects with the NUT charge become stronger depends on the Carter constant, the position and velocity of the particles themselves. We also present numerically the changes of the observable velocity and acceleration with the NUT charge for the unbound particles in the Kerr-Taub-NUT spacetime. Supported by the Scientific Research Fund of Hunan Provincial Education Department under Grant No. 17A124, and the Construct Program of Key Disciplines in Hunan Province

  7. Locating Stardust-like Particles in Aerogel Using X-Ray Techniques

    NASA Technical Reports Server (NTRS)

    Jurewicz, A. J. G.; Jones, S. M.; Tsapin, A.; Mih, D. T.; Connolly, H. C., Jr.; Graham, G. A.

    2003-01-01

    Silica aerogel is the material that the spacecraft STARDUST is using to collect interstellar and cometary silicates. Anticipating the return of the samples to earth in January of 2006, MANY individual investigators and, especially, the investigators in NASA's SRLIDAP program are studying means of both in situ analysis of particles, as well as particle extraction. To help individual PI's with extraction of particles from aerogel in their own laboratories, we are exploring the use of standard laboratory x-ray equipment and commercial techniques for precisely locating specific particles in aerogel. We approached the evaluation of commercial x-ray techniques as follows. First, we determined the most appropriate detector for use with aerogel and particulates. Then, we compared and contrasted techniques useful for university laboratories.

  8. Effect of Cobalt Particle Size on Acetone Steam Reforming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Junming; Zhang, He; Yu, Ning

    2015-06-11

    Carbon-supported cobalt nanoparticles with different particle sizes were synthesized and characterized by complementary characterization techniques such as X-ray diffraction, N-2 sorption, acetone temperature-programmed desorption, transmission electron microscopy, and CO chemisorption. Using acetone steam reforming reaction as a probe reaction, we revealed a volcano-shape curve of the intrinsic activity (turnover frequency of acetone) and the CO2 selectivity as a function of the cobalt particle size with the highest activity and selectivity observed at a particle size of approximately 12.8nm. Our results indicate that the overall performance of acetone steam reforming is related to a combination of particle-size-dependent acetone decomposition, water dissociation,more » and the oxidation state of the cobalt nanoparticles.« less

  9. Robust statistical reconstruction for charged particle tomography

    DOEpatents

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  10. A Hybrid Physics-Based Data-Driven Approach for Point-Particle Force Modeling

    NASA Astrophysics Data System (ADS)

    Moore, Chandler; Akiki, Georges; Balachandar, S.

    2017-11-01

    This study improves upon the physics-based pairwise interaction extended point-particle (PIEP) model. The PIEP model leverages a physical framework to predict fluid mediated interactions between solid particles. While the PIEP model is a powerful tool, its pairwise assumption leads to increased error in flows with high particle volume fractions. To reduce this error, a regression algorithm is used to model the differences between the current PIEP model's predictions and the results of direct numerical simulations (DNS) for an array of monodisperse solid particles subjected to various flow conditions. The resulting statistical model and the physical PIEP model are superimposed to construct a hybrid, physics-based data-driven PIEP model. It must be noted that the performance of a pure data-driven approach without the model-form provided by the physical PIEP model is substantially inferior. The hybrid model's predictive capabilities are analyzed using more DNS. In every case tested, the hybrid PIEP model's prediction are more accurate than those of physical PIEP model. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1315138 and the U.S. DOE, NNSA, ASC Program, as a Cooperative Agreement under Contract No. DE-NA0002378.

  11. Microscopy of the interacting Harper-Hofstadter model in the few-body limit

    NASA Astrophysics Data System (ADS)

    Tai, M. Eric; Lukin, Alexander; Rispoli, Matthew; Schittko, Robert; Menke, Tim; Borgnia, Dan; Preiss, Philipp; Grusdt, Fabian; Kaufman, Adam; Greiner, Markus

    2017-04-01

    The interplay of magnetic fields and interacting particles can lead to exotic phases of matter exhibiting topological order and high degrees of spatial entanglement. While these phases were discovered in a solid-state setting, recent techniques have enabled the realization of gauge fields in systems of ultracold neutral atoms, offering a new experimental paradigm for studying these novel states of matter. This complementary platform holds promise for exploring exotic physics in fractional quantum Hall systems due to the microscopic manipulation and precision possible in cold atom systems. However, these experiments thus far have mostly explored the regime of weak interactions. Here, we show how strong interactions can modify the propagation of particles in a 2 × N , real-space ladder governed by the Harper-Hofstadter model. We observe inter-particle interactions affect the populating of chiral bands, giving rise to chiral dynamics whose multi-particle correlations indicate both bound and free-particle character. The novel form of interaction-induced chirality observed in these experiments demonstrates the essential ingredients for future investigations of highly entangled topological phases of many-body systems. We are supported by Grants from the National Science Foundation, Gordon and Betty Moore Foundation's EPiQS Initiative, an Air Force Office of Scientific Research MURI program, an Army Research Office MURI program, and the NSF GRFP (MNR).

  12. Emission of particulate matter from a desktop three-dimensional (3D) printer

    PubMed Central

    Yi, Jinghai; LeBouf, Ryan F.; Duling, Matthew G.; Nurkiewicz, Timothy; Chen, Bean T.; Schwegler-Berry, Diane; Virji, M. Abbas; Stefaniak, Aleksandr B.

    2016-01-01

    ABSTRACT Desktop three-dimensional (3D) printers are becoming commonplace in business offices, public libraries, university labs and classrooms, and even private homes; however, these settings are generally not designed for exposure control. Prior experience with a variety of office equipment devices such as laser printers that emit ultrafine particles (UFP) suggests the need to characterize 3D printer emissions to enable reliable risk assessment. The aim of this study was to examine factors that influence particulate emissions from 3D printers and characterize their physical properties to inform risk assessment. Emissions were evaluated in a 0.5-m3 chamber and in a small room (32.7 m3) using real-time instrumentation to measure particle number, size distribution, mass, and surface area. Factors evaluated included filament composition and color, as well as the manufacturer-provided printer emissions control technologies while printing an object. Filament type significantly influenced emissions, with acrylonitrile butadiene styrene (ABS) emitting larger particles than polylactic acid (PLA), which may have been the result of agglomeration. Geometric mean particle sizes and total particle (TP) number and mass emissions differed significantly among colors of a given filament type. Use of a cover on the printer reduced TP emissions by a factor of 2. Lung deposition calculations indicated a threefold higher PLA particle deposition in alveoli compared to ABS. Desktop 3D printers emit high levels of UFP, which are released into indoor environments where adequate ventilation may not be present to control emissions. Emissions in nonindustrial settings need to be reduced through the use of a hierarchy of controls, beginning with device design, followed by engineering controls (ventilation) and administrative controls such as choice of filament composition and color. PMID:27196745

  13. Ten-gram-scale preparation of PTMS-based monodisperse ORMOSIL nano- and microparticles and conversion to silica particles

    NASA Astrophysics Data System (ADS)

    Kim, Jung Soo; Jung, Gyu Il; Kim, Soo Jung; Koo, Sang Man

    2018-03-01

    Monodisperse organically modified silica (ORMOSIL) particles, with an average diameter ranging from 550 nm to 4.2 μm, were prepared at low temperature at a scale of about 10 g/batch by a simple one-step self-emulsion process. The reaction mixture was composed only of water, phenyltrimethoxysilane (PTMS), and a base catalyst, without any surfactants. The size control of the particles and the monodispersity of resultant particles were achieved through the controlled supply of hydrolyzed PTMS monomer molecules, which was enabled by manipulating the reaction parameters, such as monomer concentration, type and amount of base catalyst, stirring rate, and reaction temperature. PTMS-based ORMOSIL particles were converted into silica particles by employing either a wet chemical reaction with an oleum-sulfuric acid mixture or thermal treatment above 650 °C. Complete removal of organic groups from the ORMOSIL particles was achieved by the thermal treatment while 74% removal was done by the chemical process used. [Figure not available: see fulltext.

  14. Modeling of Fine-Particle Formation in Turbulent Flames

    NASA Astrophysics Data System (ADS)

    Raman, Venkat; Fox, Rodney O.

    2016-01-01

    The generation of nanostructured particles in high-temperature flames is important both for the control of emissions from combustion devices and for the synthesis of high-value chemicals for a variety of applications. The physiochemical processes that lead to the production of fine particles in turbulent flames are highly sensitive to the flow physics and, in particular, the history of thermochemical compositions and turbulent features they encounter. Consequently, it is possible to change the characteristic size, structure, composition, and yield of the fine particles by altering the flow configuration. This review describes the complex multiscale interactions among turbulent fluid flow, gas-phase chemical reactions, and solid-phase particle evolution. The focus is on modeling the generation of soot particles, an unwanted pollutant from automobile and aircraft engines, as well as metal oxides, a class of high-value chemicals sought for specialized applications, including emissions control. Issues arising due to the numerical methods used to approximate the particle number density function, the modeling of turbulence-chemistry interactions, and model validation are also discussed.

  15. Novel flower-shaped albumin particles as controlled-release carriers for drugs to penetrate the round-window membrane.

    PubMed

    Yu, Zhan; Yu, Min; Zhou, Zhimin; Zhang, Zhibao; Du, Bo; Xiong, Qingqing

    2014-01-01

    Controlled-release carriers for local drug delivery have attracted increasing attention for inner-ear treatment recently. In this paper, flower-shaped bovine serum albumin (FBSA) particles were prepared by a modified desolvation method followed by glutaraldehyde or heat denaturation. The size of the FBSA particles varied from 10 μm to 100 μm, and most were 50-80 μm. Heat-denatured FBSA particles have good cytocompatibility with a prolonged survival time for L929 cells. The FBSA particles were utilized as carriers to investigate the release behaviors of the model drug - rhodamine B. Rhodamine B showed a sustained-release effect and penetrated the round-window membrane of guinea pigs. We also confirmed the attachment of FBSA particles onto the round-window membrane by microscopy. The FBSA particles, with good biocompatibility, drug-loading capacity, adhesive capability, and biodegradability, may have potential applications in the field of local drug delivery for inner-ear disease treatment.

  16. Wind Tunnel Seeding Systems for Laser Velocimeters

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr. (Compiler); Nichols, C. E., Jr. (Compiler)

    1985-01-01

    The principal motivating factor for convening the Workshop on the Development and Application of Wind Tunnel Seeding Systems for Laser Velocimeters is the necessity to achieve efficient operation and, most importantly, to insure accurate measurements with velocimeter techniques. The ultimate accuracy of particle scattering based laser velocimeter measurements of wind tunnel flow fields depends on the ability of the scattering particle to faithfully track the local flow field in which it is embedded. A complex relationship exists between the particle motion and the local flow field. This relationship is dependent on particle size, size distribution, shape, and density. To quantify the accuracy of the velocimeter measurements of the flow field, the researcher has to know the scattering particle characteristics. In order to obtain optimum velocimeter measurements, the researcher is striving to achieve control of the particle characteristics and to verify those characteristics at the measurement point. Additionally, the researcher is attempting to achieve maximum measurement efficiency through control of particle concentration and location in the flow field.

  17. Mind the gap: a flow instability controlled by particle-surface distance

    NASA Astrophysics Data System (ADS)

    Driscoll, Michelle; Delmotte, Blaise; Youssef, Mena; Sacanna, Stefano; Donev, Aleksandar; Chaikin, Paul

    2016-11-01

    Does a rotating particle always spin in place? Not if that particle is near a surface: rolling leads to translational motion, as well as very strong flows around the particle, even quite far away. These large advective flows strongly couple the motion of neighboring particles, giving rise to strong collective effects in groups of rolling particles. Using a model experimental system, weakly magnetic colloids driven by a rotating magnetic field, we observe that driving a compact group of microrollers leads to a new kind of flow instability. First, an initially uniformly-distributed strip of particles evolves into a shock structure, and then it becomes unstable, emitting fingers with a well-defined wavelength. Using 3D large-scale simulations in tandem with our experiments, we find that the instability wavelength is controlled not by the driving torque or the fluid viscosity, but a geometric parameter: the microroller's distance above the container floor. Furthermore, we find that the instability dynamics can be reproduced using only one ingredient: hydrodynamic interactions near a no-slip boundary.

  18. Abstract ID: 103 GAMOS: Implementation of a graphical user interface for dosimetry calculation in radiotherapy.

    PubMed

    Abdalaoui Slimani, Faical Alaoui; Bentourkia, M'hamed

    2018-01-01

    There are several computer programs or combination of programs for radiation tracking and other information in tissues by using Monte Carlo simulation [1]. Among these are GEANT4 [2] programs provided as classes that can be incorporated in C++ codes to achieve different tasks in radiation interactions with matter. GEANT4 made the physics easier but requires often a long learning-curve that implies a good knowledge of C++ and the Geant4 architecture. GAMOS [3], the Geant4-based Architecture for Medicine-Oriented Simulations, facilitates the use of Geant4 by providing a script language that covers almost all the needs of a radiotherapy simulation but it is obviously out of reach of biological researchers. The aim of the present work was to report the design and development of a Graphical User Interface (GUI) for absorbed dose calculation and for particle tracking in humans, small animals and phantoms. The GUI is based on the open source GEANT4 for the physics of particle interactions, on the QT cross-platform application for combining programming commands and for display. The calculation of the absorbed dose can be performed based on 3D CT images in DICOM format, from images of phantoms or from solid volumes that can be made from any pure or composite material to be specified by its molecular formulas. The GUI has several menus relative to the emitting source which can have different shapes, positions, energy as mono- or poly-energy such as X-ray spectra; the types of particles and particle interactions; energy deposition and absorbed dose; and the output results as histograms. In conclusion, the GUI we developed can be easily used by any researcher without the need to be familiar with computer programming, and it will be freely proposed as an open source. Copyright © 2017.

  19. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  20. Trapped particle absorption by the Ring of Jupiter

    NASA Technical Reports Server (NTRS)

    Fillius, W.

    1983-01-01

    The interaction of trapped radiation with the ring of Jupiter is investigated. Because it is an identical problem, the rings of Saturn and Uranus are also examined. Data from the Pioneer II encounter, deductions for some of the properties of the rings of Jupiter and Saturn. Over a dozen Jupiter magnetic field models are available in a program that integrates the adiabatic invariants to compute B and L. This program is to label our UCSD Pioneer II encounter data with the most satisfactory of these models. The expected effects of absorbing material on the trapped radiation are studied to obtain the loss rate as a function of ring properties. Analysis of the particle diffusion problem rounds out the theoretical end of the ring absorption problem. Other projects include identification of decay products for energetic particle albedo off the rings and moons of Saturn and a search for flux transfer events at the Jovian magnetopause.

Top