Nuclear and Particle Physics Simulations: The Consortium of Upper-Level Physics Software
NASA Astrophysics Data System (ADS)
Bigelow, Roberta; Moloney, Michael J.; Philpott, John; Rothberg, Joseph
1995-06-01
The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.
Bridging the Particle Physics and Big Data Worlds
NASA Astrophysics Data System (ADS)
Pivarski, James
2017-09-01
For decades, particle physicists have developed custom software because the scale and complexity of our problems were unique. In recent years, however, the ``big data'' industry has begun to tackle similar problems, and has developed some novel solutions. Incorporating scientific Python libraries, Spark, TensorFlow, and machine learning tools into the physics software stack can improve abstraction, reliability, and in some cases performance. Perhaps more importantly, it can free physicists to concentrate on domain-specific problems. Building bridges isn't always easy, however. Physics software and open-source software from industry differ in many incidental ways and a few fundamental ways. I will show work from the DIANA-HEP project to streamline data flow from ROOT to Numpy and Spark, to incorporate ideas of functional programming into histogram aggregation, and to develop real-time, query-style manipulations of particle data.
Motion of a Charged Particle in a Constant and Uniform Electromagnetic Field
ERIC Educational Resources Information Center
Ladino, L. A.; Rondón, S. H.; Orduz, P.
2015-01-01
This paper focuses on the use of software developed by the authors that allows the visualization of the motion of a charged particle under the influence of magnetic and electric fields in 3D, at a level suitable for introductory physics courses. The software offers the possibility of studying a great number of physical situations that can…
Impact of detector simulation in particle physics collider experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elvira, V. Daniel
Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less
Impact of detector simulation in particle physics collider experiments
Elvira, V. Daniel
2017-06-01
Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less
Impact of detector simulation in particle physics collider experiments
NASA Astrophysics Data System (ADS)
Daniel Elvira, V.
2017-06-01
Through the last three decades, accurate simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics (HEP) experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detector simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the precision of the physics results and publication turnaround, from data-taking to submission. It also presents estimates of the cost and economic impact of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data with increasingly complex detectors, taxing heavily the performance of simulation and reconstruction software. Consequently, exploring solutions to speed up simulation and reconstruction software to satisfy the growing demand of computing resources in a time of flat budgets is a matter that deserves immediate attention. The article ends with a short discussion on the potential solutions that are being considered, based on leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering HEP code for concurrency and parallel computing.
NASA Astrophysics Data System (ADS)
Vikhlyantsev, O. P.; Generalov, L. N.; Kuryakin, A. V.; Karpov, I. A.; Gurin, N. E.; Tumkin, A. D.; Fil'chagin, S. V.
2017-12-01
A hardware-software complex for measurement of energy and angular distributions of charged particles formed in nuclear reactions is presented. Hardware and software structures of the complex, the basic set of the modular nuclear-physical apparatus of a multichannel detecting system on the basis of Δ E- E telescopes of silicon detectors, and the hardware of experimental data collection, storage, and processing are presented and described.
Software-aided discussion about classical picture of Mach-Zehnder interferometer
NASA Astrophysics Data System (ADS)
Cavalcanti, C. J. H.; Ostermann, F.; Lima, N. W.; Netto, J. S.
2017-11-01
The Mach-Zehnder interferometer has played an important role both in quantum and classical physics research over the years. In physics education, it has been used as a didactic tool for quantum physics teaching, allowing fundamental concepts, such as particle-wave duality, to be addressed from the very beginning. For a student to understand the novelties of the quantum scenario, it is first worth introducing the classical picture. In this paper, we introduce a new version of the software developed by our research group to deepen the discussion on the classical picture of the Mach-Zehnder interferometer. We present its equivalence with the double slit experiment and we derive the mathematical expressions relating to the interference pattern. We also explore the concept of visibility (which is very important for understanding wave-particle complementarity in quantum physics) to help students become familiar with this experiment and to enhance their knowledge of its counterintuitive aspects. We use the software articulated by the mathematical formalism and phenomenological features. We also present excerpts of the discursive interactions of students using the software in didactic situations.
NASA Astrophysics Data System (ADS)
Brandt, Douglas; Hiller, John R.; Moloney, Michael J.
1995-10-01
The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.
HEP Software Foundation Community White Paper Working Group - Detector Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolakis, J.
A working group on detector simulation was formed as part of the high-energy physics (HEP) Software Foundation's initiative to prepare a Community White Paper that describes the main software challenges and opportunities to be faced in the HEP field over the next decade. The working group met over a period of several months in order to review the current status of the Full and Fast simulation applications of HEP experiments and the improvements that will need to be made in order to meet the goals of future HEP experimental programmes. The scope of the topics covered includes the main componentsmore » of a HEP simulation application, such as MC truth handling, geometry modeling, particle propagation in materials and fields, physics modeling of the interactions of particles with matter, the treatment of pileup and other backgrounds, as well as signal processing and digitisation. The resulting work programme described in this document focuses on the need to improve both the software performance and the physics of detector simulation. The goals are to increase the accuracy of the physics models and expand their applicability to future physics programmes, while achieving large factors in computing performance gains consistent with projections on available computing resources.« less
The Particle Physics Data Grid. Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livny, Miron
2002-08-16
The main objective of the Particle Physics Data Grid (PPDG) project has been to implement and evaluate distributed (Grid-enabled) data access and management technology for current and future particle and nuclear physics experiments. The specific goals of PPDG have been to design, implement, and deploy a Grid-based software infrastructure capable of supporting the data generation, processing and analysis needs common to the physics experiments represented by the participants, and to adapt experiment-specific software to operate in the Grid environment and to exploit this infrastructure. To accomplish these goals, the PPDG focused on the implementation and deployment of several critical services:more » reliable and efficient file replication service, high-speed data transfer services, multisite file caching and staging service, and reliable and recoverable job management services. The focus of the activity was the job management services and the interplay between these services and distributed data access in a Grid environment. Software was developed to study the interaction between HENP applications and distributed data storage fabric. One key conclusion was the need for a reliable and recoverable tool for managing large collections of interdependent jobs. An attached document provides an overview of the current status of the Directed Acyclic Graph Manager (DAGMan) with its main features and capabilities.« less
Software-type Wave-Particle Interaction Analyzer (SWPIA) by RPWI for JUICE
NASA Astrophysics Data System (ADS)
Katoh, Y.; Kojima, H.; Asamura, K.; Kasaba, Y.; Tsuchiya, F.; Kasahara, Y.; Ishisaka, S.; Kimura, T.; Miyoshi, Y.; Santolik, O.; Bergman, J.; Puccio, W.; Gill, R.; Wieser, M.; Schmidt, W.; Barabash, S.; Wahlund, J.-E.
2017-09-01
Software-type Wave-Particle Interaction Analyzer (SWPIA) will be realized as a software function of Low-Frequency receiver (LF) running on the DPU of RPWI (Radio and Plasma Waves Investigation) for the ESA JUICE mission. SWPIA conducts onboard computations of physical quantities indicating the energy exchange between plasma waves and energetic ions. Onboard inter-instruments communications are necessary to realize SWPIA, which will be implemented by efforts of RPWI, PEP (Particle Environment Package) and J-MAG (JUICE Magnetometer). By providing the direct evidence of ion energization processes by plasma waves around Jovian satellites, SWPIA contributes scientific output of JUICE as much as possible with keeping its impact on the telemetry data size to a minimum.
Cross-platform validation and analysis environment for particle physics
NASA Astrophysics Data System (ADS)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
2017-11-01
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.
The Particle Physics Playground website: tutorials and activities using real experimental data
NASA Astrophysics Data System (ADS)
Bellis, Matthew; CMS Collaboration
2016-03-01
The CERN Open Data Portal provides access to data from the LHC experiments to anyone with the time and inclination to learn the analysis procedures. The CMS experiment has made a significant amount of data availible in basically the same format the collaboration itself uses, along with software tools and a virtual enviroment in which to run those tools. These same data have also been mined for educational exercises that range from very simple .csv files that can be analyzed in a spreadsheet to more sophisticated formats that use ROOT, a dominant software package in experimental particle physics but not used as much in the general computing community. This talk will present the Particle Physics Playground website (http://particle-physics-playground.github.io/), a project that uses data from the CMS experiment, as well as the older CLEO experiment, in tutorials and exercises aimed at high school and undergraduate students and other science enthusiasts. The data are stored as text files and the users are provided with starter Python/Jupyter notebook programs and accessor functions which can be modified to perform fairly high-level analyses. The status of the project, success stories, and future plans for the website will be presented. This work was supported in part by NSF Grant PHY-1307562.
Kassiopeia: a modern, extensible C++ particle tracking package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Kassiopeia: a modern, extensible C++ particle tracking package
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; ...
2017-05-16
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
iQIST v0.7: An open source continuous-time quantum Monte Carlo impurity solver toolkit
NASA Astrophysics Data System (ADS)
Huang, Li
2017-12-01
In this paper, we present a new version of the iQIST software package, which is capable of solving various quantum impurity models by using the hybridization expansion (or strong coupling expansion) continuous-time quantum Monte Carlo algorithm. In the revised version, the software architecture is completely redesigned. New basis (intermediate representation or singular value decomposition representation) for the single-particle and two-particle Green's functions is introduced. A lot of useful physical observables are added, such as the charge susceptibility, fidelity susceptibility, Binder cumulant, and autocorrelation time. Especially, we optimize measurement for the two-particle Green's functions. Both the particle-hole and particle-particle channels are supported. In addition, the block structure of the two-particle Green's functions is exploited to accelerate the calculation. Finally, we fix some known bugs and limitations. The computational efficiency of the code is greatly enhanced.
Cross-platform validation and analysis environment for particle physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for onlinemore » validation of Monte Carlo event samples through a web interface.« less
Kassiopeia: a modern, extensible C++ particle tracking package
NASA Astrophysics Data System (ADS)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; Babutzka, Martin; Barrett, John P.; Behrens, Jan; Buzinsky, Nicholas; Corona, Thomas; Enomoto, Sanshiro; Erhard, Moritz; Formaggio, Joseph A.; Glück, Ferenc; Harms, Fabian; Heizmann, Florian; Hilk, Daniel; Käfer, Wolfgang; Kleesiek, Marco; Leiber, Benjamin; Mertens, Susanne; Oblath, Noah S.; Renschler, Pascal; Schwarz, Johannes; Slocum, Penny L.; Wandkowsky, Nancy; Wierman, Kevin; Zacher, Michael
2017-05-01
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle’s state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.
NASA Astrophysics Data System (ADS)
Swanson, Molly E. C.
2008-08-01
Particles have tremendous potential as astronomical messengers, and conversely, studying the universe as a whole also teaches us about particle physics. This thesis encompasses both of these research directions. Many models predict a diffuse flux of high energy neutrinos from active galactic nuclei and other astrophysical sources. The "Astrophysics Underground" portion of this thesis describes a search for this neutrino flux performed by looking for very high energy upward-going muons using the Super-K detector. In addition to using particles to do astronomy, we can also use the universe itself as a particle physics lab. The "Particle Physics in the Sky" portion of this thesis focuses on extracting cosmological information from galaxy surveys. To overcome technical challenges faced by the latest galaxy surveys, we produced a comprehensive upgrade to mangle, a software package that processes the angular masks defining the survey area on the sky. We added dramatically faster algorithms and new useful features that are necessary for managing complex masks of current and next-generation galaxy surveys. With this software in hand, we utilized SDSS data to investigate the relation between galaxies and dark matter by studying relative bias, i.e., the relation between different types of galaxies. Separating galaxies by their luminosities and colors reveals a complicated picture: red galaxies are clustered more strongly than blue galaxies, with both the brightest and the faintest red galaxies showing the strongest clustering. Furthermore, red and blue galaxies tend to occupy different regions of space. In order to make precise measurements from the next generation of galaxy surveys, it will be essential to account for this complexity.
Determining Trajectory of Triboelectrically Charged Particles, Using Discrete Element Modeling
NASA Technical Reports Server (NTRS)
2008-01-01
The Kennedy Space Center (KSC) Electrostatics and Surface Physics Laboratory is participating in an Innovative Partnership Program (IPP) project with an industry partner to modify a commercial off-the-shelf simulation software product to treat the electrodynamics of particulate systems. Discrete element modeling (DEM) is a numerical technique that can track the dynamics of particle systems. This technique, which was introduced in 1979 for analysis of rock mechanics, was recently refined to include the contact force interaction of particles with arbitrary surfaces and moving machinery. In our work, we endeavor to incorporate electrostatic forces into the DEM calculations to enhance the fidelity of the software and its applicability to (1) particle processes, such as electrophotography, that are greatly affected by electrostatic forces, (2) grain and dust transport, and (3) the study of lunar and Martian regoliths.
Building an infrastructure at PICKSC for the educational use of kinetic software tools
NASA Astrophysics Data System (ADS)
Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; Amorim, L. D.; An, W.; Dalichaouch, T. N.; Davidson, A.; Joglekar, A.; Li, F.; May, J.; Touati, M.; Xu, X. L.; Yu, P.
2016-10-01
One aim of the Particle-In-Cell and Kinetic Simulation Center (PICKSC) at UCLA is to coordinate a community development of educational software for undergraduate and graduate courses in plasma physics and computer science. The rich array of physical behaviors exhibited by plasmas can be difficult to grasp by students. If they are given the ability to quickly and easily explore plasma physics through kinetic simulations, and to make illustrative visualizations of plasma waves, particle motion in electromagnetic fields, instabilities, or other phenomena, then they can be equipped with first-hand experiences that inform and contextualize conventional texts and lectures. We are developing an infrastructure for any interested persons to take our kinetic codes, run them without any prerequisite knowledge, and explore desired scenarios. Furthermore, we are actively interested in any ideas or input from other plasma physicists. This poster aims to illustrate what we have developed and gather a community of interested users and developers. Supported by NSF under Grant ACI-1339893.
HEPData: a repository for high energy physics data
NASA Astrophysics Data System (ADS)
Maguire, Eamonn; Heinrich, Lukas; Watt, Graeme
2017-10-01
The Durham High Energy Physics Database (HEPData) has been built up over the past four decades as a unique open-access repository for scattering data from experimental particle physics papers. It comprises data points underlying several thousand publications. Over the last two years, the HEPData software has been completely rewritten using modern computing technologies as an overlay on the Invenio v3 digital library framework. The software is open source with the new site available at https://hepdata.net now replacing the previous site at http://hepdata.cedar.ac.uk. In this write-up, we describe the development of the new site and explain some of the advantages it offers over the previous platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael
The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less
NASA Technical Reports Server (NTRS)
Pordes, Ruth (Editor)
1989-01-01
Papers on real-time computer applications in nuclear, particle, and plasma physics are presented, covering topics such as expert systems tactics in testing FASTBUS segment interconnect modules, trigger control in a high energy physcis experiment, the FASTBUS read-out system for the Aleph time projection chamber, a multiprocessor data acquisition systems, DAQ software architecture for Aleph, a VME multiprocessor system for plasma control at the JT-60 upgrade, and a multiasking, multisinked, multiprocessor data acquisition front end. Other topics include real-time data reduction using a microVAX processor, a transputer based coprocessor for VEDAS, simulation of a macropipelined multi-CPU event processor for use in FASTBUS, a distributed VME control system for the LISA superconducting Linac, a distributed system for laboratory process automation, and a distributed system for laboratory process automation. Additional topics include a structure macro assembler for the event handler, a data acquisition and control system for Thomson scattering on ATF, remote procedure execution software for distributed systems, and a PC-based graphic display real-time particle beam uniformity.
Towards a high performance geometry library for particle-detector simulations
Apostolakis, J.; Bandieramonte, M.; Bitzes, G.; ...
2015-05-22
Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less
Towards a high performance geometry library for particle-detector simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolakis, J.; Bandieramonte, M.; Bitzes, G.
Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less
NASA Astrophysics Data System (ADS)
Plebe, Alice; Grasso, Giorgio
2016-12-01
This paper describes a system developed for the simulation of flames inside an open-source 3D computer graphic software, Blender, with the aim of analyzing in virtual reality scenarios of hazards in large-scale industrial plants. The advantages of Blender are of rendering at high resolution the very complex structure of large industrial plants, and of embedding a physical engine based on smoothed particle hydrodynamics. This particle system is used to evolve a simulated fire. The interaction of this fire with the components of the plant is computed using polyhedron separation distance, adopting a Voronoi-based strategy that optimizes the number of feature distance computations. Results on a real oil and gas refining industry are presented.
How do particle physicists learn the programming concepts they need?
NASA Astrophysics Data System (ADS)
Kluth, S.; Pia, M. G.; Schoerner-Sadenius, T.; Steinbach, P.
2015-12-01
The ability to read, use and develop code efficiently and successfully is a key ingredient in modern particle physics. We report the experience of a training program, identified as “Advanced Programming Concepts”, that introduces software concepts, methods and techniques to work effectively on a daily basis in a HEP experiment or other programming intensive fields. This paper illustrates the principles, motivations and methods that shape the “Advanced Computing Concepts” training program, the knowledge base that it conveys, an analysis of the feedback received so far, and the integration of these concepts in the software development process of the experiments as well as its applicability to a wider audience.
Nuclear Data Online Services at Peking University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, T.S.; Guo, Z.Y.; Ye, W.G.
2005-05-24
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
Nuclear Data Online Services at Peking University
NASA Astrophysics Data System (ADS)
Fan, T. S.; Guo, Z. Y.; Ye, W. G.; Liu, W. L.; Liu, T. J.; Liu, C. X.; Chen, J. X.; Tang, G. Y.; Shi, Z. M.; Huang, X. L.; Chen, J. E.
2005-05-01
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
Beyond detection: nuclear physics with a webcam in an educational setting
NASA Astrophysics Data System (ADS)
Pallone, Arthur
2015-03-01
Nuclear physics affects our daily lives in such diverse fields from medicine to art. I believe three obstacles - limited time, lack of subject familiarity and thus comfort on the part of educators, and equipment expense - must be overcome to produce a nuclear-educated populace. Educators regularly use webcams to actively engage students in scientific discovery as evidenced by a literature search for the term webcam paired with topics such as astronomy, biology, and physics. Inspired by YouTube videos that demonstrate alpha particle detection by modified webcams, I searched for examples that go beyond simple detection with only one education-oriented result - the determination of the in-air range of alphas using a modified CCD camera. Custom-built, radiation-hardened CMOS detectors exist in high energy physics and for soft x-ray detection. Commercial CMOS cameras are used for direct imaging in electron microscopy. I demonstrate charged-particle spectrometry with a slightly modified CMOS-based webcam. When used with inexpensive sources of radiation and free software, the webcam charged-particle spectrometer presents educators with a simple, low-cost technique to include nuclear physics in science education.
A Roadmap for HEP Software and Computing R&D for the 2020s
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Antonio Augusto, Jr; et al.
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to preparemore » for this software upgrade.« less
Physics in ;Real Life;: Accelerator-based Research with Undergraduates
NASA Astrophysics Data System (ADS)
Klay, J. L.
All undergraduates in physics and astronomy should have access to significant research experiences. When given the opportunity to tackle challenging open-ended problems outside the classroom, students build their problem-solving skills in ways that better prepare them for the workplace or future research in graduate school. Accelerator-based research on fundamental nuclear and particle physics can provide a myriad of opportunities for undergraduate involvement in hardware and software development as well as ;big data; analysis. The collaborative nature of large experiments exposes students to scientists of every culture and helps them begin to build their professional network even before they graduate. This paper presents an overview of my experiences - the good, the bad, and the ugly - engaging undergraduates in particle and nuclear physics research at the CERN Large Hadron Collider and the Los Alamos Neutron Science Center.
WorldWide Web: Hypertext from CERN.
ERIC Educational Resources Information Center
Nickerson, Gord
1992-01-01
Discussion of software tools for accessing information on the Internet focuses on the WorldWideWeb (WWW) system, which was developed at the European Particle Physics Laboratory (CERN) in Switzerland to build a worldwide network of hypertext links using available networking technology. Its potential for use with multimedia documents is also…
Giving pandas ROOT to chew on: experiences with the XENON1T Dark Matter experiment
NASA Astrophysics Data System (ADS)
Remenska, D.; Tunnell, C.; Aalbers, J.; Verhoeven, S.; Maassen, J.; Templon, J.
2017-10-01
In preparation for the XENON1T Dark Matter data acquisition, we have prototyped and implemented a new computing model. The XENON signal and data processing software is developed fully in Python 3, and makes extensive use of generic scientific data analysis libraries, such as the SciPy stack. A certain tension between modern “Big Data” solutions and existing HEP frameworks is typically experienced in smaller particle physics experiments. ROOT is still the “standard” data format in our field, defined by large experiments (ATLAS, CMS). To ease the transition, our computing model caters to both analysis paradigms, leaving the choice of using ROOT-specific C++ libraries, or alternatively, Python and its data analytics tools, as a front-end choice of developing physics algorithms. We present our path on harmonizing these two ecosystems, which allowed us to use off-the-shelf software libraries (e.g., NumPy, SciPy, scikit-learn, matplotlib) and lower the cost of development and maintenance. To analyse the data, our software allows researchers to easily create “mini-trees” small, tabular ROOT structures for Python analysis, which can be read directly into pandas DataFrame structures. One of our goals was making ROOT available as a cross-platform binary for an easy installation from the Anaconda Cloud (without going through the “dependency hell”). In addition to helping us discover dark matter interactions, lowering this barrier helps shift the particle physics toward non-domain-specific code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genser, Krzysztof; Hatcher, Robert; Perdue, Gabriel
2016-11-10
The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies.more » It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.« less
IViPP: A Tool for Visualization in Particle Physics
NASA Astrophysics Data System (ADS)
Tran, Hieu; Skiba, Elizabeth; Baldwin, Doug
2011-10-01
Experiments and simulations in physics generate a lot of data; visualization is helpful to prepare that data for analysis. IViPP (Interactive Visualizations in Particle Physics) is an interactive computer program that visualizes results of particle physics simulations or experiments. IViPP can handle data from different simulators, such as SRIM or MCNP. It can display relevant geometry and measured scalar data; it can do simple selection from the visualized data. In order to be an effective visualization tool, IViPP must have a software architecture that can flexibly adapt to new data sources and display styles. It must be able to display complicated geometry and measured data with a high dynamic range. We therefore organize it in a highly modular structure, we develop libraries to describe geometry algorithmically, use rendering algorithms running on the powerful GPU to display 3-D geometry at interactive rates, and we represent scalar values in a visual form of scientific notation that shows both mantissa and exponent. This work was supported in part by the US Department of Energy through the Laboratory for Laser Energetics (LLE), with special thanks to Craig Sangster at LLE.
DEM Solutions Develops Answers to Modeling Lunar Dust and Regolith
NASA Technical Reports Server (NTRS)
Dunn, Carol Anne; Calle, Carlos; LaRoche, Richard D.
2010-01-01
With the proposed return to the Moon, scientists like NASA-KSC's Dr. Calle are concerned for a number of reasons. We will be staying longer on the planet's surface, future missions may include dust-raising activities, such as excavation and handling of lunar soil and rock, and we will be sending robotic instruments to do much of the work for us. Understanding more about the chemical and physical properties of lunar dust, how dust particles interact with each other and with equipment surfaces and the role of static electricity build-up on dust particles in the low-humidity lunar environment is imperative to the development of technologies for removing and preventing dust accumulation, and successfully handling lunar regolith. Dr. Calle is currently working on the problems of the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces, particularly to those of Mars and the Moon, and is heavily involved in developing instrumentation for future planetary missions. With this end in view, the NASA Kennedy Space Center's Innovative Partnerships Program Office partnered with OEM Solutions, Inc. OEM Solutions is a global leader in particle dynamics simulation software, providing custom solutions for use in tackling tough design and process problems related to bulk solids handling. Customers in industries such as pharmaceutical, chemical, mineral, and materials processing as well as oil and gas production, agricultural and construction, and geo-technical engineering use OEM Solutions' EDEM(TradeMark) software to improve the design and operation of their equipment while reducing development costs, time-to-market and operational risk. EDEM is the world's first general-purpose computer-aided engineering (CAE) tool to use state-of-the-art discrete element modeling technology for the simulation and analysis of particle handling and manufacturing operations. With EDEM you'can quickly and easily create a parameterized model of your granular solids system. Computer-aided design (CAD) models of real particles can be imported to obtain an accurate representation of their shape. EDEM(TradeMark) uses particle-scale behavior models to simulate bulk solids behavior. In addition to particle size and shape, the models can account for physical properties of particles along with interaction between particles and with equipment surfaces and surrounding media, as needed to define the physics of a particular process.
NASA Astrophysics Data System (ADS)
Lv, Hongkui; He, Huihai; Sheng, Xiangdong; Liu, Jia; Chen, Songzhan; Liu, Ye; Hou, Chao; Zhao, Jing; Zhang, Zhongquan; Wu, Sha; Wang, Yaping; Lhaaso Collaboration
2018-07-01
In the Large High Altitude Air Shower Observatory (LHAASO), one square kilometer array (KM2A), with 5242 electromagnetic particle detectors (EDs) and 1171 muon detectors (MDs), is designed to study ultra-high energy gamma-ray astronomy and cosmic ray physics. The remoteness and numerous detectors extremely demand a robust and automatic calibration procedure. In this paper, a self-calibration method which relies on the measurement of charged particles within the extensive air showers is proposed. The method is fully validated by Monte Carlo simulation and successfully applied in a KM2A prototype array experiment. Experimental results show that the self-calibration method can be used to determine the detector time offset constants at the sub-nanosecond level and the number density of particles collected by each ED with an accuracy of a few percents, which are adequate to meet the physical requirements of LHAASO experiment. This software calibration also offers an ideal method to realtime monitor the detector performances for next generation ground-based EAS experiments covering an area above square kilometers scale.
NASA Astrophysics Data System (ADS)
Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.
2013-05-01
Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.
AF-GEOSpace Version 2.1 Release
NASA Astrophysics Data System (ADS)
Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Madden, D.; Perry, K. L.; Tautz, M.; Roth, C.
2006-05-01
AF-GEOSpace Version 2.1 is a graphics-intensive software program with space environment models and applications developed recently by the Space Weather Center of Excellence at AFRL. A review of new and planned AF-GEOSpace capabilities will be given. The software addresses a wide range of physical domains and addresses such topics as solar disturbance propagation, geomagnetic field and radiation belt configurations, auroral particle precipitation, and ionospheric scintillation. Building on the success of previous releases, AF-GEOSpace has become a platform for the rapid prototyping of automated operational and simulation space weather visualization products and helps with a variety of tasks, including: orbit specification for radiation hazard avoidance; satellite design assessment and post-event anomaly analysis; solar disturbance effects forecasting; determination of link outage regions for active ionospheric conditions; satellite magnetic conjugate studies, scientific model validation and comparison, physics research, and education. Previously, Version 2.0 provided a simplified graphical user interface, improved science and application modules, significantly enhanced graphical performance, common input data archive sets, and 1-D, 2-D, and 3- D visualization tools for all models. Dynamic capabilities permit multiple environments to be generated at user- specified time intervals while animation tools enable the display of satellite orbits and environment data together as a function of time. Building on the Version 2.0 software architecture, AF-GEOSpace Version 2.1 includes a host of new modules providing, for example, plasma sheet charged particle fluxes, neutral atmosphere densities, 3-D cosmic ray cutoff maps, low-altitude trapped proton belt flux specification, DMSP particle data displays, satellite magnetic field footprint mapping determination, and meteor sky maps and shower/storm fluxes with spacecraft impact probabilities. AF-GEOSpace Version 2.1 was developed for Windows XP and Linux systems. To receive a copy of the AF-GEOSpace 2.1 software, please submit requests via e-mail to the first author.
Simulation tools for particle-based reaction-diffusion dynamics in continuous space
2014-01-01
Particle-based reaction-diffusion algorithms facilitate the modeling of the diffusional motion of individual molecules and the reactions between them in cellular environments. A physically realistic model, depending on the system at hand and the questions asked, would require different levels of modeling detail such as particle diffusion, geometrical confinement, particle volume exclusion or particle-particle interaction potentials. Higher levels of detail usually correspond to increased number of parameters and higher computational cost. Certain systems however, require these investments to be modeled adequately. Here we present a review on the current field of particle-based reaction-diffusion software packages operating on continuous space. Four nested levels of modeling detail are identified that capture incrementing amount of detail. Their applicability to different biological questions is discussed, arching from straight diffusion simulations to sophisticated and expensive models that bridge towards coarse grained molecular dynamics. PMID:25737778
Elementary Particle Physics Experiment at the University of Massachusetts, Amherst
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brau, Benjamin; Dallapiccola, Carlo; Willocq, Stephane
2013-07-30
In this progress report we summarize the activities of the University of Massachusetts- Amherst group for the three years of this research project. We are fully engaged in research at the energy frontier with the ATLAS experiment at the CERN Large Hadron Collider. We have made leading contributions in software development and performance studies for the ATLAS Muon Spectrometer, as well as on physics analysis with an emphasis on Standard Model measurements and searches for physics beyond the Standard Model. In addition, we have increased our contributions to the Muon Spectrometer New Small Wheel upgrade project.
A Browser-Based Multi-User Working Environment for Physicists
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.
2014-06-01
Many programs in experimental particle physics do not yet have a graphical interface, or demand strong platform and software requirements. With the most recent development of the VISPA project, we provide graphical interfaces to existing software programs and access to multiple computing clusters through standard web browsers. The scalable clientserver system allows analyses to be performed in sizable teams, and disburdens the individual physicist from installing and maintaining a software environment. The VISPA graphical interfaces are implemented in HTML, JavaScript and extensions to the Python webserver. The webserver uses SSH and RPC to access user data, code and processes on remote sites. As example applications we present graphical interfaces for steering the reconstruction framework OFFLINE of the Pierre-Auger experiment, and the analysis development toolkit PXL. The browser based VISPA system was field-tested in biweekly homework of a third year physics course by more than 100 students. We discuss the system deployment and the evaluation by the students.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Nuclear spectroscopy with Geant4. The superheavy challenge
NASA Astrophysics Data System (ADS)
Sarmiento, Luis G.
2016-12-01
The simulation toolkit Geant4 was originally developed at CERN for high-energy physics. Over the years it has been established as a swiss army knife not only in particle physics but it has seen an accelerated expansion towards nuclear physics and more recently to medical imaging and γ- and ion- therapy to mention but a handful of new applications. The validity of Geant4 is vast and large across many particles, ions, materials, and physical processes with typically various different models to choose from. Unfortunately, atomic nuclei with atomic number Z > 100 are not properly supported. This is likely due to the rather novelty of the field, its comparably small user base, and scarce evaluated experimental data. To circumvent this situation different workarounds have been used over the years. In this work the simulation toolkit Geant4 will be introduced with its different components and the effort to bring the software to the heavy and superheavy region will be described.
Design Considerations of a Virtual Laboratory for Advanced X-ray Sources
NASA Astrophysics Data System (ADS)
Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.
2004-11-01
The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.
2016 Research Outreach Program report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hye Young; Kim, Yangkyu
2016-10-13
This paper is the research activity report for 4 weeks in LANL. Under the guidance of Dr. Lee, who performs nuclear physics research at LANSCE, LANL, I studied the Low Energy NZ (LENZ) setup and how to use the LENZ. First, I studied the LENZ chamber and Si detectors, and worked on detector calibrations, using the computer software, ROOT (CERN developed data analysis tool) and EXCEL (Microsoft office software). I also performed the calibration experiments that measure alpha particles emitted from a Th-229 source by using a S1-type detector (Si detector). And with Dr. Lee, we checked the result.
MULTINEST: an efficient and robust Bayesian inference tool for cosmology and particle physics
NASA Astrophysics Data System (ADS)
Feroz, F.; Hobson, M. P.; Bridges, M.
2009-10-01
We present further development and the first public release of our multimodal nested sampling algorithm, called MULTINEST. This Bayesian inference tool calculates the evidence, with an associated error estimate, and produces posterior samples from distributions that may contain multiple modes and pronounced (curving) degeneracies in high dimensions. The developments presented here lead to further substantial improvements in sampling efficiency and robustness, as compared to the original algorithm presented in Feroz & Hobson, which itself significantly outperformed existing Markov chain Monte Carlo techniques in a wide range of astrophysical inference problems. The accuracy and economy of the MULTINEST algorithm are demonstrated by application to two toy problems and to a cosmological inference problem focusing on the extension of the vanilla Λ cold dark matter model to include spatial curvature and a varying equation of state for dark energy. The MULTINEST software, which is fully parallelized using MPI and includes an interface to COSMOMC, is available at http://www.mrao.cam.ac.uk/software/multinest/. It will also be released as part of the SUPERBAYES package, for the analysis of supersymmetric theories of particle physics, at http://www.superbayes.org.
Charged particle tracking through electrostatic wire meshes using the finite element method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devlin, L. J.; Karamyshev, O.; Welsch, C. P., E-mail: carsten.welsch@cockcroft.ac.uk
Wire meshes are used across many disciplines to accelerate and focus charged particles, however, analytical solutions are non-exact and few codes exist which simulate the exact fields around a mesh with physical sizes. A tracking code based in Matlab-Simulink using field maps generated using finite element software has been developed which tracks electrons or ions through electrostatic wire meshes. The fields around such a geometry are presented as an analytical expression using several basic assumptions, however, it is apparent that computational calculations are required to obtain realistic values of electric potential and fields, particularly when multiple wire meshes are deployed.more » The tracking code is flexible in that any quantitatively describable particle distribution can be used for both electrons and ions as well as other benefits such as ease of export to other programs for analysis. The code is made freely available and physical examples are highlighted where this code could be beneficial for different applications.« less
NASA Astrophysics Data System (ADS)
Hawkins, Donovan Lee
In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.
Reconstruction software of the silicon tracker of DAMPE mission
NASA Astrophysics Data System (ADS)
Tykhonov, A.; Gallo, V.; Wu, X.; Zimmer, S.
2017-10-01
DAMPE is a satellite-borne experiment aimed to probe astroparticle physics in the GeV-TeV energy range. The Silicon tracker (STK) is one of the key components of DAMPE, which allows the reconstruction of trajectories (tracks) of detected particles. The non-negligible amount of material in the tracker poses a challenge to its reconstruction and alignment. In this paper we describe methods to address this challenge. We present the track reconstruction algorithm and give insight into the alignment algorithm. We also present our CAD-to-GDML converter, an in-house tool for implementing detector geometry in the software from the CAD drawings of the detector.
NASA Astrophysics Data System (ADS)
McMahon, Allison; Sauncy, Toni
2008-10-01
Light manipulation is a very powerful tool in physics, biology, and chemistry. There are several physical principles underlying the apparatus known as the ``optical tweezers,'' the term given to using focused light to manipulate and control small objects. By carefully controlling the orientation and position of a focused laser beam, dielectric particles can be effectively trapped and manipulated. We have designed a cost efficient and effective undergraduate optical tweezers apparatus by using standard ``off the shelf'' components and starting with a standard undergraduate laboratory microscope. Images are recorded using a small CCD camera interfaced to a computer and controlled by LabVIEW^TM software. By using wave plates to produce circular polarized light, rotational motion can be induced in small particles of birefringent materials such as calcite and mica.
Allison, J.; Amako, K.; Apostolakis, J.; ...
2016-07-01
Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fields, Laura; Genser, Krzysztof; Hatcher, Robert
Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. Thismore » raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.« less
Developments in the ATLAS Tracking Software ahead of LHC Run 2
NASA Astrophysics Data System (ADS)
Styles, Nicholas; Bellomo, Massimiliano; Salzburger, Andreas; ATLAS Collaboration
2015-05-01
After a hugely successful first run, the Large Hadron Collider (LHC) is currently in a shut-down period, during which essential maintenance and upgrades are being performed on the accelerator. The ATLAS experiment, one of the four large LHC experiments has also used this period for consolidation and further developments of the detector and of its software framework, ahead of the new challenges that will be brought by the increased centre-of-mass energy and instantaneous luminosity in the next run period. This is of particular relevance for the ATLAS Tracking software, responsible for reconstructing the trajectory of charged particles through the detector, which faces a steep increase in CPU consumption due to the additional combinatorics of the high-multiplicity environment. The steps taken to mitigate this increase and stay within the available computing resources while maintaining the excellent performance of the tracking software in terms of the information provided to the physics analyses will be presented. Particular focus will be given to changes to the Event Data Model, replacement of the maths library, and adoption of a new persistent output format. The resulting CPU profiling results will be discussed, as well as the performance of the algorithms for physics processes under the expected conditions for the next LHC run.
Tri-track: free software for large-scale particle tracking.
Vallotton, Pascal; Olivier, Sandra
2013-04-01
The ability to correctly track objects in time-lapse sequences is important in many applications of microscopy. Individual object motions typically display a level of dynamic regularity reflecting the existence of an underlying physics or biology. Best results are obtained when this local information is exploited. Additionally, if the particle number is known to be approximately constant, a large number of tracking scenarios may be rejected on the basis that they are not compatible with a known maximum particle velocity. This represents information of a global nature, which should ideally be exploited too. Some time ago, we devised an efficient algorithm that exploited both types of information. The tracking task was reduced to a max-flow min-cost problem instance through a novel graph structure that comprised vertices representing objects from three consecutive image frames. The algorithm is explained here for the first time. A user-friendly implementation is provided, and the specific relaxation mechanism responsible for the method's effectiveness is uncovered. The software is particularly competitive for complex dynamics such as dense antiparallel flows, or in situations where object displacements are considerable. As an application, we characterize a remarkable vortex structure formed by bacteria engaged in interstitial motility.
ACTS: from ATLAS software towards a common track reconstruction software
NASA Astrophysics Data System (ADS)
Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration
2017-10-01
Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.
The ATLAS Simulation Infrastructure
Aad, G.; Abbott, B.; Abdallah, J.; ...
2010-09-25
The simulation software for the ATLAS Experiment at the Large Hadron Collider is being used for large-scale production of events on the LHC Computing Grid. This simulation requires many components, from the generators that simulate particle collisions, through packages simulating the response of the various detectors and triggers. All of these components come together under the ATLAS simulation infrastructure. In this paper, that infrastructure is discussed, including that supporting the detector description, interfacing the event generation, and combining the GEANT4 simulation of the response of the individual detectors. Also described are the tools allowing the software validation, performance testing, andmore » the validation of the simulated output against known physics processes.« less
NASA Astrophysics Data System (ADS)
Kouznetsov, A.; Cully, C. M.; Knudsen, D. J.
2016-12-01
Changes in D-Region ionization caused by energetic particle precipitation are monitored by the Array for Broadband Observations of VLF/ELF Emissions (ABOVE) - a network of receivers deployed across Western Canada. The observed amplitudes and phases of subionospheric-propagating VLF signals from distant artificial transmitters depend sensitively on the free electron population created by precipitation of energetic charged particles. Those include both primary (electrons, protons and heavier ions) and secondary (cascades of ionized particles and electromagnetic radiation) components. We have designed and implemented a full-scale model to predict the received VLF signals based on first-principle charged particle transport calculations coupled to the Long Wavelength Propagation Capability (LWPC) software. Calculations of ionization rates and free electron densities are based on MCNP-6 (a general-purpose Monte Carlo N- Particle) software taking advantage of its capability of coupled neutron/photon/electron transport and novel library of cross-sections for low-energetic electron and photon interactions with matter. Cosmic ray calculations of background ionization are based on source spectra obtained both from PAMELA direct Cosmic Rays spectra measurements and based on the recently-implemented MCNP 6 galactic cosmic-ray source, scaled using our (Calgary) neutron monitor measurement results. Conversion from calculated fluxes (MCNP F4 tallies) to ionization rates for low-energy electrons are based on the total ionization cross-sections for oxygen and nitrogen molecules from the National Institute of Standard and Technology. We use our model to explore the complexity of the physical processes affecting VLF propagation.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2002-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...and global water column. 1 OBJECTIVES The project’s objective is to develop automated image analysis software to reduce the effort and time
A simple optical tweezers for trapping polystyrene particles
NASA Astrophysics Data System (ADS)
Shiddiq, Minarni; Nasir, Zulfa; Yogasari, Dwiyana
2013-09-01
Optical tweezers is an optical trap. For decades, it has become an optical tool that can trap and manipulate any particle from the very small size like DNA to the big one like bacteria. The trapping force comes from the radiation pressure of laser light which is focused to a group of particles. Optical tweezers has been used in many research areas such as atomic physics, medical physics, biophysics, and chemistry. Here, a simple optical tweezers has been constructed using a modified Leybold laboratory optical microscope. The ocular lens of the microscope has been removed for laser light and digital camera accesses. A laser light from a Coherent diode laser with wavelength λ = 830 nm and power 50 mW is sent through an immersion oil objective lens with magnification 100 × and NA 1.25 to a cell made from microscope slides containing polystyrene particles. Polystyrene particles with size 3 μm and 10 μm are used. A CMOS Thorlabs camera type DCC1545M with USB Interface and Thorlabs camera lens 35 mm are connected to a desktop and used to monitor the trapping and measure the stiffness of the trap. The camera is accompanied by camera software which makes able for the user to capture and save images. The images are analyzed using ImageJ and Scion macro. The polystyrene particles have been trapped successfully. The stiffness of the trap depends on the size of the particles and the power of the laser. The stiffness increases linearly with power and decreases as the particle size larger.
NASA Astrophysics Data System (ADS)
Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Madden, D.; Tautz, M.; Roth, C.
2004-05-01
AF-GEOSpace is a graphics-intensive software program with space environment models and applications developed and distributed by the Space Weather Center of Excellence at AFRL. A review of current (Version 2.0) and planned (Version 2.1) AF-GEOSpace capabilities will be given. A wide range of physical domains is represented enabling the software to address such things as solar disturbance propagation, radiation belt configuration, and ionospheric auroral particle precipitation and scintillation. The software is currently being used to aid with the design, operation, and simulation of a wide variety of communications, navigation, and surveillance systems. Building on the success of previous releases, AF-GEOSpace has become a platform for the rapid prototyping of automated operational and simulation space weather visualization products and helps with a variety of tasks, including: orbit specification for radiation hazard avoidance; satellite design assessment and post-event anomaly analysis; solar disturbance effects forecasting; frequency and antenna management for radar and HF communications; determination of link outage regions for active ionospheric conditions; scientific model validation and comparison, physics research, and education. Version 2.0 provided a simplified graphical user interface, improved science and application modules, and significantly enhanced graphical performance. Common input data archive sets, application modules, and 1-D, 2-D, and 3-D visualization tools are provided to all models. Dynamic capabilities permit multiple environments to be generated at user-specified time intervals while animation tools enable displays such as satellite orbits and environment data together as a function of time. Building on the existing Version 2.0 software architecture, AF-GEOSpace Version 2.1 is currently under development and will include a host of new modules to provide, for example, geosynchronous charged particle fluxes, neutral atmosphere densities, cosmic ray cutoff maps, low-altitude trapped proton belt specification, and meteor shower/storm fluxes with spacecraft impact probabilities. AF-GEOSpace Version 2.1 is being developed for Windows NT/2000/XP and Linux systems.
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
Continuous Energy Photon Transport Implementation in MCATK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Terry R.; Trahan, Travis John; Sweezy, Jeremy Ed
2016-10-31
The Monte Carlo Application ToolKit (MCATK) code development team has implemented Monte Carlo photon transport into the MCATK software suite. The current particle transport capabilities in MCATK, which process the tracking and collision physics, have been extended to enable tracking of photons using the same continuous energy approximation. We describe the four photoatomic processes implemented, which are coherent scattering, incoherent scattering, pair-production, and photoelectric absorption. The accompanying background, implementation, and verification of these processes will be presented.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study
NASA Astrophysics Data System (ADS)
Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.
2017-12-01
Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.
Data Acquisition and Environmental Monitoring of the MAJORANA DEMONSTRATOR
NASA Astrophysics Data System (ADS)
Meijer, Samuel; Majorana Collaboration
2015-04-01
Low-background non-accelerator experiments have unique requirements for their data acquisition and environmental monitoring. Background signals can easily overwhelm the signals of interest, so events which could contribute to the background must be identified. There is a need to correlate events between detectors and environmental conditions, and data integrity must be maintained. Here, we describe several of the software and hardware techniques achieved by the MAJORANA Collaboration for the MAJORANA DEMONSTRATOR, such as using the Object-oriented Realtime Control and Acquisition (ORCA) software package. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, the Particle Astrophysics Program of the National Science Foundation, and the Sanford Underground Research Facility.
Lidar stand-alone retrieval of atmospheric aerosol microphysical properties during SLOPE
NASA Astrophysics Data System (ADS)
Ortiz-Amezcua, Pablo; Samaras, Stefanos; Böckmann, Christine; Antonio Benavent-Oltra, Jose; Luis Guerrero-Rascado, Juan; Román, Roberto; Alados-Arboledas, Lucas
2018-04-01
Two cases from SLOPE campaign at Granada are analyzed in terms of particle microphysical properties using novel software developed at Potsdam University. Multiwavelength Raman lidar measurements of particle extinction and backscatter coefficients as well as linear particle depolarization ratios are used as input for the software. The result of the retrieval is a 2-dimensional particle volume distribution as a function of radius and aspect ratio, from which the particle microphysical properties are obtained.
Statistical analysis of magnetically soft particles in magnetorheological elastomers
NASA Astrophysics Data System (ADS)
Gundermann, T.; Cremer, P.; Löwen, H.; Menzel, A. M.; Odenbach, S.
2017-04-01
The physical properties of magnetorheological elastomers (MRE) are a complex issue and can be influenced and controlled in many ways, e.g. by applying a magnetic field, by external mechanical stimuli, or by an electric potential. In general, the response of MRE materials to these stimuli is crucially dependent on the distribution of the magnetic particles inside the elastomer. Specific knowledge of the interactions between particles or particle clusters is of high relevance for understanding the macroscopic rheological properties and provides an important input for theoretical calculations. In order to gain a better insight into the correlation between the macroscopic effects and microstructure and to generate a database for theoretical analysis, x-ray micro-computed tomography (X-μCT) investigations as a base for a statistical analysis of the particle configurations were carried out. Different MREs with quantities of 2-15 wt% (0.27-2.3 vol%) of iron powder and different allocations of the particles inside the matrix were prepared. The X-μCT results were edited by an image processing software regarding the geometrical properties of the particles with and without the influence of an external magnetic field. Pair correlation functions for the positions of the particles inside the elastomer were calculated to statistically characterize the distributions of the particles in the samples.
ME(SSY)**2: Monte Carlo Code for Star Cluster Simulations
NASA Astrophysics Data System (ADS)
Freitag, Marc Dewi
2013-02-01
ME(SSY)**2 stands for “Monte-carlo Experiments with Spherically SYmmetric Stellar SYstems." This code simulates the long term evolution of spherical clusters of stars; it was devised specifically to treat dense galactic nuclei. It is based on the pioneering Monte Carlo scheme proposed by Hénon in the 70's and includes all relevant physical ingredients (2-body relaxation, stellar mass spectrum, collisions, tidal disruption, ldots). It is basically a Monte Carlo resolution of the Fokker-Planck equation. It can cope with any stellar mass spectrum or velocity distribution. Being a particle-based method, it also allows one to take stellar collisions into account in a very realistic way. This unique code, featuring most important physical processes, allows million particle simulations, spanning a Hubble time, in a few CPU days on standard personal computers and provides a wealth of data only rivalized by N-body simulations. The current version of the software requires the use of routines from the "Numerical Recipes in Fortran 77" (http://www.nrbook.com/a/bookfpdf.php).
NASA Astrophysics Data System (ADS)
Torres, Hilario; Iaccarino, Gianluca
2017-11-01
Soleil-X is a multi-physics solver being developed at Stanford University as a part of the Predictive Science Academic Alliance Program II. Our goal is to conduct high fidelity simulations of particle laden turbulent flows in a radiation environment for solar energy receiver applications as well as to demonstrate our readiness to effectively utilize next generation Exascale machines. The novel aspect of Soleil-X is that it is built upon the Legion runtime system to enable easy portability to different parallel distributed heterogeneous architectures while also being written entirely in high-level/high-productivity languages (Ebb and Regent). An overview of the Soleil-X software architecture will be given. Results from coupled fluid flow, Lagrangian point particle tracking, and thermal radiation simulations will be presented. Performance diagnostic tools and metrics corresponding the the same cases will also be discussed. US Department of Energy, National Nuclear Security Administration.
The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector
NASA Astrophysics Data System (ADS)
Axani, S. N.; Frankiewicz, K.; Conrad, J. M.
2018-03-01
The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.
Software manual for operating particle displacement tracking data acquisition and reduction system
NASA Technical Reports Server (NTRS)
Wernet, Mark P.
1991-01-01
The software manual is presented. The necessary steps required to record, analyze, and reduce Particle Image Velocimetry (PIV) data using the Particle Displacement Tracking (PDT) technique are described. The new PDT system is an all electronic technique employing a CCD video camera and a large memory buffer frame-grabber board to record low velocity (less than or equal to 20 cm/s) flows. Using a simple encoding scheme, a time sequence of single exposure images are time coded into a single image and then processed to track particle displacements and determine 2-D velocity vectors. All the PDT data acquisition, analysis, and data reduction software is written to run on an 80386 PC.
NASA Astrophysics Data System (ADS)
Buchari, M. A.; Mardiyanto, S.; Hendradjaya, B.
2018-03-01
Finding the existence of software defect as early as possible is the purpose of research about software defect prediction. Software defect prediction activity is required to not only state the existence of defects, but also to be able to give a list of priorities which modules require a more intensive test. Therefore, the allocation of test resources can be managed efficiently. Learning to rank is one of the approach that can provide defect module ranking data for the purposes of software testing. In this study, we propose a meta-heuristic chaotic Gaussian particle swarm optimization to improve the accuracy of learning to rank software defect prediction approach. We have used 11 public benchmark data sets as experimental data. Our overall results has demonstrated that the prediction models construct using Chaotic Gaussian Particle Swarm Optimization gets better accuracy on 5 data sets, ties in 5 data sets and gets worse in 1 data sets. Thus, we conclude that the application of Chaotic Gaussian Particle Swarm Optimization in Learning-to-Rank approach can improve the accuracy of the defect module ranking in data sets that have high-dimensional features.
Automated data collection in single particle electron microscopy
Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget
2016-01-01
Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944
Löck, Steffen; Roth, Klaus; Skripcak, Tomas; Worbs, Mario; Helmbrecht, Stephan; Jakobi, Annika; Just, Uwe; Krause, Mechthild; Baumann, Michael; Enghardt, Wolfgang; Lühr, Armin
2015-09-01
To guarantee equal access to optimal radiotherapy, a concept of patient assignment to photon or particle radiotherapy using remote treatment plan exchange and comparison - ReCompare - was proposed. We demonstrate the implementation of this concept and present its clinical applicability. The ReCompare concept was implemented using a client-server based software solution. A clinical workflow for the remote treatment plan exchange and comparison was defined. The steps required by the user and performed by the software for a complete plan transfer were described and an additional module for dose-response modeling was added. The ReCompare software was successfully tested in cooperation with three external partner clinics and worked meeting all required specifications. It was compatible with several standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. The ReCompare software can be applied to support non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by remote treatment plan exchange and comparison. Copyright © 2015. Published by Elsevier GmbH.
Simulating Coupling Complexity in Space Plasmas: First Results from a new code
NASA Astrophysics Data System (ADS)
Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.
2005-12-01
The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-07-01
The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of themore » physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.« less
NASA Astrophysics Data System (ADS)
1999-09-01
Physics teacher Andrew Morrison from High Pavement College in Nottingham has recently been appointed as Schools' officer for particle physics by the Particle Physics and Astronomy Research Council, as part of the Council's Public Understanding of Science programme. As well as his role as an experienced physics teacher, Andrew has acted as marketing manager for his college and chair of the Nottinghamshire section of the Association for Science Education. He will now be working two days each week in his new role with PPARC, acting as a link between the science education and research communities, helping researchers develop ideas for promoting particle physics and leading some specific new projects for the production of schools materials. Andrew can be contacted at High Pavement Sixth Form College, Gainsford Crescent, Nottingham NG5 5HT (tel: 0115 916 6165 or e-mail: morrison@innotts.co.uk). On the other side of the Atlantic, an 18 year-old student at Atlee High School in Mechanicsville, Virginia, USA was the recipient of the `1999 Young Scientist of the Year' award. Jakob Harmon submitted a project on magnetic levitation (maglev) in this extracurricular competition organized by PhysLINK.com, a leading Internet authority on physics and engineering education. The prize was a summer placement at Virginia Polytechnic Institute, Blacksburg, where Jakob continued his education in one of the most active maglev research and development groups in the USA. He also received science books and software as part of the award. The PhysLINK.com award was established to recognize, encourage and foster talented high school students in physics and engineering, with the prize being designed to fit the specific needs and aspirations of each individual winner. Details of next year's competition, along with Jakob's project and more about magnetic levitation can be viewed at www.physlink.com or by contacting Anton Skorucak of PhysLINK.com at 11271 Ventura Blvd #299, Studio City, CA 91606, USA (fax: (1) 818 985 2466, e-mail: info@physlink.com).
Reconstruction of Cyber and Physical Software Using Novel Spread Method
NASA Astrophysics Data System (ADS)
Ma, Wubin; Deng, Su; Huang, Hongbin
2018-03-01
Cyber and Physical software has been concerned for many years since 2010. Actually, many researchers would disagree with the deployment of traditional Spread Method for reconstruction of Cyber and physical software, which embodies the key principles reconstruction of cyber physical system. NSM(novel spread method), our new methodology for reconstruction of cyber and physical software, is the solution to all of these challenges.
A Web-Based Development Environment for Collaborative Data Analysis
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.
2014-06-01
Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.
NASA Astrophysics Data System (ADS)
Ryżak, Magdalena; Beczek, Michał; Mazur, Rafał; Sochan, Agata; Bieganowski, Andrzej
2017-04-01
The phenomenon of splash, which is one of the factors causing erosion of the soil surface, is the subject of research of various scientific teams. One of efficient methods of observation and analysis of this phenomenon are high-speed cameras to measure particles at 2000 frames per second or higher. Analysis of the phenomenon of splash with the use of high-speed cameras and specialized software can reveal, among other things, the number of broken particles, their speeds, trajectories, and the distances over which they were transferred. The paper presents an attempt at evaluation of the efficiency of detection of splashed particles with the use of a set of 3 cameras (Vision Research MIRO 310) and software Dantec Dynamics Studio, using a 3D module (Volumetric PTV). In order to assess the effectiveness of estimating the number of particles, the experiment was performed on glass beads with a diameter of 0.5 mm (corresponding to the sand fraction). Water droplets with a diameter of 4.2 mm fell on a sample from a height of 1.5 m. Two types of splashed particles were observed: particle having a low range (up to 18 mm) splashed at larger angles and particles of a high range (up to 118 mm) splashed at smaller angles. The detection efficiency the number of splashed particles estimated by the software was 45 - 65% for particles with a large range. The effectiveness of the detection of particles by the software has been calculated on the basis of comparison with the number of beads that fell on the adhesive surface around the sample. This work was partly financed from the National Science Centre, Poland; project no. 2014/14/E/ST10/00851.
NASA Astrophysics Data System (ADS)
Coquelin, L.; Le Brusquet, L.; Fischer, N.; Gensdarmes, F.; Motzkus, C.; Mace, T.; Fleury, G.
2018-05-01
A scanning mobility particle sizer (SMPS) is a high resolution nanoparticle sizing system that is widely used as the standard method to measure airborne particle size distributions (PSD) in the size range 1 nm–1 μm. This paper addresses the problem to assess the uncertainty associated with PSD when a differential mobility analyzer (DMA) operates under scanning mode. The sources of uncertainty are described and then modeled either through experiments or knowledge extracted from the literature. Special care is brought to model the physics and to account for competing theories. Indeed, it appears that the modeling errors resulting from approximations of the physics can largely affect the final estimate of this indirect measurement, especially for quantities that are not measured during day-to-day experiments. The Monte Carlo method is used to compute the uncertainty associated with PSD. The method is tested against real data sets that are monosize polystyrene latex spheres (PSL) with nominal diameters of 100 nm, 200 nm and 450 nm. The median diameters and associated standard uncertainty of the aerosol particles are estimated as 101.22 nm ± 0.18 nm, 204.39 nm ± 1.71 nm and 443.87 nm ± 1.52 nm with the new approach. Other statistical parameters, such as the mean diameter, the mode and the geometric mean and associated standard uncertainty, are also computed. These results are then compared with the results obtained by SMPS embedded software.
The Particle-in-Cell and Kinetic Simulation Software Center
NASA Astrophysics Data System (ADS)
Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; An, W.; Dalichaouch, T. N.; Davidson, A.; Hildebrand, L.; Joglekar, A.; May, J.; Miller, K.; Touati, M.; Xu, X. L.
2017-10-01
The UCLA Particle-in-Cell and Kinetic Simulation Software Center (PICKSC) aims to support an international community of PIC and plasma kinetic software developers, users, and educators; to increase the use of this software for accelerating the rate of scientific discovery; and to be a repository of knowledge and history for PIC. We discuss progress towards making available and documenting illustrative open-source software programs and distinct production programs; developing and comparing different PIC algorithms; coordinating the development of resources for the educational use of kinetic software; and the outcomes of our first sponsored OSIRIS users workshop. We also welcome input and discussion from anyone interested in using or developing kinetic software, in obtaining access to our codes, in collaborating, in sharing their own software, or in commenting on how PICKSC can better serve the DPP community. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
NASA Astrophysics Data System (ADS)
Prathap Reddy, K.
2016-11-01
An ‘electrostatic bathtub potential’ is defined and analytical expressions for the time period and amplitude of charged particles in this potential are obtained and compared with simulations. These kinds of potentials are encountered in linear electrostatic ion traps, where the potential along the axis appears like a bathtub. Ion traps are used in basic physics research and mass spectrometry to store ions; these stored ions make oscillatory motion within the confined volume of the trap. Usually these traps are designed and studied using ion optical software, but in this work the bathtub potential is reproduced by making two simple modifications to the harmonic oscillator potential. The addition of a linear ‘k 1|x|’ potential makes the simple harmonic potential curve steeper with a sharper turn at the origin, while the introduction of a finite-length zero potential region at the centre reproduces the flat region of the bathtub curve. This whole exercise of modelling a practical experimental situation in terms of a well-known simple physics problem may generate interest among readers.
A Reconfigurable Instrument System for Nuclear and Particle Physics Experiments
NASA Astrophysics Data System (ADS)
Sang, Ziru; Li, Feng; Jiang, Xiao; Jin, Ge
2014-04-01
We developed a reconfigurable nuclear instrument system (RNIS) that could satisfy the requirements of diverse nuclear and particle physics experiments, and the inertial confinement fusion diagnostic. Benefiting from the reconfigurable hardware structure and digital pulse processing technology, RNIS shakes off the restrictions of cumbersome crates and miscellaneous modules. It retains all the advantages of conventional nuclear instruments and is more flexible and portable. RNIS is primarily composed of a field programmable hardware board and relevant PC software. Separate analog channels are designed to provide different functions, such as amplifiers, ADC, fast discriminators and Schmitt discriminators for diverse experimental purposes. The high-performance field programmable gate array could complete high-precision time interval measurement, histogram accumulation, counting, and coincidence anticoincidence measurement. To illustrate the prospects of RNIS, a series of applications to the experiments are described in this paper. The first, for which RNIS was originally developed, involves nuclear energy spectrum measurement with a scintillation detector and photomultiplier. The second experiment applies RNIS to a G-M tube counting experiment, and in the third, it is applied to a quantum communication experiment through reconfiguration.
Fourier-Bessel Particle-In-Cell (FBPIC) v0.1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehe, Remi; Kirchen, Manuel; Jalas, Soeren
The Fourier-Bessel Particle-In-Cell code is a scientific simulation software for relativistic plasma physics. It is a Particle-In-Cell code whose distinctive feature is to use a spectral decomposition in cylindrical geometry. This decomposition allows to combine the advantages of spectral 3D Cartesian PIC codes (high accuracy and stability) and those of finite-difference cylindrical PIC codes with azimuthal decomposition (orders-of-magnitude speedup when compared to 3D simulations). The code is built on Python and can run both on CPU and GPU (the GPU runs being typically 1 or 2 orders of magnitude faster than the corresponding CPU runs.) The code has the exactmore » same output format as the open-source PIC codes Warp and PIConGPU (openPMD format: openpmd.org) and has a very similar input format as Warp (Python script with many similarities). There is therefore tight interoperability between Warp and FBPIC, and this interoperability will increase even more in the future.« less
PET Imaging - from Physics to Clinical Molecular Imaging
NASA Astrophysics Data System (ADS)
Majewski, Stan
2008-03-01
From the beginnings many years ago in a few physics laboratories and first applications as a research brain function imager, PET became lately a leading molecular imaging modality used in diagnosis, staging and therapy monitoring of cancer, as well as has increased use in assessment of brain function (early diagnosis of Alzheimer's, etc) and in cardiac function. To assist with anatomic structure map and with absorption correction CT is often used with PET in a duo system. Growing interest in the last 5-10 years in dedicated organ specific PET imagers (breast, prostate, brain, etc) presents again an opportunity to the particle physics instrumentation community to contribute to the important field of medical imaging. In addition to the bulky standard ring structures, compact, economical and high performance mobile imagers are being proposed and build. The latest development in standard PET imaging is introduction of the well known TOF concept enabling clearer tomographic pictures of the patient organs. Development and availability of novel photodetectors such as Silicon PMT immune to magnetic fields offers an exciting opportunity to use PET in conjunction with MRI and fMRI. As before with avalanche photodiodes, particle physics community plays a leading role in developing these devices. The presentation will mostly focus on present and future opportunities for better PET designs based on new technologies and methods: new scintillators, photodetectors, readout, software.
Design and Implementation of Embedded Computer Vision Systems Based on Particle Filters
2010-01-01
for hardware/software implementa- tion of multi-dimensional particle filter application and we explore this in the third application which is a 3D...methodology for hardware/software implementation of multi-dimensional particle filter application and we explore this in the third application which is a...and hence multiprocessor implementation of parti- cle filters is an important option to examine. A significant body of work exists on optimizing generic
NASA Astrophysics Data System (ADS)
De Bonis, Giulia; Bozza, Cristiano
2017-03-01
In the framework of Horizon 2020, the European Commission approved the ASTERICS initiative (ASTronomy ESFRI and Research Infrastructure CluSter) to collect knowledge and experiences from astronomy, astrophysics and particle physics and foster synergies among existing research infrastructures and scientific communities, hence paving the way for future ones. ASTERICS aims at producing a common set of tools and strategies to be applied in Astronomy ESFRI facilities. In particular, it will target the so-called multi-messenger approach to combine information from optical and radio telescopes, photon counters and neutrino telescopes. pLISA is a software tool under development in ASTERICS to help and promote machine learning as a unified approach to multivariate analysis of astrophysical data and signals. The library will offer a collection of classification parameters, estimators, classes and methods to be linked and used in reconstruction programs (and possibly also extended), to characterize events in terms of particle identification and energy. The pLISA library aims at offering the software infras tructure for applications developed inside different experiments and has been designed with an effort to extrapolate general, physics-related estimators from the specific features of the data model related to each particular experiment. pLISA is oriented towards parallel computing architectures, with awareness of the opportunity of using GPUs as accelerators demanding specifically optimized algorithms and to reduce the costs of pro cessing hardware requested for the reconstruction tasks. Indeed, a fast (ideally, real-time) reconstruction can open the way for the development or improvement of alert systems, typically required by multi-messenger search programmes among the different experi mental facilities involved in ASTERICS.
Evaluating Word Prediction Software for Students with Physical Disabilities
ERIC Educational Resources Information Center
Mezei, Peter; Heller, Kathryn Wolff
2005-01-01
Although word prediction software was originally developed for individuals with physical disabilities, little research has been conducted featuring participants with physical disabilities. Using the Co:Writer 4000 word prediction software, three participants with physical disabilities improved typing rate and spelling accuracy, and two of these…
The Pandora multi-algorithm approach to automated pattern recognition in LAr TPC detectors
NASA Astrophysics Data System (ADS)
Marshall, J. S.; Blake, A. S. T.; Thomson, M. A.; Escudero, L.; de Vries, J.; Weston, J.;
2017-09-01
The development and operation of Liquid Argon Time Projection Chambers (LAr TPCs) for neutrino physics has created a need for new approaches to pattern recognition, in order to fully exploit the superb imaging capabilities offered by this technology. The Pandora Software Development Kit provides functionality to aid the process of designing, implementing and running pattern recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition: individual algorithms each address a specific task in a particular topology; a series of many tens of algorithms then carefully builds-up a picture of the event. The input to the Pandora pattern recognition is a list of 2D Hits. The output from the chain of over 70 algorithms is a hierarchy of reconstructed 3D Particles, each with an identified particle type, vertex and direction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyer, W.B.
1979-09-01
This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.
A computer controlled television detector for light, X-rays and particles
NASA Technical Reports Server (NTRS)
Kalata, K.
1981-01-01
A versatile, high resolution, software configurable, two-dimensional intensified vidicon quantum detector system has been developed for multiple research applications. A thin phosphor convertor allows the detection of X-rays below 20 keV and non-relativistic particles in addition to visible light, and a thicker scintillator can be used to detect X-rays up to 100 keV and relativistic particles. Faceplates may be changed to allow any active area from 1 to 40 mm square, and active areas up to 200 mm square are possible. The image is integrated in a digital memory on any software specified array size up to 4000 x 4000. The array size is selected to match the spatial resolution, which ranges from 10 to 100 microns depending on the operating mode, the active area, and the photon or particle energy. All scan and data acquisition parameters are under software control to allow optimal data collection for each application.
Icing Branch Current Research Activities in Icing Physics
NASA Technical Reports Server (NTRS)
Vargas, Mario
2009-01-01
Current development: A grid block transformation scheme which allows the input of grids in arbitrary reference frames, the use of mirror planes, and grids with relative velocities has been developed. A simple ice crystal and sand particle bouncing scheme has been included. Added an SLD splashing model based on that developed by William Wright for the LEWICE 3.2.2 software. A new area based collection efficiency algorithm will be incorporated which calculates trajectories from inflow block boundaries to outflow block boundaries. This method will be used for calculating and passing collection efficiency data between blade rows for turbo-machinery calculations.
A fast - Monte Carlo toolkit on GPU for treatment plan dose recalculation in proton therapy
NASA Astrophysics Data System (ADS)
Senzacqua, M.; Schiavi, A.; Patera, V.; Pioli, S.; Battistoni, G.; Ciocca, M.; Mairani, A.; Magro, G.; Molinelli, S.
2017-10-01
In the context of the particle therapy a crucial role is played by Treatment Planning Systems (TPSs), tools aimed to compute and optimize the tratment plan. Nowadays one of the major issues related to the TPS in particle therapy is the large CPU time needed. We developed a software toolkit (FRED) for reducing dose recalculation time by exploiting Graphics Processing Units (GPU) hardware. Thanks to their high parallelization capability, GPUs significantly reduce the computation time, up to factor 100 respect to a standard CPU running software. The transport of proton beams in the patient is accurately described through Monte Carlo methods. Physical processes reproduced are: Multiple Coulomb Scattering, energy straggling and nuclear interactions of protons with the main nuclei composing the biological tissues. FRED toolkit does not rely on the water equivalent translation of tissues, but exploits the Computed Tomography anatomical information by reconstructing and simulating the atomic composition of each crossed tissue. FRED can be used as an efficient tool for dose recalculation, on the day of the treatment. In fact it can provide in about one minute on standard hardware the dose map obtained combining the treatment plan, earlier computed by the TPS, and the current patient anatomic arrangement.
2006-12-01
NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI
Sahraneshin Samani, Fazel; Moore, Jodene K; Khosravani, Pardis; Ebrahimi, Marzieh
2014-08-01
Flow cytometers designed to analyze large particles are enabling new applications in biology. Data analysis is a critical component of the process FCM. In this article we compare features of four free software packages including WinMDI, Cyflogic, Flowing software, and Cytobank.
NASA Astrophysics Data System (ADS)
Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.
2015-12-01
The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.
Time-based Reconstruction of Free-streaming Data in CBM
NASA Astrophysics Data System (ADS)
Akishina, Valentina; Kisel, Ivan; Vassiliev, Iouri; Zyzak, Maksym
2018-02-01
Traditional latency-limited trigger architectures typical for conventional experiments are inapplicable for the CBM experiment. Instead, CBM will ship and collect time-stamped data into a readout buffer in a form of a time-slice of a certain length and deliver it to a large computer farm, where online event reconstruction and selection will be performed. Grouping measurements into physical collisions must be performed in software and requires reconstruction not only in space, but also in time, the so-called 4-dimensional track reconstruction and event building. The tracks, reconstructed with 4D Cellular Automaton track finder, are combined into event-corresponding clusters according to the estimated time in the target position and the errors, obtained with the Kalman Filter method. The reconstructed events are given as inputs to the KF Particle Finder package for short-lived particle reconstruction. The results of time-based reconstruction of simulated collisions in CBM are presented and discussed in details.
PowderSim: Lagrangian Discrete and Mesh-Free Continuum Simulation Code for Cohesive Soils
NASA Technical Reports Server (NTRS)
Johnson, Scott; Walton, Otis; Settgast, Randolph
2013-01-01
PowderSim is a calculation tool that combines a discrete-element method (DEM) module, including calibrated interparticle-interaction relationships, with a mesh-free, continuum, SPH (smoothed-particle hydrodynamics) based module that utilizes enhanced, calibrated, constitutive models capable of mimicking both large deformations and the flow behavior of regolith simulants and lunar regolith under conditions anticipated during in situ resource utilization (ISRU) operations. The major innovation introduced in PowderSim is to use a mesh-free method (SPH-based) with a calibrated and slightly modified critical-state soil mechanics constitutive model to extend the ability of the simulation tool to also address full-scale engineering systems in the continuum sense. The PowderSim software maintains the ability to address particle-scale problems, like size segregation, in selected regions with a traditional DEM module, which has improved contact physics and electrostatic interaction models.
Strong scaling of general-purpose molecular dynamics simulations on GPUs
NASA Astrophysics Data System (ADS)
Glaser, Jens; Nguyen, Trung Dac; Anderson, Joshua A.; Lui, Pak; Spiga, Filippo; Millan, Jaime A.; Morse, David C.; Glotzer, Sharon C.
2015-07-01
We describe a highly optimized implementation of MPI domain decomposition in a GPU-enabled, general-purpose molecular dynamics code, HOOMD-blue (Anderson and Glotzer, 2013). Our approach is inspired by a traditional CPU-based code, LAMMPS (Plimpton, 1995), but is implemented within a code that was designed for execution on GPUs from the start (Anderson et al., 2008). The software supports short-ranged pair force and bond force fields and achieves optimal GPU performance using an autotuning algorithm. We are able to demonstrate equivalent or superior scaling on up to 3375 GPUs in Lennard-Jones and dissipative particle dynamics (DPD) simulations of up to 108 million particles. GPUDirect RDMA capabilities in recent GPU generations provide better performance in full double precision calculations. For a representative polymer physics application, HOOMD-blue 1.0 provides an effective GPU vs. CPU node speed-up of 12.5 ×.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, P.; /Fermilab; Cary, J.
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less
Expected Performance of the ATLAS Experiment - Detector, Trigger and Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aad, G.; Abat, E.; Abbott, B.
2011-11-28
The Large Hadron Collider (LHC) at CERN promises a major step forward in the understanding of the fundamental nature of matter. The ATLAS experiment is a general-purpose detector for the LHC, whose design was guided by the need to accommodate the wide spectrum of possible physics signatures. The major remit of the ATLAS experiment is the exploration of the TeV mass scale where groundbreaking discoveries are expected. In the focus are the investigation of the electroweak symmetry breaking and linked to this the search for the Higgs boson as well as the search for Physics beyond the Standard Model. Inmore » this report a detailed examination of the expected performance of the ATLAS detector is provided, with a major aim being to investigate the experimental sensitivity to a wide range of measurements and potential observations of new physical processes. An earlier summary of the expected capabilities of ATLAS was compiled in 1999 [1]. A survey of physics capabilities of the CMS detector was published in [2]. The design of the ATLAS detector has now been finalised, and its construction and installation have been completed [3]. An extensive test-beam programme was undertaken. Furthermore, the simulation and reconstruction software code and frameworks have been completely rewritten. Revisions incorporated reflect improved detector modelling as well as major technical changes to the software technology. Greatly improved understanding of calibration and alignment techniques, and their practical impact on performance, is now in place. The studies reported here are based on full simulations of the ATLAS detector response. A variety of event generators were employed. The simulation and reconstruction of these large event samples thus provided an important operational test of the new ATLAS software system. In addition, the processing was distributed world-wide over the ATLAS Grid facilities and hence provided an important test of the ATLAS computing system - this is the origin of the expression 'CSC studies' ('computing system commissioning'), which is occasionally referred to in these volumes. The work reported does generally assume that the detector is fully operational, and in this sense represents an idealised detector: establishing the best performance of the ATLAS detector with LHC proton-proton collisions is a challenging task for the future. The results summarised here therefore represent the best estimate of ATLAS capabilities before real operational experience of the full detector with beam. Unless otherwise stated, simulations also do not include the effect of additional interactions in the same or other bunch-crossings, and the effect of neutron background is neglected. Thus simulations correspond to the low-luminosity performance of the ATLAS detector. This report is broadly divided into two parts: firstly the performance for identification of physics objects is examined in detail, followed by a detailed assessment of the performance of the trigger system. This part is subdivided into chapters surveying the capabilities for charged particle tracking, each of electron/photon, muon and tau identification, jet and missing transverse energy reconstruction, b-tagging algorithms and performance, and finally the trigger system performance. In each chapter of the report, there is a further subdivision into shorter notes describing different aspects studied. The second major subdivision of the report addresses physics measurement capabilities, and new physics search sensitivities. Individual chapters in this part discuss ATLAS physics capabilities in Standard Model QCD and electroweak processes, in the top quark sector, in b-physics, in searches for Higgs bosons, supersymmetry searches, and finally searches for other new particles predicted in more exotic models.« less
Have More Fun Teaching Physics: Simulating, Stimulating Software.
ERIC Educational Resources Information Center
Jenkins, Doug
1996-01-01
High school physics offers opportunities to use problem solving and lab practices as well as cement skills in research, technical writing, and software applications. Describes and evaluates computer software enhancing the high school physics curriculum including spreadsheets for laboratory data, all-in-one simulators, projectile motion simulators,…
NASA Astrophysics Data System (ADS)
2013-09-01
WE RECOMMEND Marie Curie and Her Daughters An insightful study of a resilient and ingenious family and their achievements Cumulus Simple to install and operate and with obvious teaching applications, this weather station 'donationware' is as easy to recommend as it is to use Alpha Particle Scattering Apparatus Good design and construction make for good results National Grid Transmission Model Despite its expense, this resource offers excellent value Einstein's Physics A vivid, accurate, compelling and rigorous treatment, but requiring an investment of time and thought WORTH A LOOK 3D Magnetic Tube Magnetic fields in three dimensions at a low cost Barton's Pendulums A neat, well-made and handy variant, but not a replacement for the more traditional version Weather Station Though not as robust or substantial as hoped for, this can be put to good use with the right software WEB WATCH An online experiment and worksheet are useful for teaching motor efficiency, a glance at CERN, and NASA's interesting information on the alpha-magnetic spectrometer and climate change
cisTEM, user-friendly software for single-particle image processing.
Grant, Timothy; Rohou, Alexis; Grigorieff, Nikolaus
2018-03-07
We have developed new open-source software called cis TEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cis TEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k - 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cis TEM is available for download from cistem.org. © 2018, Grant et al.
cisTEM, user-friendly software for single-particle image processing
2018-01-01
We have developed new open-source software called cisTEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cisTEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k – 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cisTEM is available for download from cistem.org. PMID:29513216
Development and characterization of an aircraft aerosol time-of-flight mass spectrometer.
Pratt, Kerri A; Mayer, Joseph E; Holecek, John C; Moffet, Ryan C; Sanchez, Rene O; Rebotier, Thomas P; Furutani, Hiroshi; Gonin, Marc; Fuhrer, Katrin; Su, Yongxuan; Guazzotti, Sergio; Prather, Kimberly A
2009-03-01
Vertical and horizontal profiles of atmospheric aerosols are necessary for understanding the impact of air pollution on regional and global climate. To gain further insight into the size-resolved chemistry of individual atmospheric particles, a smaller aerosol time-of-flight mass spectrometer (ATOFMS) with increased data acquisition capabilities was developed for aircraft-based studies. Compared to previous ATOFMS systems, the new instrument has a faster data acquisition rate with improved ion transmission and mass resolution, as well as reduced physical size and power consumption, all required advances for use in aircraft studies. In addition, real-time source apportionment software allows the immediate identification and classification of individual particles to guide sampling decisions while in the field. The aircraft (A)-ATOFMS was field-tested on the ground during the Study of Organic Aerosols in Riverside, CA (SOAR) and aboard an aircraft during the Ice in Clouds Experiment-Layer Clouds (ICE-L). Initial results from ICE-L represent the first reported aircraft-based single-particle dual-polarity mass spectrometry measurements and provide an increased understanding of particle mixing state as a function of altitude. Improved ion transmission allows for the first single-particle detection of species out to approximately m/z 2000, an important mass range for the detection of biological aerosols and oligomeric species. In addition, high time resolution measurements of single-particle mixing state are demonstrated and shown to be important for airborne studies where particle concentrations and chemistry vary rapidly.
PIV/HPIV Film Analysis Software Package
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.
High-throughput landslide modelling using computational grids
NASA Astrophysics Data System (ADS)
Wallace, M.; Metson, S.; Holcombe, L.; Anderson, M.; Newbold, D.; Brook, N.
2012-04-01
Landslides are an increasing problem in developing countries. Multiple landslides can be triggered by heavy rainfall resulting in loss of life, homes and critical infrastructure. Through computer simulation of individual slopes it is possible to predict the causes, timing and magnitude of landslides and estimate the potential physical impact. Geographical scientists at the University of Bristol have developed software that integrates a physically-based slope hydrology and stability model (CHASM) with an econometric model (QUESTA) in order to predict landslide risk over time. These models allow multiple scenarios to be evaluated for each slope, accounting for data uncertainties, different engineering interventions, risk management approaches and rainfall patterns. Individual scenarios can be computationally intensive, however each scenario is independent and so multiple scenarios can be executed in parallel. As more simulations are carried out the overhead involved in managing input and output data becomes significant. This is a greater problem if multiple slopes are considered concurrently, as is required both for landslide research and for effective disaster planning at national levels. There are two critical factors in this context: generated data volumes can be in the order of tens of terabytes, and greater numbers of simulations result in long total runtimes. Users of such models, in both the research community and in developing countries, need to develop a means for handling the generation and submission of landside modelling experiments, and the storage and analysis of the resulting datasets. Additionally, governments in developing countries typically lack the necessary computing resources and infrastructure. Consequently, knowledge that could be gained by aggregating simulation results from many different scenarios across many different slopes remains hidden within the data. To address these data and workload management issues, University of Bristol particle physicists and geographical scientists are collaborating to develop methods for providing simple and effective access to landslide models and associated simulation data. Particle physicists have valuable experience in dealing with data complexity and management due to the scale of data generated by particle accelerators such as the Large Hadron Collider (LHC). The LHC generates tens of petabytes of data every year which is stored and analysed using the Worldwide LHC Computing Grid (WLCG). Tools and concepts from the WLCG are being used to drive the development of a Software-as-a-Service (SaaS) platform to provide access to hosted landslide simulation software and data. It contains advanced data management features and allows landslide simulations to be run on the WLCG, dramatically reducing simulation runtimes by parallel execution. The simulations are accessed using a web page through which users can enter and browse input data, submit jobs and visualise results. Replication of the data ensures a local copy can be accessed should a connection to the platform be unavailable. The platform does not know the details of the simulation software it runs, so it is therefore possible to use it to run alternative models at similar scales. This creates the opportunity for activities such as model sensitivity analysis and performance comparison at scales that are impractical using standalone software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehe, Remi
Many simulation software produce data in the form of a set of field values or of a set of particle positions. (one such example is that of particle-in-cell codes, which produce data on the electromagnetic fields that they simulate.) However, each particular software uses its own particular format and layout, for the output data. This makes it difficult to compare the results of different simulation software, or to have a common visualization tool for these results. However, a standardized layout for fields and particles has recently been developed: the openPMD format ( HYPERLINK "http://www.openpmd.org/"www.openpmd.org) This format is open- source, andmore » specifies a standard way in which field data and particle data should be written. The openPMD format is already implemented in the particle-in-cell code Warp (developed at LBL) and in PIConGPU (developed at HZDR, Germany). In this context, the proposed software (openPMD-viewer) is a Python package, which allows to access and visualize any data which has been formatted according to the openPMD standard. This package contains two main components: - a Python API, which allows to read and extract the data from a openPMD file, so as to be able to work with it within the Python environment. (e.g. plot the data and reprocess it with particular Python functions) - a graphical interface, which works with the ipython notebook, and allows to quickly visualize the data and browse through a set of openPMD files. The proposed software will be typically used when analyzing the results of numerical simulations. It will be useful to quickly extract scientific meaning from a set of numerical data.« less
A 2 MV Van de Graaff accelerator as a tool for planetary and impact physics research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mocker, Anna; Bugiel, Sebastian; Srama, Ralf
Investigating the dynamical and physical properties of cosmic dust can reveal a great deal of information about both the dust and its many sources. Over recent years, several spacecraft (e.g., Cassini, Stardust, Galileo, and Ulysses) have successfully characterised interstellar, interplanetary, and circumplanetary dust using a variety of techniques, including in situ analyses and sample return. Charge, mass, and velocity measurements of the dust are performed either directly (induced charge signals) or indirectly (mass and velocity from impact ionisation signals or crater morphology) and constrain the dynamical parameters of the dust grains. Dust compositional information may be obtained via either time-of-flightmore » mass spectrometry of the impact plasma or direct sample return. The accurate and reliable interpretation of collected spacecraft data requires a comprehensive programme of terrestrial instrument calibration. This process involves accelerating suitable solar system analogue dust particles to hypervelocity speeds in the laboratory, an activity performed at the Max Planck Institut fuer Kernphysik in Heidelberg, Germany. Here, a 2 MV Van de Graaff accelerator electrostatically accelerates charged micron and submicron-sized dust particles to speeds up to 80 km s{sup -1}. Recent advances in dust production and processing have allowed solar system analogue dust particles (silicates and other minerals) to be coated with a thin conductive shell, enabling them to be charged and accelerated. Refinements and upgrades to the beam line instrumentation and electronics now allow for the reliable selection of particles at velocities of 1-80 km s{sup -1} and with diameters of between 0.05 {mu}m and 5 {mu}m. This ability to select particles for subsequent impact studies based on their charges, masses, or velocities is provided by a particle selection unit (PSU). The PSU contains a field programmable gate array, capable of monitoring in real time the particles' speeds and charges, and is controlled remotely by a custom, platform independent, software package. The new control instrumentation and electronics, together with the wide range of accelerable particle types, allow the controlled investigation of hypervelocity impact phenomena across a hitherto unobtainable range of impact parameters.« less
A 2 MV Van de Graaff accelerator as a tool for planetary and impact physics research
NASA Astrophysics Data System (ADS)
Mocker, Anna; Bugiel, Sebastian; Auer, Siegfried; Baust, Günter; Colette, Andrew; Drake, Keith; Fiege, Katherina; Grün, Eberhard; Heckmann, Frieder; Helfert, Stefan; Hillier, Jonathan; Kempf, Sascha; Matt, Günter; Mellert, Tobias; Munsat, Tobin; Otto, Katharina; Postberg, Frank; Röser, Hans-Peter; Shu, Anthony; Sternovsky, Zoltán; Srama, Ralf
2011-09-01
Investigating the dynamical and physical properties of cosmic dust can reveal a great deal of information about both the dust and its many sources. Over recent years, several spacecraft (e.g., Cassini, Stardust, Galileo, and Ulysses) have successfully characterised interstellar, interplanetary, and circumplanetary dust using a variety of techniques, including in situ analyses and sample return. Charge, mass, and velocity measurements of the dust are performed either directly (induced charge signals) or indirectly (mass and velocity from impact ionisation signals or crater morphology) and constrain the dynamical parameters of the dust grains. Dust compositional information may be obtained via either time-of-flight mass spectrometry of the impact plasma or direct sample return. The accurate and reliable interpretation of collected spacecraft data requires a comprehensive programme of terrestrial instrument calibration. This process involves accelerating suitable solar system analogue dust particles to hypervelocity speeds in the laboratory, an activity performed at the Max Planck Institut für Kernphysik in Heidelberg, Germany. Here, a 2 MV Van de Graaff accelerator electrostatically accelerates charged micron and submicron-sized dust particles to speeds up to 80 km s-1. Recent advances in dust production and processing have allowed solar system analogue dust particles (silicates and other minerals) to be coated with a thin conductive shell, enabling them to be charged and accelerated. Refinements and upgrades to the beam line instrumentation and electronics now allow for the reliable selection of particles at velocities of 1-80 km s-1 and with diameters of between 0.05 μm and 5 μm. This ability to select particles for subsequent impact studies based on their charges, masses, or velocities is provided by a particle selection unit (PSU). The PSU contains a field programmable gate array, capable of monitoring in real time the particles' speeds and charges, and is controlled remotely by a custom, platform independent, software package. The new control instrumentation and electronics, together with the wide range of accelerable particle types, allow the controlled investigation of hypervelocity impact phenomena across a hitherto unobtainable range of impact parameters.
Inner space/outer space - The interface between cosmology and particle physics
NASA Astrophysics Data System (ADS)
Kolb, Edward W.; Turner, Michael S.; Lindley, David; Olive, Keith; Seckel, David
A collection of papers covering the synthesis between particle physics and cosmology is presented. The general topics addressed include: standard models of particle physics and cosmology; microwave background radiation; origin and evolution of large-scale structure; inflation; massive magnetic monopoles; supersymmetry, supergravity, and quantum gravity; cosmological constraints on particle physics; Kaluza-Klein cosmology; and future directions and connections in particle physics and cosmology.
Fast Photon Monte Carlo for Water Cherenkov Detectors
NASA Astrophysics Data System (ADS)
Latorre, Anthony; Seibert, Stanley
2012-03-01
We present Chroma, a high performance optical photon simulation for large particle physics detectors, such as the water Cerenkov far detector option for LBNE. This software takes advantage of the CUDA parallel computing platform to propagate photons using modern graphics processing units. In a computer model of a 200 kiloton water Cerenkov detector with 29,000 photomultiplier tubes, Chroma can propagate 2.5 million photons per second, around 200 times faster than the same simulation with Geant4. Chroma uses a surface based approach to modeling geometry which offers many benefits over a solid based modelling approach which is used in other simulations like Geant4.
Introduction to the scientific application system of DAMPE (On behalf of DAMPE collaboration)
NASA Astrophysics Data System (ADS)
Zang, Jingjing
2016-07-01
The Dark Matter Particle Explorer (DAMPE) is a high energy particle physics experiment satellite, launched on 17 Dec 2015. The science data processing and payload operation maintenance for DAMPE will be provided by the DAMPE Scientific Application System (SAS) at the Purple Mountain Observatory (PMO) of Chinese Academy of Sciences. SAS is consisted of three subsystems - scientific operation subsystem, science data and user management subsystem and science data processing subsystem. In cooperation with the Ground Support System (Beijing), the scientific operation subsystem is responsible for proposing observation plans, monitoring the health of satellite, generating payload control commands and participating in all activities related to payload operation. Several databases developed by the science data and user management subsystem of DAMPE methodically manage all collected and reconstructed science data, down linked housekeeping data, payload configuration and calibration data. Under the leadership of DAMPE Scientific Committee, this subsystem is also responsible for publication of high level science data and supporting all science activities of the DAMPE collaboration. The science data processing subsystem of DAMPE has already developed a series of physics analysis software to reconstruct basic information about detected cosmic ray particle. This subsystem also maintains the high performance computing system of SAS to processing all down linked science data and automatically monitors the qualities of all produced data. In this talk, we will describe all functionalities of whole DAMPE SAS system and show you main performances of data processing ability.
Analyzing Virtual Physics Simulations with Tracker
NASA Astrophysics Data System (ADS)
Claessens, Tom
2017-12-01
In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.
In-vivo x-ray micro-imaging and micro-CT with the Medipix2 semiconductor detector at UniAndes
NASA Astrophysics Data System (ADS)
Caicedo, I.; Avila, C.; Gomez, B.; Bula, C.; Roa, C.; Sanabria, J.
2012-02-01
This poster contains the procedure to obtain micro-CTs and to image moving samples using the Medipix2 detector, with its corresponding results. The high granularity of the detector makes it suitable for micro-CT. We used commercial software (Octopus) to do the 3D reconstruction of the samples in the first place, and we worked on modifying free reconstruction software afterwards. Medipix has a very fast response ( ~ hundreds of nanoseconds) and high sensibility. These features allow obtaining nearly in-vivo high resolution (55m * 55m) images. We used an exposure time of 0.1 s for each frame, and the resulting images were animated. The High Energy Physics Group at UniAndes is a member of the Medipix3 collaboration. Its research activities are focused on developing set-ups for biomedical applications and particle tracking using the Medipix2 and Timepix detectors, and assessing the feasibility of the Medipix3 detector for future applications.
Consistent Evolution of Software Artifacts and Non-Functional Models
2014-11-14
induce bad software performance)? 15. SUBJECT TERMS EOARD, Nano particles, Photo-Acoustic Sensors, Model-Driven Engineering ( MDE ), Software Performance...Università degli Studi dell’Aquila, Via Vetoio, 67100 L’Aquila, Italy Email: vittorio.cortellessa@univaq.it Web : http: // www. di. univaq. it/ cortelle/ Phone...Model-Driven Engineering ( MDE ), Software Performance Engineering (SPE), Change Propagation, Performance Antipatterns. For sake of readability of the
Monitoring of computing resource use of active software releases at ATLAS
NASA Astrophysics Data System (ADS)
Limosani, Antonio; ATLAS Collaboration
2017-10-01
The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.
gpuSPHASE-A shared memory caching implementation for 2D SPH using CUDA
NASA Astrophysics Data System (ADS)
Winkler, Daniel; Meister, Michael; Rezavand, Massoud; Rauch, Wolfgang
2017-04-01
Smoothed particle hydrodynamics (SPH) is a meshless Lagrangian method that has been successfully applied to computational fluid dynamics (CFD), solid mechanics and many other multi-physics problems. Using the method to solve transport phenomena in process engineering requires the simulation of several days to weeks of physical time. Based on the high computational demand of CFD such simulations in 3D need a computation time of years so that a reduction to a 2D domain is inevitable. In this paper gpuSPHASE, a new open-source 2D SPH solver implementation for graphics devices, is developed. It is optimized for simulations that must be executed with thousands of frames per second to be computed in reasonable time. A novel caching algorithm for Compute Unified Device Architecture (CUDA) shared memory is proposed and implemented. The software is validated and the performance is evaluated for the well established dambreak test case.
Software package for modeling spin-orbit motion in storage rings
NASA Astrophysics Data System (ADS)
Zyuzin, D. V.
2015-12-01
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.
Characterization of third-body media particles and their effect on in vitro composite wear
Lawson, Nathaniel C.; Cakir, Deniz; Beck, Preston; Litaker, Mark S.; Burgess, John O.
2012-01-01
Objectives The purpose of this study was to compare four medium particles currently used for in vitro composite wear testing (glass and PMMA beads and millet and poppy seeds). Methods Particles were prepared as described in previous wear studies. Hardness of medium particles was measured with a nano-indentor, particle size was measured with a particle size analyzer, and the particle form was determined with light microscopy and image analysis software. Composite wear was measured using each type of medium and water in the Alabama wear testing device. Four dental composites were compared: a hybrid (Z100), flowable microhybrid (Estelite Flow Quick), micromatrix (Esthet-X), and nano-filled (Filtek Supreme Plus). The test ran for 100,000 cycles at 1.2Hz with 70N force by a steel antagonist. Volumetric wear was measured by non-contact profilometry. A two-way analysis of variance (ANOVA) and Tukey's test was used to compare both materials and media. Results Hardness values (GPa) of the particles are (glass, millet, PMMA, poppy respectively): 1.310(0.150), 0.279(.170), 0.279(0.095), and 0.226(0.146). Average particle sizes (μm) are (glass, millet, PMMA, poppy respectively): 88.35(8.24), 8.07(4.05), 28.95(8.74), and 14.08(7.20). Glass and PMMA beads were considerably more round than the seeds. During composite wear testing, glass was the only medium that produced more wear than the use of water alone. The rank ordering of the materials varied with each medium, however, the glass and PMMA bead medium allowed better discrimination between materials. Significance PMMA beads are a practical and relevant choice for composite wear testing because they demonstrate similar physical properties as seeds but reduce the variability of wear measurements. PMID:22578990
NASA Astrophysics Data System (ADS)
Bliven, L. F.; Kucera, P. A.; Rodriguez, P.
2010-12-01
NASA Snowflake Video Imagers (SVIs) enable snowflake visualization at diverse field sites. The natural variability of frozen precipitation is a complicating factor for remote sensing retrievals in high latitude regions. Particle classification is important for understanding snow/ice physics, remote sensing polarimetry, bulk radiative properties, surface emissivity, and ultimately, precipitation rates and accumulations. Yet intermittent storms, low temperatures, high winds, remote locations and complex terrain can impede us from observing falling snow in situ. SVI hardware and software have some special features. The standard camera and optics yield 8-bit gray-scale images with resolution of 0.05 x 0.1 mm, at 60 frames per second. Gray-scale images are highly desirable because they display contrast that aids particle classification. Black and white (1-bit) systems display no contrast, so there is less information to recognize particle types, which is particularly burdensome for aggregates. Data are analyzed at one-minute intervals using NASA's Precipitation Link Software that produces (a) Particle Catalogs and (b) Particle Size Distributions (PSDs). SVIs can operate nearly continuously for long periods (e.g., an entire winter season), so natural variability can be documented. Let’s summarize results from field studies this past winter and review some recent SVI enhancements. During the winter of 2009-2010, SVIs were deployed at two sites. One SVI supported weather observations during the 2010 Winter Olympics and Paralympics. It was located close to the summit (Roundhouse) of Whistler Mountain, near the town of Whistler, British Columbia, Canada. In addition, two SVIs were located at the King City Weather Radar Station (WKR) near Toronto, Ontario, Canada. Access was prohibited to the SVI on Whistler Mountain during the Olympics due to security concerns. So to meet the schedule for daily data products, we operated the SVI by remote control. We also upgraded the Precipitation Link Software to allow operator selection of image sub-sampling interval during data processing. Thus quick-look data products were delivered on schedule, even for intense storms that generated large data files. Approximately 11 million snowflakes were recorded and we present highlights from the Particle Catalog and the PSDs obtained during the 2010 Winter Olympics and Paralympics. On the other hand, the SVIs at King Radar, Ontario had a standard resolution camera and a higher resolution camera (0.1 x 0.05 mm and 0.05 x 0.05 mm, respectively). The upgraded camera operated well. Using observations from the King Radar site, we will discuss camera durability and data products from the upgraded SVI. During the ’10-11 winter, a standard SVI is deployed in Finland as part of the Light Precipitation Validation Experiment. Two higher solution SVIs are also deployed in Canada at a field site ~30km from WKR, which will provide data for validation of radar polarization signatures and satellite observations.
SLAC Library - Online Particle Physics Information
Background Knowledge Particle Physics Lessons and Activities Astronomy and Astrophysics Lessons and Online Particle Physics Information Compiled by Revised: April, 201 7 This annotated list provides a highly selective set of online resources that are useful to the particle physics community. It
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeh, Hsu-Chi; Phalen, R.F.; Chang, I.
1995-12-01
The National Council on Radiation Protection and Measurements (NCRP) in the United States and the International Commission on Radiological Protection (ICRP) have been independently reviewing and revising respiratory tract dosimetry models for inhaled radioactive aerosols. The newly proposed NCRP respiratory tract dosimetry model represents a significant change in philosophy from the old ICRP Task Group model. The proposed NCRP model describes respiratory tract deposition, clearance, and dosimetry for radioactive substances inhaled by workers and the general public and is expected to be published soon. In support of the NCRP proposed model, ITRI staff members have been developing computer software. Althoughmore » this software is still incomplete, the deposition portion has been completed and can be used to calculate inhaled particle deposition within the respiratory tract for particle sizes as small as radon and radon progeny ({approximately} 1 nm) to particles larger than 100 {mu}m. Recently, ICRP published their new dosimetric model for the respiratory tract, ICRP66. Based on ICRP66, the National Radiological Protection Board of the UK developed PC-based software, LUDEP, for calculating particle deposition and internal doses. The purpose of this report is to compare the calculated respiratory tract deposition of particles using the NCRP/ITRI model and the ICRP66 model, under the same particle size distribution and breathing conditions. In summary, the general trends of the deposition curves for the two models were similar.« less
Two decades of Mexican particle physics at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy Rubinstein
2002-12-03
This report is a view from Fermilab of Mexican particle physics at the Laboratory since about 1980; it is not intended to be a history of Mexican particle physics: that topic is outside the expertise of the writer. The period 1980 to the present coincides with the growth of Mexican experimental particle physics from essentially no activity to its current state where Mexican groups take part in experiments at several of the world's major laboratories. Soon after becoming Fermilab director in 1979, Leon Lederman initiated a program to encourage experimental physics, especially experimental particle physics, in Latin America. At themore » time, Mexico had significant theoretical particle physics activity, but none in experiment. Following a visit by Lederman to UNAM in 1981, a conference ''Panamerican Symposium on Particle Physics and Technology'' was held in January 1982 at Cocoyoc, Mexico, with about 50 attendees from Europe, North America, and Latin America; these included Lederman, M. Moshinsky, J. Flores, S. Glashow, J. Bjorken, and G. Charpak. Among the conference outcomes were four subsequent similar symposia over the next decade, and a formal Fermilab program to aid Latin American physics (particularly particle physics); it also influenced a decision by Mexican physicist Clicerio Avilez to switch from theoretical to experimental particle physics. The first physics collaboration between Fermilab and Mexico was in particle theory. Post-docs Rodrigo Huerta and Jose Luis Lucio spent 1-2 years at Fermilab starting in 1981, and other theorists (including Augusto Garcia, Arnulfo Zepeda, Matias Moreno and Miguel Angel Perez) also spent time at the Laboratory in the 1980s.« less
Addition of Electrostatic Forces to EDEM with Applications to Triboelectrically Charged Particles
NASA Technical Reports Server (NTRS)
Hogue, Michael D.; Calle, Carlos; Curry, David
2008-01-01
Tribocharging of particles is common in many processes including fine powder handling and mixing, printer toner transport and dust extraction. In a lunar environment with its high vacuum and lack of water, electrostatic forces are an important factor to consider when designing and operating equipment. Dust mitigation and management is critical to safe and predictable performance of people and equipment. The extreme nature of lunar conditions makes it difficult and costly to carryout experiments on earth which are necessary to better understand how particles gather and transfer charge between each other and with equipment surfaces. DEM (Discrete Element Modeling) provides an excellent virtual laboratory for studying tribocharging of particles as well as for design of devices for dust mitigation and for other purposes related to handling and processing of lunar regolith. Theoretical and experimental work has been performed pursuant to incorporating screened Coulombic electrostatic forces into EDEM Tm, a commercial DEM software package. The DEM software is used to model the trajectories of large numbers of particles for industrial particulate handling and processing applications and can be coupled with other solvers and numerical models to calculate particle interaction with surrounding media and force fields. In this paper we will present overview of the theoretical calculations and experimental data and their comparison to the results of the DEM simulations. We will also discuss current plans to revise the DEM software with advanced electrodynamic and mechanical algorithms.
Computer Models Simulate Fine Particle Dispersion
NASA Technical Reports Server (NTRS)
2010-01-01
Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.
Modeling Physical Systems Using Vensim PLE Systems Dynamics Software
NASA Astrophysics Data System (ADS)
Widmark, Stephen
2012-02-01
Many physical systems are described by time-dependent differential equations or systems of such equations. This makes it difficult for students in an introductory physics class to solve many real-world problems since these students typically have little or no experience with this kind of mathematics. In my high school physics classes, I address this problem by having my students use a variety of software solutions to model physical systems described by differential equations. These include spreadsheets, applets, software my students themselves create, and systems dynamics software. For the latter, cost is often the main issue in choosing a solution for use in a public school and so I researched no-cost software. I found Sphinx SD,2OptiSim,3 Systems Dynamics,4 Simile (Trial Edition),5 and Vensim PLE.6 In evaluating each of these solutions, I looked for the fewest restrictions in the license for educational use, ease of use by students, power, and versatility. In my opinion, Vensim PLE best fulfills these criteria.7
SU-E-P-05: Electronic Brachytherapy: A Physics Perspective On Field Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pai, S; Ayyalasomayajula, S; Lee, S
2015-06-15
Purpose: We want to summarize our experience implementing a successful program of electronic brachytherapy at several dermatology clinics with the help of a cloud based software to help us define the key program parameters and capture physics QA aspects. Optimally developed software helps the physicist in peer review and qualify the physical parameters. Methods: Using the XOFT™ Axxent™ electronic brachytherapy system in conjunction with a cloud-based software, a process was setup to capture and record treatments. It was implemented initially at about 10 sites in California. For dosimetric purposes, the software facilitated storage of the physics parameters of surface applicatorsmore » used in treatment and other source calibration parameters. In addition, the patient prescription, pathology and other setup considerations were input by radiation oncologist and the therapist. This facilitated physics planning of the treatment parameters and also independent check of the dwell time. From 2013–2014, nearly1500 such calculation were completed by a group of physicists. A total of 800 patients with multiple lesions have been treated successfully during this period. The treatment log files have been uploaded and documented in the software which facilitated physics peer review of treatments per the standards in place by AAPM and ACR. Results: The program model was implemented successfully at multiple sites. The cloud based software allowed for proper peer review and compliance of the program at 10 clinical sites. Dosimtery was done on 800 patients and executed in a timely fashion to suit the clinical needs. Accumulated physics data in the software from the clinics allows for robust analysis and future development. Conclusion: Electronic brachytherapy implementation experience from a quality assurance perspective was greatly enhanced by using a cloud based software. The comprehensive database will pave the way for future developments to yield superior physics outcomes.« less
Visualization of instationary flows by particle traces
NASA Astrophysics Data System (ADS)
Raasch, S.
An abstract on a study which represents a model of atmospheric flow output by computer movies is presented. The structure and evolution of the flow is visualized by starting weightless particles at the locations of the model grid points at distinct, equally spaced times. These particles are then only advected by the flow. In order to avoid useless accumulation of particles, they can be provided with a limited lifetime. Scalar quantities can be shown in addition to using color shaded contours as background information. A movie with several examples of atmospheric flows, for example convection in the atmospheric boundary layer, slope winds, land seabreeze and Kelvin-Helmholtz waves is presented. The simulations are performed by two dimensional and three dimensional nonhydrostatic, finite difference models. Graphics are produced by using the UNIRAS software and the graphic output is in form of CGM metafiles. The single frames are stored on an ABEKAS real time video disc and then transferred to a BETACAM-SP tape recorder. The graphic software is suitable to produce 2 dimensional pictures, for example only cross sections of three dimensional simulations can be made. To produce a movie of typically 90 seconds duration, the graphic software and the particle model need about 10 hours CPU time on a CCD CYBER 990 and the CGM metafile has a size of about 1.4 GByte.
Computing in high-energy physics
Mount, Richard P.
2016-05-31
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
A Vision on the Status and Evolution of HEP Physics Software Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canal, P.; Elvira, D.; Hatcher, R.
2013-07-28
This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.
Computing in high-energy physics
NASA Astrophysics Data System (ADS)
Mount, Richard P.
2016-04-01
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
Computing in high-energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mount, Richard P.
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
NASA Astrophysics Data System (ADS)
Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián
2018-02-01
In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.
Cosmic Radiation Detection and Observations
NASA Astrophysics Data System (ADS)
Ramirez Chavez, Juan; Troncoso, Maria
Cosmic rays consist of high-energy particles accelerated from remote supernova remnant explosions and travel vast distances throughout the universe. Upon arriving at earth, the majority of these particles ionize gases in the upper atmosphere, while others interact with gas molecules in the troposphere and producing secondary cosmic rays, which are the main focus of this research. To observe these secondary cosmic rays, a detector telescope was designed and equipped with two silicon photomultipliers (SiPMs). Each SiPM is coupled to a bundle of 4 wavelength shifting optical fibers that are embedded inside a plastic scintillator sheet. The SiPM signals were amplified using a fast preamplifier with coincidence between detectors established using a binary logic gate. The coincidence events were recorded with two devices; a digital counter and an Arduino micro-controller. For detailed analysis of the SiPM waveforms, a DRS4 sensory digitizer captured the waveforms for offline analysis with the CERN software package Physics Analysis Workstation in a Linux environment. Results from our experiments would be presented. Hartnell College STEM Internship Program.
Giménez-Alventosa, V; Ballester, F; Vijande, J
2016-12-01
The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Science Teacher, 1988
1988-01-01
Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)
A Review of Discrete Element Method (DEM) Particle Shapes and Size Distributions for Lunar Soil
NASA Technical Reports Server (NTRS)
Lane, John E.; Metzger, Philip T.; Wilkinson, R. Allen
2010-01-01
As part of ongoing efforts to develop models of lunar soil mechanics, this report reviews two topics that are important to discrete element method (DEM) modeling the behavior of soils (such as lunar soils): (1) methods of modeling particle shapes and (2) analytical representations of particle size distribution. The choice of particle shape complexity is driven primarily by opposing tradeoffs with total number of particles, computer memory, and total simulation computer processing time. The choice is also dependent on available DEM software capabilities. For example, PFC2D/PFC3D and EDEM support clustering of spheres; MIMES incorporates superquadric particle shapes; and BLOKS3D provides polyhedra shapes. Most commercial and custom DEM software supports some type of complex particle shape beyond the standard sphere. Convex polyhedra, clusters of spheres and single parametric particle shapes such as the ellipsoid, polyellipsoid, and superquadric, are all motivated by the desire to introduce asymmetry into the particle shape, as well as edges and corners, in order to better simulate actual granular particle shapes and behavior. An empirical particle size distribution (PSD) formula is shown to fit desert sand data from Bagnold. Particle size data of JSC-1a obtained from a fine particle analyzer at the NASA Kennedy Space Center is also fitted to a similar empirical PSD function.
Go Science at Fermilab Fermilab and the Higgs Boson Frontiers of Particle Physics Experiments & Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Medicine Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Eric M.
2004-05-20
The YAP software library computes (1) electromagnetic modes, (2) electrostatic fields, (3) magnetostatic fields and (4) particle trajectories in 2d and 3d models. The code employs finite element methods on unstructured grids of tetrahedral, hexahedral, prism and pyramid elements, with linear through cubic element shapes and basis functions to provide high accuracy. The novel particle tracker is robust, accurate and efficient, even on unstructured grids with discontinuous fields. This software library is a component of the MICHELLE 3d finite element gun code.
Comet Gas and Dust Dynamics Modeling
NASA Technical Reports Server (NTRS)
Von Allmen, Paul A.; Lee, Seungwon
2010-01-01
This software models the gas and dust dynamics of comet coma (the head region of a comet) in order to support the Microwave Instrument for Rosetta Orbiter (MIRO) project. MIRO will study the evolution of the comet 67P/Churyumov-Gerasimenko's coma system. The instrument will measure surface temperature, gas-production rates and relative abundances, and velocity and excitation temperatures of each species along with their spatial temporal variability. This software will use these measurements to improve the understanding of coma dynamics. The modeling tool solves the equation of motion of a dust particle, the energy balance equation of the dust particle, the continuity equation for the dust and gas flow, and the dust and gas mixture energy equation. By solving these equations numerically, the software calculates the temperature and velocity of gas and dust as a function of time for a given initial gas and dust production rate, and a dust characteristic parameter that measures the ability of a dust particle to adjust its velocity to the local gas velocity. The software is written in a modular manner, thereby allowing the addition of more dynamics equations as needed. All of the numerical algorithms are added in-house and no third-party libraries are used.
Physics of Alfvén waves and energetic particles in burning plasmas
NASA Astrophysics Data System (ADS)
Chen, Liu; Zonca, Fulvio
2016-01-01
Dynamics of shear Alfvén waves and energetic particles are crucial to the performance of burning fusion plasmas. This article reviews linear as well as nonlinear physics of shear Alfvén waves and their self-consistent interaction with energetic particles in tokamak fusion devices. More specifically, the review on the linear physics deals with wave spectral properties and collective excitations by energetic particles via wave-particle resonances. The nonlinear physics deals with nonlinear wave-wave interactions as well as nonlinear wave-energetic particle interactions. Both linear as well as nonlinear physics demonstrate the qualitatively important roles played by realistic equilibrium nonuniformities, magnetic field geometries, and the specific radial mode structures in determining the instability evolution, saturation, and, ultimately, energetic-particle transport. These topics are presented within a single unified theoretical framework, where experimental observations and numerical simulation results are referred to elucidate concepts and physics processes.
NASA Astrophysics Data System (ADS)
Maćkowiak-Pawłowska, Maja; Przybyła, Piotr
2018-05-01
The incomplete particle identification limits the experimentally-available phase space region for identified particle analysis. This problem affects ongoing fluctuation and correlation studies including the search for the critical point of strongly interacting matter performed on SPS and RHIC accelerators. In this paper we provide a procedure to obtain nth order moments of the multiplicity distribution using the identity method, generalising previously published solutions for n=2 and n=3. Moreover, we present an open source software implementation of this computation, called Idhim, that allows one to obtain the true moments of identified particle multiplicity distributions from the measured ones provided the response function of the detector is known.
UFMulti: A new parallel processing software system for HEP
NASA Astrophysics Data System (ADS)
Avery, Paul; White, Andrew
1989-12-01
UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.
Fermilab | Science at Fermilab | Experiments & Projects | Energy Frontier
Go Science at Fermilab Fermilab and the Higgs Boson Frontiers of Particle Physics Experiments & Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Medicine Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library
Fermilab | Science at Fermilab | Experiments & Projects
Go Science at Fermilab Fermilab and the Higgs Boson Frontiers of Particle Physics Experiments & Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Medicine Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library
Nuclear and Particle Physics, Astrophysics and Cosmology : T-2 : LANL
linked in Search T-2, Nuclear and Particle Physics, Astrophysics and Cosmology T-2 Home T Division Focus Areas Nuclear Information Service Nuclear Physics Particle Physics Astrophysics Cosmology CONTACTS Group energy security, heavy ion physics, nuclear astrophysics, physics beyond the standard model, neutrino
Using Python as a first programming environment for computational physics in developing countries
NASA Astrophysics Data System (ADS)
Akpojotor, Godfrey; Ehwerhemuepha, Louis; Echenim, Myron; Akpojotor, Famous
2011-03-01
Python unique features such its interpretative, multiplatform and object oriented nature as well as being a free and open source software creates the possibility that any user connected to the internet can download the entire package into any platform, install it and immediately begin to use it. Thus Python is gaining reputation as a preferred environment for introducing students and new beginners to programming. Therefore in Africa, the Python African Tour project has been launched and we are coordinating its use in computational science. We examine here the challenges and prospects of using Python for computational physics (CP) education in developing countries (DC). Then we present our project on using Python to simulate and aid the learning of laboratory experiments illustrated here by modeling of the simple pendulum and also to visualize phenomena in physics illustrated here by demonstrating the wave motion of a particle in a varying potential. This project which is to train both the teachers and our students on CP using Python can easily be adopted in other DC.
Sensor-Web Operations Explorer
NASA Technical Reports Server (NTRS)
Meemong, Lee; Miller, Charles; Bowman, Kevin; Weidner, Richard
2008-01-01
Understanding the atmospheric state and its impact on air quality requires observations of trace gases, aerosols, clouds, and physical parameters across temporal and spatial scales that range from minutes to days and from meters to more than 10,000 kilometers. Observations include continuous local monitoring for particle formation; field campaigns for emissions, local transport, and chemistry; and periodic global measurements for continental transport and chemistry. Understanding includes global data assimilation framework capable of hierarchical coupling, dynamic integration of chemical data and atmospheric models, and feedback loops between models and observations. The objective of the sensor-web system is to observe trace gases, aerosols, clouds, and physical parameters, an integrated observation infrastructure composed of space-borne, air-borne, and in-situ sensors will be simulated based on their measurement physics properties. The objective of the sensor-web operation is to optimally plan for heterogeneous multiple sensors, the sampling strategies will be explored and science impact will be analyzed based on comprehensive modeling of atmospheric phenomena including convection, transport, and chemical process. Topics include system architecture, software architecture, hardware architecture, process flow, technology infusion, challenges, and future direction.
Big Data in HEP: A comprehensive use case study
NASA Astrophysics Data System (ADS)
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; Jayatilaka, Bo; Kowalkowski, Jim; Pivarski, Jim; Sehrish, Saba; Mantilla Surez, Cristina; Svyatkovskiy, Alexey; Tran, Nhan
2017-10-01
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.
An efficient, modular and simple tape archiving solution for LHC Run-3
NASA Astrophysics Data System (ADS)
Murray, S.; Bahyl, V.; Cancio, G.; Cano, E.; Kotlyar, V.; Kruse, D. F.; Leduc, J.
2017-10-01
The IT Storage group at CERN develops the software responsible for archiving to tape the custodial copy of the physics data generated by the LHC experiments. Physics run 3 will start in 2021 and will introduce two major challenges for which the tape archive software must be evolved. Firstly the software will need to make more efficient use of tape drives in order to sustain the predicted data rate of 150 petabytes per year as opposed to the current 50 petabytes per year. Secondly the software will need to be seamlessly integrated with EOS, which has become the de facto disk storage system provided by the IT Storage group for physics data. The tape storage software for LHC physics run 3 is code named CTA (the CERN Tape Archive). This paper describes how CTA will introduce a pre-emptive drive scheduler to use tape drives more efficiently, will encapsulate all tape software into a single module that will sit behind one or more EOS systems, and will be simpler by dropping support for obsolete backwards compatibility.
ReaDDy - A Software for Particle-Based Reaction-Diffusion Dynamics in Crowded Cellular Environments
Schöneberg, Johannes; Noé, Frank
2013-01-01
We introduce the software package ReaDDy for simulation of detailed spatiotemporal mechanisms of dynamical processes in the cell, based on reaction-diffusion dynamics with particle resolution. In contrast to other particle-based reaction kinetics programs, ReaDDy supports particle interaction potentials. This permits effects such as space exclusion, molecular crowding and aggregation to be modeled. The biomolecules simulated can be represented as a sphere, or as a more complex geometry such as a domain structure or polymer chain. ReaDDy bridges the gap between small-scale but highly detailed molecular dynamics or Brownian dynamics simulations and large-scale but little-detailed reaction kinetics simulations. ReaDDy has a modular design that enables the exchange of the computing core by efficient platform-specific implementations or dynamical models that are different from Brownian dynamics. PMID:24040218
A Low-cost Beam Profiler Based On Cerium-doped Silica Fibers
NASA Astrophysics Data System (ADS)
Potkins, David Edward; Braccini, Saverio; Nesteruk, Konrad Pawel; Carzaniga, Tommaso Stefano; Vedda, Anna; Chiodini, Norberto; Timmermans, Jacob; Melanson, Stephane; Dehnel, Morgan Patrick
A beam profiler called the Universal Beam Monitor (UniBEaM) has been developed by D-Pace Inc. (Canada) and the Albert Einstein Center for Fundamental Physics, Laboratory for High Energy Physics, University of Bern (Switzerland). The device is based on passing 100 to 600 micron cerium-doped optical fibers through a particle beam. Visible scintillation light from the sensor fibers is transmitted over distances of tens of meters to the light sensors with minimal signal loss and no susceptibility to electromagnetic fields. The probe has an insertion length of only 70 mm. The software plots the beam intensity distribution in the horizontal and vertical planes, and calculates the beam location and integrated profile area, which correlates well with total beam current. UniBEaM has a large dynamic range, operating with beam currents of ∼pA to mA, and a large range of particle kinetic energies of ∼keV to GeV, depending on the absorbed power density. Test data are presented for H- beams at 25keV for 500 μA, and H+ beams at 18MeV for 50pA to 10 μA. Maximum absorbed power density of the optical fiber before thermal damage is discussed in relation to dE/dx energy deposition as a function of particle type and kinetic energy. UniBEaM is well suited for a wide variety of beamlines including discovery science applications, radio-pharmaceutical production, hadron therapy, industrial ion beam applications including ion implantation, industrial electron beams, and ion source testing.
Polycrystalline CVD diamond device level modeling for particle detection applications
NASA Astrophysics Data System (ADS)
Morozzi, A.; Passeri, D.; Kanxheri, K.; Servoli, L.; Lagomarsino, S.; Sciortino, S.
2016-12-01
Diamond is a promising material whose excellent physical properties foster its use for radiation detection applications, in particular in those hostile operating environments where the silicon-based detectors behavior is limited due to the high radiation fluence. Within this framework, the application of Technology Computer Aided Design (TCAD) simulation tools is highly envisaged for the study, the optimization and the predictive analysis of sensing devices. Since the novelty of using diamond in electronics, this material is not included in the library of commercial, state-of-the-art TCAD software tools. In this work, we propose the development, the application and the validation of numerical models to simulate the electrical behavior of polycrystalline (pc)CVD diamond conceived for diamond sensors for particle detection. The model focuses on the characterization of a physically-based pcCVD diamond bandgap taking into account deep-level defects acting as recombination centers and/or trap states. While a definite picture of the polycrystalline diamond band-gap is still debated, the effect of the main parameters (e.g. trap densities, capture cross-sections, etc.) can be deeply investigated thanks to the simulated approach. The charge collection efficiency due to β -particle irradiation of diamond materials provided by different vendors and with different electrode configurations has been selected as figure of merit for the model validation. The good agreement between measurements and simulation findings, keeping the traps density as the only one fitting parameter, assesses the suitability of the TCAD modeling approach as a predictive tool for the design and the optimization of diamond-based radiation detectors.
Software package for modeling spin–orbit motion in storage rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de
2015-12-15
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less
NASA Astrophysics Data System (ADS)
Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma
2017-04-01
High-energy solar energetic particles (SEPs) emitted from the Sun are a major space weather hazard motivating the development of predictive capabilities. In this work, the current state of knowledge on the origin and forecasting of SEP events will be reviewed. Subsequently, we will present the EU HORIZON2020 HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis) project, its structure, its main scientific objectives and forecasting operational tools, as well as the added value to SEP research both from the observational as well as the SEP modelling perspective. The project addresses through multi-frequency observations and simulations the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to the detection near 1 AU. Furthermore, publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters that can be compared with space-borne measurements at lower energies is provided for the first time by HESPERIA. In order to achieve these goals, HESPERIA is exploiting already available large datasets stored in databases such as the neutron monitor database (NMDB) and SEPServer that were developed under EU FP7 projects from 2008 to 2013. Forecasting results of the two novel SEP operational forecasting tools published via the consortium server of 'HESPERIA' will be presented, as well as some scientific key results on the acceleration, transport and impact on Earth of high-energy particles. Acknowledgement: This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637324.
NASA Technical Reports Server (NTRS)
Sadoulet, Bernard; Cronin, James; Aprile, Elena; Barish, Barry C.; Beier, Eugene W.; Brandenberger, Robert; Cabrera, Blas; Caldwell, David; Cassiday, George; Cline, David B.
1991-01-01
The following scientific areas are reviewed: (1) cosmology and particle physics (particle physics and the early universe, dark matter, and other relics); (2) stellar physics and particles (solar neutrinos, supernovae, and unconventional particle physics); (3) high energy gamma ray and neutrino astronomy; (4) cosmic rays (space and ground observations). Highest scientific priorities for the next decade include implementation of the current program, new initiatives, and longer-term programs. Essential technological developments, such as cryogenic detectors of particles, new solar neutrino techniques, and new extensive air shower detectors, are discussed. Also a certain number of institutional issues (the funding of particle astrophysics, recommended funding mechanisms, recommended facilities, international collaborations, and education and technology) which will become critical in the coming decade are presented.
NASA Astrophysics Data System (ADS)
Di Stefano, Omar; Stassi, Roberto; Garziano, Luigi; Frisk Kockum, Anton; Savasta, Salvatore; Nori, Franco
2017-05-01
In quantum field theory, bare particles are dressed by a cloud of virtual particles to form physical particles. The virtual particles affect properties such as the mass and charge of the physical particles, and it is only these modified properties that can be measured in experiments, not the properties of the bare particles. The influence of virtual particles is prominent in the ultrastrong-coupling regime of cavity quantum electrodynamics (QED), which has recently been realised in several condensed-matter systems. In some of these systems, the effective interaction between atom-like transitions and the cavity photons can be switched on or off by external control pulses. This offers unprecedented possibilities for exploring quantum vacuum fluctuations and the relation between physical and bare particles. We consider a single three-level quantum system coupled to an optical resonator. Here we show that, by applying external electromagnetic pulses of suitable amplitude and frequency, each virtual photon dressing a physical excitation in cavity-QED systems can be converted into a physical observable photon, and back again. In this way, the hidden relationship between the bare and the physical excitations can be unravelled and becomes experimentally testable. The conversion between virtual and physical photons can be clearly pictured using Feynman diagrams with cut loops.
Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko
2014-12-01
To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.
Performance studies of the P barANDA planar GEM-tracking detector in physics simulations
NASA Astrophysics Data System (ADS)
Divani Veis, Nazila; Firoozabadi, Mohammad M.; Karabowicz, Radoslaw; Maas, Frank; Saito, Takehiko R.; Voss, Bernd; ̅PANDA Gem-Tracker Subgroup
2018-03-01
The P barANDA experiment will be installed at the future facility for antiproton and ion research (FAIR) in Darmstadt, Germany, to study events from the annihilation of protons and antiprotons. The P barANDA detectors can cover a wide physics program about baryon spectroscopy and nucleon structure as well as the study of hadrons and hypernuclear physics including the study of excited hyperon states. One very specific feature of most hyperon ground states is the long decay length of several centimeters in the forward direction. The central tracking detectors of the P barANDA setup are not sufficiently optimized for these long decay lengths. Therefore, using a set of the planar GEM-tracking detectors in the forward region of interest can improve the results in the hyperon physics-benchmark channel. The current conceptual designed P barANDA GEM-tracking stations contribute the measurement of the particles emitted in the polar angles between about 2 to 22 degrees. For this designed detector performance and acceptance, studies have been performed using one of the important hyperonic decay channel p bar p → Λ bar Λ → p bar pπ+π- in physics simulations. The simulations were carried out using the PandaRoot software packages based on the FairRoot framework.
ZENO: N-body and SPH Simulation Codes
NASA Astrophysics Data System (ADS)
Barnes, Joshua E.
2011-02-01
The ZENO software package integrates N-body and SPH simulation codes with a large array of programs to generate initial conditions and analyze numerical simulations. Written in C, the ZENO system is portable between Mac, Linux, and Unix platforms. It is in active use at the Institute for Astronomy (IfA), at NRAO, and possibly elsewhere. Zeno programs can perform a wide range of simulation and analysis tasks. While many of these programs were first created for specific projects, they embody algorithms of general applicability and embrace a modular design strategy, so existing code is easily applied to new tasks. Major elements of the system include: Structured data file utilities facilitate basic operations on binary data, including import/export of ZENO data to other systems.Snapshot generation routines create particle distributions with various properties. Systems with user-specified density profiles can be realized in collisionless or gaseous form; multiple spherical and disk components may be set up in mutual equilibrium.Snapshot manipulation routines permit the user to sift, sort, and combine particle arrays, translate and rotate particle configurations, and assign new values to data fields associated with each particle.Simulation codes include both pure N-body and combined N-body/SPH programs: Pure N-body codes are available in both uniprocessor and parallel versions.SPH codes offer a wide range of options for gas physics, including isothermal, adiabatic, and radiating models. Snapshot analysis programs calculate temporal averages, evaluate particle statistics, measure shapes and density profiles, compute kinematic properties, and identify and track objects in particle distributions.Visualization programs generate interactive displays and produce still images and videos of particle distributions; the user may specify arbitrary color schemes and viewing transformations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee
2015-09-01
This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.
Fermilab | Science at Fermilab | Experiments & Projects | Intensity
Search Search Go Science at Fermilab Fermilab and the Higgs Boson Frontiers of Particle Physics and Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Results Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle
Applications of Nuclear and Particle Physics Technology: Particles & Detection — A Brief Overview
NASA Astrophysics Data System (ADS)
Weisenberger, Andrew G.
A brief overview of the technology applications with significant societal benefit that have their origins in nuclear and particle physics research is presented. It is shown through representative examples that applications of nuclear physics can be classified into two basic areas: 1) applying the results of experimental nuclear physics and 2) applying the tools of experimental nuclear physics. Examples of the application of the tools of experimental nuclear and particle physics research are provided in the fields of accelerator and detector based technologies namely synchrotron light sources, nuclear medicine, ion implantation and radiation therapy.
Particle Accelerators Test Cosmological Theory.
ERIC Educational Resources Information Center
Schramm, David N.; Steigman, Gary
1988-01-01
Discusses the symbiotic relationship of cosmology and elementary-particle physics. Presents a brief overview of particle physics. Explains how cosmological considerations set limits on the number of types of elementary particles. (RT)
ERIC Educational Resources Information Center
Wadness, Michael J.
2010-01-01
This dissertation addresses the research question: To what extent do secondary school science students attending the U.S. Particle Physics Masterclass change their view of the nature of science (NOS)? The U.S. Particle Physics Masterclass is a physics outreach program run by QuarkNet, a national organization of secondary school physics teachers…
ERIC Educational Resources Information Center
Science and Children, 1990
1990-01-01
Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…
Basic guidelines to introduce electric circuit simulation software in a general physics course
NASA Astrophysics Data System (ADS)
Moya, A. A.
2018-05-01
The introduction of electric circuit simulation software for undergraduate students in a general physics course is proposed in order to contribute to the constructive learning of electric circuit theory. This work focuses on the lab exercises based on dc, transient and ac analysis in electric circuits found in introductory physics courses, and shows how students can use the simulation software to do simple activities associated with a lab exercise itself and with related topics. By introducing electric circuit simulation programs in a general physics course as a brief activitiy complementing lab exercise, students develop basic skills in using simulation software, improve their knowledge on the topology of electric circuits and perceive that the technology contributes to their learning, all without reducing the time spent on the actual content of the course.
APM_GUI: analyzing particle movement on the cell membrane and determining confinement.
Menchón, Silvia A; Martín, Mauricio G; Dotti, Carlos G
2012-02-20
Single-particle tracking is a powerful tool for tracking individual particles with high precision. It provides useful information that allows the study of diffusion properties as well as the dynamics of movement. Changes in particle movement behavior, such as transitions between Brownian motion and temporary confinement, can reveal interesting biophysical interactions. Although useful applications exist to determine the paths of individual particles, only a few software implementations are available to analyze these data, and these implementations are generally not user-friendly and do not have a graphical interface,. Here, we present APM_GUI (Analyzing Particle Movement), which is a MatLab-implemented application with a Graphical User Interface. This user-friendly application detects confined movement considering non-random confinement when a particle remains in a region longer than a Brownian diffusant would remain. In addition, APM_GUI exports the results, which allows users to analyze this information using software that they are familiar with. APM_GUI provides an open-source tool that quantifies diffusion coefficients and determines whether trajectories have non-random confinements. It also offers a simple and user-friendly tool that can be used by individuals without programming skills.
Software-type Wave-Particle Interaction Analyzer on board the ARASE satellite
NASA Astrophysics Data System (ADS)
Katoh, Y.; Kojima, H.; Hikishima, M.; Takashima, T.; Asamura, K.; Miyoshi, Y.; Kasahara, Y.; Kasahara, S.; Mitani, T.; Higashio, N.; Matsuoka, A.; Ozaki, M.; Yagitani, S.; Yokota, S.; Matsuda, S.; Kitahara, M.; Shinohara, I.
2017-12-01
Wave-Particle Interaction Analyzer (WPIA) is a new type of instrumentation recently proposed by Fukuhara et al. (2009) for direct and quantitative measurements of wave-particle interactions. WPIA computes an inner product W(ti) = qE(ti)·vi, where ti is the detection timing of the i-th particle, E(ti) is the wave electric field vector at ti, and q and vi is the charge and the velocity vector of the i-th particle, respectively. Since W(ti) is the gain or the loss of the kinetic energy of the i-th particle, by accumulating W for detected particles, we obtain the net amount of the energy exchange in the region of interest. Software-type WPIA (S-WPIA) is installed in the ARASE satellite as a software function running on the mission data processor. S-WPIA on board the ARASE satellite uses electromagnetic field waveform measured by Waveform Capture (WFC) of Plasma Wave Experiment (PWE) and velocity vectors detected by Medium-Energy Particle Experiments - Electron Analyzer (MEP-e), High-Energy Electron Experiments (HEP), and Extremely High-Energy Electron Experiment (XEP). The prime target of S-WPIA is the measurement of the energy exchange between whistler-mode chorus emissions and energetic electrons in the inner magnetosphere. It is essential for S-WPIA to synchronize instruments in the time resolution better than the time scale of wave-particle interactions. Since the typical frequency of chorus emissions is a few kHz in the inner magnetosphere, the time resolution better than 10 micro-sec should be realized so as to measure the relative phase angle between wave and velocity vectors with the accuracy enough to detect the sign of W correctly. In the ARASE satellite, a dedicated system has been developed in order to realize the required time resolution for the inter-instruments communications. In this presentation, we show the principle of the WPIA and its significance as well as the implementation of S-WPIA on the ARASE satellite.
The influence of human physical activity and contaminated clothing type on particle resuspension.
McDonagh, A; Byrne, M A
2014-01-01
A study was conducted to experimentally quantify the influence of three variables on the level of resuspension of hazardous aerosol particles from clothing. Variables investigated include physical activity level (two levels, low and high), surface type (four different clothing material types), and time i.e. the rate at which particles resuspend. A mixture of three monodisperse tracer-labelled powders, with median diameters of 3, 5, and 10 microns, was used to "contaminate" the samples, and the resuspended particles were analysed in real-time using an Aerodynamic Particle Sizer (APS), and also by Neutron Activation Analysis (NAA). The overall finding was that physical activity resulted in up to 67% of the contamination deposited on clothing being resuspended back into the air. A detailed examination of the influence of physical activity level on resuspension, from NAA, revealed that the average resuspended fraction (RF) of particles at low physical activity was 28 ± 8%, and at high physical activity was 30 ± 7%, while the APS data revealed a tenfold increase in the cumulative mass of airborne particles during high physical activity in comparison to that during low physical activity. The results also suggest that it is not the contaminated clothing's fibre type which influences particle resuspension, but the material's weave pattern (and hence the material's surface texture). Investigation of the time variation in resuspended particle concentrations indicated that the data were separable into two distinct regimes: the first (occurring within the first 1.5 min) having a high, positive rate of change of airborne particle concentration relative to the second regime. The second regime revealed a slower rate of change of particle concentration and remained relatively unchanged for the remainder of each resuspension event. Copyright © 2013 Elsevier Ltd. All rights reserved.
Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Daniel S.; Tandon, Lav
The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.
[Principles of the EOS™ X-ray machine and its use in daily orthopedic practice].
Illés, Tamás; Somoskeöy, Szabolcs
2012-02-26
The EOS™ X-ray machine, based on a Nobel prize-winning invention in Physics in the field of particle detection, is capable of simultaneously capturing biplanar X-ray images by slot scanning of the whole body in an upright, physiological load-bearing position, using ultra low radiation doses. The simultaneous capture of spatially calibrated anterioposterior and lateral images allows the performance of a three-dimensional (3D) surface reconstruction of the skeletal system by a special software. Parts of the skeletal system in X-ray images and 3D-reconstructed models appear in true 1:1 scale for size and volume, thus spinal and vertebral parameters, lower limb axis lengths and angles, as well as any relevant clinical parameters in orthopedic practice could be very precisely measured and calculated. Visualization of 3D reconstructed models in various views by the sterEOS 3D software enables the presentation of top view images, through which one can analyze the rotational conditions of lower limbs, joints and spine deformities in horizontal plane and this provides revolutionary novel possibilities in orthopedic surgery, especially in spine surgery.
NASA Astrophysics Data System (ADS)
Athron, Peter; Balázs, Csaba; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Kvellestad, Anders; McKay, James; Putze, Antje; Rogan, Chris; Scott, Pat; Weniger, Christoph; White, Martin
2018-01-01
We present the GAMBIT modules SpecBit, DecayBit and PrecisionBit. Together they provide a new framework for linking publicly available spectrum generators, decay codes and other precision observable calculations in a physically and statistically consistent manner. This allows users to automatically run various combinations of existing codes as if they are a single package. The modular design allows software packages fulfilling the same role to be exchanged freely at runtime, with the results presented in a common format that can easily be passed to downstream dark matter, collider and flavour codes. These modules constitute an essential part of the broader GAMBIT framework, a major new software package for performing global fits. In this paper we present the observable calculations, data, and likelihood functions implemented in the three modules, as well as the conventions and assumptions used in interfacing them with external codes. We also present 3-BIT-HIT, a command-line utility for computing mass spectra, couplings, decays and precision observables in the MSSM, which shows how the three modules can easily be used independently of GAMBIT.
Open Source Software in Teaching Physics: A Case Study on Vector Algebra and Visual Representations
ERIC Educational Resources Information Center
Cataloglu, Erdat
2006-01-01
This study aims to report the effort on teaching vector algebra using free open source software (FOSS). Recent studies showed that students have difficulties in learning basic physics concepts. Constructivist learning theories suggest the use of visual and hands-on activities in learning. We will report on the software used for this purpose. The…
Monte Carlo Methods in Materials Science Based on FLUKA and ROOT
NASA Technical Reports Server (NTRS)
Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor
2003-01-01
A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the CERN ALICE (A Large Ion Collisions Experiment) software team through an adaptation of their existing AliROOT (ALICE Using ROOT) architecture. In order to check our progress against actual data, we have chosen to simulate the ATIC14 (Advanced Thin Ionization Calorimeter) cosmic-ray astrophysics balloon payload as well as neutron fluences in the Mir spacecraft. This paper contains a summary of status of this project, and a roadmap to its successful completion.
pyCTQW: A continuous-time quantum walk simulator on distributed memory computers
NASA Astrophysics Data System (ADS)
Izaac, Josh A.; Wang, Jingbo B.
2015-01-01
In the general field of quantum information and computation, quantum walks are playing an increasingly important role in constructing physical models and quantum algorithms. We have recently developed a distributed memory software package pyCTQW, with an object-oriented Python interface, that allows efficient simulation of large multi-particle CTQW (continuous-time quantum walk)-based systems. In this paper, we present an introduction to the Python and Fortran interfaces of pyCTQW, discuss various numerical methods of calculating the matrix exponential, and demonstrate the performance behavior of pyCTQW on a distributed memory cluster. In particular, the Chebyshev and Krylov-subspace methods for calculating the quantum walk propagation are provided, as well as methods for visualization and data analysis.
Space Physics Data Facility Web Services
NASA Technical Reports Server (NTRS)
Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.
2005-01-01
The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graf, Norman A.; /SLAC
Maximizing the physics performance of detectors being designed for the International Linear Collider, while remaining sensitive to cost constraints, requires a powerful, efficient, and flexible simulation, reconstruction and analysis environment to study the capabilities of a large number of different detector designs. The preparation of Letters Of Intent for the International Linear Collider involved the detailed study of dozens of detector options, layouts and readout technologies; the final physics benchmarking studies required the reconstruction and analysis of hundreds of millions of events. We describe the Java-based software toolkit (org.lcsim) which was used for full event reconstruction and analysis. The componentsmore » are fully modular and are available for tasks from digitization of tracking detector signals through to cluster finding, pattern recognition, track-fitting, calorimeter clustering, individual particle reconstruction, jet-finding, and analysis. The detector is defined by the same xml input files used for the detector response simulation, ensuring the simulation and reconstruction geometries are always commensurate by construction. We discuss the architecture as well as the performance.« less
RF Models for Plasma-Surface Interactions in VSim
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.
2014-10-01
An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.
PIV Data Validation Software Package
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.
What's Next for Particle Physics?
NASA Astrophysics Data System (ADS)
White, Martin
2017-10-01
Following the discovery of the Higgs boson in 2012, particle physics has entered its most exciting and crucial period for over 50 years. In this book, I first summarise our current understanding of particle physics, and why this knowledge is almost certainly incomplete. We will then see that the Large Hadron Collider provides the means to search for the next theory of particle physics by performing precise measurements of the Higgs boson, and by looking directly for particles that can solve current cosmic mysteries such as the nature of dark matter. Finally, I will anticipate the next decade of particle physics by placing the Large Hadron Collider within the wider context of other experiments. The results expected over the next ten years promise to transform our understanding of what the Universe is made of and how it came to be.
Particle Physics: From School to University.
ERIC Educational Resources Information Center
Barlow, Roger
1992-01-01
Discusses the teaching of particle physics as part of the A-level physics course in British secondary schools. Utilizes the quark model of hadrons and the conceptual kinematics of particle collisions, as examples, to demonstrate practical instructional possibilities in relation to student expectations. (JJK)
From Particle Physics to Medical Applications
NASA Astrophysics Data System (ADS)
Dosanjh, Manjit
2017-06-01
CERN is the world's largest particle physics research laboratory. Since it was established in 1954, it has made an outstanding contribution to our understanding of the fundamental particles and their interactions, and also to the technologies needed to analyse their properties and behaviour. The experimental challenges have pushed the performance of particle accelerators and detectors to the limits of our technical capabilities, and these groundbreaking technologies can also have a significant impact in applications beyond particle physics. In particular, the detectors developed for particle physics have led to improved techniques for medical imaging, while accelerator technologies lie at the heart of the irradiation methods that are widely used for treating cancer. Indeed, many important diagnostic and therapeutic techniques used by healthcare professionals are based either on basic physics principles or the technologies developed to carry out physics research. Ever since the discovery of x-rays by Roentgen in 1895, physics has been instrumental in the development of technologies in the biomedical domain, including the use of ionizing radiation for medical imaging and therapy. Some key examples that are explored in detail in this book include scanners based on positron emission tomography, as well as radiation therapy for cancer treatment. Even the collaborative model of particle physics is proving to be effective in catalysing multidisciplinary research for medical applications, ensuring that pioneering physics research is exploited for the benefit of all.
Comparison of particle tracking algorithms in commercial CFD packages: sedimentation and diffusion.
Robinson, Risa J; Snyder, Pam; Oldham, Michael J
2007-05-01
Computational fluid dynamic modeling software has enabled microdosimetry patterns of inhaled toxins and toxicants to be predicted and visualized, and is being used in inhalation toxicology and risk assessment. These predicted microdosimetry patterns in airway structures are derived from predicted airflow patterns within these airways and particle tracking algorithms used in computational fluid dynamics (CFD) software packages. Although these commercial CFD codes have been tested for accuracy under various conditions, they have not been well tested for respiratory flows in general. Nor has their particle tracking algorithm accuracy been well studied. In this study, three software packages, Fluent Discrete Phase Model (DPM), Fluent Fine Particle Model (FPM), and ANSYS CFX, were evaluated. Sedimentation and diffusion were each isolated in a straight tube geometry and tested for accuracy. A range of flow rates corresponding to adult low activity (minute ventilation = 10 L/min) and to heavy exertion (minute ventilation = 60 L/min) were tested by varying the range of dimensionless diffusion and sedimentation parameters found using the Weibel symmetric 23 generation lung morphology. Numerical results for fully developed parabolic and uniform (slip) profiles were compared respectively, to Pich (1972) and Yu (1977) analytical sedimentation solutions. Schum and Yeh (1980) equations for sedimentation were also compared. Numerical results for diffusional deposition were compared to analytical solutions of Ingham (1975) for parabolic and uniform profiles. Significant differences were found among the various CFD software packages and between numerical and analytical solutions. Therefore, it is prudent to validate CFD predictions against analytical solutions in idealized geometry before tackling the complex geometries of the respiratory tract.
Let’s have a coffee with the Standard Model of particle physics!
NASA Astrophysics Data System (ADS)
Woithe, Julia; Wiener, Gerfried J.; Van der Veken, Frederik F.
2017-05-01
The Standard Model of particle physics is one of the most successful theories in physics and describes the fundamental interactions between elementary particles. It is encoded in a compact description, the so-called ‘Lagrangian’, which even fits on t-shirts and coffee mugs. This mathematical formulation, however, is complex and only rarely makes it into the physics classroom. Therefore, to support high school teachers in their challenging endeavour of introducing particle physics in the classroom, we provide a qualitative explanation of the terms of the Lagrangian and discuss their interpretation based on associated Feynman diagrams.
Lühr, Armin; Löck, Steffen; Roth, Klaus; Helmbrecht, Stephan; Jakobi, Annika; Petersen, Jørgen B; Just, Uwe; Krause, Mechthild; Enghardt, Wolfgang; Baumann, Michael
2014-02-18
Identifying those patients who have a higher chance to be cured with fewer side effects by particle beam therapy than by state-of-the-art photon therapy is essential to guarantee a fair and sufficient access to specialized radiotherapy. The individualized identification requires initiatives by particle as well as non-particle radiotherapy centers to form networks, to establish procedures for the decision process, and to implement means for the remote exchange of relevant patient information. In this work, we want to contribute a practical concept that addresses these requirements. We proposed a concept for individualized patient allocation to photon or particle beam therapy at a non-particle radiotherapy institution that bases on remote treatment plan comparison. We translated this concept into the web-based software tool ReCompare (REmote COMparison of PARticlE and photon treatment plans). We substantiated the feasibility of the proposed concept by demonstrating remote exchange of treatment plans between radiotherapy institutions and the direct comparison of photon and particle treatment plans in photon treatment planning systems. ReCompare worked with several tested standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. Our concept supports non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by providing expertise from a particle therapy center. The software tool ReCompare may help to improve and standardize this personalized treatment decision. It will be available from our website when proton therapy is operational at our facility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calhoon, E.C.; Starring, P.W. eds.
1959-08-01
Lectures given at the Ernest 0. Lawrence Radiation Laboratory on physics, biophysics, and chemistry for high school science teachers are presented. Topics covered include a mathematics review, atomic physics, nuclear physics, solid-state physics, elementary particles, antiparticies, design of experiments, high-energy particle accelerators, survey of particle detectors, emulsion as a particle detector, counters used in high-energy physics, bubble chambers, computer programming, chromatography, the transuranium elements, health physics, photosynthesis, the chemistry and physics of virus, the biology of virus, lipoproteins and heart disease, origin and evolution of the solar system, the role of space satellites in gathering astronomical data, and radiation andmore » life in space. (M.C.G.)« less
NASA Technical Reports Server (NTRS)
Bidwell, Colin, S.
2012-01-01
Ice Particle trajectory calculations with phase change were made for the Energy Efficient Engine (E(sup 3)) using the LEWICE3D Version 3.2 software. The particle trajectory computations were performed using the new Glenn Ice Particle Phase Change Model which has been incorporated into the LEWICE3D Version 3.2 software. The E(sup 3) was developed by NASA and GE in the early 1980 s as a technology demonstrator and is representative of a modern high bypass turbofan engine. The E(sup 3) flow field was calculated using the NASA Glenn ADPAC turbomachinery flow solver. Computations were performed for the low pressure compressor of the E(sup 3) for a Mach 0.8 cruise condition at 11,887 m assuming a standard warm day for ice particle sizes of 5, 20, and 100 microns and a free stream particle concentration of 0.3 g/cu m. The impingement efficiency results showed that as particle size increased average impingement efficiencies and scoop factors increased for the various components. The particle analysis also showed that the amount of mass entering the inner core decreased with increased particle size because the larger particles were less able to negotiate the turn into the inner core due to particle inertia. The particle phase change analysis results showed that the larger particles warmed less as they were transported through the low pressure compressor. Only the smallest 5 micron particles were warmed enough to produce melting and the amount of melting was relatively small with a maximum average melting fraction of 0.836. The results also showed an appreciable amount of particle sublimation and evaporation for the 5 micron particles entering the engine core (22 percent).
Basic Guidelines to Introduce Electric Circuit Simulation Software in a General Physics Course
ERIC Educational Resources Information Center
Moya, A. A.
2018-01-01
The introduction of electric circuit simulation software for undergraduate students in a general physics course is proposed in order to contribute to the constructive learning of electric circuit theory. This work focuses on the lab exercises based on dc, transient and ac analysis in electric circuits found in introductory physics courses, and…
DynamO: a free O(N) general event-driven molecular dynamics simulator.
Bannerman, M N; Sargant, R; Lue, L
2011-11-30
Molecular dynamics algorithms for systems of particles interacting through discrete or "hard" potentials are fundamentally different to the methods for continuous or "soft" potential systems. Although many software packages have been developed for continuous potential systems, software for discrete potential systems based on event-driven algorithms are relatively scarce and specialized. We present DynamO, a general event-driven simulation package, which displays the optimal O(N) asymptotic scaling of the computational cost with the number of particles N, rather than the O(N) scaling found in most standard algorithms. DynamO provides reference implementations of the best available event-driven algorithms. These techniques allow the rapid simulation of both complex and large (>10(6) particles) systems for long times. The performance of the program is benchmarked for elastic hard sphere systems, homogeneous cooling and sheared inelastic hard spheres, and equilibrium Lennard-Jones fluids. This software and its documentation are distributed under the GNU General Public license and can be freely downloaded from http://marcusbannerman.co.uk/dynamo. Copyright © 2011 Wiley Periodicals, Inc.
Particle and nuclear physics instrumentation and its broad connections
Demarteau, Marcel; Lipton, Ron; Nicholson, Howard; ...
2016-12-20
Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector researchmore » and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. Finally, this symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.« less
Particle and nuclear physics instrumentation and its broad connections
NASA Astrophysics Data System (ADS)
Demarteau, M.; Lipton, R.; Nicholson, H.; Shipsey, I.
2016-10-01
Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector research and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. This symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.
Particle and nuclear physics instrumentation and its broad connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demarteau, Marcel; Lipton, Ron; Nicholson, Howard
Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector researchmore » and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. Finally, this symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.« less
Components for Atomistic-to-Continuum Multiscale Modeling of Flow in Micro- and Nanofluidic Systems
Adalsteinsson, Helgi; Debusschere, Bert J.; Long, Kevin R.; ...
2008-01-01
Micro- and nanofluidics pose a series of significant challenges for science-based modeling. Key among those are the wide separation of length- and timescales between interface phenomena and bulk flow and the spatially heterogeneous solution properties near solid-liquid interfaces. It is not uncommon for characteristic scales in these systems to span nine orders of magnitude from the atomic motions in particle dynamics up to evolution of mass transport at the macroscale level, making explicit particle models intractable for all but the simplest systems. Recently, atomistic-to-continuum (A2C) multiscale simulations have gained a lot of interest as an approach to rigorously handle particle-levelmore » dynamics while also tracking evolution of large-scale macroscale behavior. While these methods are clearly not applicable to all classes of simulations, they are finding traction in systems in which tight-binding, and physically important, dynamics at system interfaces have complex effects on the slower-evolving large-scale evolution of the surrounding medium. These conditions allow decomposition of the simulation into discrete domains, either spatially or temporally. In this paper, we describe how features of domain decomposed simulation systems can be harnessed to yield flexible and efficient software for multiscale simulations of electric field-driven micro- and nanofluidics.« less
Tunable particles alter macrophage uptake based on combinatorial effects of physical properties
Garapaty, Anusha
2017-01-01
Abstract The ability to tune phagocytosis of particle‐based therapeutics by macrophages can enhance their delivery to macrophages or reduce their phagocytic susceptibility for delivery to non‐phagocytic cells. Since phagocytosis is affected by the physical and chemical properties of particles, it is crucial to identify any interplay between physical properties of particles in altering phagocytic interactions. The combinatorial effect of physical properties size, shape and stiffness was investigated on Fc receptor mediated macrophage interactions by fabrication of layer‐by‐layer tunable particles of constant surface chemistry. Our results highlight how changing particle stiffness affects phagocytic interaction intricately when combined with varying size or shape. Increase in size plays a dominant role over reduction in stiffness in reducing internalization by macrophages for spherical particles. Internalization of rod‐shaped, but not spherical particles, was highly dependent on stiffness. These particles demonstrate the interplay between size, shape and stiffness in interactions of Fc‐functionalized particles with macrophages during phagocytosis. PMID:29313025
In Vitro Toxicity of Silver Nanoparticles in Human Lung Epithelial Cells
2009-03-01
software from the particle distributions measured and the polydispersity index (PdI) given is a measure of the size ranges present in the solution...Transmission Electron Microscopy Figure 22 shows the TEM primary particles size and distribution determined from measurement of over 100 particles from...nm uncoated. (B) Ag 80 nm uncoated. (C) Ag 10 nm coated. (D) Ag 80 nm coated Table 4 shows the TEM primary particles size and distribution
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
High-energy physics software parallelization using database techniques
NASA Astrophysics Data System (ADS)
Argante, E.; van der Stok, P. D. V.; Willers, I.
1997-02-01
A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.
DarkBit: a GAMBIT module for computing dark matter observables and likelihoods
NASA Astrophysics Data System (ADS)
Bringmann, Torsten; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Kahlhoefer, Felix; Kvellestad, Anders; Putze, Antje; Savage, Christopher; Scott, Pat; Weniger, Christoph; White, Martin; Wild, Sebastian
2017-12-01
We introduce DarkBit, an advanced software code for computing dark matter constraints on various extensions to the Standard Model of particle physics, comprising both new native code and interfaces to external packages. This release includes a dedicated signal yield calculator for gamma-ray observations, which significantly extends current tools by implementing a cascade-decay Monte Carlo, as well as a dedicated likelihood calculator for current and future experiments ( gamLike). This provides a general solution for studying complex particle physics models that predict dark matter annihilation to a multitude of final states. We also supply a direct detection package that models a large range of direct detection experiments ( DDCalc), and that provides the corresponding likelihoods for arbitrary combinations of spin-independent and spin-dependent scattering processes. Finally, we provide custom relic density routines along with interfaces to DarkSUSY, micrOMEGAs, and the neutrino telescope likelihood package nulike. DarkBit is written in the framework of the Global And Modular Beyond the Standard Model Inference Tool ( GAMBIT), providing seamless integration into a comprehensive statistical fitting framework that allows users to explore new models with both particle and astrophysics constraints, and a consistent treatment of systematic uncertainties. In this paper we describe its main functionality, provide a guide to getting started quickly, and show illustrative examples for results obtained with DarkBit (both as a stand-alone tool and as a GAMBIT module). This includes a quantitative comparison between two of the main dark matter codes ( DarkSUSY and micrOMEGAs), and application of DarkBit 's advanced direct and indirect detection routines to a simple effective dark matter model.
TANGRA-Setup for the Investigation of Nuclear Fission Induced by 14.1 MeV Neutrons
NASA Astrophysics Data System (ADS)
Ruskov, I. N.; Kopatch, Yu. N.; Bystritsky, V. M.; Skoy, V. R.; Shvetsov, V. N.; Hambsch, F.-J.; Oberstedt, S.; Noy, R. Capote; Sedyshev, P. V.; Grozdanov, D. N.; Ivanov, I. Zh.; Aleksakhin, V. Yu.; Bogolubov, E. P.; Barmakov, Yu. N.; Khabarov, S. V.; Krasnoperov, A. V.; Krylov, A. R.; Obhođaš, J.; Pikelner, L. B.; Rapatskiy, V. L.; Rogachev, A. V.; Rogov, Yu. N.; Ryzhkov, V. I.; Sadovsky, A. B.; Salmin, R. A.; Sapozhnikov, M. G.; Slepnev, V. M.; Sudac, D.; Tarasov, O. G.; Valković, V.; Yurkov, D. I.; Zamyatin, N. I.; Zeynalov, Sh. S.; Zontikov, A. O.; Zubarev, E. V.
The new experimental setup TANGRA (Tagged Neutrons & Gamma Rays), for the investigation of neutron induced nuclear reactions, e.g. (n,xn'), (n,xn'γ), (n,γ), (n,f), on a number of important isotopes for nuclear science and engineering (235,238U, 237Np, 239Pu, 244,245,248Cm) is under construction and being tested at the Frank Laboratory of Neutron Physics (FLNP) of the Joint Institute for Nuclear Research (JINR) in Dubna. The TANGRA setup consists of: a portable neutron generator ING-27, with a 64-pixel Si charge-particle detector incorporated into its vacuum chamber for registering of α-particles formed in the T(d, n)4He reaction, as a source of 14.1 MeV steady-state neutrons radiation with an intensity of ∼5x107n/s; a combined iron (Fe), borated polyethylene (BPE) and lead (Pb) compact shielding-collimator; a reconfigurable multi-detector (neutron plus gamma ray detecting system); a fast computer with 2 (x16 channels) PCI-E 100 MHz ADC cards for data acquisition and hard disk storage; Linux ROOT data acquisition, visualization and analysis software. The signals from the α-particle detector are used to 'tag' the neutrons with the coincident α-particles. Counting the coincidences between the α-particle and the reaction-product detectors in a 20ns time-interval improves the effect/background-ratio by a factor of ∼200 as well as the accuracy in the neutron flux determination, which decreases noticeably the overall experimental data uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX
2010-08-25
Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for buildingmore » parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i.e. the parameters of the beam lifetime model) are physically meaningful. (3) Numerical Efficiency of the Training - We investigated the numerical efficiency of the SVM training. More specifically, for the primal formulation of the training, we have developed a problem formulation that avoids the linear increase in the number of the constraints as a function of the number of data points. (4) Flexibility of Software Architecture - The software framework for the training of the support vector machines was designed to enable experimentation with different solvers. We experimented with two commonly used nonlinear solvers for our simulations. The primary application of interest for this project has been the sustained optimal operation of particle accelerators at the Stanford Linear Accelerator Center (SLAC). Particle storage rings are used for a variety of applications ranging from 'colliding beam' systems for high-energy physics research to highly collimated x-ray generators for synchrotron radiation science. Linear accelerators are also used for collider research such as International Linear Collider (ILC), as well as for free electron lasers, such as the Linear Coherent Light Source (LCLS) at SLAC. One common theme in the operation of storage rings and linear accelerators is the need to precisely control the particle beams over long periods of time with minimum beam loss and stable, yet challenging, beam parameters. We strongly believe that beyond applications in particle accelerators, the high fidelity and cost benefits of a combined model-based fault estimation/correction system will attract customers from a wide variety of commercial and scientific industries. Even though the acquisition of Pavilion Technologies, Inc. by Rockwell Automation Inc. in 2007 has altered the small business status of the Pavilion and it no longer qualifies for a Phase II funding, our findings in the course of the Phase I research have convinced us that further research will render a workable model-based fault estimation and correction for particle accelerators and industrial plants feasible.« less
Automated Counting of Particles To Quantify Cleanliness
NASA Technical Reports Server (NTRS)
Rhode, James
2005-01-01
A machine vision system, similar to systems used in microbiological laboratories to count cultured microbes, has been proposed for quantifying the cleanliness of nominally precisely cleaned hardware by counting residual contaminant particles. The system would include a microscope equipped with an electronic camera and circuitry to digitize the camera output, a personal computer programmed with machine-vision and interface software, and digital storage media. A filter pad, through which had been aspirated solvent from rinsing the hardware in question, would be placed on the microscope stage. A high-resolution image of the filter pad would be recorded. The computer would analyze the image and present a histogram of sizes of particles on the filter. On the basis of the histogram and a measure of the desired level of cleanliness, the hardware would be accepted or rejected. If the hardware were accepted, the image would be saved, along with other information, as a quality record. If the hardware were rejected, the histogram and ancillary information would be recorded for analysis of trends. The software would perceive particles that are too large or too numerous to meet a specified particle-distribution profile. Anomalous particles or fibrous material would be flagged for inspection.
Tracking Debris Shed by a Space-Shuttle Launch Vehicle
NASA Technical Reports Server (NTRS)
Stuart, Phillip C.; Rogers, Stuart E.
2009-01-01
The DEBRIS software predicts the trajectories of debris particles shed by a space-shuttle launch vehicle during ascent, to aid in assessing potential harm to the space-shuttle orbiter and crew. The user specifies the location of release and other initial conditions for a debris particle. DEBRIS tracks the particle within an overset grid system by means of a computational fluid dynamics (CFD) simulation of the local flow field and a ballistic simulation that takes account of the mass of the particle and its aerodynamic properties in the flow field. The computed particle trajectory is stored in a file to be post-processed by other software for viewing and analyzing the trajectory. DEBRIS supplants a prior debris tracking code that took .15 minutes to calculate a single particle trajectory: DEBRIS can calculate 1,000 trajectories in .20 seconds on a desktop computer. Other improvements over the prior code include adaptive time-stepping to ensure accuracy, forcing at least one step per grid cell to ensure resolution of all CFD-resolved flow features, ability to simulate rebound of debris from surfaces, extensive error checking, a builtin suite of test cases, and dynamic allocation of memory.
NASA Technical Reports Server (NTRS)
Zhang, Ming
2005-01-01
The primary goal of this project was to perform theoretical calculations of propagation of cosmic rays and energetic particles in 3-dimensional heliospheric magnetic fields. We used Markov stochastic process simulation to achieve to this goal. We developed computation software that can be used to study particle propagation in, as two examples of heliospheric magnetic fields that have to be treated in 3 dimensions, a heliospheric magnetic field suggested by Fisk (1996) and a global heliosphere including the region beyond the termination shock. The results from our model calculations were compared with particle measurements from Ulysses, Earth-based spacecraft such as IMP-8, WIND and ACE, Voyagers and Pioneers in outer heliosphere for tests of the magnetic field models. We particularly looked for features of particle variations that can allow us to significantly distinguish the Fisk magnetic field from the conventional Parker spiral field. The computer code will eventually lead to a new generation of integrated software for solving complicated problems of particle acceleration, propagation and modulation in realistic 3-dimensional heliosphere of realistic magnetic fields and the solar wind with a single computation approach.
Theory of post-block 2 VLBI observable extraction
NASA Technical Reports Server (NTRS)
Lowe, Stephen T.
1992-01-01
The algorithms used in the post-Block II fringe-fitting software called 'Fit' are described. The steps needed to derive the very long baseline interferometry (VLBI) charged-particle corrected group delay, phase delay rate, and phase delay (the latter without resolving cycle ambiguities) are presented beginning with the set of complex fringe phasors as a function of observation frequency and time. The set of complex phasors is obtained from the JPL/CIT Block II correlator. The output of Fit is the set of charged-particle corrected observables (along with ancillary information) in a form amenable to the software program 'Modest.'
Schneider, Florian R; Mann, Alexander B; Konorov, Igor; Delso, Gaspar; Paul, Stephan; Ziegler, Sibylle I
2012-06-01
A one-day laboratory course on positron emission tomography (PET) for the education of physics students and PhD students in medical physics has been set up. In the course, the physical background and the principles of a PET scanner are introduced. Course attendees set the system in operation, calibrate it using a (22)Na point source and reconstruct different source geometries filled with (18)F. The PET scanner features an individual channel read-out of 96 lutetium oxyorthosilicate (LSO) scintillator crystals coupled to avalanche photodiodes (APD). The analog data of each APD are digitized by fast sampling analog to digital converters (SADC) and processed within field programmable gate arrays (FPGA) to extract amplitudes and time stamps. All SADCs are continuously sampling with a precise rate of 80MHz, which is synchronous for the whole system. The data is transmitted via USB to a Linux PC, where further processing and the image reconstruction are performed. The course attendees get an insight into detector techniques, modern read-out electronics, data acquisition and PET image reconstruction. In addition, a short introduction to some common software applications used in particle and high energy physics is part of the course. Copyright © 2011. Published by Elsevier GmbH.
Teaching Elementary Particle Physics: Part I
ERIC Educational Resources Information Center
Hobson, Art
2011-01-01
I'll outline suggestions for teaching elementary particle physics, often called "high energy physics," in high school or introductory college courses for non-scientists or scientists. Some presentations of this topic simply list the various particles along with their properties, with little overarching structure. Such a laundry list approach is a…
The Ultimate Structure of Matter: The High Energy Physics Program from the 1950s through the 1980s
DOE R&D Accomplishments Database
1990-02-01
This discusses the following topics in High Energy Physics: The Particle Zoo; The Strong and the Weak; The Particle Explosion; Deep Inside the Nucleon; The Search for Unity; Physics in Collision; The Standard Model; Particles and the Cosmos; and Practical Benefits.
Computer Simulations for Lab Experiences in Secondary Physics
ERIC Educational Resources Information Center
Murphy, David Shannon
2012-01-01
Physical science instruction often involves modeling natural systems, such as electricity that possess particles which are invisible to the unaided eye. The effect of these particles' motion is observable, but the particles are not directly observable to humans. Simulations have been developed in physics, chemistry and biology that, under certain…
[Meta-analyses of quarks, baryons and mesons--a "Cochrane Collaboration" in particle physics].
Sauerland, Stefan; Sauerland, Thankmar; Antes, Gerd; Barnett, R Michael
2002-02-01
Within the last 20 years meta-analysis has become an important research technique in medicine for integrating the results of independent studies. Meta-analytical techniques, however, are much older. In particle physics for 50 years now the properties of huge numbers of particles have been assessed in meta-analyses. The Cochrane Collaboration's counterpart in physics is the Particle Data Group. This article compares methodological and organisational aspects of meta-analyses in medicine and physics. Several interesting parallels exist, especially with regard to methodology.
ALICE Masterclass on strangeness
NASA Astrophysics Data System (ADS)
Foka, Panagiota; Janik, Małgorzata
2014-04-01
An educational activity, the International Particle Physics Masterclasses, was developed by the International Particle Physics Outreach Group with the aim to bring the excitement of cutting-edge particle-physics research into the classroom. Thousands of pupils, every year since 2005, in many countries all over the world, are hosted in research centers or universities close to their schools and become "scientists for a day" as they are introduced to the mysteries of particle physics. The program of a typical day includes lectures that give insight to topics and methods of fundamental research followed by a "hands-on" session where the high-school students perform themselves measurements on real data from particle-physics experiments. The last three years data from the ALICE experiment at LHC were used. The performed measurement "strangeness enhancement" and the employed methodology are presented.
Stability of cosmetic emulsion containing different amount of hemp oil.
Kowalska, M; Ziomek, M; Żbikowska, A
2015-08-01
The aim of the study was to determine the optimal conditions, that is the content of hemp oil and time of homogenization to obtain stable dispersion systems. For this purpose, six emulsions were prepared, their stability was examined empirically and the most correctly formulated emulsion composition was determined using a computer simulation. Variable parameters (oil content and homogenization time) were indicated by the optimization software based on Kleeman's method. Physical properties of the synthesized emulsions were studied by numerous techniques involving particle size analysis, optical microscopy, Turbiscan test and viscosity of emulsions. The emulsion containing 50 g of oil and being homogenized for 6 min had the highest stability. Empirically determined parameters proved to be consistent with the results obtained using the computer software. The computer simulation showed that the most stable emulsion should contain from 30 to 50 g of oil and should be homogenized for 2.5-6 min. The computer software based on Kleeman's method proved to be useful for quick optimization of the composition and production parameters of stable emulsion systems. Moreover, obtaining an emulsion system with proper stability justifies further research extended with sensory analysis, which will allow the application of such systems (containing hemp oil, beneficial for skin) in the cosmetic industry. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
NASA Astrophysics Data System (ADS)
Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.
2012-04-01
Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.
Big Data in HEP: A comprehensive use case study
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; ...
2017-11-23
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity.more » In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. Lastly, we will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.« less
Big Data in HEP: A comprehensive use case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity.more » In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. Lastly, we will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.« less
ATLAS jet trigger update for the LHC run II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delgado, A. T.
The CERN Large Hadron Collider is the biggest and most powerful particle collider ever built. It produces up to 40 million proton-proton collisions per second at unprecedented energies to explore the fundamental laws and properties of Nature. The ATLAS experiment is one of the detectors that analyses and records these collisions. It generates dozens of GB/s of data that has to be reduced before it can be permanently stored, the event selection is made by the ATLAS trigger system, which reduces the data volume by a factor of 105. The trigger system has to be highly configurable in order tomore » adapt to changing running conditions and maximize the physics output whilst keeping the output rate under control. A particularly interesting pattern generated during collisions consists of a collimated spray of particles, known as a hadronic jet. To retain the interesting jets and efficiently reject the overwhelming background, optimal jet energy resolution is needed. Therefore the Jet trigger software requires CPU-intensive reconstruction algorithms. In order to reduce the resources needed for the reconstruction step, a partial detector readout scheme was developed, which effectively suppresses the low activity regions of the calorimeter. In this paper we describe the overall ATLAS trigger software, and the jet trigger in particular, along with the improvements made on the system. We then focus on detailed studies of the algorithm timing and the performance impact of the full and partial calorimeter readout schemes. We conclude with an outlook of the jet trigger plans for the next LHC data-taking period. (authors)« less
Optimization of an interactive distributive computer network
NASA Technical Reports Server (NTRS)
Frederick, V.
1985-01-01
The activities under a cooperative agreement for the development of a computer network are briefly summarized. Research activities covered are: computer operating systems optimization and integration; software development and implementation of the IRIS (Infrared Imaging of Shuttle) Experiment; and software design, development, and implementation of the APS (Aerosol Particle System) Experiment.
Three-Dimensional Visualization of Particle Tracks.
ERIC Educational Resources Information Center
Julian, Glenn M.
1993-01-01
Suggests ways to bring home to the introductory physics student some of the excitement of recent discoveries in particle physics. Describes particle detectors and encourages the use of the Standard Model along with real images of particle tracks to determine three-dimensional views of tracks. (MVL)
Plato's TIMAIOσ (TIMAEUS) and Modern Particle Physics
NASA Astrophysics Data System (ADS)
Machleidt, Ruprecht
2005-04-01
It is generally known that the question, ``What are the smallest particles (elementary particles) that all matter is made from?'', was posed already in the antiquity. The Greek natural philosophers Leucippus and Democritus were the first to suggest that all matter was made from atoms. Therefore, most people perceive them as the ancient fathers of elementary particle physics. It will be the purpose of my contribution to point out that this perception is wrong. Modern particle physics is not just a primitive atomism. More important than the materialistic particles are the underlying symmetries (e. g., SU(3) and SU(6)). A similar idea was first advanced by Plato in his dialog TIMAIOσ (Latin translation: TIMAEUS): Geometric symmetries generate the materialistic particles from a few even more elementary items. Plato's vision is amazingly close to the ideas of modern particle physics. This fact, which is unfortunately little known, has been pointed out repeatedly by Heisenberg (see, e. g., Werner Heisenberg, Across the Frontiers, Harper & Row, New York, 1974).
The Belle II software—From detector signals to physics results
NASA Astrophysics Data System (ADS)
Kuhr, T.
2017-07-01
The construction of the Belle II detector is being completed and the focus shifts towards the reconstruction of higher level objects from the detector signals with the aim to search for new physics effects in huge data samples. The software is providing the connection between detector hardware and physics analyses. This article describes the development infrastructure and main components of the Belle II software which are essential for the success of the Belle II physics program.
Interactions.org Particle Physics News Image Bank Fermilab in the News Quantum Diaries Mu2e: muon-to-electron works The Mu2e detector is a particle physics detector embedded in a series of superconducting magnets advance research at the Intensity Frontier. The U.S. Particle Physics Project Prioritization Panel, P5
back to history page Back Particle Physics Timeline For over two thousand years people have thought the Standard Model. We invite you to explore this history of particle physics with a focus on the : Quantum Theory 1964 - Present: The Modern View (the Standard Model) back to history page Back Sections of
Quarked!--Adventures in Particle Physics Education
ERIC Educational Resources Information Center
MacDonald, Teresa; Bean, Alice
2009-01-01
Particle physics is a subject that can send shivers down the spines of students and educators alike--with visions of long mathematical equations and inscrutable ideas. This perception, along with a full curriculum, often leaves this topic the road less traveled until the latter years of school. Particle physics, including quarks, is typically not…
Let's Have a Coffee with the Standard Model of Particle Physics!
ERIC Educational Resources Information Center
Woithe, Julia; Wiener, Gerfried J.; Van der Veken, Frederik F.
2017-01-01
The Standard Model of particle physics is one of the most successful theories in physics and describes the fundamental interactions between elementary particles. It is encoded in a compact description, the so-called "Lagrangian," which even fits on t-shirts and coffee mugs. This mathematical formulation, however, is complex and only…
2014-01-01
Background Identifying those patients who have a higher chance to be cured with fewer side effects by particle beam therapy than by state-of-the-art photon therapy is essential to guarantee a fair and sufficient access to specialized radiotherapy. The individualized identification requires initiatives by particle as well as non-particle radiotherapy centers to form networks, to establish procedures for the decision process, and to implement means for the remote exchange of relevant patient information. In this work, we want to contribute a practical concept that addresses these requirements. Methods We proposed a concept for individualized patient allocation to photon or particle beam therapy at a non-particle radiotherapy institution that bases on remote treatment plan comparison. We translated this concept into the web-based software tool ReCompare (REmote COMparison of PARticlE and photon treatment plans). Results We substantiated the feasibility of the proposed concept by demonstrating remote exchange of treatment plans between radiotherapy institutions and the direct comparison of photon and particle treatment plans in photon treatment planning systems. ReCompare worked with several tested standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. Conclusions Our concept supports non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by providing expertise from a particle therapy center. The software tool ReCompare may help to improve and standardize this personalized treatment decision. It will be available from our website when proton therapy is operational at our facility. PMID:24548333
WalkThrough Example Procedures for MAMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph
This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.
On some physical and dynamical properties of microplastic particles in marine environment.
Chubarenko, I; Bagaev, A; Zobkov, M; Esiukova, E
2016-07-15
Simplified physical models and geometrical considerations reveal general physical and dynamical properties of microplastic particles (0.5-5mm) of different density, shape and size in marine environment. Windage of extremely light foamed particles, surface area and fouling rate of slightly positively buoyant microplastic spheres, films and fibres and settling velocities of negatively buoyant particles are analysed. For the Baltic Sea dimensions and under the considered idealised external conditions, (i) only one day is required for a foamed polystyrene particle to cross the sea (ca. 250km); (ii) polyethylene fibres should spend about 6-8months in the euphotic zone before sinking due to bio-fouling, whilst spherical particles can be retained on the surface up to 10-15years; (iii) for heavy microplastic particles, the time of settling through the water column in the central Gotland basin (ca. 250m) is less than 18h. Proper physical setting of the problem of microplastics transport and developing of physically-based parameterisations are seen as applications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Studies of the Impact of Magnetic Field Uncertainties on Physics Parameters of the Mu2e Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradascio, Federica
The Mu2e experiment at Fermilab will search for a signature of charged lepton flavor violation, an effect prohibitively too small to be observed within the Standard Model of particle physics. Therefore, its observation is a signal of new physics. The signature that Mu2e will search for is the ratio of the rate of neutrinoless coherent conversion of muons into electrons in the field of a nucleus, relative to the muon capture rate by the nucleus. The conversion process is an example of charged lepton flavor violation. This experiment aims at a sensitivity of four orders of magnitude higher than previousmore » related experiments. The desired sensitivity implies highly demanding requirements of accuracy in the design and conduct of the experiment. It is therefore important to investigate the tolerance of the experiment to instrumental uncertainties and provide specifications that the design and construction must meet. This is the core of the work reported in this thesis. The design of the experiment is based on three superconducting solenoid magnets. The most important uncertainties in the magnetic field of the solenoids can arise from misalignments of the Transport Solenoid, which transfers the beam from the muon production area to the detector area and eliminates beam-originating backgrounds. In this thesis, the field uncertainties induced by possible misalignments and their impact on the physics parameters of the experiment are examined. The physics parameters include the muon and pion stopping rates and the scattering of beam electrons off the capture target, which determine the signal, intrinsic background and late-arriving background yields, respectively. Additionally, a possible test of the Transport Solenoid alignment with low momentum electrons is examined, as an alternative option to measure its field with conventional probes, which is technically difficult due to mechanical interference. Misalignments of the Transport Solenoid were simulated using standard magnetic field cal- culation tools. Particle transport was simulated using the Mu2e Offline software, which includes realistic models of particle interactions with materials in the full Mu2e geometry. The physics parameters were found tolerant within the precision requirements of the experiment for rigid-body type of misalignments, which are the most dangerous, up to a maximum coil displacement of nearly 10 mm. With the appropriate choice of low momentum electron detector, the proposed Transport Solenoid test is found to be sensitive to such misalignments.« less
NASA Astrophysics Data System (ADS)
Morse, P. E.; Reading, A. M.; Lueg, C.
2014-12-01
Pattern-recognition in scientific data is not only a computational problem but a human-observer problem as well. Human observation of - and interaction with - data visualization software can augment, select, interrupt and modify computational routines and facilitate processes of pattern and significant feature recognition for subsequent human analysis, machine learning, expert and artificial intelligence systems.'Tagger' is a Mac OS X interactive data visualisation tool that facilitates Human-Computer interaction for the recognition of patterns and significant structures. It is a graphical application developed using the Quartz Composer framework. 'Tagger' follows a Model-View-Controller (MVC) software architecture: the application problem domain (the model) is to facilitate novel ways of abstractly representing data to a human interlocutor, presenting these via different viewer modalities (e.g. chart representations, particle systems, parametric geometry) to the user (View) and enabling interaction with the data (Controller) via a variety of Human Interface Devices (HID). The software enables the user to create an arbitrary array of tags that may be appended to the visualised data, which are then saved into output files as forms of semantic metadata. Three fundamental problems that are not strongly supported by conventional scientific visualisation software are addressed:1] How to visually animate data over time, 2] How to rapidly deploy unconventional parametrically driven data visualisations, 3] How to construct and explore novel interaction models that capture the activity of the end-user as semantic metadata that can be used to computationally enhance subsequent interrogation. Saved tagged data files may be loaded into Tagger, so that tags may be tagged, if desired. Recursion opens up the possibility of refining or overlapping different types of tags, tagging a variety of different POIs or types of events, and of capturing different types of specialist observations of important or noticeable events. Other visualisations and modes of interaction will also be demonstrated, with the aim of discovering knowledge in large datasets in the natural, physical sciences. Fig.1 Wave height data from an oceanographic Wave Rider Buoy. Colors/radii are driven by wave height data.
Development of Matlab GUI educational software to assist a laboratory of physical optics
NASA Astrophysics Data System (ADS)
Fernández, Elena; Fuentes, Rosa; García, Celia; Pascual, Inmaculada
2014-07-01
Physical optics is one of the subjects in the Grade of Optics and Optometry in Spanish universities. The students who come to this degree often have difficulties to understand subjects that are related to physics. For this reason, the aim of this work is to develop optics simulation software that provides a virtual laboratory for studying the effects of different aspects of physical optics phenomena. This software can let optical undergraduates simulate many optical systems for a better understanding of the practical competences associated with the theoretical concepts studied in class. This interactive environment unifies the information that brings the manual of the practices, provides the visualization of the physical phenomena and allows users to vary the values of the parameters that come into play to check its effect. So, this virtual tool is the perfect complement to learning more about the practices developed in the laboratory. This software will be developed through the choices which have the Matlab to generate Graphical User Interfaces or GUIs. A set of knobs, buttons and handles will be included in the GUI's in order to control the parameters of the different physics phenomena. Graphics can also be inserted in the GUIs to show the behavior of such phenomena. Specifically, by using this software, the student is able to analyze the behaviour of the transmittance and reflectance of the TE and TM modes, the polarized light through of the Malus'Law or degree of polarization.
NASA Astrophysics Data System (ADS)
Khadilkar, Aditi B.
The utility of fluidized bed reactors for combustion and gasification can be enhanced if operational issues such as agglomeration are mitigated. The monetary and efficiency losses could be avoided through a mechanistic understanding of the agglomeration process and prediction of operational conditions that promote agglomeration. Pilot-scale experimentation prior to operation for each specific condition can be cumbersome and expensive. So the development of a mathematical model would aid predictions. With this motivation, the study comprised of the following model development stages- 1) development of an agglomeration modeling methodology based on binary particle collisions, 2) study of heterogeneities in ash chemical composition and gaseous atmosphere, 3) computation of a distribution of particle collision frequencies based on granular physics for a poly-disperse particle size distribution, 4) combining the ash chemistry and granular physics inputs to obtain agglomerate growth probabilities and 5) validation of the modeling methodology. The modeling methodology comprised of testing every binary particle collision in the system for sticking, based on the extent of dissipation of the particles' kinetic energy through viscous dissipation by slag-liquid (molten ash) covering the particles. In the modeling methodology developed in this study, thermodynamic equilibrium calculations are used to estimate the amount of slag-liquid in the system, and the changes in particle collision frequencies are accounted for by continuously tracking the number density of the various particle sizes. In this study, the heterogeneities in chemical composition of fuel ash were studied by separating the bulk fuel into particle classes that are rich in specific minerals. FactSage simulations were performed on two bituminous coals and an anthracite to understand the effect of particle-level heterogeneities on agglomeration. The mineral matter behavior of these constituent classes was studied. Each particle class undergoes distinct transformations of mineral matter at fluidized bed operating temperatures, as determined by using high temperature X-ray diffraction, thermo-mechanical analysis and scanning electron microscopy with energy dispersive X-ray spectroscopy (SEM-EDX). For the incorporation of a particle size distribution, bottom ash from an operating plant was divided into four size intervals and the system granular temperatures and dynamic bed height were computed using MFIX, a CFD simulation software. The kinetic theory of granular flow was used to obtain a distribution of binary collision frequencies for the entire particle size distribution. With this distribution of collision frequencies, which is computed based on hydrodynamics and granular physics of the poly-disperse system, as the particles grow, defluidize and decrease in number, the collision frequency also decreases. Under the conditions studied, the growth rate in the latter half of the run decreased to almost 1/5th the initial rate, with this decrease in collision frequency. This interdependent effect of chemistry and physics-based parameters, at the particle-level, was used to predict the agglomerate growth probabilities of Pittsburgh No. 8, Illinois No. 6 and Skidmore anthracite coals in this study, to illustrate the utility of the modeling methodology. The study also showed that agglomerate growth probability significantly increased above 15 to 20 wt. % slag. It was limited by ash chemistry at levels below this amount. Ash agglomerates were generated in a laboratory-scale fluidized bed combustor at Penn State to support the proposed agglomerate growth mechanism. This study also attempted to gain a mechanistic understanding of agglomerate growth with particle-level initiation occurring at the relatively low operating temperatures of about 950 °C, found in some fluidized beds. The results of this study indicated that, for the materials examined, agglomerate growth in fluidized bed combustors and gasifiers is initiated at the particle-level by low-melting components rich in iron- and calcium-based minerals. Although the bulk ash chemical composition does not indicate potential for agglomeration, study of particle-level heterogeneities revealed that agglomeration can begin at lower temperatures than the fluidized bed operating temperatures of 850 °C. After initiation at the particle-level, more slag is observed to form from alumino-silicate components at about 50 to 100 °C higher temperatures caused by changes in the system, and agglomerate growth propagates in the bed. A post-mortem study of ash agglomerates using SEM-EDX helped to identify stages of agglomerate growth. Additionally, the modeling methodology developed was used to simulate agglomerate growth in a laboratory-scale fluidized bed combustor firing palm shells (biomass), reported in the literature. A comparison of the defluidization time obtained by simulations to the experimental values reported in the case-study was made for the different operating conditions studied. This indicated that although the simulation results were comparable to those reported in the case study, modifications such as inclusion of heat transfer calculations to determine particle temperature resulting from carbon conversion would improve the predictive capabilities. (Abstract shortened by ProQuest.).
A parameterization of nuclear track profiles in CR-39 detector
NASA Astrophysics Data System (ADS)
Azooz, A. A.; Al-Nia'emi, S. H.; Al-Jubbori, M. A.
2012-11-01
In this work, the empirical parameterization describing the alpha particles’ track depth in CR-39 detectors is extended to describe longitudinal track profiles against etching time for protons and alpha particles. MATLAB based software is developed for this purpose. The software calculates and plots the depth, diameter, range, residual range, saturation time, and etch rate versus etching time. The software predictions are compared with other experimental data and with results of calculations using the original software, TRACK_TEST, developed for alpha track calculations. The software related to this work is freely downloadable and performs calculations for protons in addition to alpha particles. Program summary Program title: CR39 Catalog identifier: AENA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENA_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Copyright (c) 2011, Aasim Azooz Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met • Redistributions of source code must retain the above copyright, this list of conditions and the following disclaimer. • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution This software is provided by the copyright holders and contributors “as is” and any express or implied warranties, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose are disclaimed. In no event shall the copyright owner or contributors be liable for any direct, indirect, incidental, special, exemplary, or consequential damages (including, but not limited to, procurement of substitute goods or services; loss of use, data, or profits; or business interruption) however caused and on any theory of liability, whether in contract, strict liability, or tort (including negligence or otherwise) arising in any way out of the use of this software, even if advised of the possibility of such damage. No. of lines in distributed program, including test data, etc.: 15598 No. of bytes in distributed program, including test data, etc.: 3933244 Distribution format: tar.gz Programming language: MATLAB. Computer: Any Desktop or Laptop. Operating system: Windows 1998 or above (with MATLAB R13 or above installed). RAM: 512 Megabytes or higher Classification: 17.5. Nature of problem: A new semispherical parameterization of charged particle tracks in CR-39 SSNTD is carried out in a previous paper. This parameterization is developed here into a MATLAB based software to calculate the track length and track profile for any proton or alpha particle energy or etching time. This software is intended to compete with the TRACK_TEST [1] and TRACK_VISION [2] software currently in use by all people working in the field of SSNTD. Solution method: Based on fitting of experimental results of protons and alpha particles track lengths for various energies and etching times to a new semispherical formula with four free fitting parameters, the best set of energy independent parameters were found. These parameters are introduced into the software and the software is programmed to solve the set of equations to calculate the track depth, track etching rate as a function of both time and residual range for particles of normal and oblique incidence, the track longitudinal profile at both normal and oblique incidence, and the three dimensional track profile at normal incidence. Running time: 1-8 s on Pentium (4) 2 GHz CPU, 3 GB of RAM depending on the etching time value References: [1] ADWT_v1_0 Track_Test Computer program TRACK_TEST for calculating parameters and plotting profiles for etch pits in nuclear track materials. D. Nikezic, K.N. Yu Comput. Phys. Commun. 174(2006)160 [2] AEAF_v1_0 TRACK_VISION Computer program TRACK_VISION for simulating optical appearance of etched tracks in CR-39 nuclear track detectors. D. Nikezic, K.N. Yu Comput. Phys. Commun. 178(2008)591
PEOPLE IN PHYSICS: Interview with Scott Durow, Software Engineer, Oxford
NASA Astrophysics Data System (ADS)
Burton, Conducted by Paul
1998-05-01
Scott Durow was educated at Bootham School, York. He studied Physics, Mathematics and Chemistry to A-level and went on to Nottingham University to read Medical Physics. After graduating from Nottingham he embarked on his present career as a Software Engineer based in Oxford. He is a musician in his spare time, as a member of a band and playing the French horn.
Rudolph, Sabrina; Göring, Arne; Padrok, Dennis
2018-01-03
Sports and physical activity interventions are attracting considerable attention in the context of workplace health promotion. Due to increasing digitalization, especially software-based interventions that promote physical activity are gaining acceptance in practice. Empirical evidence concerning the efficiency of software-based interventions in the context of workplace health promotion is rather low so far. This paper examines the question in what way software-based interventions are more efficient than personal-based interventions in terms of increasing the level of physical activity. A systematic review according to the specifications of the Cochrane Collaboration was conducted. Inclusion criteria and should-have criteria were defined and by means of the should-have criteria the quality score of the studies was calculated. The software-based and personal-based interventions are presented in 2 tables with the categories author, year, country, sample group, aim of the intervention, methods, outcome and study quality. A total of 25 studies are included in the evaluation (12 personal- and 13 software-based interventions). The quality scores of the studies are heterogeneous and range from 3 to 9 points. 5 personal- and 5 software-based studies achieved an increase of physical activity. Other positive effects on health could be presented in the studies, for example, a reduction in blood pressure or body-mass index. A few studies did not show any improvement in health-related parameters. This paper demonstrates that positive effects can be achieved with both intervention types. Software-based interventions show advantages due to the use of new technologies. Use of desktop or mobile applications facilitate organization, communication and data acquisition with fewer resources needed. A schooled trainer, on the other hand, is able to react to specific and varying needs of the employees. This aspect should be considered as very significant. © Georg Thieme Verlag KG Stuttgart · New York.
Analysis of particle size to erosion wear of sliding sleeve ball seat based on fluent software
NASA Astrophysics Data System (ADS)
Ding, Kun; Yin, Hongcheng; Wan, Bingqian; Cheng, Hao; Xiang, Lu; Li, Jianmin
2017-04-01
The fracturing has become the most offensive stimulation treatment in the low permeability reservoir. But, as the construction displacement and sand dosage of overlong horizontal well were increased continuously, the erosion wear of ball seat of pitching sliding sleeve was increasingly serious, which might lead to the failure of opening the sliding sleeve. In the existing literature, there were many researches on the erosion wear of liquid-solid two-phase flow in the diameter of sudden expansion pipe, but the influence of solid particle with mixed particle size to the erosion wear was not considered. This paper studied the erosion wear of ball seat according to the mixed proppant with different particle sizes, and carried out the numerical simulation with Fluent software with the Euler two-fluid theory. The results showed that: the erosion wear rate of ball seat is in inversely proportional to the particle size of proppant; the erosion wear rate of ball seat is different when the volume fraction of proppant with different particle sizes is changed; and for the mixed proppant of which the particle size is 0.3mm and 0.8mm, the erosion wear rate of ball seat is minimum when the volume fraction of proppant, of which the particle size is 0.3mm, is about 20%. The simulated result contributed to the deep study on erosion wear law of solid particle, and meanwhile, provided a certain reference basis for the selection of staged fracturing material of horizontal well.
Earth-moon system: Dynamics and parameter estimation
NASA Technical Reports Server (NTRS)
Breedlove, W. J., Jr.
1975-01-01
A theoretical development of the equations of motion governing the earth-moon system is presented. The earth and moon were treated as finite rigid bodies and a mutual potential was utilized. The sun and remaining planets were treated as particles. Relativistic, non-rigid, and dissipative effects were not included. The translational and rotational motion of the earth and moon were derived in a fully coupled set of equations. Euler parameters were used to model the rotational motions. The mathematical model is intended for use with data analysis software to estimate physical parameters of the earth-moon system using primarily LURE type data. Two program listings are included. Program ANEAMO computes the translational/rotational motion of the earth and moon from analytical solutions. Program RIGEM numerically integrates the fully coupled motions as described above.
NASA Astrophysics Data System (ADS)
Dyomin, V. V.; Polovtsev, I. G.; Davydova, A. Yu.
2018-03-01
The physical principles of a method for determination of geometrical characteristics of particles and particle recognition based on the concepts of digital holography, followed by processing of the particle images reconstructed from the digital hologram, using the morphological parameter are reported. An example of application of this method for fast plankton particle recognition is given.
NASA Astrophysics Data System (ADS)
Turner, Michael Lloyd
Research on student difficulties in learning physics concepts has been coincident with a general reform movement in science education with the aim of increasing the level of inquiry in the teaching and learning of science. Coincident with these efforts has been a dramatic increase in the offering of online courses. Generally, this movement toward online course offerings has taken place without the inclusion of laboratory science offerings. The Learn Anytime Anywhere Physics (LAAPhysics) program for asynchronous online introductory physics learning is a notable exception. LAAPhysics software attempts to implement the principles of reformed science teaching and learning in an online environment. The purpose of this study was to measure how student cognition of physics concepts in kinematics was effected through use of LAAPhysics online kinematics tutorials. The normalized gains between pre-instruction and post-instruction scores on the Test of Understanding Graphs in Kinematics (TUG-K) for a treatment group of LAAPhysics testers was calculated. This normalized gain was compared to normalized gains typically found for students taking face-to-face physics courses. The normalized gain scores for LAAPhysics testers were also tested for correlation against time on task variables as measured by connectivity to the online software. Finally, a content analysis of student responses recorded in the LAAPhysics software was conducted. Normalized gain scores for LAAPhysics testers were not found to be greater than gain scores typically found in face-to-face courses. The number of student connections to the software and their total time working in the software were found to be significantly related to normalized gain on the TUG-K. The content analysis of student responses in the LAAPhysics software revealed variation in initial understanding of physics concepts in kinematics as well as variation in change in understanding across students.
Final Technical Report for "High Energy Physics at The University of Iowa"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mallik, Usha; Meurice, Yannick; Nachtman, Jane
2013-07-31
Particle Physics explores the very fundamental building blocks of our universe: the nature of forces, of space and time. By exploring very energetic collisions of sub-nuclear particles with sophisticated detectors at the colliding beam accelerators (as well as others), experimental particle physicists have established the current theory known as the Standard Model (SM), one of the several theoretical postulates to explain our everyday world. It explains all phenomena known up to a very small fraction of a second after the Big Bang to a high precision; the Higgs boson, discovered recently, was the last of the particle predicted by themore » SM. However, many other phenomena, like existence of dark energy, dark matter, absence of anti-matter, the parameters in the SM, neutrino masses etc. are not explained by the SM. So, in order to find out what lies beyond the SM, i.e., what conditions at the earliest fractions of the first second of the universe gave rise to the SM, we constructed the Large Hadron Collider (LHC) at CERN after the Tevatron collider at Fermi National Accelerator Laboratory. Each of these projects helped us push the boundary further with new insights as we explore a yet higher energy regime. The experiments are extremely complex, and as we push the boundaries of our existing knowledge, it also requires pushing the boundaries of our technical knowhow. So, not only do we pursue humankind’s most basic intellectual pursuit of knowledge, we help develop technology that benefits today’s highly technical society. Our trained Ph.D. students become experts at fast computing, manipulation of large data volumes and databases, developing cloud computing, fast electronics, advanced detector developments, and complex interfaces in several of these areas. Many of the Particle physics Ph.D.s build their careers at various technology and computing facilities, even financial institutions use some of their skills of simulation and statistical prowess. Additionally, last but not least, today’s discoveries make for tomorrow’s practical uses of an improved life style, case in point, internet technology, fiber optics, and many such things. At The University of Iowa we are involved in the LHC experiments, ATLAS and CMS, building equipment, with calibration and maintenance, supporting the infrastructure in hardware, software and analysis as well as participating in various aspects of data analyses. Our theory group works on fundamentals of field theories and on exploration of non-accelerator high energy neutrinos and possible dark matter searches.« less
Developing a Virtual Physics World
ERIC Educational Resources Information Center
Wegener, Margaret; McIntyre, Timothy J.; McGrath, Dominic; Savage, Craig M.; Williamson, Michael
2012-01-01
In this article, the successful implementation of a development cycle for a physics teaching package based on game-like virtual reality software is reported. The cycle involved several iterations of evaluating students' use of the package followed by instructional and software development. The evaluation used a variety of techniques, including…
HEP Community White Paper on Software Trigger and Event Reconstruction: Executive Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albrecht, Johannes; et al.
Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less
HEP Community White Paper on Software Trigger and Event Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albrecht, Johannes; et al.
Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less
Teaching Radiology Physics Interactively with Scientific Notebook Software.
Richardson, Michael L; Amini, Behrang
2018-06-01
The goal of this study is to demonstrate how the teaching of radiology physics can be enhanced with the use of interactive scientific notebook software. We used the scientific notebook software known as Project Jupyter, which is free, open-source, and available for the Macintosh, Windows, and Linux operating systems. We have created a scientific notebook that demonstrates multiple interactive teaching modules we have written for our residents using the Jupyter notebook system. Scientific notebook software allows educators to create teaching modules in a form that combines text, graphics, images, data, interactive calculations, and image analysis within a single document. These notebooks can be used to build interactive teaching modules, which can help explain complex topics in imaging physics to residents. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Free software for performing physical analysis of systems for digital radiography and mammography.
Donini, Bruno; Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco
2014-05-01
In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online (www.medphys.it/downloads.htm). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.
The GENIE Neutrino Monte Carlo Generator: Physics and User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreopoulos, Costas; Barry, Christopher; Dytman, Steve
2015-10-20
GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of itsmore » physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd
2015-04-29
A DAQ (data acquisition) software called RPTv2.0 has been developed for Radioactive Particle Tracking System in Malaysian Nuclear Agency. RPTv2.0 that features scanning control GUI, data acquisition from 12-channel counter via RS-232 interface, and multichannel analyzer (MCA). This software is fully developed on National Instruments Labview 8.6 platform. Ludlum Model 4612 Counter is used to count the signals from the scintillation detectors while a host computer is used to send control parameters, acquire and display data, and compute results. Each detector channel consists of independent high voltage control, threshold or sensitivity value and window settings. The counter is configured withmore » a host board and twelve slave boards. The host board collects the counts from each slave board and communicates with the computer via RS-232 data interface.« less
Data processing in Software-type Wave-Particle Interaction Analyzer onboard the Arase satellite
NASA Astrophysics Data System (ADS)
Hikishima, Mitsuru; Kojima, Hirotsugu; Katoh, Yuto; Kasahara, Yoshiya; Kasahara, Satoshi; Mitani, Takefumi; Higashio, Nana; Matsuoka, Ayako; Miyoshi, Yoshizumi; Asamura, Kazushi; Takashima, Takeshi; Yokota, Shoichiro; Kitahara, Masahiro; Matsuda, Shoya
2018-05-01
The software-type wave-particle interaction analyzer (S-WPIA) is an instrument package onboard the Arase satellite, which studies the magnetosphere. The S-WPIA represents a new method for directly observing wave-particle interactions onboard a spacecraft in a space plasma environment. The main objective of the S-WPIA is to quantitatively detect wave-particle interactions associated with whistler-mode chorus emissions and electrons over a wide energy range (from several keV to several MeV). The quantity of energy exchanges between waves and particles can be represented as the inner product of the wave electric-field vector and the particle velocity vector. The S-WPIA requires accurate measurement of the phase difference between wave and particle gyration. The leading edge of the S-WPIA system allows us to collect comprehensive information, including the detection time, energy, and incoming direction of individual particles and instantaneous-wave electric and magnetic fields, at a high sampling rate. All the collected particle and waveform data are stored in the onboard large-volume data storage. The S-WPIA executes calculations asynchronously using the collected electric and magnetic wave data, data acquired from multiple particle instruments, and ambient magnetic-field data. The S-WPIA has the role of handling large amounts of raw data that are dedicated to calculations of the S-WPIA. Then, the results are transferred to the ground station. This paper describes the design of the S-WPIA and its calculations in detail, as implemented onboard Arase.[Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
2002-03-01
UK Awards: Teacher of Physics Awards Institute Matters: Institute of Physics Education Conference UK Awards: Top SHAP students win prizes Competition: International creative essay competition UK Awards: Kelvin Medal Particle Physics Resources: New poster from PPARC Australia: Physics Students's Day at Adventure World UK Awards: Bragg Medal winners in a FLAP ASE Annual Meeting: Particle Physics at ASE 2002 UK Grants: PPARC Awards AAPT Winter Meeting: Physics First - but do you need maths? UK In-Service Training: The Particle Physics Institutes for A-level teachers Physics on Stage 2: Not too entertaining this time, please! Scotland: A reasoned approach wins reasonable funding Institute Matters: New education manager Germany: Physics gets real: curriculum change for better teaching Research Frontiers: Let there be light - if you hang on a minute
On the modeling of the 2010 Gulf of Mexico Oil Spill
NASA Astrophysics Data System (ADS)
Mariano, A. J.; Kourafalou, V. H.; Srinivasan, A.; Kang, H.; Halliwell, G. R.; Ryan, E. H.; Roffer, M.
2011-09-01
Two oil particle trajectory forecasting systems were developed and applied to the 2010 Deepwater Horizon Oil Spill in the Gulf of Mexico. Both systems use ocean current fields from high-resolution numerical ocean circulation model simulations, Lagrangian stochastic models to represent unresolved sub-grid scale variability to advect oil particles, and Monte Carlo-based schemes for representing uncertain biochemical and physical processes. The first system assumes two-dimensional particle motion at the ocean surface, the oil is in one state, and the particle removal is modeled as a Monte Carlo process parameterized by a one number removal rate. Oil particles are seeded using both initial conditions based on observations and particles released at the location of the Maconda well. The initial conditions (ICs) of oil particle location for the two-dimensional surface oil trajectory forecasts are based on a fusing of all available information including satellite-based analyses. The resulting oil map is digitized into a shape file within which a polygon filling software generates longitude and latitude with variable particle density depending on the amount of oil present in the observations for the IC. The more complex system assumes three (light, medium, heavy) states for the oil, each state has a different removal rate in the Monte Carlo process, three-dimensional particle motion, and a particle size-dependent oil mixing model. Simulations from the two-dimensional forecast system produced results that qualitatively agreed with the uncertain "truth" fields. These simulations validated the use of our Monte Carlo scheme for representing oil removal by evaporation and other weathering processes. Eulerian velocity fields for predicting particle motion from data-assimilative models produced better particle trajectory distributions than a free running model with no data assimilation. Monte Carlo simulations of the three-dimensional oil particle trajectory, whose ensembles were generated by perturbing the size of the oil particles and the fraction in a given size range that are released at depth, the two largest unknowns in this problem. 36 realizations of the model were run with only subsurface oil releases. An average of these results yields that after three months, about 25% of the oil remains in the water column and that most of the oil is below 800 m.
Talking Back: Weapons, Warfare, and Feedback
2010-04-01
realize that these laws are not laws of physics . They don’t allow for performance or effectiveness comparisons either as they don’t have a common...the weapon’s next software update. Software updates are done by physical connections like most legacy systems as well as by secure data link...Generally the land based Air Force squadrons use physical connections due to the increased reliability, while sea based squadrons use the wireless
The International Committee for Future Accelerators (ICFA): 1976 to the present
Rubinstein, Roy
2016-12-14
The International Committee for Future Accelerators (ICFA) has been in existence now for four decades. It plays an important role in allowing discussions by the world particle physics community on the status and future of very large particle accelerators and the particle physics and related fields associated with them. Here, this paper gives some indication of what ICFA is and does, and also describes its involvement in some of the more important developments in the particle physics field since its founding.
Single Aerosol Particle Studies Using Optical Trapping Raman And Cavity Ringdown Spectroscopy
NASA Astrophysics Data System (ADS)
Gong, Z.; Wang, C.; Pan, Y. L.; Videen, G.
2017-12-01
Due to the physical and chemical complexity of aerosol particles and the interdisciplinary nature of aerosol science that involves physics, chemistry, and biology, our knowledge of aerosol particles is rather incomplete; our current understanding of aerosol particles is limited by averaged (over size, composition, shape, and orientation) and/or ensemble (over time, size, and multi-particles) measurements. Physically, single aerosol particles are the fundamental units of any large aerosol ensembles. Chemically, single aerosol particles carry individual chemical components (properties and constituents) in particle ensemble processes. Therefore, the study of single aerosol particles can bridge the gap between aerosol ensembles and bulk/surface properties and provide a hierarchical progression from a simple benchmark single-component system to a mixed-phase multicomponent system. A single aerosol particle can be an effective reactor to study heterogeneous surface chemistry in multiple phases. Latest technological advances provide exciting new opportunities to study single aerosol particles and to further develop single aerosol particle instrumentation. We present updates on our recent studies of single aerosol particles optically trapped in air using the optical-trapping Raman and cavity ringdown spectroscopy.
Georges Charpak, Particle Detectors, and Multiwire Chambers
particle detectors used throughout experimental particle physics. In 1968, he invented and developed the the 2005 International Year of Physics (video) Top Some links on this page may take you to non-federal
Probing the frontiers of particle physics with tabletop-scale experiments.
DeMille, David; Doyle, John M; Sushkov, Alexander O
2017-09-08
The field of particle physics is in a peculiar state. The standard model of particle theory successfully describes every fundamental particle and force observed in laboratories, yet fails to explain properties of the universe such as the existence of dark matter, the amount of dark energy, and the preponderance of matter over antimatter. Huge experiments, of increasing scale and cost, continue to search for new particles and forces that might explain these phenomena. However, these frontiers also are explored in certain smaller, laboratory-scale "tabletop" experiments. This approach uses precision measurement techniques and devices from atomic, quantum, and condensed-matter physics to detect tiny signals due to new particles or forces. Discoveries in fundamental physics may well come first from small-scale experiments of this type. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
NASA Astrophysics Data System (ADS)
Uzunoglu, B.; Hussaini, Y.
2017-12-01
Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.
Discrete Element Modeling of Triboelectrically Charged Particles
NASA Technical Reports Server (NTRS)
Hogue, Michael D.; Calle, Carlos I.; Weitzman, Peter S.; Curry, David R.
2008-01-01
Tribocharging of particles is common in many processes including fine powder handling and mixing, printer toner transport and dust extraction. In a lunar environment with its high vacuum and lack of water, electrostatic forces are an important factor to consider when designing and operating equipment. Dust mitigation and management is critical to safe and predictable performance of people and equipment. The extreme nature of lunar conditions makes it difficult and costly to carry out experiments on earth which are necessary to better understand how particles gather and transfer charge between each other and with equipment surfaces. DEM (Discrete Element Modeling) provides an excellent virtual laboratory for studying tribocharging of particles as well as for design of devices for dust mitigation and for other purposes related to handling and processing of lunar regolith. Theoretical and experimental work has been performed pursuant to incorporating screened Coulombic electrostatic forces into EDEM, a commercial DEM software package. The DEM software is used to model the trajectories of large numbers of particles for industrial particulate handling and processing applications and can be coupled with other solvers and numerical models to calculate particle interaction with surrounding media and force fields. While simple Coulombic force between two particles is well understood, its operation in an ensemble of particles is more complex. When the tribocharging of particles and surfaces due to frictional contact is also considered, it is necessary to consider longer range of interaction of particles in response to electrostatic charging. The standard DEM algorithm accounts for particle mechanical properties and inertia as a function of particle shape and mass. If fluid drag is neglected, then particle dynamics are governed by contact between particles, between particles and equipment surfaces and gravity forces. Consideration of particle charge and any tribocharging and electric field effects requires calculation of the forces due to these effects.
Particle-based simulations of self-motile suspensions
NASA Astrophysics Data System (ADS)
Hinz, Denis F.; Panchenko, Alexander; Kim, Tae-Yeon; Fried, Eliot
2015-11-01
A simple model for simulating flows of active suspensions is investigated. The approach is based on dissipative particle dynamics. While the model is potentially applicable to a wide range of self-propelled particle systems, the specific class of self-motile bacterial suspensions is considered as a modeling scenario. To mimic the rod-like geometry of a bacterium, two dissipative particle dynamics particles are connected by a stiff harmonic spring to form an aggregate dissipative particle dynamics molecule. Bacterial motility is modeled through a constant self-propulsion force applied along the axis of each such aggregate molecule. The model accounts for hydrodynamic interactions between self-propelled agents through the pairwise dissipative interactions conventional to dissipative particle dynamics. Numerical simulations are performed using a customized version of the open-source software package LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) software package. Detailed studies of the influence of agent concentration, pairwise dissipative interactions, and Stokes friction on the statistics of the system are provided. The simulations are used to explore the influence of hydrodynamic interactions in active suspensions. For high agent concentrations in combination with dominating pairwise dissipative forces, strongly correlated motion patterns and a fluid-like spectral distributions of kinetic energy are found. In contrast, systems dominated by Stokes friction exhibit weaker spatial correlations of the velocity field. These results indicate that hydrodynamic interactions may play an important role in the formation of spatially extended structures in active suspensions.
The Birth of Elementary-Particle Physics.
ERIC Educational Resources Information Center
Brown, Laurie M.; Hoddeson, Lillian
1982-01-01
Traces the origin and development of particle physics, concentrating on the roles of cosmic rays and theory. Includes charts highlighting significant events in the development of cosmic-ray physics and quantum field theory. (SK)
EPICS-based control and data acquisition for the APS slope profiler (Conference Presentation)
NASA Astrophysics Data System (ADS)
Sullivan, Joseph; Assoufid, Lahsen; Qian, Jun; Jemian, Peter R.; Mooney, Tim; Rivers, Mark L.; Goetze, Kurt; Sluiter, Ronald L.; Lang, Keenan
2016-09-01
The motion control, data acquisition and analysis system for APS Slope Measuring Profiler was implemented using the Experimental Physics and Industrial Control System (EPICS). EPICS was designed as a framework with software tools and applications that provide a software infrastructure used in building distributed control systems to operate devices such as particle accelerators, large experiments and major telescopes. EPICS was chosen to implement the APS Slope Measuring Profiler because it is also applicable to single purpose systems. The control and data handling capability available in the EPICS framework provides the basic functionality needed for high precision X-ray mirror measurement. Those built in capabilities include hardware integration of high-performance motion control systems (3-axis gantry and tip-tilt stages), mirror measurement devices (autocollimator, laser spot camera) and temperature sensors. Scanning the mirror and taking measurements was accomplished with an EPICS feature (the sscan record) which synchronizes motor positioning with measurement triggers and data storage. Various mirror scanning modes were automatically configured using EPICS built-in scripting. EPICS tools also provide low-level image processing (areaDetector). Operation screens were created using EPICS-aware GUI screen development tools.
Performance of the Magnetospheric Multiscale central instrument data handling
NASA Astrophysics Data System (ADS)
Klar, Robert A.; Miller, Scott A.; Brysch, Michael L.; Bertrand, Allison R.
In order to study the fundamental physical processes of magnetic reconnection, particle acceleration and turbulence, the Magnetospheric Multiscale (MMS) mission employs a constellation of four identically configured observatories, each with a suite of complementary science instruments. Southwest Research Institute® (SwRI® ) developed the Central Instrument Data Processor (CIDP) to handle the large data volume associated with these instruments. The CIDP is an integrated access point between the instruments and the spacecraft. It provides synchronization pulses, relays telecommands, and gathers instrument housekeeping telemetry. It collects science data from the instruments and stores it to a mass memory for later playback to a ground station. This paper retrospectively examines the data handling performance realized by the CIDP implementation. It elaborates on some of the constraints on the hardware and software designs and the resulting effects on performance. For the hardware, it discusses the limitations of the front-end electronics input/output (I/O) architecture and associated mass memory buffering. For the software, it discusses the limitations of the Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP) implementation and the data structure choices for file management. It also describes design changes that improve data handling performance in newer designs.
ERIC Educational Resources Information Center
Wiener, Gerfried J.; Schmeling, Sascha M.; Hopf, Martin
2015-01-01
This study introduces a teaching concept based on the Standard Model of particle physics. It comprises two consecutive chapters--elementary particles and fundamental interactions. The rationale of this concept is that the fundamental principles of particle physics can run as the golden thread through the whole physics curriculum. The design…
Kanojia, Gaurav; Willems, Geert-Jan; Frijlink, Henderik W; Kersten, Gideon F A; Soema, Peter C; Amorij, Jean-Pierre
2016-09-25
Spray dried vaccine formulations might be an alternative to traditional lyophilized vaccines. Compared to lyophilization, spray drying is a fast and cheap process extensively used for drying biologicals. The current study provides an approach that utilizes Design of Experiments for spray drying process to stabilize whole inactivated influenza virus (WIV) vaccine. The approach included systematically screening and optimizing the spray drying process variables, determining the desired process parameters and predicting product quality parameters. The process parameters inlet air temperature, nozzle gas flow rate and feed flow rate and their effect on WIV vaccine powder characteristics such as particle size, residual moisture content (RMC) and powder yield were investigated. Vaccine powders with a broad range of physical characteristics (RMC 1.2-4.9%, particle size 2.4-8.5μm and powder yield 42-82%) were obtained. WIV showed no significant loss in antigenicity as revealed by hemagglutination test. Furthermore, descriptive models generated by DoE software could be used to determine and select (set) spray drying process parameter. This was used to generate a dried WIV powder with predefined (predicted) characteristics. Moreover, the spray dried vaccine powders retained their antigenic stability even after storage for 3 months at 60°C. The approach used here enabled the generation of a thermostable, antigenic WIV vaccine powder with desired physical characteristics that could be potentially used for pulmonary administration. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballouz, Ronald-Louis; Richardson, Derek C.; Morishima, Ryuji
We study the B ring’s complex optical depth structure. The source of this structure may be the complex dynamics of the Keplerian shear and the self-gravity of the ring particles. The outcome of these dynamic effects depends sensitively on the collisional and physical properties of the particles. Two mechanisms can emerge that dominate the macroscopic physical structure of the ring: self-gravity wakes and viscous overstability. Here we study the interplay between these two mechanisms by using our recently developed particle collision method that allows us to better model the inter-particle contact physics. We find that for a constant ring surfacemore » density and particle internal density, particles with rough surfaces tend to produce axisymmetric ring features associated with the viscous overstability, while particles with smoother surfaces produce self-gravity wakes.« less
Computational Fluid Dynamics Modeling of the John Day Dam Tailrace
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rakowski, Cynthia L.; Perkins, William A.; Richmond, Marshall C.
US Army Corps of Engineers - Portland District required that a two-dimensional (2D) depth-averaged and a three-dimensional (3D) free-surface numerical models to be developed and validated for the John Day tailrace. These models were used to assess potential impact of a select group of structural and operational alternatives to tailrace flows aimed at improving fish survival at John Day Dam. The 2D model was used for the initial assessment of the alternatives in conjunction with a reduced-scale physical model of the John Day Project. A finer resolution 3D model was used to more accurately model the details of flow inmore » the stilling basin and near-project tailrace hydraulics. Three-dimensional model results were used as input to the Pacific Northwest National Laboratory particle tracking software, and particle paths and times to pass a downstream cross section were used to assess the relative differences in travel times resulting from project operations and structural scenarios for multiple total river flows. Streamlines and neutrally-buoyant particles were seeded in all turbine and spill bays with flows. For a Total River of 250 kcfs running with the Fish Passage Plan spill pattern and a spillwall, the mean residence times for all particles were little changed; however the tails of the distribution were truncated for both spillway and powerhouse release points, and, for the powerhouse releases, reduced the residence time for 75% of the particles to pass a downstream cross section from 45.5 minutes to 41.3 minutes. For a total river of 125 kcfs configured with the operations from the Fish Passage Plan for the temporary spillway weirs and for a proposed spillwall, the neutrally-buoyant particle tracking data showed that the river with a spillwall in place had the overall mean residence time increase; however, the residence time for 75% of the powerhouse-released particles to pass a downstream cross section was reduced from 102.4 min to 89 minutes.« less
2012-11-01
Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology , Department of Physics and SLAC National Accelerator...Laboratory, Stanford University, Stanford, CA 94305, USA; echarles@slac.stanford.edu 3 Department of Physics, Center for Cosmology and Astro-Particle Physics
Usage of "Powergraph" software at laboratory lessons of "general physics" department of MEPhI
NASA Astrophysics Data System (ADS)
Klyachin, N. A.; Matronchik, A. Yu.; Khangulyan, E. V.
2017-01-01
One considers usage of "PowerGraph" software in laboratory exercise "Study of sodium spectrum" of physical experiment lessons. Togethe with the design of experiment setup, one discusses the sodium spectra digitized with computer audio chip. Usage of "PowerGraph" software in laboratory experiment "Study of sodium spectrum" allows an efficient visualization of the sodium spectrum and analysis of its fine structure. In particular, it allows quantitative measurements of the wavelengths and line relative intensities.
Software For Design Of Life-Support Systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1991-01-01
Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.
2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrington, David Bradley; Waters, Jiajia
Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.
NASA Technical Reports Server (NTRS)
1989-01-01
Loredan Biomedical, Inc.'s LIDO, a computerized physical therapy system, was purchased by NASA in 1985 for evaluation as a Space Station Freedom exercise program. In 1986, while involved in an ARC muscle conditioning project, Malcom Bond, Loredan's chairman, designed an advanced software package for NASA which became the basis for LIDOSOFT software used in the commercially available system. The system employs a "proprioceptive" software program which perceives internal body conditions, induces perturbations to muscular effort and evaluates the response. Biofeedback on a screen allows a patient to observe his own performance.
PDG Homepage Link Educational Information Particle Adventure Image CPEP Image Enjoy our interactive web feature: The Particle Adventure Contemporary Physics Education Projects: Educational materials educational sites on particle physics Copyright information: This page and all following and associated are
Rotation of a 1-GeV particle beam by a fan system of thin crystals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Britvich, G. I.; Maisheev, V. A.; Chesnokov, Yu. A., E-mail: Yury.Chesnokov@ihep.ru
2016-10-15
The deflection of a 1-GeV charged particle beam by a system formed by fan-oriented thin silicon wafers has been studied theoretically and experimentally. Software has been developed for numerical simulation of a particle beam transmission through a fan crystal system. In the U-70 experiment on a proton beam, the particles were deflected by such a system through an angle exceeding 1 mrad. Thus, a new method has been demonstrated for rotating a particle beam, which can be used for creating accelerator beams for medical purposes.
Automatic Parameter Tuning for the Morpheus Vehicle Using Particle Swarm Optimization
NASA Technical Reports Server (NTRS)
Birge, B.
2013-01-01
A high fidelity simulation using a PC based Trick framework has been developed for Johnson Space Center's Morpheus test bed flight vehicle. There is an iterative development loop of refining and testing the hardware, refining the software, comparing the software simulation to hardware performance and adjusting either or both the hardware and the simulation to extract the best performance from the hardware as well as the most realistic representation of the hardware from the software. A Particle Swarm Optimization (PSO) based technique has been developed that increases speed and accuracy of the iterative development cycle. Parameters in software can be automatically tuned to make the simulation match real world subsystem data from test flights. Special considerations for scale, linearity, discontinuities, can be all but ignored with this technique, allowing fast turnaround both for simulation tune up to match hardware changes as well as during the test and validation phase to help identify hardware issues. Software models with insufficient control authority to match hardware test data can be immediately identified and using this technique requires very little to no specialized knowledge of optimization, freeing model developers to concentrate on spacecraft engineering. Integration of the PSO into the Morpheus development cycle will be discussed as well as a case study highlighting the tool's effectiveness.
Software Aids Visualization of Computed Unsteady Flow
NASA Technical Reports Server (NTRS)
Kao, David; Kenwright, David
2003-01-01
Unsteady Flow Analysis Toolkit (UFAT) is a computer program that synthesizes motions of time-dependent flows represented by very large sets of data generated in computational fluid dynamics simulations. Prior to the development of UFAT, it was necessary to rely on static, single-snapshot depictions of time-dependent flows generated by flow-visualization software designed for steady flows. Whereas it typically takes weeks to analyze the results of a largescale unsteady-flow simulation by use of steady-flow visualization software, the analysis time is reduced to hours when UFAT is used. UFAT can be used to generate graphical objects of flow visualization results using multi-block curvilinear grids in the format of a previously developed NASA data-visualization program, PLOT3D. These graphical objects can be rendered using FAST, another popular flow visualization software developed at NASA. Flow-visualization techniques that can be exploited by use of UFAT include time-dependent tracking of particles, detection of vortex cores, extractions of stream ribbons and surfaces, and tetrahedral decomposition for optimal particle tracking. Unique computational features of UFAT include capabilities for automatic (batch) processing, restart, memory mapping, and parallel processing. These capabilities significantly reduce analysis time and storage requirements, relative to those of prior flow-visualization software. UFAT can be executed on a variety of supercomputers.
NASA Astrophysics Data System (ADS)
Beringer, J.; Arguin, J.-F.; Barnett, R. M.; Copic, K.; Dahl, O.; Groom, D. E.; Lin, C.-J.; Lys, J.; Murayama, H.; Wohl, C. G.; Yao, W.-M.; Zyla, P. A.; Amsler, C.; Antonelli, M.; Asner, D. M.; Baer, H.; Band, H. R.; Basaglia, T.; Bauer, C. W.; Beatty, J. J.; Belousov, V. I.; Bergren, E.; Bernardi, G.; Bertl, W.; Bethke, S.; Bichsel, H.; Biebel, O.; Blucher, E.; Blusk, S.; Brooijmans, G.; Buchmueller, O.; Cahn, R. N.; Carena, M.; Ceccucci, A.; Chakraborty, D.; Chen, M.-C.; Chivukula, R. S.; Cowan, G.; D'Ambrosio, G.; Damour, T.; de Florian, D.; de Gouvêa, A.; DeGrand, T.; de Jong, P.; Dissertori, G.; Dobrescu, B.; Doser, M.; Drees, M.; Edwards, D. A.; Eidelman, S.; Erler, J.; Ezhela, V. V.; Fetscher, W.; Fields, B. D.; Foster, B.; Gaisser, T. K.; Garren, L.; Gerber, H.-J.; Gerbier, G.; Gherghetta, T.; Golwala, S.; Goodman, M.; Grab, C.; Gritsan, A. V.; Grivaz, J.-F.; Grünewald, M.; Gurtu, A.; Gutsche, T.; Haber, H. E.; Hagiwara, K.; Hagmann, C.; Hanhart, C.; Hashimoto, S.; Hayes, K. G.; Heffner, M.; Heltsley, B.; Hernández-Rey, J. J.; Hikasa, K.; Höcker, A.; Holder, J.; Holtkamp, A.; Huston, J.; Jackson, J. D.; Johnson, K. F.; Junk, T.; Karlen, D.; Kirkby, D.; Klein, S. R.; Klempt, E.; Kowalewski, R. V.; Krauss, F.; Kreps, M.; Krusche, B.; Kuyanov, Yu. V.; Kwon, Y.; Lahav, O.; Laiho, J.; Langacker, P.; Liddle, A.; Ligeti, Z.; Liss, T. M.; Littenberg, L.; Lugovsky, K. S.; Lugovsky, S. B.; Mannel, T.; Manohar, A. V.; Marciano, W. J.; Martin, A. D.; Masoni, A.; Matthews, J.; Milstead, D.; Miquel, R.; Mönig, K.; Moortgat, F.; Nakamura, K.; Narain, M.; Nason, P.; Navas, S.; Neubert, M.; Nevski, P.; Nir, Y.; Olive, K. A.; Pape, L.; Parsons, J.; Patrignani, C.; Peacock, J. A.; Petcov, S. T.; Piepke, A.; Pomarol, A.; Punzi, G.; Quadt, A.; Raby, S.; Raffelt, G.; Ratcliff, B. N.; Richardson, P.; Roesler, S.; Rolli, S.; Romaniouk, A.; Rosenberg, L. J.; Rosner, J. L.; Sachrajda, C. T.; Sakai, Y.; Salam, G. P.; Sarkar, S.; Sauli, F.; Schneider, O.; Scholberg, K.; Scott, D.; Seligman, W. G.; Shaevitz, M. H.; Sharpe, S. R.; Silari, M.; Sjöstrand, T.; Skands, P.; Smith, J. G.; Smoot, G. F.; Spanier, S.; Spieler, H.; Stahl, A.; Stanev, T.; Stone, S. L.; Sumiyoshi, T.; Syphers, M. J.; Takahashi, F.; Tanabashi, M.; Terning, J.; Titov, M.; Tkachenko, N. P.; Törnqvist, N. A.; Tovey, D.; Valencia, G.; van Bibber, K.; Venanzoni, G.; Vincter, M. G.; Vogel, P.; Vogt, A.; Walkowiak, W.; Walter, C. W.; Ward, D. R.; Watari, T.; Weiglein, G.; Weinberg, E. J.; Wiencke, L. R.; Wolfenstein, L.; Womersley, J.; Woody, C. L.; Workman, R. L.; Yamamoto, A.; Zeller, G. P.; Zenin, O. V.; Zhang, J.; Zhu, R.-Y.; Harper, G.; Lugovsky, V. S.; Schaffner, P.
2012-07-01
This biennial Review summarizes much of particle physics. Using data from previous editions, plus 2658 new measurements from 644 papers, we list, evaluate, and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons. We summarize searches for hypothetical particles such as Higgs bosons, heavy neutrinos, and supersymmetric particles. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as the Standard Model, particle detectors, probability, and statistics. Among the 112 reviews are many that are new or heavily revised including those on Heavy-Quark and Soft-Collinear Effective Theory, Neutrino Cross Section Measurements, Monte Carlo Event Generators, Lattice QCD, Heavy Quarkonium Spectroscopy, Top Quark, Dark Matter, Vcb & Vub, Quantum Chromodynamics, High-Energy Collider Parameters, Astrophysical Constants, Cosmological Parameters, and Dark Matter.A booklet is available containing the Summary Tables and abbreviated versions of some of the other sections of this full Review. All tables, listings, and reviews (and errata) are also available on the Particle Data Group website: http://pdg.lbl.gov/.The 2012 edition of Review of Particle Physics is published for the Particle Data Group as article 010001 in volume 86 of Physical Review D.This edition should be cited as: J. Beringer et al. (Particle Data Group), Phys. Rev. D 86, 010001 (2012).
Radio detection of high-energy cosmic rays with the Auger Engineering Radio Array
NASA Astrophysics Data System (ADS)
Schröder, Frank G.; Pierre Auger Collaboration
2016-07-01
The Auger Engineering Radio Array (AERA) is an enhancement of the Pierre Auger Observatory in Argentina. Covering about 17km2, AERA is the world-largest antenna array for cosmic-ray observation. It consists of more than 150 antenna stations detecting the radio signal emitted by air showers, i.e., cascades of secondary particles caused by primary cosmic rays hitting the atmosphere. At the beginning, technical goals had been in focus: first of all, the successful demonstration that a large-scale antenna array consisting of autonomous stations is feasible. Moreover, techniques for calibration of the antennas and time calibration of the array have been developed, as well as special software for the data analysis. Meanwhile physics goals come into focus. At the Pierre Auger Observatory air showers are simultaneously detected by several detector systems, in particular water-Cherenkov detectors at the surface, underground muon detectors, and fluorescence telescopes, which enables cross-calibration of different detection techniques. For the direction and energy of air showers, the precision achieved by AERA is already competitive; for the type of primary particle, several methods are tested and optimized. By combining AERA with the particle detectors we aim for a better understanding of cosmic rays in the energy range from approximately 0.3 to 10 EeV, i.e., significantly higher energies than preceding radio arrays.
Olive, K. A.
2016-10-01
The Review summarizes much of particle physics and cosmology. Using data from previous editions, plus 3,062 new measurements from 721 papers, we list, evaluate, and average measured properties of gauge bosons and the recently discovered Higgs boson, leptons, quarks, mesons, and baryons. We summarize searches for hypothetical particles such as supersymmetric particles, heavy bosons, axions, dark photons, etc. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as Higgs Boson Physics, Supersymmetry, Grand Unified Theories, Neutrino Mixing, Dark Energy, Dark Matter, Cosmology, Particle Detectors, Colliders,more » Probability and Statistics. As a result, among the 117 reviews are many that are new or heavily revised, including those on Pentaquarks and Inflation.« less
Particle astronomy and particle physics from the moon - The particle observatory
NASA Technical Reports Server (NTRS)
Wilson, Thomas L.
1990-01-01
Promising experiments from the moon using particle detectors are discussed, noting the advantage of the large flux collecting power Pc offered by the remote, stable environment of a lunar base. An observatory class of particle experiments is presented, based upon proposals at NASA's recent Stanford workshop. They vary from neutrino astronomy, particle astrophysics, and cosmic ray experiments to space physics and fundamental physics experiments such as proton decay and 'table-top' arrays. This research is background-limited on earth, and it is awkward and unrealistic in earth orbit, but is particularly suited for the moon where Pc can be quite large and the instrumentation is not subject to atmospheric erosion as it is (for large t) in low earth orbit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olive, K. A.
The Review summarizes much of particle physics and cosmology. Using data from previous editions, plus 3,062 new measurements from 721 papers, we list, evaluate, and average measured properties of gauge bosons and the recently discovered Higgs boson, leptons, quarks, mesons, and baryons. We summarize searches for hypothetical particles such as supersymmetric particles, heavy bosons, axions, dark photons, etc. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as Higgs Boson Physics, Supersymmetry, Grand Unified Theories, Neutrino Mixing, Dark Energy, Dark Matter, Cosmology, Particle Detectors, Colliders,more » Probability and Statistics. As a result, among the 117 reviews are many that are new or heavily revised, including those on Pentaquarks and Inflation.« less
Standard Model of Particle Physics--a health physics perspective.
Bevelacqua, J J
2010-11-01
The Standard Model of Particle Physics is reviewed with an emphasis on its relationship to the physics supporting the health physics profession. Concepts important to health physics are emphasized and specific applications are presented. The capability of the Standard Model to provide health physics relevant information is illustrated with application of conservation laws to neutron and muon decay and in the calculation of the neutron mean lifetime.
Zelenyuk, Alla; Imre, Dan; Wilson, Jacqueline; Zhang, Zhiyuan; Wang, Jun; Mueller, Klaus
2015-02-01
Understanding the effect of aerosols on climate requires knowledge of the size and chemical composition of individual aerosol particles-two fundamental properties that determine an aerosol's optical properties and ability to serve as cloud condensation or ice nuclei. Here we present our aircraft-compatible single particle mass spectrometers, SPLAT II and its new, miniaturized version, miniSPLAT that measure in-situ and in real-time the size and chemical composition of individual aerosol particles with extremely high sensitivity, temporal resolution, and sizing precision on the order of a monolayer. Although miniSPLAT's size, weight, and power consumption are significantly smaller, its performance is on par with SPLAT II. Both instruments operate in dual data acquisition mode to measure, in addition to single particle size and composition, particle number concentrations, size distributions, density, and asphericity with high temporal resolution. We also present ND-Scope, our newly developed interactive visual analytics software package. ND-Scope is designed to explore and visualize the vast amount of complex, multidimensional data acquired by our single particle mass spectrometers, along with other aerosol and cloud characterization instruments on-board aircraft. We demonstrate that ND-Scope makes it possible to visualize the relationships between different observables and to view the data in a geo-spatial context, using the interactive and fully coupled Google Earth and Parallel Coordinates displays. Here we illustrate the utility of ND-Scope to visualize the spatial distribution of atmospheric particles of different compositions, and explore the relationship between individual particle compositions and their activity as cloud condensation nuclei.
Computer Software for Life Cycle Cost.
1987-04-01
34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually
TRIMS: Validating T2 Molecular Effects for Neutrino Mass Experiments
NASA Astrophysics Data System (ADS)
Lin, Ying-Ting; Trims Collaboration
2017-09-01
The Tritium Recoil-Ion Mass Spectrometer (TRIMS) experiment examines the branching ratio of the molecular tritium (T2) beta decay to the bound state (3HeT+). Measuring this branching ratio helps to validate the current molecular final-state theory applied in neutrino mass experiments such as KATRIN and Project 8. TRIMS consists of a magnet-guided time-of-flight mass spectrometer with a detector located on each end. By measuring the kinetic energy and time-of-flight difference of the ions and beta particles reaching the detectors, we will be able to distinguish molecular ions from atomic ones and hence derive the ratio in question. We will give an update on the apparatus, simulation software, and analysis tools, including efforts to improve the resolution of our detectors and to characterize the stability and uniformity of our field sources. We will also share our commissioning results and prospects for physics data. The TRIMS experiment is supported by U.S. Department of Energy Office of Science, Office of Nuclear Physics, Award Number DE-FG02-97ER41020.
Fermilab | Science at Fermilab | Experiments & Projects | Cosmic Frontier
Proposed Projects and Experiments Fermilab's Tevatron Questions for the Universe Theory Computing High Answers Submit a Question Frontiers of Particle Physics Benefits to Society Benefits to Society Medicine Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library
Advanced Level Physics Students' Conceptions of Quantum Physics.
ERIC Educational Resources Information Center
Mashhadi, Azam
This study addresses questions about particle physics that focus on the nature of electrons. Speculations as to whether they are more like particles or waves or like neither illustrate the difficulties with which students are confronted when trying to incorporate the concepts of quantum physics into their overall conceptual framework. Such…
Apparatus and method for tracking a molecule or particle in three dimensions
Werner, James H [Los Alamos, NM; Goodwin, Peter M [Los Alamos, NM; Lessard, Guillaume [Santa Fe, NM
2009-03-03
An apparatus and method were used to track the movement of fluorescent particles in three dimensions. Control software was used with the apparatus to implement a tracking algorithm for tracking the motion of the individual particles in glycerol/water mixtures. Monte Carlo simulations suggest that the tracking algorithms in combination with the apparatus may be used for tracking the motion of single fluorescent or fluorescently labeled biomolecules in three dimensions.
Dynamics of low velocity collisions of ice particle, coated with frost
NASA Technical Reports Server (NTRS)
Bridges, F.; Lin, D.; Boone, L.; Darknell, D.
1991-01-01
We continued our investigations of low velocity collisions of ice particles for velocities in range 10(exp -3) - 2 cm/s. The work focused on two effects: (1) the sticking forces for ice particles coated with CO2 frost, and (2) the completion of a 2-D pendulum system for glancing collisions. A new computer software was also developed to control and monitor the position of the 2-D pendulum.
Software tool for physics chart checks.
Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa
2014-01-01
Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.
Inerton fields: very new ideas on fundamental physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krasnoholovets, Volodymyr
2010-12-22
Modern theories of everything, or theories of the grand unification of all physical interactions, try to describe the whole world starting from the first principles of quantum theory. However, the first principles operate with undetermined notions, such as the wave {psi}-function, particle, lepton and quark, de Broglie and Compton wavelengths, mass, electric charge, spin, electromagnetic field, photon, gravitation, physical vacuum, space, etc. From a logical point of view this means that such modern approach to the theory of everything is condemned to failure... Thus, what should we suggest to improve the situation? It seems quite reasonable to develop initially amore » theory of something, which will be able to clarify the major fundamental notions (listed above) that physics operates with every day. What would be a starting point in such approach? Of course a theory of space as such, because particles and all physical fields emerge just from space. After that, when a particle and fields (and hence the fields' carriers) are well defined and introduced in the well defined physical space, different kinds of interactions can be proposed and investigated. Moreover, we must also allow for a possible interaction of a created particle with the space that generated the appearance of the particle. The mathematical studies of Michel Bounias and the author have shown what the real physical space is, how the space is constituted, how it is arranged and what its elements are. Having constructed the real physical space we can then derive whatever we wish, in particular, such basic notions as mass, particle and charge. How are mechanics of such objects (a massive particle, a charged massive particle) organised? The appropriate theory of motion has been called a sub microscopic mechanics of particles, which is developed in the real physical space, not an abstract phase space, as conventional quantum mechanics does. A series of questions arise: can these two mechanics (submicroscopic and conventional quantum mechanics) be unified?, what can such unification bring new for us?, can such submicroscopic mechanics be a starting point for the derivation of the phenomenon of gravity?, can this new theory be a unified physical theory?, does the theory allow experimental verification? These major points have been clarified in detail. And, perhaps, the most intriguing aspect of the theory is the derivation of a new physical field associated with the notion of mass (or rather inertia of a particle, which has been called the inerton field and which represents a real sense of the particle's wave {psi}-function). This field emerges by analogy with the electromagnetic field associated with the notion of the electric charge. Yes, the postulated inerton field has being tested in a series of different experiments. Even more, the inerton field might have a number of practical applications...« less
Development of students' interest in particle physics as effect of participating in a Masterclass
NASA Astrophysics Data System (ADS)
Gedigk, Kerstin; Pospiech, Gesche
2016-05-01
The International Hands On Particle Physics Masterclasses are enjoying increasing popularity worldwide every year. In Germany a national program was brought to live in 2010, which offers these appreciated events to whole classes or courses of high school students all over the year. These events were evaluated concerning the issues of students' interest in particle physics and their perception of the events. How several interest variables interact with each other and the perception of the events is answered by structural equation modelling (sect. 5.2). The results give information about the events' effects on the students' interest development in particle physics, show which event features are important ( e.g. the authenticity) and give information about practical approaches to improve the effects of the Masterclasses. Section 5.3 deals with a group of participants which have a high interest in particle physics 6-8 weeks after the participation. The number of these students is remarkable large, with 26% of all participants. The investigation of this group shows that the Masterclass participation has the same positive effect on both sexes and all levels of physics education.
NASA Technical Reports Server (NTRS)
Potter, A. E. (Editor); Wilson, T. L. (Editor)
1990-01-01
The present conference on physics and astrophysics from a lunar base encompasses space physics, cosmic ray physics, neutrino physics, experiments in gravitation and general relativity, gravitational radiation physics, cosmic background radiation, particle astrophysics, surface physics, and the physics of gamma rays and X-rays. Specific issues addressed include space-plasma physics research at a lunar base, prospects for neutral particle imaging, the atmosphere as particle detector, medium- and high-energy neutrino physics from a lunar base, muons on the moon, a search for relic supernovae antineutrinos, and the use of clocks in satellites orbiting the moon to test general relativity. Also addressed are large X-ray-detector arrays for physics experiments on the moon, and the measurement of proton decay, arcsec-source locations, halo dark matter and elemental abundances above 10 exp 15 eV at a lunar base.
Event Reconstruction in the PandaRoot framework
NASA Astrophysics Data System (ADS)
Spataro, Stefano
2012-12-01
The PANDA experiment will study the collisions of beams of anti-protons, with momenta ranging from 2-15 GeV/c, with fixed proton and nuclear targets in the charm energy range, and will be built at the FAIR facility. In preparation for the experiment, the PandaRoot software framework is under development for detector simulation, reconstruction and data analysis, running on an Alien2-based grid. The basic features are handled by the FairRoot framework, based on ROOT and Virtual Monte Carlo, while the PANDA detector specifics and reconstruction code are implemented inside PandaRoot. The realization of Technical Design Reports for the tracking detectors has pushed the finalization of the tracking reconstruction code, which is complete for the Target Spectrometer, and of the analysis tools. Particle Identification algorithms are currently implemented using Bayesian approach and compared to Multivariate Analysis methods. Moreover, the PANDA data acquisition foresees a triggerless operation in which events are not defined by a hardware 1st level trigger decision, but all the signals are stored with time stamps requiring a deconvolution by the software. This has led to a redesign of the software from an event basis to a time-ordered structure. In this contribution, the reconstruction capabilities of the Panda spectrometer will be reported, focusing on the performances of the tracking system and the results for the analysis of physics benchmark channels, as well as the new (and challenging) concept of time-based simulation and its implementation.
NASA Astrophysics Data System (ADS)
Lippert, Ross A.; Predescu, Cristian; Ierardi, Douglas J.; Mackenzie, Kenneth M.; Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.
2013-10-01
In molecular dynamics simulations, control over temperature and pressure is typically achieved by augmenting the original system with additional dynamical variables to create a thermostat and a barostat, respectively. These variables generally evolve on timescales much longer than those of particle motion, but typical integrator implementations update the additional variables along with the particle positions and momenta at each time step. We present a framework that replaces the traditional integration procedure with separate barostat, thermostat, and Newtonian particle motion updates, allowing thermostat and barostat updates to be applied infrequently. Such infrequent updates provide a particularly substantial performance advantage for simulations parallelized across many computer processors, because thermostat and barostat updates typically require communication among all processors. Infrequent updates can also improve accuracy by alleviating certain sources of error associated with limited-precision arithmetic. In addition, separating the barostat, thermostat, and particle motion update steps reduces certain truncation errors, bringing the time-average pressure closer to its target value. Finally, this framework, which we have implemented on both general-purpose and special-purpose hardware, reduces software complexity and improves software modularity.
PARTICLE PHYSICS: CERN Collider Glimpses Supersymmetry--Maybe.
Seife, C
2000-07-14
Last week, particle physicists at the CERN laboratory in Switzerland announced that by smashing together matter and antimatter in four experiments, they detected an unexpected effect in the sprays of particles that ensued. The anomaly is subtle, and physicists caution that it might still be a statistical fluke. If confirmed, however, it could mark the long-sought discovery of a whole zoo of new particles--and the end of a long-standing model of particle physics.
ERIC Educational Resources Information Center
Davies, Denise M.
1985-01-01
Discusses design, development, and use of a database to provide organization and access to a computer software collection at the University of Hawaii School of Library Studies. Field specifications, samples of report forms, and a description of the physical organization of the software collection are included. (MBR)
PC based graphic display real-time particle beam uniformity
NASA Technical Reports Server (NTRS)
Huebner, M. A.; Malone, C. J.; Smith, L. S.; Soli, G. A.
1989-01-01
A technique has been developed to support the study of the effects of cosmic rays on integrated circuits. The system is designed to determine the particle distribution across the surface of an integrated circuit accurately while the circuit is bombarded by a particle beam. The system uses photomultiplier tubes, an octal discriminator, a computer-controlled NIM quad counter, and an IBM PC. It provides real-time operator feedback for fast beam tuning and monitors momentary fluctuations in the particle beam. The hardware, software, and system performance are described.
Voutilainen, Arto; Kaipio, Jari P; Pekkanen, Juha; Timonen, Kirsi L; Ruuskanen, Juhani
2004-01-01
A theoretical comparison of modeled particle depositions in the human respiratory tract was performed by taking into account different particle number and mass size distributions and physical activity in an urban environment. Urban-air data on particulate concentrations in the size range 10 nm-10 microm were used to estimate the hourly average particle number and mass size distribution functions. The functions were then combined with the deposition probability functions obtained from a computerized ICRP 66 deposition model of the International Commission on Radiological Protection to calculate the numbers and masses of particles deposited in five regions of the respiratory tract of a male adult. The man's physical activity and minute ventilation during the day were taken into account in the calculations. Two different mass and number size distributions of aerosol particles with equal (computed) <10 microm particle mass concentrations gave clearly different deposition patterns in the central and peripheral regions of the human respiratory tract. The deposited particle numbers and masses were much higher during the day (0700-1900) than during the night (1900-0700) because an increase in physical activity and ventilation were temporally associated with highly increased traffic-derived particles in urban outdoor air. In future analyses of the short-term associations between particulate air pollution and health, it would not only be important to take into account the outdoor-to-indoor penetration of different particle sizes and human time-activity patterns, but also actual lung deposition patterns and physical activity in significant microenvironments.
Science at Fermilab Fermilab and the Higgs Boson Frontiers of Particle Physics Experiments & Projects Frontiers of Particle Physics Benefits to Society Contacting Fermilab General Contact Information Email -12 Programs Lederman Science Center Saturday Morning Physics Cooperative Education Program
Simulation of the MoEDAL experiment
NASA Astrophysics Data System (ADS)
King, Matthew; MoEDAL Collaboration
2016-04-01
The MoEDAL experiment (Monopole and Exotics Detector at the LHC) is designed to directly search for magnetic monopoles and other highly ionising stable or meta-stable particles at the LHC. The MoEDAL detector comprises an array of plastic track detectors and aluminium trapping volumes around the P8 intersection region, opposite from the LHCb detector. TimePix devices are also installed for monitoring of the experiment. As MoEDAL mostly employs passive detectors the software development focusses on particle simulation, rather than digitisation or reconstruction. Here, we present the current status of the MoEDAL simulation software. Specifically, the development of a material description of the detector and simulations of monopole production and propagation at MoEDAL.
Design and fabrication of complete dentures using CAD/CAM technology
Han, Weili; Li, Yanfeng; Zhang, Yue; lv, Yuan; Zhang, Ying; Hu, Ping; Liu, Huanyue; Ma, Zheng; Shen, Yi
2017-01-01
Abstract The aim of the study was to test the feasibility of using commercially available computer-aided design and computer-aided manufacturing (CAD/CAM) technology including 3Shape Dental System 2013 trial version, WIELAND V2.0.049 and WIELAND ZENOTEC T1 milling machine to design and fabricate complete dentures. The modeling process of full denture available in the trial version of 3Shape Dental System 2013 was used to design virtual complete dentures on the basis of 3-dimensional (3D) digital edentulous models generated from the physical models. The virtual complete dentures designed were exported to CAM software of WIELAND V2.0.049. A WIELAND ZENOTEC T1 milling machine controlled by the CAM software was used to fabricate physical dentitions and baseplates by milling acrylic resin composite plates. The physical dentitions were bonded to the corresponding baseplates to form the maxillary and mandibular complete dentures. Virtual complete dentures were successfully designed using the software through several steps including generation of 3D digital edentulous models, model analysis, arrangement of artificial teeth, trimming relief area, and occlusal adjustment. Physical dentitions and baseplates were successfully fabricated according to the designed virtual complete dentures using milling machine controlled by a CAM software. Bonding physical dentitions to the corresponding baseplates generated the final physical complete dentures. Our study demonstrated that complete dentures could be successfully designed and fabricated by using CAD/CAM. PMID:28072686
U.C. Davis high energy particle physics research: Technical progress report -- 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Summaries of progress made for this period is given for each of the following areas: (1) Task A--Experiment, H1 detector at DESY; (2) Task C--Experiment, AMY detector at KEK; (3) Task D--Experiment, fixed target detectors at Fermilab; (4) Task F--Experiment, PEP detector at SLAC and pixel detector; (5) Task B--Theory, particle physics; and (6) Task E--Theory, particle physics.
A facility to search for hidden particles at the CERN SPS: the SHiP physics case.
Alekhin, Sergey; Altmannshofer, Wolfgang; Asaka, Takehiko; Batell, Brian; Bezrukov, Fedor; Bondarenko, Kyrylo; Boyarsky, Alexey; Choi, Ki-Young; Corral, Cristóbal; Craig, Nathaniel; Curtin, David; Davidson, Sacha; de Gouvêa, André; Dell'Oro, Stefano; deNiverville, Patrick; Bhupal Dev, P S; Dreiner, Herbi; Drewes, Marco; Eijima, Shintaro; Essig, Rouven; Fradette, Anthony; Garbrecht, Björn; Gavela, Belen; Giudice, Gian F; Goodsell, Mark D; Gorbunov, Dmitry; Gori, Stefania; Grojean, Christophe; Guffanti, Alberto; Hambye, Thomas; Hansen, Steen H; Helo, Juan Carlos; Hernandez, Pilar; Ibarra, Alejandro; Ivashko, Artem; Izaguirre, Eder; Jaeckel, Joerg; Jeong, Yu Seon; Kahlhoefer, Felix; Kahn, Yonatan; Katz, Andrey; Kim, Choong Sun; Kovalenko, Sergey; Krnjaic, Gordan; Lyubovitskij, Valery E; Marcocci, Simone; Mccullough, Matthew; McKeen, David; Mitselmakher, Guenakh; Moch, Sven-Olaf; Mohapatra, Rabindra N; Morrissey, David E; Ovchynnikov, Maksym; Paschos, Emmanuel; Pilaftsis, Apostolos; Pospelov, Maxim; Reno, Mary Hall; Ringwald, Andreas; Ritz, Adam; Roszkowski, Leszek; Rubakov, Valery; Ruchayskiy, Oleg; Schienbein, Ingo; Schmeier, Daniel; Schmidt-Hoberg, Kai; Schwaller, Pedro; Senjanovic, Goran; Seto, Osamu; Shaposhnikov, Mikhail; Shchutska, Lesya; Shelton, Jessie; Shrock, Robert; Shuve, Brian; Spannowsky, Michael; Spray, Andy; Staub, Florian; Stolarski, Daniel; Strassler, Matt; Tello, Vladimir; Tramontano, Francesco; Tripathi, Anurag; Tulin, Sean; Vissani, Francesco; Winkler, Martin W; Zurek, Kathryn M
2016-12-01
This paper describes the physics case for a new fixed target facility at CERN SPS. The SHiP (search for hidden particles) experiment is intended to hunt for new physics in the largely unexplored domain of very weakly interacting particles with masses below the Fermi scale, inaccessible to the LHC experiments, and to study tau neutrino physics. The same proton beam setup can be used later to look for decays of tau-leptons with lepton flavour number non-conservation, [Formula: see text] and to search for weakly-interacting sub-GeV dark matter candidates. We discuss the evidence for physics beyond the standard model and describe interactions between new particles and four different portals-scalars, vectors, fermions or axion-like particles. We discuss motivations for different models, manifesting themselves via these interactions, and how they can be probed with the SHiP experiment and present several case studies. The prospects to search for relatively light SUSY and composite particles at SHiP are also discussed. We demonstrate that the SHiP experiment has a unique potential to discover new physics and can directly probe a number of solutions of beyond the standard model puzzles, such as neutrino masses, baryon asymmetry of the Universe, dark matter, and inflation.
Evaluation of a breast software model for 2D and 3D X-ray imaging studies of the breast.
Baneva, Yanka; Bliznakova, Kristina; Cockmartin, Lesley; Marinov, Stoyko; Buliev, Ivan; Mettivier, Giovanni; Bosmans, Hilde; Russo, Paolo; Marshall, Nicholas; Bliznakov, Zhivko
2017-09-01
In X-ray imaging, test objects reproducing breast anatomy characteristics are realized to optimize issues such as image processing or reconstruction, lesion detection performance, image quality and radiation induced detriment. Recently, a physical phantom with a structured background has been introduced for both 2D mammography and breast tomosynthesis. A software version of this phantom and a few related versions are now available and a comparison between these 3D software phantoms and the physical phantom will be presented. The software breast phantom simulates a semi-cylindrical container filled with spherical beads of different diameters. Four computational breast phantoms were generated with a dedicated software application and for two of these, physical phantoms are also available and they are used for the side by side comparison. Planar projections in mammography and tomosynthesis were simulated under identical incident air kerma conditions. Tomosynthesis slices were reconstructed with an in-house developed reconstruction software. In addition to a visual comparison, parameters like fractal dimension, power law exponent β and second order statistics (skewness, kurtosis) of planar projections and tomosynthesis reconstructed images were compared. Visually, an excellent agreement between simulated and real planar and tomosynthesis images is observed. The comparison shows also an overall very good agreement between parameters evaluated from simulated and experimental images. The computational breast phantoms showed a close match with their physical versions. The detailed mathematical analysis of the images confirms the agreement between real and simulated 2D mammography and tomosynthesis images. The software phantom is ready for optimization purpose and extrapolation of the phantom to other breast imaging techniques. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Liquid argon TPC signal formation, signal processing and reconstruction techniques
NASA Astrophysics Data System (ADS)
Baller, B.
2017-07-01
This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.
iPadPix—A novel educational tool to visualise radioactivity measured by a hybrid pixel detector
NASA Astrophysics Data System (ADS)
Keller, O.; Schmeling, S.; Müller, A.; Benoit, M.
2016-11-01
With the ability to attribute signatures of ionising radiation to certain particle types, pixel detectors offer a unique advantage over the traditional use of Geiger-Müller tubes also in educational settings. We demonstrate in this work how a Timepix readout chip combined with a standard 300μm pixelated silicon sensor can be used to visualise radioactivity in real-time and by means of augmented reality. The chip family is the result of technology transfer from High Energy Physics at CERN and facilitated by the Medipix Collaboration. This article summarises the development of a prototype based on an iPad mini and open source software detailed in ref. [1]. Appropriate experimental activities that explore natural radioactivity and everyday objects are given to demonstrate the use of this new tool in educational settings.
Acciarri, R.; Adamowski, M.; Artrip, D.; ...
2015-07-28
The second workshop to discuss the development of liquid argon time projection chambers (LArTPCs) in the United States was held at Fermilab on July 8-9, 2014. The workshop was organized under the auspices of the Coordinating Panel for Advanced Detectors, a body that was initiated by the American Physical Society Division of Particles and Fields. All presentations at the workshop were made in six topical plenary sessions: i) Argon Purity and Cryogenics, ii) TPC and High Voltage, iii) Electronics, Data Acquisition and Triggering, iv) Scintillation Light Detection, v) Calibration and Test Beams, and vi) Software. This document summarizes the currentmore » efforts in each of these areas. It primarily focuses on the work in the US, but also highlights work done elsewhere in the world.« less
Machine learning phases of matter
NASA Astrophysics Data System (ADS)
Carrasquilla, Juan; Melko, Roger G.
2017-02-01
Condensed-matter physics is the study of the collective behaviour of infinitely complex assemblies of electrons, nuclei, magnetic moments, atoms or qubits. This complexity is reflected in the size of the state space, which grows exponentially with the number of particles, reminiscent of the `curse of dimensionality' commonly encountered in machine learning. Despite this curse, the machine learning community has developed techniques with remarkable abilities to recognize, classify, and characterize complex sets of data. Here, we show that modern machine learning architectures, such as fully connected and convolutional neural networks, can identify phases and phase transitions in a variety of condensed-matter Hamiltonians. Readily programmable through modern software libraries, neural networks can be trained to detect multiple types of order parameter, as well as highly non-trivial states with no conventional order, directly from raw state configurations sampled with Monte Carlo.
Theoretical Studies of Alfven Waves and Energetic Particle Physics in Fusion Plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Liu
This report summarizes major theoretical findings in the linear as well as nonlinear physics of Alfvén waves and energetic particles in magnetically confined fusion plasmas. On the linear physics, a variational formulation, based on the separation of singular and regular spatial scales, for drift-Alfvén instabilities excited by energetic particles is established. This variational formulation is then applied to derive the general fishbone-like dispersion relations corresponding to the various Alfvén eigenmodes and energetic-particle modes. It is further employed to explore in depth the low-frequency Alfvén eigenmodes and demonstrate the non-perturbative nature of the energetic particles. On the nonlinear physics, new novelmore » findings are obtained on both the nonlinear wave-wave interactions and nonlinear wave-energetic particle interactions. It is demonstrated that both the energetic particles and the fine radial mode structures could qualitatively affect the nonlinear evolution of Alfvén eigenmodes. Meanwhile, a theoretical approach based on the Dyson equation is developed to treat self-consistently the nonlinear interactions between Alfvén waves and energetic particles, and is then applied to explain simulation results of energetic-particle modes. Relevant list of journal publications on the above findings is also included.« less
FabricS: A user-friendly, complete and robust software for particle shape-fabric analysis
NASA Astrophysics Data System (ADS)
Moreno Chávez, G.; Castillo Rivera, F.; Sarocchi, D.; Borselli, L.; Rodríguez-Sedano, L. A.
2018-06-01
Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability. The program has been verified by means of several simulations performed using appropriately designed features and by analyzing real samples.
Particle physics for primary schools—enthusing future physicists
NASA Astrophysics Data System (ADS)
Pavlidou, M.; Lazzeroni, C.
2016-09-01
In recent years, the realisation that children make decisions and choices about subjects they like in primary school, became widely understood. For this reason academic establishments focus some of their public engagement activities towards the younger ages. Taking advantage of Professor Lazzeroni’s long-standing experience in particle physics research, during the last academic year we designed and trialled a particle physics workshop for primary schools. The workshop allows young children (ages 8-11) to learn the world of fundamental particles, use creative design to make particle models. The workshop has already been trialled in many primary schools, receiving very positive evaluation. The initial resources were reviewed and improved, based on the feedback received from school teachers and communicators.
Martinus Veltman, the Electroweak Theory, and Elementary Particle Physics
Particle Physics Resources with Additional Information Martinus Veltman Courtesy University of Michigan Martinus J.G. Veltman, the John D. MacArthur Professor Emeritus of Physics at the University of Michigan , was awarded the 1999 Nobel Prize in physics "for elucidating the quantum structure of electroweak
NASA Astrophysics Data System (ADS)
2010-09-01
WE RECOMMEND Enjoyable Physics Mechanics book makes learning more fun SEP Colorimeter Box A useful and inexpensive colorimeter for the classroom Pursuing Power and Light Account of the development of science in the 19th centuary SEP Bottle Rocket Launcher An excellent resource for teaching about projectiles GLE Datalogger GPS software is combined with a datalogger EDU Logger Remote datalogger has greater sensing abilities Logotron Insight iLog Studio Software enables datlogging, data analysis and modelling iPhone Apps Mobile phone games aid study of gravity WORTH A LOOK Physics of Sailing Book journeys through the importance of physics in sailing The Lightness of Being Study of what the world is made from LECTURE The 2010 IOP Schools and Colleges Lecture presents the physics of fusion WEB WATCH Planet Scicast pushes boundaries of pupil creativity
Lagrangian particles with mixing. I. Simulating scalar transport
NASA Astrophysics Data System (ADS)
Klimenko, A. Y.
2009-06-01
The physical similarity and mathematical equivalence of continuous diffusion and particle random walk forms one of the cornerstones of modern physics and the theory of stochastic processes. The randomly walking particles do not need to posses any properties other than location in physical space. However, particles used in many models dealing with simulating turbulent transport and turbulent combustion do posses a set of scalar properties and mixing between particle properties is performed to reflect the dissipative nature of the diffusion processes. We show that the continuous scalar transport and diffusion can be accurately specified by means of localized mixing between randomly walking Lagrangian particles with scalar properties and assess errors associated with this scheme. Particles with scalar properties and localized mixing represent an alternative formulation for the process, which is selected to represent the continuous diffusion. Simulating diffusion by Lagrangian particles with mixing involves three main competing requirements: minimizing stochastic uncertainty, minimizing bias introduced by numerical diffusion, and preserving independence of particles. These requirements are analyzed for two limited cases of mixing between two particles and mixing between a large number of particles. The problem of possible dependences between particles is most complicated. This problem is analyzed using a coupled chain of equations that has similarities with Bogolubov-Born-Green-Kirkwood-Yvon chain in statistical physics. Dependences between particles can be significant in close proximity of the particles resulting in a reduced rate of mixing. This work develops further ideas introduced in the previously published letter [Phys. Fluids 19, 031702 (2007)]. Paper I of this work is followed by Paper II [Phys. Fluids 19, 065102 (2009)] where modeling of turbulent reacting flows by Lagrangian particles with localized mixing is specifically considered.
MO-DE-BRA-02: SIMAC: A Simulation Tool for Teaching Linear Accelerator Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlone, M; Harnett, N; Department of Radiation Oncology, University of Toronto, Toronto, Ontario
Purpose: The first goal of this work is to develop software that can simulate the physics of linear accelerators (linac). The second goal is to show that this simulation tool is effective in teaching linac physics to medical physicists and linac service engineers. Methods: Linacs were modeled using analytical expressions that can correctly describe the physical response of a linac to parameter changes in real time. These expressions were programmed with a graphical user interface in order to produce an environment similar to that of linac service mode. The software, “SIMAC”, has been used as a learning aid in amore » professional development course 3 times (2014 – 2016) as well as in a physics graduate program. Exercises were developed to supplement the didactic components of the courses consisting of activites designed to reinforce the concepts of beam loading; the effect of steering coil currents on beam symmetry; and the relationship between beam energy and flatness. Results: SIMAC was used to teach 35 professionals (medical physicists; regulators; service engineers; 1 week course) as well as 20 graduate students (1 month project). In the student evaluations, 85% of the students rated the effectiveness of SIMAC as very good or outstanding, and 70% rated the software as the most effective part of the courses. Exercise results were collected showing that 100% of the students were able to use the software correctly. In exercises involving gross changes to linac operating points (i.e. energy changes) the majority of students were able to correctly perform these beam adjustments. Conclusion: Software simulation(SIMAC), can be used to effectively teach linac physics. In short courses, students were able to correctly make gross parameter adjustments that typically require much longer training times using conventional training methods.« less
Multi-scale sensitivity analysis of pile installation using DEM
NASA Astrophysics Data System (ADS)
Esposito, Ricardo Gurevitz; Velloso, Raquel Quadros; , Eurípedes do Amaral Vargas, Jr.; Danziger, Bernadete Ragoni
2017-12-01
The disturbances experienced by the soil due to the pile installation and dynamic soil-structure interaction still present major challenges to foundation engineers. These phenomena exhibit complex behaviors, difficult to measure in physical tests and to reproduce in numerical models. Due to the simplified approach used by the discrete element method (DEM) to simulate large deformations and nonlinear stress-dilatancy behavior of granular soils, the DEM consists of an excellent tool to investigate these processes. This study presents a sensitivity analysis of the effects of introducing a single pile using the PFC2D software developed by Itasca Co. The different scales investigated in these simulations include point and shaft resistance, alterations in porosity and stress fields and particles displacement. Several simulations were conducted in order to investigate the effects of different numerical approaches showing indications that the method of installation and particle rotation could influence greatly in the conditions around the numerical pile. Minor effects were also noted due to change in penetration velocity and pile-soil friction. The difference in behavior of a moving and a stationary pile shows good qualitative agreement with previous experimental results indicating the necessity of realizing a force equilibrium process prior to any load-test to be simulated.
Cloud-based design of high average power traveling wave linacs
NASA Astrophysics Data System (ADS)
Kutsaev, S. V.; Eidelman, Y.; Bruhwiler, D. L.; Moeller, P.; Nagler, R.; Barbe Welzel, J.
2017-12-01
The design of industrial high average power traveling wave linacs must accurately consider some specific effects. For example, acceleration of high current beam reduces power flow in the accelerating waveguide. Space charge may influence the stability of longitudinal or transverse beam dynamics. Accurate treatment of beam loading is central to the design of high-power TW accelerators, and it is especially difficult to model in the meter-scale region where the electrons are nonrelativistic. Currently, there are two types of available codes: tracking codes (e.g. PARMELA or ASTRA) that cannot solve self-consistent problems, and particle-in-cell codes (e.g. Magic 3D or CST Particle Studio) that can model the physics correctly but are very time-consuming and resource-demanding. Hellweg is a special tool for quick and accurate electron dynamics simulation in traveling wave accelerating structures. The underlying theory of this software is based on the differential equations of motion. The effects considered in this code include beam loading, space charge forces, and external magnetic fields. We present the current capabilities of the code, provide benchmarking results, and discuss future plans. We also describe the browser-based GUI for executing Hellweg in the cloud.
The bead on a rotating hoop revisited: an unexpected resonance
NASA Astrophysics Data System (ADS)
Raviola, Lisandro A.; Véliz, Maximiliano E.; Salomone, Horacio D.; Olivieri, Néstor A.; Rodríguez, Eduardo E.
2017-01-01
The bead on a rotating hoop is a typical problem in mechanics, frequently posed to junior science and engineering students in basic physics courses. Although this system has a rich dynamics, it is usually not analysed beyond the point particle approximation in undergraduate textbooks, nor empirically investigated. Advanced textbooks show the existence of bifurcations owing to the system's nonlinear nature, and some papers demonstrate, from a theoretical standpoint, its points of contact with phase transition phenomena. However, scarce experimental research has been conducted to better understand its behaviour. We show in this paper that a minor modification to the problem leads to appealing consequences that can be studied both theoretically and empirically with the basic conceptual tools and experimental skills available to junior students. In particular, we go beyond the point particle approximation by treating the bead as a rigid spherical body, and explore the effect of a slightly non-vertical hoop's rotation axis that gives rise to a resonant behaviour not considered in previous works. This study can be accomplished by means of digital video and open source software. The experience can motivate an engaging laboratory project by integrating standard curriculum topics, data analysis and experimental exploration.
Multi-scale sensitivity analysis of pile installation using DEM
NASA Astrophysics Data System (ADS)
Esposito, Ricardo Gurevitz; Velloso, Raquel Quadros; , Eurípedes do Amaral Vargas, Jr.; Danziger, Bernadete Ragoni
2018-07-01
The disturbances experienced by the soil due to the pile installation and dynamic soil-structure interaction still present major challenges to foundation engineers. These phenomena exhibit complex behaviors, difficult to measure in physical tests and to reproduce in numerical models. Due to the simplified approach used by the discrete element method (DEM) to simulate large deformations and nonlinear stress-dilatancy behavior of granular soils, the DEM consists of an excellent tool to investigate these processes. This study presents a sensitivity analysis of the effects of introducing a single pile using the PFC2D software developed by Itasca Co. The different scales investigated in these simulations include point and shaft resistance, alterations in porosity and stress fields and particles displacement. Several simulations were conducted in order to investigate the effects of different numerical approaches showing indications that the method of installation and particle rotation could influence greatly in the conditions around the numerical pile. Minor effects were also noted due to change in penetration velocity and pile-soil friction. The difference in behavior of a moving and a stationary pile shows good qualitative agreement with previous experimental results indicating the necessity of realizing a force equilibrium process prior to any load-test to be simulated.
EDITORIAL: Metrological Aspects of Accelerator Technology and High Energy Physics Experiments
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.; Pozniak, Krzysztof T.
2007-08-01
The subject of this special feature in Measurement Science and Technology concerns measurement methods, devices and subsystems, both hardware and software aspects, applied in large experiments of high energy physics (HEP) and superconducting RF accelerator technology (SRF). These experiments concern mainly the physics of elementary particles or the building of new machines and detectors. The papers present practical examples of applied solutions in large, contemporary, international research projects such as HERA, LHC, FLASH, XFEL, ILC and others. These machines are unique in their global scale and consist of extremely dedicated apparatus. The apparatus is characterized by very large dimensions, a considerable use of resources and a high level of overall technical complexity. They possess a large number of measurement channels (ranging from thousands to over 100 million), are characterized by fast of processing of measured data and high measurement accuracies, and work in quite adverse environments. The measurement channels cooperate with a large number of different sensors of momenta, energies, trajectories of elementary particles, electron, proton and photon beam profiles, accelerating fields in resonant cavities, and many others. The provision of high quality measurement systems requires the designers to use only the most up-to-date technical solutions, measurement technologies, components and devices. Research work in these demanding fields is a natural birthplace of new measurement methods, new data processing and acquisition algorithms, complex, networked measurement system diagnostics and monitoring. These developments are taking place in both hardware and software layers. The chief intention of this special feature is that the papers represent equally some of the most current metrology research problems in HEP and SRF. The accepted papers have been divided into four topical groups: superconducting cavities (4 papers), low level RF systems (8 papers), ionizing radiation (5 papers) and HEP experiments (8 papers). The editors would like to thank cordially all the authors who accepted our invitation to present their very recent results. A number of authors of the papers in this issue are active in the 6th European Framework Research Program CARE—Coordinated Accelerators Research in Europe and ELAN—the European Linear Accelerator Network. Some authors are active in research programs of a global extent such as the LHC, ILC and GDE—the Global Design Effort for the International Linear Collider. We also would like to thank personally, as well as on behalf of all the authors, the Editorial Board of Measurement Science and Technology for accepting this very exciting field of contemporary metrology. This field seems to be really a birthplace of a host of new metrological technologies, where the driving force is the incredibly high technical requirements that must soon be fulfilled if we dream of building new accelerators for elementary particles, new biological materials and medicine alike. Special thanks are due to Professor R S Jachowicz of Warsaw University of Technology for initiating this issue and for continuous support and advice during our work.
Using Word Prediction Software to Increase Typing Fluency with Students with Physical Disabilities
ERIC Educational Resources Information Center
Tumlin, Jennifer; Heller, Kathryn Wolff
2004-01-01
The purpose of this study was to examine the use of word prediction software to increase typing speed and decrease spelling errors for students who have physical disabilities that affect hand use. Student perceptions regarding the effectiveness of word prediction was examined as well as their typing rates and spelling accuracy. Four students with…
ERIC Educational Resources Information Center
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
Development and Evaluation of the Effectiveness of Computer-Assisted Physics Instruction
ERIC Educational Resources Information Center
Rahman, Mohd. Jasmy Abd; Ismail, Mohd. Arif. Hj.; Nasir, Muhammad
2014-01-01
This study aims to design and develop an interactive software for teaching and learning physics about motion and vectors analysis. This study also assesses its effectiveness in classroom and assesses the learning motivation of SMA Pekanbaru's students. The software is developed using ADDIE Model design and Life Cycle Model and built using the…
Arduino-Based Data Acquisition into Excel, LabVIEW, and MATLAB
ERIC Educational Resources Information Center
Nichols, Daniel
2017-01-01
Data acquisition equipment for physics can be quite expensive. As an alternative, data can be acquired using a low-cost Arduino microcontroller. The Arduino has been used in physics labs where the data are acquired using the Arduino software. The Arduino software, however, does not contain a suite of tools for data fitting and analysis. The data…
Data management, archiving, visualization and analysis of space physics data
NASA Technical Reports Server (NTRS)
Russell, C. T.
1995-01-01
A series of programs for the visualization and analysis of space physics data has been developed at UCLA. In the course of those developments, a number of lessons have been learned regarding data management and data archiving, as well as data analysis. The issues now facing those wishing to develop such software, as well as the lessons learned, are reviewed. Modern media have eased many of the earlier problems of the physical volume required to store data, the speed of access, and the permanence of the records. However, the ultimate longevity of these media is still a question of debate. Finally, while software development has become easier, cost is still a limiting factor in developing visualization and analysis software.
NASA Astrophysics Data System (ADS)
Brown, Laurie Mark; Dresden, Max; Hoddeson, Lillian
2009-01-01
Part I. Introduction; 1. Pions to quarks: particle physics in the 1950s Laurie M Brown, Max Dresden and Lillian Hoddeson; 2. Particle physics in the early 1950s Chen Ning Yang; 3. An historian's interest in particle physics J. L. Heilbron; Part II. Particle discoveries in cosmic rays; 4. Cosmic-ray cloud-chamber contributions to the discovery of the strange particles in the decade 1947-1957 George D. Rochester; 5. Cosmic-ray work with emulsions in the 1940s and 1950s Donald H. Perkins; Part III. High-energy nuclear physics; Learning about nucleon resonances with pion photoproduction Robert L. Walker; 7. A personal view of nucleon structure as revealed by electron scattering Robert Hofstadter; 8. Comments on electromagnetic form factors of the nucleon Robert G. Sachs and Kameshwar C. Wali; Part IV. The new laboratory; 9. The making of an accelerator physicist Matthew Sands; 10. Accelerator design and construction in the 1950s John P. Blewett; 11. Early history of the Cosmotron and AGS Ernest D. Courant; 12. Panel on accelerators and detectors in the 1950s Lawrence W. Jones, Luis W. Alvarez, Ugo Amaldi, Robert Hofstadter, Donald W. Kerst, Robert R. Wilson; 13. Accelerators and the Midwestern Universities Research Association in the 1950s Donald W. Kerst; 14. Bubbles, sparks and the postwar laboratory Peter Galison; 15. Development of the discharge (spark) chamber in Japan in the 1950s Shuji Fukui; 16. Early work at the Bevatron: a personal account Gerson Goldhaber; 17. The discovery of the antiproton Owen Chamberlain; 18. On the antiproton discovery Oreste Piccioni; Part V. The Strange Particles; 19. The hydrogen bubble chamber and the strange resonances Luis W. Alvarez; 20. A particular view of particle physics in the fifties Jack Steinberger; 21. Strange particles William Chinowsky; 22. Strange particles: production by Cosmotron beams as observed in diffusion cloud chambers William B. Fowler; 23. From the 1940s into the 1950s Abraham Pais; Part VI. Detection of the neutrino Frederick Reines; 25. Recollections on the establishment of the weak-interaction notion Bruno M. Pontecorvo; 26. Symmetry and conservation laws in particle physics in the fifties Louis Michel; 27. A connection between the strong and weak interactions Sam B. Treiman; Part VII. Weak interactions and parity nonconservation; 29. The nondiscovery of parity nonconservation Allan Franklin; 30. K-meson decays and parity violation Richard H. Dalitz; 31. An Experimentalist's Perspective Val L. Fitch; 32. The early experiments leading to the V - A interaction Valentine L. Telegdi; 33. Midcentury adventures in particles physics E. C. G. Sudarshan; Part VIII. The particle physics community; 34. The postwar political economy of high-energy physics Robert Seidel; 35. The history of CERN during the early 1950s Edoardo Amaldi; 36. Arguments pro and contra the European laboratory in the participating countries Armin Hermann; 37. Physics and excellences of the life it brings Abdus Salam; 38. Social aspects of Japanese particle physics in the 1950s Michiji Konuma; Part IX. Theories of hadrons; 39. The early S-matrix theory and its propagation (1942-1952) Helmut Rechenberg; 40. From field theory to phenomenology: the history of dispersion relations Andy Pickering; 41. Particles as S-matrix poles: hadron democracy Geoffrey F. Chew; 42. The general theory of quantised fields in the 1950s Arthur S. Wrightman; 43. The classification and structure of hadrons Yuval Ne'eman; 44. Gauge principle, vector-meson dominance and spontaneous symmetry breaking Yoichiro Nambu; Part X. Personal overviews; 45. Scientific impact of the first decade of the Rochester conferences (1950-1960) Robert E. Marshak; 46. Some reflections on the history of particle physics in the 1950s Silvan S. Schweber; 47. Progress in elementary particle theory 1950-1964 Murray Gell-Mann.
Plato's Ideas and the Theories of Modern Particle Physics: Amazing Parallels
NASA Astrophysics Data System (ADS)
Machleidt, Ruprecht
2006-05-01
It is generally known that the question, ``What are the most elementary particles that all matter is made from?'', was already posed in the antiquity. The Greek natural philosophers Leucippus and Democritus were the first to suggest that all matter was made from atoms. Therefore, most people perceive them as the ancient fathers of elementary particle physics. However, this perception is wrong. Modern particle physics is not just a simple atomism. The characteristic point of modern particle theory is that it is concerned with the symmetries underlying the particles we discover in experiment. More than 2000 years ago, a similar idea was already advanced by the Greek philosopher Plato in his dialogue Timaeus: Geometric symmetries generate the atoms from just a few even more elementary items. Plato's vision is amazingly close to the ideas of modern particle theory. This fact, which is unfortunately little known, has been pointed out repeatedly by Werner Heisenberg.
Manipulation of particles by weak forces
NASA Technical Reports Server (NTRS)
Adler, M. S.; Savkar, S. D.; Summerhayes, H. R.
1972-01-01
Quantitative relations between various force fields and their effects on the motion of particles of various sizes and physical characteristics were studied. The forces considered were those derived from light, heat, microwaves, electric interactions, magnetic interactions, particulate interactions, and sound. A physical understanding is given of the forces considered as well as formulae which express how the size of the force depends on the physical and electrical properties of the particle. The drift velocity in a viscous fluid is evaluated as a function of initial acceleration and the effects of thermal random motion are considered. A means of selectively sorting or moving particles by choosing a force system and/or environment such that the particle of interest reacts uniquely was developed. The forces considered and a demonstration of how the initial acceleration, drift velocity, and ultimate particle density distribution is affected by particle, input, and environmental parameters are tabulated.
Quarks, Leptons, and Bosons: A Particle Physics Primer.
ERIC Educational Resources Information Center
Wagoner, Robert; Goldsmith, Donald
1983-01-01
Presented is a non-technical introduction to particle physics. The material is adapted from chapter 3 of "Cosmic Horizons," (by Robert Wagoner and Don Goldsmith), a lay-person's introduction to cosmology. Among the topics considered are elementary particles, forces and motion, and higher level structures. (JN)
Exploring physics concepts among novice teachers through CMAP tools
NASA Astrophysics Data System (ADS)
Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.
2018-03-01
Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.
Higgs Particle: The Origin of Mass
NASA Astrophysics Data System (ADS)
Okada, Yasuhiro
2007-11-01
The Higgs particle is a new elementary particle predicted in the Standard Model of the elementary particle physics. It plays a special role in the theory of mass generation of quarks, leptons, and gauge bosons. In this article, theoretical issues on the Higgs mechanism are first discussed, and then experimental prospects on the Higgs particle study at the future collider experiments, LHC and ILC, are reviewed. The Higgs coupling determination is an essential step to establish the mass generation mechanism, which could lead to a deeper understanding of particle physics.
Final Report: High Energy Physics at the Energy Frontier at Louisiana Tech
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sawyer, Lee; Wobisch, Markus; Greenwood, Zeno D.
The Louisiana Tech University High Energy Physics group has developed a research program aimed at experimentally testing the Standard Model of particle physics and searching for new phenomena through a focused set of analyses in collaboration with the ATLAS experiment at the Large Hadron Collider (LHC) at the CERN laboratory in Geneva. This research program includes involvement in the current operation and maintenance of the ATLAS experiment and full involvement in Phase 1 and Phase 2 upgrades in preparation for future high luminosity (HL-LHC) operation of the LHC. Our focus is solely on the ATLAS experiment at the LHC, withmore » some related detector development and software efforts. We have established important service roles on ATLAS in five major areas: Triggers, especially jet triggers; Data Quality monitoring; grid computing; GPU applications for upgrades; and radiation testing for upgrades. Our physics research is focused on multijet measurements and top quark physics in final states containing tau leptons, which we propose to extend into related searches for new phenomena. Focusing on closely related topics in the jet and top analyses and coordinating these analyses in our group has led to high efficiency and increased visibility inside the ATLAS collaboration and beyond. Based on our work in the DØ experiment in Run II of the Fermilab Tevatron Collider, Louisiana Tech has developed a reputation as one of the leading institutions pursuing jet physics studies. Currently we are applying this expertise to the ATLAS experiment, with several multijet analyses in progress.« less
Review of heavy charged particle transport in MCNP6.2
NASA Astrophysics Data System (ADS)
Zieb, K.; Hughes, H. G.; James, M. R.; Xu, X. G.
2018-04-01
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. This paper discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models' theories are included as well.
Review of Heavy Charged Particle Transport in MCNP6.2
Zieb, Kristofer James Ekhart; Hughes, Henry Grady III; Xu, X. George; ...
2018-01-05
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. Here, this article discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models’ theories are included as well.
Explaining Melting and Evaporation below Boiling Point. Can Software Help with Particle Ideas?
ERIC Educational Resources Information Center
Papageorgiou, George; Johnson, Philip; Fotiades, Fotis
2008-01-01
This paper reports the findings of a study exploring the use of a software package to help pupils understand particulate explanations for melting and evaporation below boiling point. Two matched classes in a primary school in Greece (ages 11-12, n = 16 and 19) were involved in a short intervention of six one hour lessons. Covering the same…
Yunker, Peter J; Chen, Ke; Gratale, Matthew D; Lohr, Matthew A; Still, Tim; Yodh, A G
2014-05-01
This review collects and describes experiments that employ colloidal suspensions to probe physics in ordered and disordered solids and related complex fluids. The unifying feature of this body of work is its clever usage of poly(N-isopropylacrylamide) (PNIPAM) microgel particles. These temperature-sensitive colloidal particles provide experimenters with a 'knob' for in situ control of particle size, particle interaction and particle packing fraction that, in turn, influence the structural and dynamical behavior of the complex fluids and solids. A brief summary of PNIPAM particle synthesis and properties is given, followed by a synopsis of current activity in the field. The latter discussion describes a variety of soft matter investigations including those that explore formation and melting of crystals and clusters, and those that probe structure, rearrangement and rheology of disordered (jammed/glassy) and partially ordered matter. The review, therefore, provides a snapshot of a broad range of physics phenomenology which benefits from the unique properties of responsive microgel particles.
Special Report: Part One. New Tools for Professionals.
ERIC Educational Resources Information Center
Liskin, Miriam; And Others
1984-01-01
This collection of articles includes an examination of word-processing software; project management software; new expert systems that turn microcomputers into logical, well-informed consultants; simulated negotiation software; telephone management systems; and the physical design of an efficient microcomputer work space. (MBR)
DOE R&D Accomplishments Database
Dahms, A. S.; Boyer, P. D.
This discusses the following topics in High Energy Physics: The Particle Zoo; The Strong and the Weak; The Particle Explosion; Deep Inside the Nucleon; The Search for Unity; Physics in Collision; The Standard Model; Particles and the Cosmos; and Practical Benefits.
A Hybrid Physics-Based Data-Driven Approach for Point-Particle Force Modeling
NASA Astrophysics Data System (ADS)
Moore, Chandler; Akiki, Georges; Balachandar, S.
2017-11-01
This study improves upon the physics-based pairwise interaction extended point-particle (PIEP) model. The PIEP model leverages a physical framework to predict fluid mediated interactions between solid particles. While the PIEP model is a powerful tool, its pairwise assumption leads to increased error in flows with high particle volume fractions. To reduce this error, a regression algorithm is used to model the differences between the current PIEP model's predictions and the results of direct numerical simulations (DNS) for an array of monodisperse solid particles subjected to various flow conditions. The resulting statistical model and the physical PIEP model are superimposed to construct a hybrid, physics-based data-driven PIEP model. It must be noted that the performance of a pure data-driven approach without the model-form provided by the physical PIEP model is substantially inferior. The hybrid model's predictive capabilities are analyzed using more DNS. In every case tested, the hybrid PIEP model's prediction are more accurate than those of physical PIEP model. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1315138 and the U.S. DOE, NNSA, ASC Program, as a Cooperative Agreement under Contract No. DE-NA0002378.
Particle Dark Matter constraints: the effect of Galactic uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benito, Maria; Bernal, Nicolás; Iocco, Fabio
2017-02-01
Collider, space, and Earth based experiments are now able to probe several extensions of the Standard Model of particle physics which provide viable dark matter candidates. Direct and indirect dark matter searches rely on inputs of astrophysical nature, such as the local dark matter density or the shape of the dark matter density profile in the target in object. The determination of these quantities is highly affected by astrophysical uncertainties. The latter, especially those for our own Galaxy, are ill-known, and often not fully accounted for when analyzing the phenomenology of particle physics models. In this paper we present amore » systematic, quantitative estimate of how astrophysical uncertainties on Galactic quantities (such as the local galactocentric distance, circular velocity, or the morphology of the stellar disk and bulge) propagate to the determination of the phenomenology of particle physics models, thus eventually affecting the determination of new physics parameters. We present results in the context of two specific extensions of the Standard Model (the Singlet Scalar and the Inert Doublet) that we adopt as case studies for their simplicity in illustrating the magnitude and impact of such uncertainties on the parameter space of the particle physics model itself. Our findings point toward very relevant effects of current Galactic uncertainties on the determination of particle physics parameters, and urge a systematic estimate of such uncertainties in more complex scenarios, in order to achieve constraints on the determination of new physics that realistically include all known uncertainties.« less
Measurement and Modeling of Electromagnetic Scattering by Particles and Particle Groups. Chapter 3
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.
2015-01-01
Small particles forming clouds of interstellar and circumstellar dust, regolith surfaces of many solar system bodies, and cometary atmospheres have a strong and often controlling effect on many ambient physical and chemical processes. Similarly, aerosol and cloud particles exert a strong influence on the regional and global climates of the Earth, other planets of the solar system, and exoplanets. Therefore, detailed and accurate knowledge of physical and chemical characteristics of such particles has the utmost scientific importance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zelenyuk, Alla; Imre, D.; Wilson, Jacqueline M.
2015-02-01
Understanding the effect of aerosols on climate requires knowledge of the size and chemical composition of individual aerosol particles - two fundamental properties that determine an aerosol’s optical properties and ability to serve as cloud condensation or ice nuclei. Here we present miniSPLAT, our new aircraft compatible single particle mass spectrometer, that measures in-situ and in real-time size and chemical composition of individual aerosol particles with extremely high sensitivity, temporal resolution, and sizing precision on the order of a monolayer. miniSPLAT operates in dual data acquisition mode to measure, in addition to single particle size and composition, particle number concentrations,more » size distributions, density, and asphericity with high temporal resolution. When compared to our previous instrument, SPLAT II, miniSPLAT has been significantly reduced in size, weight, and power consumption without loss in performance. We also present ND-Scope, our newly developed interactive visual analytics software package. ND-Scope is designed to explore and visualize the vast amount of complex, multidimensional data acquired by our single particle mass spectrometers, along with other aerosol and cloud characterization instruments on-board aircraft. We demonstrate that ND-Scope makes it possible to visualize the relationships between different observables and to view the data in a geo-spatial context, using the interactive and fully coupled Google Earth and Parallel Coordinates displays. Here we illustrate the utility of ND-Scope to visualize the spatial distribution of atmospheric particles of different compositions, and explore the relationship between individual particle composition and their activity as cloud condensation nuclei.« less
Challenges and Demands on Automated Software Revision
NASA Technical Reports Server (NTRS)
Bonakdarpour, Borzoo; Kulkarni, Sandeep S.
2008-01-01
In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.
Quantum Optics, Diffraction Theory, and Elementary Particle Physics
Glauber, Roy
2018-05-22
Physical optics has expanded greatly in recent years. Though it remains part of the ancestry of elementary particle physics, there are once again lessons to be learned from it. I shall discuss several of these, including some that have emerged at CERN and Brookhaven.
Software and languages for microprocessors
NASA Astrophysics Data System (ADS)
Williams, David O.
1986-08-01
This paper forms the basis for lectures given at the 6th Summer School on Computing Techniques in Physics, organised by the Computational Physics group of the European Physics Society, and held at the Hotel Ski, Nové Město na Moravě, Czechoslovakia, on 17-26 September 1985. Various types of microprocessor applications are discussed and the main emphasis of the paper is devoted to 'embedded' systems, where the software development is not carried out on the target microprocessor. Some information is provided on the general characteristics of microprocessor hardware. Various types of microprocessor operating system are compared and contrasted. The selection of appropriate languages and software environments for use with microprocessors is discussed. Mechanisms for interworking between different languages, including reasonable error handling, are treated. The CERN developed cross-software suite for the Motorola 68000 family is described. Some remarks are made concerning program tools applicable to microprocessors. PILS, a Portable Interactive Language System, which can be interpreted or compiled for a range of microprocessors, is described in some detail, and the implementation techniques are discussed.
ERIC Educational Resources Information Center
Yilmaz, Gül Kaleli
2015-01-01
This study aims to investigate the effects of using Dynamic Geometry Software (DGS) Cabri II Plus and physical manipulatives on the transformational geometry achievement of candidate teachers. In this study, the semi-experimental method was used, consisting of two experimental and one control groups. The samples of this study were 117 students. A…
Fall 2014 SEI Research Review High Confidence Cyber Physical Systems
2014-10-28
2014 Carnegie Mellon University Fall 2014 SEI Research Review High Confidence Cyber Physical Systems Software Engineering Institute Carnegie... Research Review de Niz Oct 28th, 2014 © 2014 Carnegie Mellon University Copyright 2014 Carnegie Mellon University This material is based upon work...Software Engineering Institute, a federally funded research and development center. Any opinions, findings and conclusions or recommendations expressed
Reliability Validation and Improvement Framework
2012-11-01
systems . Steps in that direction include the use of the Architec- ture Tradeoff Analysis Method ® (ATAM®) developed at the Carnegie Mellon...embedded software • cyber - physical systems (CPSs) to indicate that the embedded software interacts with, manag - es, and controls a physical system [Lee...the use of formal static analysis methods to increase our confidence in system operation beyond testing. However, analysis results
Trusted Silicon Stratus (TSS) Workshop
2011-02-01
business case for a proposed Infrastructure-as-a- Service (IaaS)/ Software -as-a- Service ( SaaS ) cloud architecture. User desires for innovative pricing and...Public Physically Unclonable Function PUF Physically Unclonable Function SaaS Software -as-a- Service SIP Semiconductor Intellectual Property SNL...WORKSHOP NIMBIS SERVICES INCORPORATED FEBRUARY 2011 FINAL TECHNICAL REPORT ROME, NY 13441 UNITED STATES AIR FORCE AIR FORCE
Flowing Valued Information and Cyber-Physical Situational Awareness
2012-01-01
file type” constraints. The basic software supporting encryption and signing uses the OPENSSL software suite (the November 2009 version is...authorities for each organization can use OPENSSL software to generate their public and private keys. The MBTC does need to know the public or private
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; LeCompte, Tom
2015-10-29
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
NASA Technical Reports Server (NTRS)
Perkins, D. H.
1986-01-01
Elementary particle physics is discussed. Status of the Standard Model of electroweak and strong interactions; phenomena beyond the Standard Model; new accelerator projects; and possible contributions from non-accelerator experiments are examined.
A proposed physical analog for a quantum probability amplitude
NASA Astrophysics Data System (ADS)
Boyd, Jeffrey
What is the physical analog of a probability amplitude? All quantum mathematics, including quantum information, is built on amplitudes. Every other science uses probabilities; QM alone uses their square root. Why? This question has been asked for a century, but no one previously has proposed an answer. We will present cylindrical helices moving toward a particle source, which particles follow backwards. Consider Feynman's book QED. He speaks of amplitudes moving through space like the hand of a spinning clock. His hand is a complex vector. It traces a cylindrical helix in Cartesian space. The Theory of Elementary Waves changes direction so Feynman's clock faces move toward the particle source. Particles follow amplitudes (quantum waves) backwards. This contradicts wave particle duality. We will present empirical evidence that wave particle duality is wrong about the direction of particles versus waves. This involves a paradigm shift; which are always controversial. We believe that our model is the ONLY proposal ever made for the physical foundations of probability amplitudes. We will show that our ``probability amplitudes'' in physical nature form a Hilbert vector space with adjoints, an inner product and support both linear algebra and Dirac notation.
HIGH ENERGY PHYSICS: CERN Link Breathes Life Into Russian Physics.
Stone, R
2000-10-13
Without fanfare, 600 Russian scientists here at CERN, the European particle physics laboratory, are playing key roles in building the Large Hadron Collider (LHC), a machine that will explore fundamental questions such as why particles have mass, as well as search for exotic new particles whose existence would confirm supersymmetry, a popular theory that aims to unify the four forces of nature. In fact, even though Russia is not one of CERN's 20 member states, most top high-energy physicists in Russia are working on the LHC. Some say their work could prove the salvation of high-energy physics back home.
NASA Astrophysics Data System (ADS)
2010-05-01
Teaching: The epiSTEMe project: KS3 maths and science improvement Field trip: Pupils learn physics in a stately home Conference: ShowPhysics welcomes fun in Europe Student numbers: Physics numbers increase in UK Tournament: Physics tournament travels to Singapore Particle physics: Hadron Collider sets new record Astronomy: Take your classroom into space Forthcoming Events
Performance profiling for brachytherapy applications
NASA Astrophysics Data System (ADS)
Choi, Wonqook; Cho, Kihyeon; Yeo, Insung
2018-05-01
In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.
Fact Sheets and Brochures | News
Illinois Accelerator Research Center Economic Impact Particle Physics: Benefits to Society The Fermilab Saturday Morning Physics What are neutrinos? What are neutrinos? (large format) What is a Higgs boson? U.S Public Outreach America's particle physics and accelerator laboratory LBNF/DUNE - An international mega
Design Considerations for High Energy Electron -- Positron Storage Rings
DOE R&D Accomplishments Database
Richter, B.
1966-11-01
High energy electron-positron storage rings give a way of making a new attack on the most important problems of elementary particle physics. All of us who have worked in the storage ring field designing, building, or using storage rings know this. The importance of that part of storage ring work concerning tests of quantum electrodynamics and mu meson physics is also generally appreciated by the larger physics community. However, I do not think that most of the physicists working tin the elementary particle physics field realize the importance of the contribution that storage ring experiments can make to our understanding of the strongly interacting particles. I would therefore like to spend the next few minutes discussing the sort of things that one can do with storage rings in the strongly interacting particle field.
NASA Astrophysics Data System (ADS)
Grupen, Claus; Shwartz, Boris
2011-09-01
Preface to the first edition; Preface to the second edition; Introduction; 1. Interactions of particles and radiation with matter; 2. Characteristic properties of detectors; 3. Units of radiation measurements and radiation sources; 4. Accelerators; 5. Main physical phenomena used for particle detection and basic counter types; 6. Historical track detectors; 7. Track detectors; 8. Calorimetry; 9. Particle identification; 10. Neutrino detectors; 11. Momentum measurement and muon detection; 12. Ageing and radiation effects; 13. Example of a general-purpose detector: Belle; 14. Electronics; 15. Data analysis; 16. Applications of particle detectors outside particle physics; 17. Glossary; 18. Solutions; 19. Resumé; Appendixes; Index.
Determination of the number of J/ψ events with inclusive J/ψ decays
NASA Astrophysics Data System (ADS)
Ablikim, M.; Achasov, M. N.; Ai, X. C.; Albayrak, O.; Albrecht, M.; Ambrose, D. J.; Amoroso, A.; An, F. F.; An, Q.; Bai, J. Z.; Baldini Ferroli, R.; Ban, Y.; Bennett, D. W.; Bennett, J. V.; Bertani, M.; Bettoni, D.; Bian, J. M.; Bianchi, F.; Boger, E.; Boyko, I.; Briere, R. A.; Cai, H.; Cai, X.; Cakir, O.; Calcaterra, A.; Cao, G. F.; Cetin, S. A.; Chang, J. F.; Chelkov, G.; Chen, G.; Chen, H. S.; Chen, H. Y.; Chen, J. C.; Chen, M. L.; Chen, S. J.; Chen, X.; Chen, X. R.; Chen, Y. B.; Cheng, H. P.; Chu, X. K.; Cibinetto, G.; Dai, H. L.; Dai, J. P.; Dbeyssi, A.; Dedovich, D.; Deng, Z. Y.; Denig, A.; Denysenko, I.; Destefanis, M.; De Mori, F.; Ding, Y.; Dong, C.; Dong, J.; Dong, L. Y.; Dong, M. Y.; Dou, Z. L.; Du, S. X.; Duan, P. F.; Fan, J. Z.; Fang, J.; Fang, S. S.; Fang, X.; Fang, Y.; Farinelli, R.; Fava, L.; Fedorov, O.; Feldbauer, F.; Felici, G.; Feng, C. Q.; Fioravanti, E.; Fritsch, M.; Fu, C. D.; Gao, Q.; Gao, X. L.; Gao, X. Y.; Gao, Y.; Gao, Z.; Garzia, I.; Goetzen, K.; Gong, L.; Gong, W. X.; Gradl, W.; Greco, M.; Gu, M. H.; Gu, Y. T.; Guan, Y. H.; Guo, A. Q.; Guo, L. B.; Guo, Y.; Guo, Y. P.; Haddadi, Z.; Hafner, A.; Han, S.; Hao, X. Q.; Harris, F. A.; He, K. L.; Held, T.; Heng, Y. K.; Hou, Z. L.; Hu, C.; Hu, H. M.; Hu, J. F.; Hu, T.; Hu, Y.; Huang, G. S.; Huang, J. S.; Huang, X. T.; Huang, Y.; Hussain, T.; Ji, Q.; Ji, Q. P.; Ji, X. B.; Ji, X. L.; Jiang, L. W.; Jiang, X. S.; Jiang, X. Y.; Jiao, J. B.; Jiao, Z.; Jin, D. P.; Jin, S.; Johansson, T.; Julin, A.; Kalantar-Nayestanaki, N.; Kang, X. L.; Kang, X. S.; Kavatsyuk, M.; Ke, B. C.; Kiese, P.; Kliemt, R.; Kloss, B.; Kolcu, O. B.; Kopf, B.; Kornicer, M.; Kupsc, A.; Kühn, W.; Lange, J. S.; Lara, M.; Larin, P.; Leng, C.; Li, C.; Li, Cheng; Li, D. M.; Li, F.; Li, F. Y.; Li, G.; Li, H. B.; Li, J. C.; Li, Jin; Li, K.; Li, K.; Li, Lei; Li, P. R.; Li, Q. Y.; Li, T.; Li, W. D.; Li, W. G.; Li, X. L.; Li, X. N.; Li, X. Q.; Li, Z. B.; Liang, H.; Liang, Y. F.; Liang, Y. T.; Liao, G. R.; Lin, D. X.; Liu, B. J.; Liu, C. X.; Liu, D.; Liu, F. H.; Liu, Fang; Liu, Feng; Liu, H. B.; Liu, H. H.; Liu, H. H.; Liu, H. M.; Liu, J.; Liu, J. B.; Liu, J. P.; Liu, J. Y.; Liu, K.; Liu, K. Y.; Liu, L. D.; Liu, P. L.; Liu, Q.; Liu, S. B.; Liu, X.; Liu, Y. B.; Liu, Z. A.; Liu, Zhiqing; Loehner, H.; Lou, X. C.; Lu, H. J.; Lu, J. G.; Lu, Y.; Lu, Y. P.; Luo, C. L.; Luo, M. X.; Luo, T.; Luo, X. L.; Lyu, X. R.; Ma, F. C.; Ma, H. L.; Ma, L. L.; Ma, Q. M.; Ma, T.; Ma, X. N.; Ma, X. Y.; Ma, Y. M.; Maas, F. E.; Maggiora, M.; Mao, Y. J.; Mao, Z. P.; Marcello, S.; Messchendorp, J. G.; Min, J.; Min, T. J.; Mitchell, R. E.; Mo, X. H.; Mo, Y. J.; Morales Morales, C.; Muchnoi, N. Yu.; Muramatsu, H.; Nefedov, Y.; Nerling, F.; Nikolaev, I. B.; Ning, Z.; Nisar, S.; Niu, S. L.; Niu, X. Y.; Olsen, S. L.; Ouyang, Q.; Pacetti, S.; Pan, Y.; Patteri, P.; Pelizaeus, M.; Peng, H. P.; Peters, K.; Pettersson, J.; Ping, J. L.; Ping, R. G.; Poling, R.; Prasad, V.; Qi, H. R.; Qi, M.; Qian, S.; Qiao, C. F.; Qin, L. Q.; Qin, N.; Qin, X. S.; Qin, Z. H.; Qiu, J. F.; Rashid, K. H.; Redmer, C. F.; Ripka, M.; Rong, G.; Rosner, Ch.; Ruan, X. D.; Santoro, V.; Sarantsev, A.; Savrié, M.; Schoenning, K.; Schumann, S.; Shan, W.; Shao, M.; Shen, C. P.; Shen, P. X.; Shen, X. Y.; Sheng, H. Y.; Song, W. M.; Song, X. Y.; Sosio, S.; Spataro, S.; Sun, G. X.; Sun, J. F.; Sun, S. S.; Sun, Y. J.; Sun, Y. Z.; Sun, Z. J.; Sun, Z. T.; Tang, C. J.; Tang, X.; Tapan, I.; Thorndike, E. H.; Tiemens, M.; Ullrich, M.; Uman, I.; Varner, G. S.; Wang, B.; Wang, B. L.; Wang, D.; Wang, D. Y.; Wang, K.; Wang, L. L.; Wang, L. S.; Wang, M.; Wang, P.; Wang, P. L.; Wang, W.; Wang, W. P.; Wang, X. F.; Wang, Y. D.; Wang, Y. F.; Wang, Y. Q.; Wang, Z.; Wang, Z. G.; Wang, Z. H.; Wang, Z. Y.; Weber, T.; Wei, D. H.; Weidenkaff, P.; Wen, S. P.; Wiedner, U.; Wolke, M.; Wu, L. H.; Wu, Z.; Xia, L.; Xia, L. G.; Xia, Y.; Xiao, D.; Xiao, H.; Xiao, Z. J.; Xie, Y. G.; Xiu, Q. L.; Xu, G. F.; Xu, L.; Xu, Q. J.; Xu, Q. N.; Xu, X. P.; Yan, L.; Yan, W. B.; Yan, W. C.; Yan, Y. H.; Yang, H. J.; Yang, H. X.; Yang, L.; Yang, Y. X.; Ye, M.; Ye, M. H.; Yin, J. H.; Yu, B. X.; Yu, C. X.; Yu, J. S.; Yuan, C. Z.; Yuan, W. L.; Yuan, Y.; Yuncu, A.; Zafar, A. A.; Zallo, A.; Zeng, Y.; Zeng, Z.; Zhang, B. X.; Zhang, B. Y.; Zhang, C.; Zhang, C. C.; Zhang, D. H.; Zhang, H. H.; Zhang, H. Y.; Zhang, J. J.; Zhang, J. L.; Zhang, J. Q.; Zhang, J. W.; Zhang, J. Y.; Zhang, J. Z.; Zhang, K.; Zhang, L.; Zhang, X. Y.; Zhang, Y.; Zhang, Y. H.; Zhang, Y. N.; Zhang, Y. T.; Zhang, Yu; Zhang, Z. H.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, G.; Zhao, J. W.; Zhao, J. Y.; Zhao, J. Z.; Zhao, Lei; Zhao, Ling; Zhao, M. G.; Zhao, Q.; Zhao, Q. W.; Zhao, S. J.; Zhao, T. C.; Zhao, Y. B.; Zhao, Z. G.; Zhemchugov, A.; Zheng, B.; Zheng, J. P.; Zheng, W. J.; Zheng, Y. H.; Zhong, B.; Zhou, L.; Zhou, X.; Zhou, X. K.; Zhou, X. R.; Zhou, X. Y.; Zhu, K.; Zhu, K. J.; Zhu, S.; Zhu, S. H.; Zhu, X. L.; Zhu, Y. C.; Zhu, Y. S.; Zhu, Z. A.; Zhuang, J.; Zotti, L.; Zou, B. S.; Zou, J. H.; BESIII Collaboration
2017-01-01
A measurement of the number of J/ψ events collected with the BESIII detector in 2009 and 2012 is performed using inclusive decays of the J/ψ. The number of J/ψ events taken in 2009 is recalculated to be (223.7 ± 1.4) × 106, which is in good agreement with the previous measurement, but with significantly improved precision due to improvements in the BESIII software. The number of J/ψ events taken in 2012 is determined to be (1086.9 ± 6.0) × 106. In total, the number of J/ψ events collected with the BESIII detector is measured to be (1310.6 ± 7.0) × 106, where the uncertainty is dominated by systematic effects and the statistical uncertainty is negligible. Supported by National Key Basic Research Program of China (2015CB856700), National Natural Science Foundation of China (NSFC) (10805053, 11125525, 11175188, 11235011, 11322544, 11335008, 11425524), Chinese Academy of Sciences (CAS) Large-Scale Scientific Facility Program, the CAS Center for Excellence in Particle Physics (CCEPP), Collaborative Innovation Center for Particles and Interactions (CICPI), Joint Large-Scale Scientific Facility Funds of NSFC and CAS (11179007, U1232201, U1232107, U1332201), CAS (KJCX2-YW-N29, KJCX2-YW-N45), 100 Talents Program of CAS, INPAC and Shanghai Key Laboratory for Particle Physics and Cosmology, German Research Foundation DFG (Collaborative Research Center CRC-1044), Istituto Nazionale di Fisica Nucleare, Italy; Ministry of Development of Turkey (DPT2006K-120470), Russian Foundation for Basic Research (14-07-91152), U. S. Department of Energy (DE-FG02-04ER41291, DE-FG02-05ER41374, DE-FG02-94ER40823, DESC0010118), U.S. National Science Foundation, University of Groningen (RuG) and the Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Darmstadt; WCU Program of National Research Foundation of Korea (R32-2008-000-10155-0)
A Validation Framework for the Long Term Preservation of High Energy Physics Data
NASA Astrophysics Data System (ADS)
Ozerov, Dmitri; South, David M.
2014-06-01
The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.
Belle2VR: A Virtual-Reality Visualization of Subatomic Particle Physics in the Belle II Experiment.
Duer, Zach; Piilonen, Leo; Glasson, George
2018-05-01
Belle2VR is an interactive virtual-reality visualization of subatomic particle physics, designed by an interdisciplinary team as an educational tool for learning about and exploring subatomic particle collisions. This article describes the tool, discusses visualization design decisions, and outlines our process for collaborative development.
Lithium Gadolinium Borate in Plastic Scintillator as an Antineutrino Detection Material
2010-06-01
advancement of fundamental particle physics, development of the standard model of particle physics and our understanding many cosmological processes...MeVee). Where the light produced by by a 1MeV electron is 1 MeVee by definition , but a heavy charged particle would have a kinetic energy of several
Teaching Particle Physics in the Open University's Science Foundation Course.
ERIC Educational Resources Information Center
Farmelo, Graham
1992-01-01
Discusses four topics presented in the science foundation course of the Open University that exemplify current developments in particle physics, in particular, and that describe important issues about the nature of science, in general. Topics include the omega minus particle, the diversity of quarks, the heavy lepton, and the discovery of the W…
Future particle-physics projects in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denisov, D. S., E-mail: denisovd@fnal.gov
2015-07-15
Basic proposals of experiments aimed at precision measurements of Standard Model parameters and at searches for new particles, including dark-matter particles, are described along with future experimental projects considered by American Physical Society at the meeting in the summer of 2013 and intended for implementation within the next ten to twenty years.
Future particle-physics projects in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denisov, D. S.
2015-08-25
Basic proposals of experiments aimed at precision measurements of Standard Model parameters and at searches for new particles, including dark-matter particles, are described along with future experimental projects considered by American Physical Society at the meeting in the summer of 2013 and intended for implementation within the next ten to twenty years.
Comprehensive model for predicting elemental composition of coal pyrolysis products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricahrds, Andrew P.; Shutt, Tim; Fletcher, Thomas H.
Large-scale coal combustion simulations depend highly on the accuracy and utility of the physical submodels used to describe the various physical behaviors of the system. Coal combustion simulations depend on the particle physics to predict product compositions, temperatures, energy outputs, and other useful information. The focus of this paper is to improve the accuracy of devolatilization submodels, to be used in conjunction with other particle physics models. Many large simulations today rely on inaccurate assumptions about particle compositions, including that the volatiles that are released during pyrolysis are of the same elemental composition as the char particle. Another common assumptionmore » is that the char particle can be approximated by pure carbon. These assumptions will lead to inaccuracies in the overall simulation. There are many factors that influence pyrolysis product composition, including parent coal composition, pyrolysis conditions (including particle temperature history and heating rate), and others. All of these factors are incorporated into the correlations to predict the elemental composition of the major pyrolysis products, including coal tar, char, and light gases.« less
NASA Astrophysics Data System (ADS)
Zhang, Jun; Li, Ri Yi
2018-06-01
Building energy simulation is an important supporting tool for green building design and building energy consumption assessment, At present, Building energy simulation software can't meet the needs of energy consumption analysis and cabinet level micro environment control design of prefabricated building. thermal physical model of prefabricated building is proposed in this paper, based on the physical model, the energy consumption calculation software of prefabricated cabin building(PCES) is developed. we can achieve building parameter setting, energy consumption simulation and building thermal process and energy consumption analysis by PCES.
Learning about a Level Physics Students' Understandings of Particle Physics Using Concept Mapping
ERIC Educational Resources Information Center
Gourlay, H.
2017-01-01
This paper describes a small-scale piece of research using concept mapping to elicit A level students' understandings of particle physics. Fifty-nine year 12 (16- and 17 year-old) students from two London schools participated. The exercise took place during school physics lessons. Students were instructed how to make a concept map and were…
Particle transport and deposition: basic physics of particle kinetics
Tsuda, Akira; Henry, Frank S.; Butler, James P.
2015-01-01
The human body interacts with the environment in many different ways. The lungs interact with the external environment through breathing. The enormously large surface area of the lung with its extremely thin air-blood barrier is exposed to particles suspended in the inhaled air. Whereas the particle-lung interaction may cause deleterious effects on health if the inhaled pollutant aerosols are toxic, this interaction can be beneficial for disease treatment if the inhaled particles are therapeutic aerosolized drug. In either case, an accurate estimation of dose and sites of deposition in the respiratory tract is fundamental to understanding subsequent biological response, and the basic physics of particle motion and engineering knowledge needed to understand these subjects is the topic of this chapter. A large portion of this chapter deals with three fundamental areas necessary to the understanding of particle transport and deposition in the respiratory tract. These are: 1) the physical characteristics of particles, 2) particle behavior in gas flow, and 3) gas flow patterns in the respiratory tract. Other areas, such as particle transport in the developing lung and in the diseased lung are also considered. The chapter concludes with a summary and a brief discussion of areas of future research. PMID:24265235
Design study of Software-Implemented Fault-Tolerance (SIFT) computer
NASA Technical Reports Server (NTRS)
Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.
1982-01-01
Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.
NASA Technical Reports Server (NTRS)
Hughes, David; Dazzo, Tony
2007-01-01
This viewgraph presentation reviews the use of particle analysis to assist in preparing for the 4th Hubble Space Telescope (HST) Servicing mission. During this mission the Space Telescope Imaging Spectrograph (STIS) will be repaired. The particle analysis consisted of Finite element mesh creation, Black-body viewfactors generated using I-DEAS TMG Thermal Analysis, Grey-body viewfactors calculated using Markov method, Particle distribution modeled using an iterative Monte Carlo process, (time-consuming); in house software called MASTRAM, Differential analysis performed in Excel, and Visualization provided by Tecplot and I-DEAS. Several tests were performed and are reviewed: Conformal Coat Particle Study, Card Extraction Study, Cover Fastener Removal Particle Generation Study, and E-Graf Vibration Particulate Study. The lessons learned during this analysis are also reviewed.
Automated Proton Track Identification in MicroBooNE Using Gradient Boosted Decision Trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodruff, Katherine
MicroBooNE is a liquid argon time projection chamber (LArTPC) neutrino experiment that is currently running in the Booster Neutrino Beam at Fermilab. LArTPC technology allows for high-resolution, three-dimensional representations of neutrino interactions. A wide variety of software tools for automated reconstruction and selection of particle tracks in LArTPCs are actively being developed. Short, isolated proton tracks, the signal for low- momentum-transfer neutral current (NC) elastic events, are easily hidden in a large cosmic background. Detecting these low-energy tracks will allow us to probe interesting regions of the proton's spin structure. An effective method for selecting NC elastic events is tomore » combine a highly efficient track reconstruction algorithm to find all candidate tracks with highly accurate particle identification using a machine learning algorithm. We present our work on particle track classification using gradient tree boosting software (XGBoost) and the performance on simulated neutrino data.« less
Simulation of concentration distribution of urban particles under wind
NASA Astrophysics Data System (ADS)
Chen, Yanghou; Yang, Hangsheng
2018-02-01
The concentration of particulate matter in the air is too high, which seriously affects people’s health. The concentration of particles in densely populated towns is also high. Understanding the distribution of particles in the air helps to remove them passively. The concentration distribution of particles in urban streets is simulated by using the FLUENT software. The simulation analysis based on Discrete Phase Modelling (DPM) of FLUENT. Simulation results show that the distribution of the particles is caused by different layout of buildings. And it is pointed out that in the windward area of the building and the leeward sides of the high-rise building are the areas with high concentration of particles. Understanding the concentration of particles in different areas is also helpful for people to avoid and reduce the concentration of particles in high concentration areas.
PyPWA: A partial-wave/amplitude analysis software framework
NASA Astrophysics Data System (ADS)
Salgado, Carlos
2016-05-01
The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.
Scientific program and abstracts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerich, C.
1983-01-01
The Fifth International Conference on High-Power Particle Beams is organized jointly by the Lawrence Livermore National Laboratory and Physics International Company. As in the previous conferences in this series, the program includes the following topics: high-power, electron- and ion-beam acceleration and transport; diode physics; high-power particle beam interaction with plasmas and dense targets; particle beam fusion (inertial confinement); collective ion acceleration; particle beam heating of magnetically confined plasmas; and generation of microwave/free-electron lasers.
NASA Astrophysics Data System (ADS)
Karim, S.; Saepuzaman, D.; Sriyansyah, S. P.
2016-08-01
This study is initiated by low achievement of prospective teachers in understanding concepts in introductory physics course. In this case, a problem has been identified that students cannot develop their thinking skills required for building physics concepts. Therefore, this study will reconstruct a learning process, emphasizing a physics concept building. The outcome will design physics lesson plans for the concepts of particle system as well as linear momentum conservation. A descriptive analysis method will be used in order to investigate the process of learning reconstruction carried out by students. In this process, the students’ conceptual understanding will be evaluated using essay tests for concepts of particle system and linear momentum conservation. The result shows that the learning reconstruction has successfully supported the students’ understanding of physics concept.
Statistical Physics Experiments Using Dusty Plasmas
NASA Astrophysics Data System (ADS)
Goree, John
2016-10-01
Compared to other areas of physics research, Statistical Physics is heavily dominated by theory, with comparatively little experiment. One reason for the lack of experiments is the impracticality of tracking of individual atoms and molecules within a substance. Thus, there is a need for a different kind of experimental system, one where individual particles not only move stochastically as they collide with one another, but also are large enough to allow tracking. A dusty plasma can meet this need. A dusty plasma is a partially ionized gas containing small particles of solid matter. These micron-size particles gain thousands of electronic charges by collecting more electrons than ions. Their motions are dominated by Coulomb collisions with neighboring particles. In this so-called strongly coupled plasma, the dust particles self-organize in much the same way as atoms in a liquid or solid. Unlike atoms, however, these particles are large and slow, so that they can be tracked easily by video microscopy. Advantages of dusty plasma for experimental statistical physics research include particle tracking, lack of frictional contact with solid surfaces, and avoidance of overdamped motion. Moreover, the motion of a collection of dust particles can mimic an equilibrium system with a Maxwellian velocity distribution, even though the dust particles themselves are not truly in thermal equilibrium. Nonequilibrium statistical physics can be studied by applying gradients, for example by imposing a shear flow. In this talk I will review some of our recent experiments with shear flow. First, we performed the first experimental test to verify the Fluctuation Theorem for a shear flow, showing that brief violations of the Second Law of Thermodynamics occur with the predicted probabilities, for a small system. Second, we discovered a skewness of a shear-stress distribution in a shear flow. This skewness is a phenomenon that likely has wide applicability in nonequilibrium steady states. Third, we performed the first experimental test of a statistical physics theory (the Green-Kubo model) that is widely used by physical chemists to compute viscosity coefficients, and we found that it fails. Work supported by the U.S. Department of Energy, NSF, and NASA.
PEOPLE IN PHYSICS: Interview with Peter Higgs
NASA Astrophysics Data System (ADS)
Fancey, Conducted by Norman
1998-01-01
Peter Higgs, FRSE, FRS held until recently a personal chair in theoretical physics at the University of Edinburgh and is now an emeritus professor. Peter is well known for predicting the existence of a new particle, the Higgs boson - as yet unconfirmed. He has been awarded a number of prizes in recognition of his work, most recently the Paul Dirac Medal and Prize for outstanding contributions to theoretical physics from the Institute of Physics and the 1997 High Energy and Particle Physics Prize by the European Physical Society.
NASA Astrophysics Data System (ADS)
Da Silva, Antonio; Sánchez Prieto, Sebastián; Rodriguez Polo, Oscar; Parra Espada, Pablo
Computer memories are not supposed to forget, but they do. Because of the proximity of the Sun, from the Solar Orbiter boot software perspective, it is mandatory to look out for permanent memory errors resulting from (SEL) latch-up failures in application binaries stored in EEPROM and its SDRAM deployment areas. In this situation, the last line in defense established by FDIR mechanisms is the capability of the boot software to provide an accurate report of the memories’ damages and to perform an application software update, that avoid the harmed locations by flashing EEPROM with a new binary. This paper describes the OTA EEPROM firmware update procedure verification of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. Since the maximum number of rewrites on real EEPROM is limited and permanent memory faults cannot be friendly emulated in real hardware, the verification has been accomplished by the use of a LEON2 Virtual Platform (Leon2ViP) with fault injection capabilities and real SpaceWire interfaces developed by the Space Research Group (SRG) of the University of Alcalá. This way it is possible to run the exact same target binary software as if was run on the real ICU platform. Furthermore, the use of this virtual hardware-in-the-loop (VHIL) approach makes it possible to communicate with Electrical Ground Support Equipment (EGSE) through real SpaceWire interfaces in an agile, controlled and deterministic environment.
al Mahbub, Asheque; Haque, Asadul
2016-01-01
This paper presents the results of X-ray CT imaging of the microstructure of sand particles subjected to high pressure one-dimensional compression leading to particle crushing. A high resolution X-ray CT machine capable of in situ imaging was employed to capture images of the whole volume of a sand sample subjected to compressive stresses up to 79.3 MPa. Images of the whole sample obtained at different load stages were analysed using a commercial image processing software (Avizo) to reveal various microstructural properties, such as pore and particle volume distributions, spatial distribution of void ratios, relative breakage, and anisotropy of particles. PMID:28774011
Al Mahbub, Asheque; Haque, Asadul
2016-11-03
This paper presents the results of X-ray CT imaging of the microstructure of sand particles subjected to high pressure one-dimensional compression leading to particle crushing. A high resolution X-ray CT machine capable of in situ imaging was employed to capture images of the whole volume of a sand sample subjected to compressive stresses up to 79.3 MPa. Images of the whole sample obtained at different load stages were analysed using a commercial image processing software (Avizo) to reveal various microstructural properties, such as pore and particle volume distributions, spatial distribution of void ratios, relative breakage, and anisotropy of particles.
NASA Tech Briefs, December 1997. Volume 21, No. 12
NASA Technical Reports Server (NTRS)
1997-01-01
Topics: Design and Analysis Software; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.
ERIC Educational Resources Information Center
Wulfson, Stephen, Ed.
1988-01-01
Reviews seven instructional software packages covering a variety of topics. Includes: "Science Square-Off"; "The Desert"; "Science Courseware: Physical Science"; "Odell Lake"; "Safety First"; "An Experience in Artificial Intelligence"; and "Master Mapper." (TW)
NASA Astrophysics Data System (ADS)
Justus, Christopher
2005-04-01
In this study, we simulated top-antitop (tt-bar) quark events at the Compact Muon Solenoid (CMS), an experiment presently being constructed at the Large Hadron Collider in Geneva, Switzerland. The tt-bar process is an important background for Higgs events. We used a chain of software to simulate and reconstruct processes that will occur inside the detector. CMKIN was used to generate and store Monte Carlo Events. OSCAR, a GEANT4 based CMS detector simulator, was used to simulate the CMS detector and how particles would interact with the detector. Next, we used ORCA to simulate the response of the readout electronics at CMS. Last, we used the Jet/MET Root maker to create root files of jets and missing energy. We are now using this software analysis chain to complete a systematic study of initial state radiation at hadron colliders. This study is essential because tt-bar is the main background for the Higgs boson and these processes are extremely sensitive to initial state radiation. Results of our initial state radiation study will be presented. We started this study at the new LHC Physics Center (LPC) located at Fermi National Accelerator Laboratory, and we are now completing the study at the University of Rochester.
Software-Realized Scaffolding to Facilitate Programming for Science Learning.
ERIC Educational Resources Information Center
Guzdial, Mark
1994-01-01
Discussion of the use of programming as a learning activity focuses on software-realized scaffolding. Emile, software that facilitates programming for modeling and simulation in physics, is described, and results of an evaluation of the use of Emile with high school students are reported. (Contains 95 references.) (LRW)
An Integrated Higgs Force Theory
NASA Astrophysics Data System (ADS)
Colella, Antonio
2016-03-01
An Integrated Higgs force theory (IHFT) was based on 2 key requirement amplifications: a matter particle/Higgs force was one and inseparable; a matter particle/Higgs force bidirectionally condensed/evaporated from/to super force. These were basis of 5 theories: particle creation, baryogenesis, superpartner/quark decays, spontaneous symmetry breaking, and stellar black holes. Our universe's 129 matter/force particles contained 64 supersymmetric Higgs particles; 9 transient matter particles/Higgs forces decayed to 8 permanent matter particles/Higgs forces; mass was given to a matter particle by its Higgs force and gravitons; and sum of 8 Higgs force energies of 8 permanent matter particles was dark energy. An IHFT's essence is the intimate physical relationships between 8 theories. These theories are independent because physicists in one theory worked independently of physicists in the other seven. An IHFT's premise is without sacrificing their integrities, 8 independent existing theories are replaced by 8 interrelated amplified theories. Requirement amplifications provide interfaces between the 8 theories. Intimate relationships between 8 theories including the above 5 and string, Higgs forces, and Super Universe are described. The sorting category selected was F. PARTICLES AND FIELDS (e.g., F1 Higgs Physics, F10 Alternative Beyond the Standard Model Physics, F11 Dark Sector Theories and Searches, and F12 Particle Cosmology).
Validation of Tendril TrueHome Using Software-to-Software Comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan
This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.
NASA Astrophysics Data System (ADS)
Kestens, Vikram; Bozatzidis, Vassili; De Temmerman, Pieter-Jan; Ramaye, Yannic; Roebben, Gert
2017-08-01
Particle tracking analysis (PTA) is an emerging technique suitable for size analysis of particles with external dimensions in the nano- and sub-micrometre scale range. Only limited attempts have so far been made to investigate and quantify the performance of the PTA method for particle size analysis. This article presents the results of a validation study during which selected colloidal silica and polystyrene latex reference materials with particle sizes in the range of 20 nm to 200 nm were analysed with NS500 and LM10-HSBF NanoSight instruments and video analysis software NTA 2.3 and NTA 3.0. Key performance characteristics such as working range, linearity, limit of detection, limit of quantification, sensitivity, robustness, precision and trueness were examined according to recommendations proposed by EURACHEM. A model for measurement uncertainty estimation following the principles described in ISO/IEC Guide 98-3 was used for quantifying random and systematic variations. For nominal 50 nm and 100 nm polystyrene and a nominal 80 nm silica reference materials, the relative expanded measurement uncertainties for the three measurands of interest, being the mode, median and arithmetic mean of the number-weighted particle size distribution, varied from about 10% to 12%. For the nominal 50 nm polystyrene material, the relative expanded uncertainty of the arithmetic mean of the particle size distributions increased up to 18% which was due to the presence of agglomerates. Data analysis was performed with software NTA 2.3 and NTA 3.0. The latter showed to be superior in terms of sensitivity and resolution.
Model-independent particle accelerator tuning
Scheinker, Alexander; Pang, Xiaoying; Rybarcyk, Larry
2013-10-21
We present a new model-independent dynamic feedback technique, rotation rate tuning, for automatically and simultaneously tuning coupled components of uncertain, complex systems. The main advantages of the method are: 1) It has the ability to handle unknown, time-varying systems, 2) It gives known bounds on parameter update rates, 3) We give an analytic proof of its convergence and its stability, and 4) It has a simple digital implementation through a control system such as the Experimental Physics and Industrial Control System (EPICS). Because this technique is model independent it may be useful as a real-time, in-hardware, feedback-based optimization scheme formore » uncertain and time-varying systems. In particular, it is robust enough to handle uncertainty due to coupling, thermal cycling, misalignments, and manufacturing imperfections. As a result, it may be used as a fine-tuning supplement for existing accelerator tuning/control schemes. We present multi-particle simulation results demonstrating the scheme’s ability to simultaneously adaptively adjust the set points of twenty two quadrupole magnets and two RF buncher cavities in the Los Alamos Neutron Science Center Linear Accelerator’s transport region, while the beam properties and RF phase shift are continuously varying. The tuning is based only on beam current readings, without knowledge of particle dynamics. We also present an outline of how to implement this general scheme in software for optimization, and in hardware for feedback-based control/tuning, for a wide range of systems.« less
Development of a numerical model for the electric current in burner-stabilised methane-air flames
NASA Astrophysics Data System (ADS)
Speelman, N.; de Goey, L. P. H.; van Oijen, J. A.
2015-03-01
This study presents a new model to simulate the electric behaviour of one-dimensional ionised flames and to predict the electric currents in these flames. The model utilises Poisson's equation to compute the electric potential. A multi-component diffusion model, including the influence of an electric field, is used to model the diffusion of neutral and charged species. The model is incorporated into the existing CHEM1D flame simulation software. A comparison between the computed electric currents and experimental values from the literature shows good qualitative agreement for the voltage-current characteristic. Physical phenomena, such as saturation and the diodic effect, are captured by the model. The dependence of the saturation current on the equivalence ratio is also captured well for equivalence ratios between 0.6 and 1.2. Simulations show a clear relation between the saturation current and the total number of charged particles created. The model shows that the potential at which the electric field saturates is strongly dependent on the recombination rate and the diffusivity of the charged particles. The onset of saturation occurs because most created charged particles are withdrawn from the flame and because the electric field effects start dominating over mass based diffusion. It is shown that this knowledge can be used to optimise ionisation chemistry mechanisms. It is shown numerically that the so-called diodic effect is caused primarily by the distance the heavier cations have to travel to the cathode.
Fermilab | About Fermilab | Photo and Video Gallery
LHC Dark matter and dark energy ADMX Muons More fundamental particles and forces Theory Scientific society Particle Physics 101 Science of matter, energy, space and time How particle physics discovery rarely interact with matter. thumb Med-Res Hi-Res A view of Fermilab's MINERvA detector with the MINOS
Meta-Analysis inside and outside Particle Physics: Two Traditions That Should Converge?
ERIC Educational Resources Information Center
Baker, Rose D.; Jackson, Dan
2013-01-01
The use of meta-analysis in medicine and epidemiology really took off in the 1970s. However, in high-energy physics, the Particle Data Group has been carrying out meta-analyses of measurements of particle masses and other properties since 1957. Curiously, there has been virtually no interaction between those working inside and outside particle…
Nuclear physics in particle therapy: a review
NASA Astrophysics Data System (ADS)
Durante, Marco; Paganetti, Harald
2016-09-01
Charged particle therapy has been largely driven and influenced by nuclear physics. The increase in energy deposition density along the ion path in the body allows reducing the dose to normal tissues during radiotherapy compared to photons. Clinical results of particle therapy support the physical rationale for this treatment, but the method remains controversial because of the high cost and of the lack of comparative clinical trials proving the benefit compared to x-rays. Research in applied nuclear physics, including nuclear interactions, dosimetry, image guidance, range verification, novel accelerators and beam delivery technologies, can significantly improve the clinical outcome in particle therapy. Measurements of fragmentation cross-sections, including those for the production of positron-emitting fragments, and attenuation curves are needed for tuning Monte Carlo codes, whose use in clinical environments is rapidly increasing thanks to fast calculation methods. Existing cross sections and codes are indeed not very accurate in the energy and target regions of interest for particle therapy. These measurements are especially urgent for new ions to be used in therapy, such as helium. Furthermore, nuclear physics hardware developments are frequently finding applications in ion therapy due to similar requirements concerning sensors and real-time data processing. In this review we will briefly describe the physics bases, and concentrate on the open issues.
Nuclear physics in particle therapy: a review.
Durante, Marco; Paganetti, Harald
2016-09-01
Charged particle therapy has been largely driven and influenced by nuclear physics. The increase in energy deposition density along the ion path in the body allows reducing the dose to normal tissues during radiotherapy compared to photons. Clinical results of particle therapy support the physical rationale for this treatment, but the method remains controversial because of the high cost and of the lack of comparative clinical trials proving the benefit compared to x-rays. Research in applied nuclear physics, including nuclear interactions, dosimetry, image guidance, range verification, novel accelerators and beam delivery technologies, can significantly improve the clinical outcome in particle therapy. Measurements of fragmentation cross-sections, including those for the production of positron-emitting fragments, and attenuation curves are needed for tuning Monte Carlo codes, whose use in clinical environments is rapidly increasing thanks to fast calculation methods. Existing cross sections and codes are indeed not very accurate in the energy and target regions of interest for particle therapy. These measurements are especially urgent for new ions to be used in therapy, such as helium. Furthermore, nuclear physics hardware developments are frequently finding applications in ion therapy due to similar requirements concerning sensors and real-time data processing. In this review we will briefly describe the physics bases, and concentrate on the open issues.
NASA Astrophysics Data System (ADS)
Henriksen, Ellen Karoline; Angell, Carl; Vistnes, Arnt Inge; Bungum, Berit
2018-03-01
Quantum physics describes light as having both particle and wave properties; however, there is no consensus about how to interpret this duality on an ontological level. This article explores how pre-university physics students, while working with learning material focusing on historical-philosophical aspects of quantum physics, interpreted the wave-particle duality of light and which views they expressed on the nature of physics. A thematic analysis was performed on 133 written responses about the nature of light, given in the beginning of the teaching sequence, and 55 audio-recorded small-group discussions addressing the wave-particle duality, given later in the sequence. Most students initially expressed a wave and particle view of light, but some of these gave an "uncritical duality description", accepting without question the two ontologically different descriptions of light. In the small-group discussions, students expressed more nuanced views. Many tried to reconcile the two descriptions using semi-classical reasoning; others entered into philosophical discussions about the status of the current scientific description of light and expected science to come up with a better model. Some found the wave description of light particularly challenging and lacked a conception of "what is waving". Many seemed to implicitly take a realist view on the description of physical phenomena, contrary with the Copenhagen interpretation which is prevalent in textbooks. Results are discussed in light of different interpretations of quantum physics, and we conclude by arguing for a historical-philosophical perspective as an entry point for upper secondary physics students to explore the development and interpretation of quantum physical concepts.
Online Tracking Algorithms on GPUs for the P̅ANDA Experiment at FAIR
NASA Astrophysics Data System (ADS)
Bianchi, L.; Herten, A.; Ritman, J.; Stockmanns, T.; Adinetz,
2015-12-01
P̅ANDA is a future hadron and nuclear physics experiment at the FAIR facility in construction in Darmstadt, Germany. In contrast to the majority of current experiments, PANDA's strategy for data acquisition is based on event reconstruction from free-streaming data, performed in real time entirely by software algorithms using global detector information. This paper reports the status of the development of algorithms for the reconstruction of charged particle tracks, optimized online data processing applications, using General-Purpose Graphic Processing Units (GPU). Two algorithms for trackfinding, the Triplet Finder and the Circle Hough, are described, and details of their GPU implementations are highlighted. Average track reconstruction times of less than 100 ns are obtained running the Triplet Finder on state-of- the-art GPU cards. In addition, a proof-of-concept system for the dispatch of data to tracking algorithms using Message Queues is presented.
2D modeling of direct laser metal deposition process using a finite particle method
NASA Astrophysics Data System (ADS)
Anedaf, T.; Abbès, B.; Abbès, F.; Li, Y. M.
2018-05-01
Direct laser metal deposition is one of the material additive manufacturing processes used to produce complex metallic parts. A thorough understanding of the underlying physical phenomena is required to obtain a high-quality parts. In this work, a mathematical model is presented to simulate the coaxial laser direct deposition process tacking into account of mass addition, heat transfer, and fluid flow with free surface and melting. The fluid flow in the melt pool together with mass and energy balances are solved using the Computational Fluid Dynamics (CFD) software NOGRID-points, based on the meshless Finite Pointset Method (FPM). The basis of the computations is a point cloud, which represents the continuum fluid domain. Each finite point carries all fluid information (density, velocity, pressure and temperature). The dynamic shape of the molten zone is explicitly described by the point cloud. The proposed model is used to simulate a single layer cladding.
Design and analysis of magneto rheological fluid brake for an all terrain vehicle
NASA Astrophysics Data System (ADS)
George, Luckachan K.; Tamilarasan, N.; Thirumalini, S.
2018-02-01
This work presents an optimised design for a magneto rheological fluid brake for all terrain vehicles. The actuator consists of a disk which is immersed in the magneto rheological fluid surrounded by an electromagnet. The braking torque is controlled by varying the DC current applied to the electromagnet. In the presence of a magnetic field, the magneto rheological fluid particle aligns in a chain like structure, thus increasing the viscosity. The shear stress generated causes friction in the surfaces of the rotating disk. Electromagnetic analysis of the proposed system is carried out using finite element based COMSOL multi-physics software and the amount of magnetic field generated is calculated with the help of COMSOL. The geometry is optimised and performance of the system in terms of braking torque is carried out. Proposed design reveals better performance in terms of braking torque from the existing literature.
NASA Astrophysics Data System (ADS)
Kampert, Karl-Heinz; Kulbartz, Jörg; Maccione, Luca; Nierstenhoefer, Nils; Schiffer, Peter; Sigl, Günter; van Vliet, Arjen René
2013-02-01
Version 2.0 of CRPropa [CRPropa is published under the 3rd version of the GNU General Public License (GPLv3). It is available, together with a detailed documentation of the code, at https://crpropa.desy.de.] is public software to model the extra-galactic propagation of ultra-high energy nuclei of atomic number Z⩽26 through structured magnetic fields and ambient photon backgrounds taking into account all relevant particle interactions. CRPropa covers the energy range 7×1016
CERN launches high-school internship programme
NASA Astrophysics Data System (ADS)
Johnston, Hamish
2017-07-01
The CERN particle-physics lab has hosted 22 high-school students from Hungary in a pilot programme designed to show teenagers how science, technology, engineering and mathematics is used at the particle-physics lab.
Quarked! - Adventures in Particle Physics Education
NASA Astrophysics Data System (ADS)
MacDonald, Teresa; Bean, Alice
2009-01-01
Particle physics is a subject that can send shivers down the spines of students and educators alike-with visions of long mathematical equations and inscrutable ideas. This perception, along with a full curriculum, often leaves this topic the road less traveled until the latter years of school. Particle physics, including quarks, is typically not introduced until high school or university.1,2 Many of these concepts can be made accessible to younger students when presented in a fun and engaging way. Informal science institutions are in an ideal position to communicate new and challenging science topics in engaging and innovative ways and offer a variety of educational enrichment experiences for students that support and enhance science learning.3 Quarked!™ Adventures in the Subatomic Universe, a National Science Foundation EPSCoR-funded particle physics education program, provides classroom programs and online educational resources.
Detectors for Particle Radiation
NASA Astrophysics Data System (ADS)
Kleinknecht, Konrad
1999-01-01
This textbook provides a clear, concise and comprehensive review of the physical principles behind the devices used to detect charged particles and gamma rays, and the construction and performance of these many different types of detectors. Detectors for high-energy particles and radiation are used in many areas of science, especially particle physics and nuclear physics experiments, nuclear medicine, cosmic ray measurements, space sciences and geological exploration. This second edition includes all the latest developments in detector technology, including several new chapters covering micro-strip gas chambers, silicion strip detectors and CCDs, scintillating fibers, shower detectors using noble liquid gases, and compensating calorimeters for hadronic showers. This well-illustrated textbook contains examples from the many areas in science in which these detectors are used. It provides both a coursebook for students in physics, and a useful introduction for researchers in other fields.
Nuclear and particle physics, astrophysics and cosmology (NPAC) capability review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redondo, Antonio
2010-01-01
The present document represents a summary self-assessment of the status of the Nuclear and Particle Physics, Astrophysics and Cosmology (NPAC) capability across Los Alamos National Laboratory (LANL). For the purpose of this review, we have divided the capability into four theme areas: Nuclear Physics, Particle Physics, Astrophysics and Cosmology, and Applied Physics. For each theme area we have given a general but brief description of the activities under the area, a list of the Laboratory divisions involved in the work, connections to the goals and mission of the Laboratory, a brief description of progress over the last three years, ourmore » opinion of the overall status of the theme area, and challenges and issues.« less
The Virtual Environment for Reactor Applications (VERA): Design and architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, John A., E-mail: turnerja@ornl.gov; Clarno, Kevin; Sieger, Matt
VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL). CASL was established for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both software and numerical perspectives, along with the goalsmore » and constraints that drove major design decisions, and their implications. We explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the use of VERA tools for a variety of challenging applications within the nuclear industry.« less
A Theoretical Analysis: Physical Unclonable Functions and The Software Protection Problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nithyanand, Rishab; Solis, John H.
2011-09-01
Physical Unclonable Functions (PUFs) or Physical One Way Functions (P-OWFs) are physical systems whose responses to input stimuli (i.e., challenges) are easy to measure (within reasonable error bounds) but hard to clone. This property of unclonability is due to the accepted hardness of replicating the multitude of uncontrollable manufacturing characteristics and makes PUFs useful in solving problems such as device authentication, software protection, licensing, and certified execution. In this paper, we focus on the effectiveness of PUFs for software protection and show that traditional non-computational (black-box) PUFs cannot solve the problem against real world adversaries in offline settings. Our contributionsmore » are the following: We provide two real world adversary models (weak and strong variants) and present definitions for security against the adversaries. We continue by proposing schemes secure against the weak adversary and show that no scheme is secure against a strong adversary without the use of trusted hardware. Finally, we present a protection scheme secure against strong adversaries based on trusted hardware.« less
Particle Physics at the Cosmic, Intensity, and Energy Frontiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Essig, Rouven
Major efforts at the Intensity, Cosmic, and Energy frontiers of particle physics are rapidly furthering our understanding of the fundamental constituents of Nature and their interactions. The overall objectives of this research project are (1) to interpret and develop the theoretical implications of the data collected at these frontiers and (2) to provide the theoretical motivation, basis, and ideas for new experiments and for new analyses of experimental data. Within the Intensity Frontier, an experimental search for a new force mediated by a GeV-scale gauge boson will be carried out with the $A'$ Experiment (APEX) and the Heavy Photon Searchmore » (HPS), both at Jefferson Laboratory. Within the Cosmic Frontier, contributions are planned to the search for dark matter particles with the Fermi Gamma-ray Space Telescope and other instruments. A detailed exploration will also be performed of new direct detection strategies for dark matter particles with sub-GeV masses to facilitate the development of new experiments. In addition, the theoretical implications of existing and future dark matter-related anomalies will be examined. Within the Energy Frontier, the implications of the data from the Large Hadron Collider will be investigated. Novel search strategies will be developed to aid the search for new phenomena not described by the Standard Model of particle physics. By combining insights from all three particle physics frontiers, this research aims to increase our understanding of fundamental particle physics.« less
NASA Astrophysics Data System (ADS)
2011-01-01
Particle Physics: ATLAS unveils mural at CERN Prize: Corti Trust invites essay entries Astrophysics: CERN holds cosmic-ray conference Researchers in Residence: Lord Winston returns to school Music: ATLAS scientists record physics music Conference: Champagne flows at Reims event Competition: Students triumph at physics olympiad Teaching: Physics proves popular in Japanese schools Forthcoming Events
Simultaneous operation and control of about 100 telescopes for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Wegner, P.; Colomé, J.; Hoffmann, D.; Houles, J.; Köppel, H.; Lamanna, G.; Le Flour, T.; Lopatin, A.; Lyard, E.; Melkumyan, D.; Oya, I.; Panazol, L.-I.; Punch, M.; Schlenstedt, S.; Schmidt, T.; Stegmann, C.; Schwanke, U.; Walter, R.; Consortium, CTA
2012-12-01
The Cherenkov Telescope Array (CTA) project is an initiative to build the next generation ground-based very high energy (VHE) gamma-ray instrument. Compared to current imaging atmospheric Cherenkov telescope experiments CTA will extend the energy range and improve the angular resolution while increasing the sensitivity up to a factor of 10. With about 100 separate telescopes it will be operated as an observatory open to a wide astrophysics and particle physics community, providing a deep insight into the non-thermal high-energy universe. The CTA Array Control system (ACTL) is responsible for several essential control tasks supporting the evaluation and selection of proposals, as well as the preparation, scheduling, and finally the execution of observations with the array. A possible basic distributed software framework for ACTL being considered is the ALMA Common Software (ACS). The ACS framework follows a container component model and contains a high level abstraction layer to integrate different types of device. To achieve a low-level consolidation of connecting control hardware, OPC UA (OPen Connectivity-Unified Architecture) client functionality is integrated directly into ACS, thus allowing interaction with other OPC UA capable hardware. The CTA Data Acquisition System comprises the data readout of all cameras and the transfer of the data to a camera server farm, thereby using standard hardware and software technologies. CTA array control is also covering conceptions for a possible array trigger system and the corresponding clock distribution. The design of the CTA observations scheduler is introducing new algorithmic technologies to achieve the required flexibility.
Physical characterization of aerosol particles during the Chinese New Year’s firework events
NASA Astrophysics Data System (ADS)
Zhang, Min; Wang, Xuemei; Chen, Jianmin; Cheng, Tiantao; Wang, Tao; Yang, Xin; Gong, Youguo; Geng, Fuhai; Chen, Changhong
2010-12-01
Measurements for particles 10 nm to 10 μm were taken using a Wide-range Particle Spectrometer during the Chinese New Year (CNY) celebrations in 2009 in Shanghai, China. These celebrations provided an opportunity to study the number concentration and size distribution of particles in an especial atmospheric pollution situation due to firework displays. The firework activities had a clear contribution to the number concentration of small accumulation mode particles (100-500 nm) and PM 1 mass concentration, with a maximum total number concentration of 3.8 × 10 4 cm -3. A clear shift of particles from nucleation and Aitken mode to small accumulation mode was observed at the peak of the CNY firework event, which can be explained by reduced atmospheric lifetimes of smaller particles via the concept of the coagulation sink. High particle density (2.7 g cm -3) was identified as being particularly characteristic of the firework aerosols. Recalculated fine particles PM 1 exhibited on average above 150 μg m -3 for more than 12 hours, which was a health risk to susceptible individuals. Integral physical parameters of firework aerosols were calculated for understanding their physical properties and further model simulation.
Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit
2017-02-01
To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.
Embracing Open Software Development in Solar Physics
NASA Astrophysics Data System (ADS)
Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.
2012-12-01
We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We discuss the development of both these efforts and how they are beginning to influence the solar physics community.
NASA Astrophysics Data System (ADS)
Gong, Z.; Wang, C.; Pan, Y. L.; Videen, G.
2017-12-01
Heterogeneous reactions of solid particles in a gaseous environment are of increasing interest; however, most of the heterogeneous chemistry studies of airborne solids were conducted on particle ensembles. A close examination on the heterogeneous chemistry between single particles and gaseous-environment species is the key to elucidate the fundamental mechanisms of hydroscopic growth, cloud nuclei condensation, secondary aerosol formation, etc., and reduce the uncertainty of models in radiative forcing, climate change, and atmospheric chemistry. We demonstrate an optical trapping-Raman spectroscopy (OT-RS) system to study the heterogeneous chemistry of the solid particles in air at single-particle level. Compared to other single-particle techniques, optical trapping offers a non-invasive, flexible, and stable method to isolate single solid particle from substrates. Benefited from two counter-propagating hollow beams, the optical trapping configuration is adaptive to trap a variety of particles with different materials from inorganic substitution (carbon nanotubes, silica, etc.) to organic, dye-doped polymers and bioaerosols (spores, pollen, etc.), with different optical properties from transparent to strongly absorbing, with different sizes from sub-micrometers to tens of microns, or with distinct morphologies from loosely packed nanotubes to microspheres and irregular pollen grains. The particles in the optical trap may stay unchanged, surface degraded, or optically fragmented according to different laser intensity, and their physical and chemical properties are characterized by the Raman spectra and imaging system simultaneously. The Raman spectra is able to distinguish the chemical compositions of different particles, while the synchronized imaging system can resolve their physical properties (sizes, shapes, morphologies, etc.). The temporal behavior of the trapped particles also can be monitored by the OT-RS system at an indefinite time with a resolution from 10 ms to 5 min, which can be further applied to monitor the dynamics of heterogeneous reactions. The OT-RS system provides a flexible method to characterize and monitor the physical properties and heterogeneous chemistry of optically trapped solid particles in gaseous environment at single-particle level.
NASA Astrophysics Data System (ADS)
Martin, B. R.; Shaw, G.
1998-01-01
Particle Physics, Second Edition is a concise and lucid account of the fundamental constituents of matter. The standard model of particle physics is developed carefully and systematically, without heavy mathematical formalism, to make this stimulating subject accessible to undergraduate students. Throughout, the emphasis is on the interpretation of experimental data in terms of the basic properties of quarks and leptons, and extensive use is made of symmetry principles and Feynman diagrams, which are introduced early in the book. The Second Edition brings the book fully up to date, including the discovery of the top quark and the search for the Higgs boson. A final short chapter is devoted to the continuing search for new physics beyond the standard model. Particle Physics, Second Edition features: * A carefully structured and written text to help students understand this exciting and demanding subject. * Many worked examples and problems to aid student learning. Hints for solving the problems are given in an Appendix. * Optional "starred" sections and appendices, containing more specialised and advanced material for the more ambitious reader.
Decoupling the Role of Inertia and Gravity on Particle Dispersion
NASA Technical Reports Server (NTRS)
Rogers, Chris; Squires, Kyle
1996-01-01
Turbulent gas flows laden with small, dense particles are encountered in a wide number of important applications in both industrial settings and aerodynamics applications. Particle interactions with the underlying turbulent flow are exceedingly complex and, consequently, difficult to accurately model. The difficulty arises primarily due to the fact that response of a particle to the local environment is dictated by turbulence properties in the reference frame moving with the particle (particle-Lagrangian). The particle-Lagrangian reference frame is in turn dependent upon the particle relaxation time (time constant) as well as gravitational drift. The combination of inertial and gravitational effects in this frame complicates our ability to accurately predict particle-laden flows since measurements in the particle-Lagrangian reference frame are difficult to obtain. Therefore, in this work we will examine separately the effects of inertia and gravitational drift on particle dispersion through a combination of physical and numerical experiments. In this study, particle-Lagrangian measurements will be obtained in physical experiments using stereo image velocimetry. Gravitational drift will be varied in the variable-g environments of the NASA DC-9 and in the zero-g environment at the drop tower at NASA-Lewis. Direct numerical simulations will be used to corroborate the measurements from the variable-g experiments. We expect that this work will generate new insight into the underlying physics of particle dispersion and will, in turn, lead to more accurate models of particle transport in turbulent flows.
Perchlorate Detection at Nanomolar Concentrations by Surface-Enhanced Raman Scattering
2009-01-01
grooves/mm grating light path controlled by Renishaw WiRE software and analyzed by Galactic GRAMS software. RESULTS AND DISCUSSION Quantitative... Federal Rights License 14. ABSTRACT Perchlorate (ClO4 ) has emerged as a widespread environmental contaminant and has been detected in various food...by means of dynamic light scattering using a ZetaPlus particle size analyzer (Brookhaven Instruments, Holtsville, NY). Data were collected for every
Visualization of fluid dynamics at NASA Ames
NASA Technical Reports Server (NTRS)
Watson, Val
1989-01-01
The hardware and software currently used for visualization of fluid dynamics at NASA Ames is described. The software includes programs to create scenes (for example particle traces representing the flow over an aircraft), programs to interactively view the scenes, and programs to control the creation of video tapes and 16mm movies. The hardware includes high performance graphics workstations, a high speed network, digital video equipment, and film recorders.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quigg, C.
The author sketches some pressing questions in several active areas of particle physics and outline the challenges they present for the design and operation of detectors. His assignment at the 1999 ICFA Instrumentation School is to survey some current developments in particle physics, and to describe the kinds of experiments they would like to do in the near future and illustrate the demands their desires place on detectors and data analysis. Like any active science, particle physics is in a state of continual renewal. Many of the subjects that seem most fascinating and most promising today simply did not existmore » as recently as twenty-five years ago. Other topics that have preoccupied physicists for many years have been reshaped by recent discoveries and insights, and transformed by new techniques in accelerator science and detector technology. To provide some context for the courses and laboratories at this school, he has chosen three topics that are of high scientific interest, and that place very different demands on instrumental techniques. He hopes that you will begin to see the breadth of opportunities in particle physics, and that you will also look beyond the domain of particle physics for opportunities to apply the lessons you learn here in Istanbul.« less
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Management of an affiliated Physics Residency Program using a commercial software tool.
Zacarias, Albert S; Mills, Michael D
2010-06-01
A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.
Future of medical physics: Real-time MRI-guided proton therapy.
Oborn, Bradley M; Dowdell, Stephen; Metcalfe, Peter E; Crozier, Stuart; Mohan, Radhe; Keall, Paul J
2017-08-01
With the recent clinical implementation of real-time MRI-guided x-ray beam therapy (MRXT), attention is turning to the concept of combining real-time MRI guidance with proton beam therapy; MRI-guided proton beam therapy (MRPT). MRI guidance for proton beam therapy is expected to offer a compelling improvement to the current treatment workflow which is warranted arguably more than for x-ray beam therapy. This argument is born out of the fact that proton therapy toxicity outcomes are similar to that of the most advanced IMRT treatments, despite being a fundamentally superior particle for cancer treatment. In this Future of Medical Physics article, we describe the various software and hardware aspects of potential MRPT systems and the corresponding treatment workflow. Significant software developments, particularly focused around adaptive MRI-based planning will be required. The magnetic interaction between the MRI and the proton beamline components will be a key area of focus. For example, the modeling and potential redesign of a magnetically compatible gantry to allow for beam delivery from multiple angles towards a patient located within the bore of an MRI scanner. Further to this, the accuracy of pencil beam scanning and beam monitoring in the presence of an MRI fringe field will require modeling, testing, and potential further development to ensure that the highly targeted radiotherapy is maintained. Looking forward we envisage a clear and accelerated path for hardware development, leveraging from lessons learnt from MRXT development. Within few years, simple prototype systems will likely exist, and in a decade, we could envisage coupled systems with integrated gantries. Such milestones will be key in the development of a more efficient, more accurate, and more successful form of proton beam therapy for many common cancer sites. © 2017 American Association of Physicists in Medicine.
ERIC Educational Resources Information Center
Baki, Adnan; Kosa, Temel; Guven, Bulent
2011-01-01
The study compared the effects of dynamic geometry software and physical manipulatives on the spatial visualisation skills of first-year pre-service mathematics teachers. A pre- and post-test quasi-experimental design was used. The Purdue Spatial Visualisation Test (PSVT) was used for the pre- and post-test. There were three treatment groups. The…
Software-type Wave-Particle Interaction Analyzer on board the Arase satellite
NASA Astrophysics Data System (ADS)
Katoh, Yuto; Kojima, Hirotsugu; Hikishima, Mitsuru; Takashima, Takeshi; Asamura, Kazushi; Miyoshi, Yoshizumi; Kasahara, Yoshiya; Kasahara, Satoshi; Mitani, Takefumi; Higashio, Nana; Matsuoka, Ayako; Ozaki, Mitsunori; Yagitani, Satoshi; Yokota, Shoichiro; Matsuda, Shoya; Kitahara, Masahiro; Shinohara, Iku
2018-01-01
We describe the principles of the Wave-Particle Interaction Analyzer (WPIA) and the implementation of the Software-type WPIA (S-WPIA) on the Arase satellite. The WPIA is a new type of instrument for the direct and quantitative measurement of wave-particle interactions. The S-WPIA is installed on the Arase satellite as a software function running on the mission data processor. The S-WPIA on board the Arase satellite uses an electromagnetic field waveform that is measured by the waveform capture receiver of the plasma wave experiment (PWE), and the velocity vectors of electrons detected by the medium-energy particle experiment-electron analyzer (MEP-e), the high-energy electron experiment (HEP), and the extremely high-energy electron experiment (XEP). The prime objective of the S-WPIA is to measure the energy exchange between whistler-mode chorus emissions and energetic electrons in the inner magnetosphere. It is essential for the S-WPIA to synchronize instruments to a relative time accuracy better than the time period of the plasma wave oscillations. Since the typical frequency of chorus emissions in the inner magnetosphere is a few kHz, a relative time accuracy of better than 10 μs is required in order to measure the relative phase angle between the wave and velocity vectors. In the Arase satellite, a dedicated system has been developed to realize the time resolution required for inter-instrument communication. Here, both the time index distributed over all instruments through the satellite system and an S-WPIA clock signal are used, that are distributed from the PWE to the MEP-e, HEP, and XEP through a direct line, for the synchronization of instruments within a relative time accuracy of a few μs. We also estimate the number of particles required to obtain statistically significant results with the S-WPIA and the expected accumulation time by referring to the specifications of the MEP-e and assuming a count rate for each detector.
Physically Based Rendering in the Nightshade NG Visualization Platform
NASA Astrophysics Data System (ADS)
Berglund, Karrie; Larey-Williams, Trystan; Spearman, Rob; Bogard, Arthur
2015-01-01
This poster describes our work on creating a physically based rendering model in Nightshade NG planetarium simulation and visualization software (project website: NightshadeSoftware.org). We discuss techniques used for rendering realistic scenes in the universe and dealing with astronomical distances in real time on consumer hardware. We also discuss some of the challenges of rewriting the software from scratch, a project which began in 2011.Nightshade NG can be a powerful tool for sharing data and visualizations. The desktop version of the software is free for anyone to download, use, and modify; it runs on Windows and Linux (and eventually Mac). If you are looking to disseminate your data or models, please stop by to discuss how we can work together.Nightshade software is used in literally hundreds of digital planetarium systems worldwide. Countless teachers and astronomy education groups run the software on flat screens. This wide use makes Nightshade an effective tool for dissemination to educators and the public.Nightshade NG is an especially powerful visualization tool when projected on a dome. We invite everyone to enter our inflatable dome in the exhibit hall to see this software in a 3D environment.
Research and technology: Fiscal year 1984 report
NASA Technical Reports Server (NTRS)
1985-01-01
Topics covered include extraterrestrial physics, high energy astrophysics, astronomy, solar physics, atmospheres, oceans, terrestrial physics, space technology, sensors, techniques, user space data systems, space communications and navigation, and system and software engineering.
Particle transport and deposition: basic physics of particle kinetics.
Tsuda, Akira; Henry, Frank S; Butler, James P
2013-10-01
The human body interacts with the environment in many different ways. The lungs interact with the external environment through breathing. The enormously large surface area of the lung with its extremely thin air-blood barrier is exposed to particles suspended in the inhaled air. The particle-lung interaction may cause deleterious effects on health if the inhaled pollutant aerosols are toxic. Conversely, this interaction can be beneficial for disease treatment if the inhaled particles are therapeutic aerosolized drugs. In either case, an accurate estimation of dose and sites of deposition in the respiratory tract is fundamental to understanding subsequent biological response, and the basic physics of particle motion and engineering knowledge needed to understand these subjects is the topic of this article. A large portion of this article deals with three fundamental areas necessary to the understanding of particle transport and deposition in the respiratory tract. These are: (i) the physical characteristics of particles, (ii) particle behavior in gas flow, and (iii) gas-flow patterns in the respiratory tract. Other areas, such as particle transport in the developing lung and in the diseased lung are also considered. The article concludes with a summary and a brief discussion of areas of future research. © 2013 American Physiological Society. Compr Physiol 3:1437-1471, 2013.
A pedagogical derivation of the matrix element method in particle physics data analysis
NASA Astrophysics Data System (ADS)
Sumowidagdo, Suharyo
2018-03-01
The matrix element method provides a direct connection between the underlying theory of particle physics processes and detector-level physical observables. I am presenting a pedagogically-oriented derivation of the matrix element method, drawing from elementary concepts in probability theory, statistics, and the process of experimental measurements. The level of treatment should be suitable for beginning research student in phenomenology and experimental high energy physics.
Instrumentation for Applied Physics and Industrial Applications
NASA Astrophysics Data System (ADS)
Hillemanns, H.; Le Goff, J.-M.
This document is part of Part 2 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Section '7.3 Instrumentation for Applied Physics and Industrial Applications' of Chapter '7 Applications of Detectors in Technology; Medicine and Other Fields' with the content:
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zieb, Kristofer James Ekhart; Hughes, Henry Grady III; Xu, X. George
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. Here, this article discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models’ theories are included as well.
NASA Technical Reports Server (NTRS)
Moore, W. W., Jr.; Kurtz, R. L.; Lemons, J. F.
1976-01-01
The paper describes a holographic/photographic camera to be used with the zero-g or low-g Atmospheric Cloud Physics Laboratory. The flight prototype holocamera is intended to record particles from 0.01 to 5 microns for an optimum two-dimensional plane only in the microscopic photography mode, particles on a volume basis in the in-line holography mode from 5 microns up, and all particle sizes possible on a volume basis in the acute sideband holography mode.
Anti-gravity: The key to 21st century physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noyes, H.P.
1993-01-01
The masses coupling constants and cosmological parameters obtained using our discrete and combinatorial physics based on discrimination between bit-strings indicate that we can achieve the unification of quantum mechanics with relativity which had become the goal of twentieth century physics. To broaden our case we show that limitations on measurement of the position and velocity of an individual massive particle observed in a colliding beam scattering experiment imply real, rational commutation relations between position and velocity. Prior to this limit being pushed down to quantum effects, the lower bound is set by the available technology, but is otherwise scale invariant.more » Replacing force by force per unit mass and force per unit charge allows us to take over the Feynman-Dyson proof of the Maxwell Equations and extend it to weak gravity. The crossing symmetry of the individual scattering processes when one or more particles are replaced by anti-particles predicts both Coulomb attraction (for charged particles) and a Newtonian repulsion between any particle and its anti-particle. Previous quantum results remain intact, and predict the expected relativistic fine structure and spin dependencies. Experimental confirmation of this anti-gravity prediction would inaugurate the physics of the twenty-first century.« less
Anti-gravity: The key to 21st century physics
NASA Astrophysics Data System (ADS)
Noyes, H. P.
1993-01-01
The masses coupling constants and cosmological parameters obtained using our discrete and combinatorial physics based on discrimination between bit-strings indicate that we can achieve the unification of quantum mechanics with relativity which had become the goal of twentieth century physics. To broaden our case we show that limitations on measurement of the position and velocity of an individual massive particle observed in a colliding beam scattering experiment imply real, rational commutation relations between position and velocity. Prior to this limit being pushed down to quantum effects, the lower bound is set by the available technology, but is otherwise scale invariant. Replacing force by force per unit mass and force per unit charge allows us to take over the Feynman-Dyson proof of the Maxwell Equations and extend it to weak gravity. The crossing symmetry of the individual scattering processes when one or more particles are replaced by anti-particles predicts both Coulomb attraction (for charged particles) and a Newtonian repulsion between any particle and its anti-particle. Previous quantum results remain intact, and predict the expected relativistic fine structure and spin dependencies. Experimental confirmation of this anti-gravity prediction would inaugurate the physics of the twenty-first century.
In situ real-time measurement of physical characteristics of airborne bacterial particles
NASA Astrophysics Data System (ADS)
Jung, Jae Hee; Lee, Jung Eun
2013-12-01
Bioaerosols, including aerosolized bacteria, viruses, and fungi, are associated with public health and environmental problems. One promising control method to reduce the harmful effects of bioaerosols is thermal inactivation via a continuous-flow high-temperature short-time (HTST) system. However, variations in bioaerosol physical characteristics - for example, the particle size and shape - during the continuous-flow inactivation process can change the transport properties in the air, which can affect particle deposition in the human respiratory system or the filtration efficiency of ventilation systems. Real-time particle monitoring techniques are a desirable alternative to the time-consuming process of microscopic analysis that is conventionally used in sampling and particle characterization. Here, we report in situ real-time optical scattering measurements of the physical characteristics of airborne bacteria particles following an HTST process in a continuous-flow system. Our results demonstrate that the aerodynamic diameter of bacterial aerosols decreases when exposed to a high-temperature environment, and that the shape of the bacterial cells is significantly altered. These variations in physical characteristics using optical scattering measurements were found to be in agreement with the results of scanning electron microscopy analysis.
Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, R.; Neymark, J.; Polly, B.
2011-12-01
This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less
Biomimetic Molecular Signaling using DNA Walkers on Microparticles.
Damase, Tulsi Ram; Spencer, Adam; Samuel, Bamidele; Allen, Peter B
2017-06-22
We report the release of catalytic DNA walkers from hydrogel microparticles and the detection of those walkers by substrate-coated microparticles. This might be considered a synthetic biology analog of molecular signal release and reception. One type of particles was coated with components of a DNA one-step strand displacement (OSD) reaction to release the walker. A second type of particle was coated with substrate (or "track") for the molecular walker. We distinguish these particle types using fluorescence barcoding: we synthesized and distinguished multiple particle types with multicolor fluorescence microscopy and automated image analysis software. This represents a step toward amplified, multiplex, and microscopically localized detection based on DNA nanotechnology.
Software for Middle School Physical Science.
ERIC Educational Resources Information Center
Podany, Zita
This final report in the MicroSIFT series reviews 10 software packages that deal mainly with the areas of electricity, magnetism, and heat energy. Software titles appearing in this report were selected because they were judged to be exemplary according to various criteria in the MicroSIFT Evaluator's Guide, with some additions to address science…
CIP's Eighth Annual Educational Software Contest: The Winners.
ERIC Educational Resources Information Center
Donnelly, Denis
1997-01-01
Announces the winners of an annual software contest for innovative software in physics education. Winning entries include an application to help students visualize the origin of energy bands in a solid, a package on the radioastronomy of pulsars, and a school-level science simulation program. Also includes student winners, honorable mentions,…
Andiamo, a Graphical User Interface for Ohio University's Hauser-Feshbach Implementation
NASA Astrophysics Data System (ADS)
Brooks, Matthew
2017-09-01
First and foremost, I am not a physicist. I am an undergraduate computer science major/Japanese minor at Ohio University. However, I am working for Zach Meisel, in the Ohio University's physics department. This is the first software development project I've ever done. My charge is/was to create a graphical program that can be used to more easily set up Hauser-Feshbach equation input files. The input files are of the format expected by the Hauser-Feshbach 2002 code developed by a handful of people at the university. I regularly attend group meetings with Zach and his other subordinates, but these are mostly used as a way for us to discuss our progress and any troubles or roadblocks we may have encountered. I was encouraged to try to come with his group to this event because it could help expose me to the scientific culture of astrophysics research. While I know very little about particles and epic space events, my poster would be an informative and (hopefully) inspiring one that could help get other undergraduates interested in doing object oriented programming. This could be more exposure for them, as I believe a lot of physics majors only learn scripting languages.
Modeling RF-induced Plasma-Surface Interactions with VSim
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Smithe, David N.; Pankin, Alexei Y.; Roark, Christine M.; Stoltz, Peter H.; Zhou, Sean C.-D.; Kruger, Scott E.
2014-10-01
An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath dynamics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath (e.g. sputtering), can thus be simulated in complex, experimentally relevant geometries. Simulations of RF sheath-enhanced impurity production near surfaces of the C-Mod field-aligned ICRF antenna are presented to illustrate the model; impurity mitigation techniques are also explored. Model extensions to capture the physics of secondary electron emission and of multispecies plasmas are summarized, together with a discussion of improved tools for plasma chemistry and IEDF/EEDF visualization and modeling. The latter tools are also highly relevant for commercial plasma processing applications. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling fusion and industrial plasma processes. Supported by U.S. DoE SBIR Phase I/II Award DE-SC0009501.
NASA Astrophysics Data System (ADS)
Briere, Roy A.; Harris, Frederick A.; Mitchell, Ryan E.
2016-10-01
The cornerstone of the Chinese experimental particle physics program is a series of experiments performed in the τ-charm energy region. China began building e+e- colliders at the Institute for High Energy Physics in Beijing more than three decades ago. Beijing Electron Spectrometer (BES) is the common root name for the particle physics detectors operated at these machines. We summarize the development of the BES program and highlight the physics results across several topical areas.
Particle Sorting and Motility Out of Equilibrium
NASA Astrophysics Data System (ADS)
Sandford, Cato
The theory of equilibrium statistical physics, formulated over a century ago, provides an excellent description of physical systems which have reached a static, relaxed state. Such systems can be loosely thought of as maximally disordered, in keeping with the Second Law of Thermodynamics which states that a thermal system in equilibrium has reached a state of highest entropy. However, many entities in the world around us maintain themselves in an remarkably ordered and dynamic state, and must pay for this by producing entropy in their surroundings. Organisms, for example, convert chemical energy (food) into heat, which is then dumped into the environment, raising its entropy. Systems which produce entropy through any mechanism must be described by theories of non-equilibrium statistical physics, for which there currently exists no unified framework or ontology. Here we examine two specific cases of non-equilibrium phenomena from a theoretical perspective. First, we explore the behaviour of microscopic particles which continually dissipate energy to propel themselves through their environment. Second, we consider how devices which distinguish between different types of particles can exploit non-equilibrium processes to enhance their performance. For the case of self-propelled particles, we consider a theoretical model where the particle's propulsion force has "memory"--it is a random process whose instantaneous value depends on its past evolution. This introduces a persistence in the particle's motion, and requires the dissipation of energy into its surroundings. These particles are found to exhibit a variety of behaviours forbidden in equilibrium systems: for instance they may cluster around barriers, exert unbalanced forces, and sustain steady flows through space. We develop the understanding of these particles' dynamics through a combination of explicit calculations, approximations and numerical simulation which characterise and quantify their non-equilibrium behaviour. The second situation investigated concerns the physics of particle-sorting, which is fundamental to biological systems. We introduce a number of model devices designed to distinguish between and segregate two species of particles, and analyse how the quality and speed of their operation may be influenced by providing them with an energy source which pushes them out of equilibrium. We identify different physical regimes, where our devices may consume energy to deliver better results or deliver them faster or both; and we furthermore connect the broader theory of particle sorting to the fundamental theoretical framework of statistical physics.
Galactic Cosmic Ray Event-Based Risk Model (GERM) Code
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.
2013-01-01
This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the first option, properties of monoenergetic beams are treated. In the second option, the transport of beams in different materials is treated. Similar biophysical properties as in the first option are evaluated for the primary ion and its secondary particles. Additional properties related to the nuclear fragmentation of the beam are evaluated. The GERM code is a computationally efficient Monte-Carlo heavy-ion-beam model. It includes accurate models of LET, range, residual energy, and straggling, and the quantum multiple scattering fragmentation (QMSGRG) nuclear database.
Theoretical physics: Quarks fuse to release energy
NASA Astrophysics Data System (ADS)
Miller, Gerald A.
2017-11-01
In nuclear fusion, energy is produced by the rearrangement of protons and neutrons. The discovery of an analogue of this process involving particles called quarks has implications for both nuclear and particle physics. See Letter p.89
Phase-factor-dependent symmetries and quantum phases in a three-level cavity QED system.
Fan, Jingtao; Yu, Lixian; Chen, Gang; Jia, Suotang
2016-05-03
Unlike conventional two-level particles, three-level particles may support some unitary-invariant phase factors when they interact coherently with a single-mode quantized light field. To gain a better understanding of light-matter interaction, it is thus necessary to explore the phase-factor-dependent physics in such a system. In this report, we consider the collective interaction between degenerate V-type three-level particles and a single-mode quantized light field, whose different components are labeled by different phase factors. We mainly establish an important relation between the phase factors and the symmetry or symmetry-broken physics. Specifically, we find that the phase factors affect dramatically the system symmetry. When these symmetries are breaking separately, rich quantum phases emerge. Finally, we propose a possible scheme to experimentally probe the predicted physics of our model. Our work provides a way to explore phase-factor-induced nontrivial physics by introducing additional particle levels.
A Bubble Chamber Simulator: A New Tool for the Physics Classroom
ERIC Educational Resources Information Center
Gagnon, Michel
2011-01-01
Mainly used in the 1960s, bubble chambers played a major role in particle physics. Now replaced with modern electronic detectors, we believe they remain an important didactic tool to introduce particle physics as they provide visual, appealing and insightful pictures. Sadly, this rare type of detector is mostly accessible through open-door events…
Donald Glaser, the Bubble Chamber, and Elementary Particles
Effects of Ionizing Radiation on the Formation of Bubbles in Liquids Physical Review, Vol. 87, Issue 4 , 665, August 15, 1952 Characteristics of Bubble Chambers Physical Review, Vol. 97, Issue 2, 474-479 Chambers Physical Review, Vol. 102, Issue 6, 1653-1658, June 15, 1956 Methods of Particle Detection for
Hands on CERN: A Well-Used Physics Education Project
ERIC Educational Resources Information Center
Johansson, K. E.
2006-01-01
The "Hands on CERN" education project makes it possible for students and teachers to get close to the forefront of scientific research. The project confronts the students with contemporary physics at its most fundamental level with the help of particle collisions from the DELPHI particle physics experiment at CERN. It now exists in 14 languages…
Fast Inference of Deep Neural Networks in FPGAs for Particle Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duarte, Javier; Han, Song; Harris, Philip
Recent results at the Large Hadron Collider (LHC) have pointed to enhanced physics capabilities through the improvement of the real-time event processing techniques. Machine learning methods are ubiquitous and have proven to be very powerful in LHC physics, and particle physics as a whole. However, exploration of the use of such techniques in low-latency, low-power FPGA hardware has only just begun. FPGA-based trigger and data acquisition (DAQ) systems have extremely low, sub-microsecond latency requirements that are unique to particle physics. We present a case study for neural network inference in FPGAs focusing on a classifier for jet substructure which wouldmore » enable, among many other physics scenarios, searches for new dark sector particles and novel measurements of the Higgs boson. While we focus on a specific example, the lessons are far-reaching. We develop a package based on High-Level Synthesis (HLS) called hls4ml to build machine learning models in FPGAs. The use of HLS increases accessibility across a broad user community and allows for a drastic decrease in firmware development time. We map out FPGA resource usage and latency versus neural network hyperparameters to identify the problems in particle physics that would benefit from performing neural network inference with FPGAs. For our example jet substructure model, we fit well within the available resources of modern FPGAs with a latency on the scale of 100 ns.« less
The Multiverse and Particle Physics
NASA Astrophysics Data System (ADS)
Donoghue, John F.
2016-10-01
The possibility of fundamental theories with very many ground states, each with different physical parameters, changes the way that we approach the major questions of particle physics. Most importantly, it raises the possibility that these different parameters could be realized in different domains in the larger universe. In this review, I survey the motivations for the multiverse and the impact of the idea of the multiverse on the search for new physics beyond the Standard Model.
Design and development of a Gadolinium-doped water Cherenkov detector
NASA Astrophysics Data System (ADS)
Poudyal, Nabin
This thesis describes a research and development project for neutron capture and detection in Gadolinium doped water. The Sanford Underground Research Facility (SURF) is exploring rare event physics, such as neutrinoless double beta decay (MAJORANA Project) and dark-matter detection (LUX experiment). The success of these experiments requires a careful study and understanding of background radiation, including flux and energy spectrum. The background radiation from surface contamination, radioactive decays of U-238, Th-232, Rn-222 in the surrounding rocks and muon induced neutrons have a large impact on the success of rare-event physics. The main objective of this R&D project is to measure the neutron flux contributing to ongoing experiments at SURF and suppress it by identification and capture method. For this purpose, we first modeled and designed a detector with Geant4 software. The approximate dimension of the detector is determined. The neutron capture percentage of the detector is estimated using Monte Carlo. The energy response of the detector is simulated. Next, we constructed the experimental detector, an acrylic rectangular tank (60cm x 30cm x 30cm), filled with Gadolinium-doped deionized water. The tank is coated with high efficient reflector and then taped with black electrical tape to make it opaque. The voltage dividers attached to PMTs are covered with mu-metal. Two 5-inch Hamamatsu Photomultiplier tubes were attached on both sides facing the tank to collect the Cherenkov light produced in the water. The detector utilizes the principle of Cherenkov light emission by a charged particle moving through a water at a speed higher than the speed of light in the water, hence it has an inherent energy threshold of Cherenkov photon production. This property reduces the lower energy backgrounds. Event data are obtained using the Data Acquisition hardware, Flash Analog to digital converter, along with Multi Instance Data Acquisition software. Post-experimental analysis was performed using ROOT software. Position calibration of the detector shows that the detector is position independent. We have designed and constructed the Gd-doped neutron detector which successfully detects the neutrons with low cost and high efficiency.
Scaling effects in direct shear tests
Orlando, A.D.; Hanes, D.M.; Shen, H.H.
2009-01-01
Laboratory experiments of the direct shear test were performed on spherical particles of different materials and diameters. Results of the bulk friction vs. non-dimensional shear displacement are presented as a function of the non-dimensional particle diameter. Simulations of the direct shear test were performed using the Discrete Element Method (DEM). The simulation results show Considerable differences with the physical experiments. Particle level material properties, such as the coefficients of static friction, restitution and rolling friction need to be known a priori in order to guarantee that the simulation results are an accurate representation of the physical phenomenon. Furthermore, laboratory results show a clear size dependency on the results, with smaller particles having a higher bulk friction than larger ones. ?? 2009 American Institute of Physics.
Application of econometric and ecology analysis methods in physics software
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo
2017-10-01
Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.
Ji, S.; Hanes, D.M.; Shen, H.H.
2009-01-01
In this study, we report a direct comparison between a physical test and a computer simulation of rapidly sheared granular materials. An annular shear cell experiment was conducted. All parameters were kept the same between the physical and the computational systems to the extent possible. Artificially softened particles were used in the simulation to reduce the computational time to a manageable level. Sensitivity study on the particle stiffness ensured such artificial modification was acceptable. In the experiment, a range of normal stress was applied to a given amount of particles sheared in an annular trough with a range of controlled shear speed. Two types of particles, glass and Delrin, were used in the experiment. Qualitatively, the required torque to shear the materials under different rotational speed compared well with those in the physical experiments for both the glass and the Delrin particles. However, the quantitative discrepancies between the measured and simulated shear stresses were nearly a factor of two. Boundary conditions, particle size distribution, particle damping and friction, including a sliding and rolling, contact force model, were examined to determine their effects on the computational results. It was found that of the above, the rolling friction between particles had the most significant effect on the macro stress level. This study shows that discrete element simulation is a viable method for engineering design for granular material systems. Particle level information is needed to properly conduct these simulations. However, not all particle level information is equally important in the study regime. Rolling friction, which is not commonly considered in many discrete element models, appears to play an important role. ?? 2009 Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Bishop, J. L.; Murchie, S.; Pieters, C.; Zent, A.
1999-01-01
This model is one of many possible scenarios to explain the generation of the current surface material on Mars using chemical, magnetic and spectroscopic data from Mars and geologic analogs from terrestrial sites. One basic premise is that there are physical and chemical interactions of the atmospheric dust particles and that these two processes create distinctly different results. Physical processes distribute dust particles on rocks, forming physical rock coatings, and on the surface between rocks forming soil units; these are reversible processes. Chemical reactions of the dust/soil particles create alteration rinds on rock surfaces or duricrust surface units, both of which are relatively permanent materials. According to this model the mineral components of the dust/soil particles are derived from a combination of "typical" palagonitic weathering of volcanic ash and hydrothermally altered components, primarily from steam vents or fumeroles. Both of these altered materials are composed of tiny particles, about 1 micron or smaller, that are aggregates of silicates and iron oxide/oxyhydroxide/sulfate phases. Additional information is contained in the original extended abstract.
Novel high power impulse magnetron sputtering enhanced by an auxiliary electrical field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Chunwei, E-mail: lcwnefu@126.com, E-mail: xiubotian@163.com; State Key Laboratory of Advanced Welding and Joining, Harbin Institute of Technology, Harbin 150001; Tian, Xiubo, E-mail: lcwnefu@126.com, E-mail: xiubotian@163.com
2016-08-15
The high power impulse magnetron sputtering (HIPIMS) technique is a novel highly ionized physical vapor deposition method with a high application potential. However, the electron utilization efficiency during sputtering is rather low and the metal particle ionization rate needs to be considerably improved to allow for a large-scale industrial application. Therefore, we enhanced the HIPIMS technique by simultaneously applying an electric field (EF-HIPIMS). The effect of the electric field on the discharge process was studied using a current sensor and an optical emission spectrometer. Furthermore, the spatial distribution of the electric potential and electric field during the EF-HIPIMS process wasmore » simulated using the ANSYS software. The results indicate that a higher electron utilization efficiency and a higher particle ionization rate could be achieved. The auxiliary anode obviously changed the distribution of the electric potential and the electric field in the discharge region, which increased the plasma density and enhanced the degree of ionization of the vanadium and argon gas. Vanadium films were deposited to further compare both techniques, and the morphology of the prepared films was investigated by scanning electron microscopy. The films showed a smaller crystal grain size and a denser growth structure when the electric field was applied during the discharge process.« less
Evaluation of Spacecraft Shielding Effectiveness for Radiation Protection
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Wilson, John W.
1999-01-01
The potential for serious health risks from solar particle events (SPE) and galactic cosmic rays (GCR) is a critical issue in the NASA strategic plan for the Human Exploration and Development of Space (HEDS). The excess cost to protect against the GCR and SPE due to current uncertainties in radiation transmission properties and cancer biology could be exceedingly large based on the excess launch costs to shield against uncertainties. The development of advanced shielding concepts is an important risk mitigation area with the potential to significantly reduce risk below conventional mission designs. A key issue in spacecraft material selection is the understanding of nuclear reactions on the transmission properties of materials. High-energy nuclear particles undergo nuclear reactions in passing through materials and tissue altering their composition and producing new radiation types. Spacecraft and planetary habitat designers can utilize radiation transport codes to identify optimal materials for lowering exposures and to optimize spacecraft design to reduce astronaut exposures. To reach these objectives will require providing design engineers with accurate data bases and computationally efficient software for describing the transmission properties of space radiation in materials. Our program will reduce the uncertainty in the transmission properties of space radiation by improving the theoretical description of nuclear reactions and radiation transport, and provide accurate physical descriptions of the track structure of microscopic energy deposition.
A Model for Determining Strength for Embedded Elliptical Crack in Ultra-high-temperature Ceramics
Wang, Ruzhuan; Li, Weiguo
2015-01-01
A fracture strength model applied at room temperature for embedded elliptical crack in brittle solid was obtained. With further research on the effects of various physical mechanisms on material strength, a thermo-damage strength model for ultra-high-temperature ceramics was applied to each temperature phase. Fracture strength of TiC and the changing trends with elliptical crack shape variations under different temperatures were studied. The study showed that under low temperature, the strength is sensitive to the crack shape variation; as the temperature increases, the sensitivities become smaller. The size of ellipse’s minor axes has great effect on the material strength when the ratio of ellipse’s minor and major axes is lower than 0.5, even under relatively high temperatures. The effect of the minor axes of added particle on material properties thus should be considered under this condition. As the crack area is set, the fracture strength decreases firstly and then increases with the increase of ratio of ellipse’s minor and major axes, and the turning point is 0.5. It suggests that for the added particles the ratio of ellipse’s minor and major axes should not be 0.5. All conclusions significantly coincided with the results obtained by using the finite element software ABAQUS. PMID:28793488
A Model for Determining Strength for Embedded Elliptical Crack in Ultra-high-temperature Ceramics.
Wang, Ruzhuan; Li, Weiguo
2015-08-05
A fracture strength model applied at room temperature for embedded elliptical crack in brittle solid was obtained. With further research on the effects of various physical mechanisms on material strength, a thermo-damage strength model for ultra-high-temperature ceramics was applied to each temperature phase. Fracture strength of TiC and the changing trends with elliptical crack shape variations under different temperatures were studied. The study showed that under low temperature, the strength is sensitive to the crack shape variation; as the temperature increases, the sensitivities become smaller. The size of ellipse's minor axes has great effect on the material strength when the ratio of ellipse's minor and major axes is lower than 0.5, even under relatively high temperatures. The effect of the minor axes of added particle on material properties thus should be considered under this condition. As the crack area is set, the fracture strength decreases firstly and then increases with the increase of ratio of ellipse's minor and major axes, and the turning point is 0.5. It suggests that for the added particles the ratio of ellipse's minor and major axes should not be 0.5. All conclusions significantly coincided with the results obtained by using the finite element software ABAQUS.
NASA Astrophysics Data System (ADS)
Almansa, Julio; Salvat-Pujol, Francesc; Díaz-Londoño, Gloria; Carnicer, Artur; Lallena, Antonio M.; Salvat, Francesc
2016-02-01
The Fortran subroutine package PENGEOM provides a complete set of tools to handle quadric geometries in Monte Carlo simulations of radiation transport. The material structure where radiation propagates is assumed to consist of homogeneous bodies limited by quadric surfaces. The PENGEOM subroutines (a subset of the PENELOPE code) track particles through the material structure, independently of the details of the physics models adopted to describe the interactions. Although these subroutines are designed for detailed simulations of photon and electron transport, where all individual interactions are simulated sequentially, they can also be used in mixed (class II) schemes for simulating the transport of high-energy charged particles, where the effect of soft interactions is described by the random-hinge method. The definition of the geometry and the details of the tracking algorithm are tailored to optimize simulation speed. The use of fuzzy quadric surfaces minimizes the impact of round-off errors. The provided software includes a Java graphical user interface for editing and debugging the geometry definition file and for visualizing the material structure. Images of the structure are generated by using the tracking subroutines and, hence, they describe the geometry actually passed to the simulation code.
Fast Particle Methods for Multiscale Phenomena Simulations
NASA Technical Reports Server (NTRS)
Koumoutsakos, P.; Wray, A.; Shariff, K.; Pohorille, Andrew
2000-01-01
We are developing particle methods oriented at improving computational modeling capabilities of multiscale physical phenomena in : (i) high Reynolds number unsteady vortical flows, (ii) particle laden and interfacial flows, (iii)molecular dynamics studies of nanoscale droplets and studies of the structure, functions, and evolution of the earliest living cell. The unifying computational approach involves particle methods implemented in parallel computer architectures. The inherent adaptivity, robustness and efficiency of particle methods makes them a multidisciplinary computational tool capable of bridging the gap of micro-scale and continuum flow simulations. Using efficient tree data structures, multipole expansion algorithms, and improved particle-grid interpolation, particle methods allow for simulations using millions of computational elements, making possible the resolution of a wide range of length and time scales of these important physical phenomena.The current challenges in these simulations are in : [i] the proper formulation of particle methods in the molecular and continuous level for the discretization of the governing equations [ii] the resolution of the wide range of time and length scales governing the phenomena under investigation. [iii] the minimization of numerical artifacts that may interfere with the physics of the systems under consideration. [iv] the parallelization of processes such as tree traversal and grid-particle interpolations We are conducting simulations using vortex methods, molecular dynamics and smooth particle hydrodynamics, exploiting their unifying concepts such as : the solution of the N-body problem in parallel computers, highly accurate particle-particle and grid-particle interpolations, parallel FFT's and the formulation of processes such as diffusion in the context of particle methods. This approach enables us to transcend among seemingly unrelated areas of research.
Basics of particle therapy I: physics
Park, Seo Hyun
2011-01-01
With the advance of modern radiation therapy technique, radiation dose conformation and dose distribution have improved dramatically. However, the progress does not completely fulfill the goal of cancer treatment such as improved local control or survival. The discordances with the clinical results are from the biophysical nature of photon, which is the main source of radiation therapy in current field, with the lower linear energy transfer to the target. As part of a natural progression, there recently has been a resurgence of interest in particle therapy, specifically using heavy charged particles, because these kinds of radiations serve theoretical advantages in both biological and physical aspects. The Korean government is to set up a heavy charged particle facility in Korea Institute of Radiological & Medical Sciences. This review introduces some of the elementary physics of the various particles for the sake of Korean radiation oncologists' interest. PMID:22984664
Fuzzy logic particle tracking velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.
1993-01-01
Fuzzy logic has proven to be a simple and robust method for process control. Instead of requiring a complex model of the system, a user defined rule base is used to control the process. In this paper the principles of fuzzy logic control are applied to Particle Tracking Velocimetry (PTV). Two frames of digitally recorded, single exposure particle imagery are used as input. The fuzzy processor uses the local particle displacement information to determine the correct particle tracks. Fuzzy PTV is an improvement over traditional PTV techniques which typically require a sequence (greater than 2) of image frames for accurately tracking particles. The fuzzy processor executes in software on a PC without the use of specialized array or fuzzy logic processors. A pair of sample input images with roughly 300 particle images each, results in more than 200 velocity vectors in under 8 seconds of processing time.
ERIC Educational Resources Information Center
Mackenzie, Norma N.; And Others
1988-01-01
Reviews four computer software packages including: "The Physical Science Series: Sound" which demonstrates making waves, speed of sound, doppler effect, and human hearing; "Andromeda" depicting celestial motions in any direction; "Biology Quiz: Humans" covering chemistry, cells, viruses, and human biology; and…
Protocol for Direct Counterfactual Quantum Communication
NASA Astrophysics Data System (ADS)
Salih, Hatim; Li, Zheng-Hong; Al-Amri, M.; Zubairy, M. Suhail
2013-04-01
It has long been assumed in physics that for information to travel between two parties in empty space, “Alice” and “Bob,” physical particles have to travel between them. Here, using the “chained” quantum Zeno effect, we show how, in the ideal asymptotic limit, information can be transferred between Alice and Bob without any physical particles traveling between them.
Particle-fluid interactions for flow measurements
NASA Technical Reports Server (NTRS)
Berman, N. S.
1973-01-01
Study has been made of the motion of single particle and of group of particles, emphasizing solid particles in gaseous fluid. Velocities of fluid and particle are compared for several conditions of physical interest. Mean velocity and velocity fluctuations are calculated for single particle, and some consideration is given to multiparticle systems.
Matter and Interactions: A Particle Physics Perspective
ERIC Educational Resources Information Center
Organtini, Giovanni
2011-01-01
In classical mechanics, matter and fields are completely separated; matter interacts with fields. For particle physicists this is not the case; both matter and fields are represented by particles. Fundamental interactions are mediated by particles exchanged between matter particles. In this article we explain why particle physicists believe in…
PDT - PARTICLE DISPLACEMENT TRACKING SOFTWARE
NASA Technical Reports Server (NTRS)
Wernet, M. P.
1994-01-01
Particle Imaging Velocimetry (PIV) is a quantitative velocity measurement technique for measuring instantaneous planar cross sections of a flow field. The technique offers very high precision (1%) directionally resolved velocity vector estimates, but its use has been limited by high equipment costs and complexity of operation. Particle Displacement Tracking (PDT) is an all-electronic PIV data acquisition and reduction procedure which is simple, fast, and easily implemented. The procedure uses a low power, continuous wave laser and a Charged Coupled Device (CCD) camera to electronically record the particle images. A frame grabber board in a PC is used for data acquisition and reduction processing. PDT eliminates the need for photographic processing, system costs are moderately low, and reduced data are available within seconds of acquisition. The technique results in velocity estimate accuracies on the order of 5%. The software is fully menu-driven from the acquisition to the reduction and analysis of the data. Options are available to acquire a single image or 5- or 25-field series of images separated in time by multiples of 1/60 second. The user may process each image, specifying its boundaries to remove unwanted glare from the periphery and adjusting its background level to clearly resolve the particle images. Data reduction routines determine the particle image centroids and create time history files. PDT then identifies the velocity vectors which describe the particle movement in the flow field. Graphical data analysis routines are included which allow the user to graph the time history files and display the velocity vector maps, interpolated velocity vector grids, iso-velocity vector contours, and flow streamlines. The PDT data processing software is written in FORTRAN 77 and the data acquisition routine is written in C-Language for 80386-based IBM PC compatibles running MS-DOS v3.0 or higher. Machine requirements include 4 MB RAM (3 MB Extended), a single or multiple frequency RGB monitor (EGA or better), a math co-processor, and a pointing device. The printers supported by the graphical analysis routines are the HP Laserjet+, Series II, and Series III with at least 1.5 MB memory. The data acquisition routines require the EPIX 4-MEG video board and optional 12.5MHz oscillator, and associated EPIX software. Data can be acquired from any CCD or RS-170 compatible video camera with pixel resolution of 600hX400v or better. PDT is distributed on one 5.25 inch 360K MS-DOS format diskette. Due to the use of required proprietary software, executable code is not provided on the distribution media. Compiling the source code requires the Microsoft C v5.1 compiler, Microsoft QuickC v2.0, the Microsoft Mouse Library, EPIX Image Processing Libraries, the Microway NDP-Fortran-386 v2.1 compiler, and the Media Cybernetics HALO Professional Graphics Kernal System. Due to the complexities of the machine requirements, COSMIC strongly recommends the purchase and review of the documentation prior to the purchase of the program. The source code, and sample input and output files are provided in PKZIP format; the PKUNZIP utility is included. PDT was developed in 1990. All trade names used are the property of their respective corporate owners.
Monitoring of Hadrontherapy Treatments by Means of Charged Particle Detection.
Muraro, Silvia; Battistoni, Giuseppe; Collamati, Francesco; De Lucia, Erika; Faccini, Riccardo; Ferroni, Fernando; Fiore, Salvatore; Frallicciardi, Paola; Marafini, Michela; Mattei, Ilaria; Morganti, Silvio; Paramatti, Riccardo; Piersanti, Luca; Pinci, Davide; Rucinski, Antoni; Russomando, Andrea; Sarti, Alessio; Sciubba, Adalberto; Solfaroli-Camillocci, Elena; Toppi, Marco; Traini, Giacomo; Voena, Cecilia; Patera, Vincenzo
2016-01-01
The interaction of the incoming beam radiation with the patient body in hadrontherapy treatments produces secondary charged and neutral particles, whose detection can be used for monitoring purposes and to perform an on-line check of beam particle range. In the context of ion-therapy with active scanning, charged particles are potentially attractive since they can be easily tracked with a high efficiency, in presence of a relatively low background contamination. In order to verify the possibility of exploiting this approach for in-beam monitoring in ion-therapy, and to guide the design of specific detectors, both simulations and experimental tests are being performed with ion beams impinging on simple homogeneous tissue-like targets (PMMA). From these studies, a resolution of the order of few millimeters on the single track has been proven to be sufficient to exploit charged particle tracking for monitoring purposes, preserving the precision achievable on longitudinal shape. The results obtained so far show that the measurement of charged particles can be successfully implemented in a technology capable of monitoring both the dose profile and the position of the Bragg peak inside the target and finally lead to the design of a novel profile detector. Crucial aspects to be considered are the detector positioning, to be optimized in order to maximize the available statistics, and the capability of accounting for the multiple scattering interactions undergone by the charged fragments along their exit path from the patient body. The experimental results collected up to now are also valuable for the validation of Monte Carlo simulation software tools and their implementation in Treatment Planning Software packages.
Monitoring of Hadrontherapy Treatments by Means of Charged Particle Detection
Muraro, Silvia; Battistoni, Giuseppe; Collamati, Francesco; De Lucia, Erika; Faccini, Riccardo; Ferroni, Fernando; Fiore, Salvatore; Frallicciardi, Paola; Marafini, Michela; Mattei, Ilaria; Morganti, Silvio; Paramatti, Riccardo; Piersanti, Luca; Pinci, Davide; Rucinski, Antoni; Russomando, Andrea; Sarti, Alessio; Sciubba, Adalberto; Solfaroli-Camillocci, Elena; Toppi, Marco; Traini, Giacomo; Voena, Cecilia; Patera, Vincenzo
2016-01-01
The interaction of the incoming beam radiation with the patient body in hadrontherapy treatments produces secondary charged and neutral particles, whose detection can be used for monitoring purposes and to perform an on-line check of beam particle range. In the context of ion-therapy with active scanning, charged particles are potentially attractive since they can be easily tracked with a high efficiency, in presence of a relatively low background contamination. In order to verify the possibility of exploiting this approach for in-beam monitoring in ion-therapy, and to guide the design of specific detectors, both simulations and experimental tests are being performed with ion beams impinging on simple homogeneous tissue-like targets (PMMA). From these studies, a resolution of the order of few millimeters on the single track has been proven to be sufficient to exploit charged particle tracking for monitoring purposes, preserving the precision achievable on longitudinal shape. The results obtained so far show that the measurement of charged particles can be successfully implemented in a technology capable of monitoring both the dose profile and the position of the Bragg peak inside the target and finally lead to the design of a novel profile detector. Crucial aspects to be considered are the detector positioning, to be optimized in order to maximize the available statistics, and the capability of accounting for the multiple scattering interactions undergone by the charged fragments along their exit path from the patient body. The experimental results collected up to now are also valuable for the validation of Monte Carlo simulation software tools and their implementation in Treatment Planning Software packages. PMID:27536555
Long Pulse Operation on Tore-Supra: Towards Steady State
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreau, P.; Bucalossi, J.; Brosset, C.
The experimental programme of Tore Supra is devoted to the study of technology and physics issues associated to long-duration high performance discharges. This new domain of operation requires simultaneously and in steady state: heat removal capability, particle exhaust, fully non-inductive current drive, advanced technology integration and real time plasma control. The long discharge allows for addressing new time scale physic such as the wall particle retention and erosion. Moreover, the physics of fully non-inductive discharges is full of novelty, namely: the MHD stability, the slow spontaneous oscillation of the central electron temperature or the outstanding inward particle pinch.
Exploring the SCOAP3 Research Contributions of the United States
NASA Astrophysics Data System (ADS)
Marsteller, Matthew
2016-03-01
The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is a successful global partnership of libraries, funding agencies and research centers. This presentation will inform the audience about SCOAP3 and also delve into descriptive statistics of the United States' intellectual contribution to particle physics via these open access journals. Exploration of the SCOAP3 particle physics literature using a variety of metrics tools such as Web of Science™, InCites™, Scopus® and SciVal will be shared. ORA or Sci2 will be used to visualize author collaboration networks.
The design of a small flow optical sensor of particle counter
NASA Astrophysics Data System (ADS)
Zhan, Yongbo; zhang, Jianwei; Zeng, Jianxiong; Li, Bin; Chen, Lu
2018-01-01
Based on the principle of Mie scattering, we design a small flow optical sensor of particle counter. Firstly, laser illumination system was simulated and designed by ZEMAX optical design software, and the uniform light intensity of photosensitive area was obtained. The gas circuit structure was also designed according to the related theory of fluid mechanics. Then, the method of combining with MIST scattering calculation software and geometric modeling was firstly used to design spherical reflection system, on the basis of the formula of object-image distance. Finally, the test was conducted after the optical sensor placed in self-designed pre-amplification and high-speed processing circuit. The test results show that the counting efficiency of 0.3 μm gear is above 70%, 0.5 μm gear and 1.0 μm gear are both reached more than 90%, and the dispersion coefficient of each gear is very nearly the same, compared with the standard machine of Kanomax 3886 under the particle spraying flow of 2.5SCFH, 3.0SCFH, 3.5SCFH.
A geometric approach to identify cavities in particle systems
NASA Astrophysics Data System (ADS)
Voyiatzis, Evangelos; Böhm, Michael C.; Müller-Plathe, Florian
2015-11-01
The implementation of a geometric algorithm to identify cavities in particle systems in an open-source python program is presented. The algorithm makes use of the Delaunay space tessellation. The present python software is based on platform-independent tools, leading to a portable program. Its successful execution provides information concerning the accessible volume fraction of the system, the size and shape of the cavities and the group of atoms forming each of them. The program can be easily incorporated into the LAMMPS software. An advantage of the present algorithm is that no a priori assumption on the cavity shape has to be made. As an example, the cavity size and shape distributions in a polyethylene melt system are presented for three spherical probe particles. This paper serves also as an introductory manual to the script. It summarizes the algorithm, its implementation, the required user-defined parameters as well as the format of the input and output files. Additionally, we demonstrate possible applications of our approach and compare its capability with the ones of well documented cavity size estimators.
Space physics educational outreach
NASA Technical Reports Server (NTRS)
Copeland, Richard A.
1995-01-01
The goal of this Space Physics Educational Outreach project was to develop a laboratory experiment and classroom lecture on Earth's aurora for use in lower division college physics courses, with the particular aim of implementing the experiment and lecture at Saint Mary's College of California. The strategy is to teach physics in the context of an interesting natural phenomenon by investigating the physical principles that are important in Earth's aurora, including motion of charged particles in electric and magnetic fields, particle collisions and chemical reactions, and atomic and molecular spectroscopy. As a by-product, the undergraduate students would develop an appreciation for naturally occurring space physics phenomena.
NASA Astrophysics Data System (ADS)
Oliveira, N. P.; Maciel, L.; Catarino, A. P.; Rocha, A. M.
2017-10-01
This work proposes the creation of models of surfaces using a parametric computer modelling software to obtain three-dimensional structures in weft knitted fabrics produced on single needle system machines. Digital prototyping, another feature of digital modelling software, was also explored in three-dimensional drawings generated using the Rhinoceros software. With this approach, different 3D structures were developed and produced. Physical characterization tests were then performed on the resulting 3D weft knitted structures to assess their ability to promote comfort. From the obtained results, it is apparent that the developed structures have potential for application in different market segments, such as clothing and interior textiles.
Proposed software system for atomic-structure calculation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, C.F.
1981-07-01
Atomic structure calculations are understood well enough that, at a routine level, an atomic structure software package can be developed. At the Atomic Physics Conference in Riga, 1978 L.V. Chernysheva and M.Y. Amusia of Leningrad University, presented a paper on Software for Atomic Calculations. Their system, called ATOM is based on the Hartree-Fock approximation and correlation is included within the framework of RPAE. Energy level calculations, transition probabilities, photo-ionization cross-sections, electron scattering cross-sections are some of the physical properties that can be evaluated by their system. The MCHF method, together with CI techniques and the Breit-Pauli approximation also provides amore » sound theoretical basis for atomic structure calculations.« less
Semi-Empirical Modeling of SLD Physics
NASA Technical Reports Server (NTRS)
Wright, William B.; Potapczuk, Mark G.
2004-01-01
The effects of supercooled large droplets (SLD) in icing have been an area of much interest in recent years. As part of this effort, the assumptions used for ice accretion software have been reviewed. A literature search was performed to determine advances from other areas of research that could be readily incorporated. Experimental data in the SLD regime was also analyzed. A semi-empirical computational model is presented which incorporates first order physical effects of large droplet phenomena into icing software. This model has been added to the LEWICE software. Comparisons are then made to SLD experimental data that has been collected to date. Results will be presented for the comparison of water collection efficiency, ice shape and ice mass.
[Nursing physical examination of the full-term neonate: self-instructional software].
Fernandes, Maria das Graças de Oliveira; Barbosa, Vera Lucia; Naganuma, Masuco
2006-01-01
The purpose of this research is to elaborate software about the physical examination of full-term newborns (TNB) for neonatal nursing teaching at undergraduate level. The software was developed according to the phases of planning, content development and evaluation. The construction of the modules was based on Gagné's modern learning theory and structured on the Keller Plan, in line with the systemic approach. The objectives were to elaborate and evaluate the contents of the self-instructional modules, to be used as a teaching strategy in the undergraduate course. After being structured, the material was reviewed and analyzed by 11 neonatal nursing experts, who rated the 42 exposed items as good or excellent.
MaMiCo: Transient multi-instance molecular-continuum flow simulation on supercomputers
NASA Astrophysics Data System (ADS)
Neumann, Philipp; Bian, Xin
2017-11-01
We present extensions of the macro-micro-coupling tool MaMiCo, which was designed to couple continuum fluid dynamics solvers with discrete particle dynamics. To enable local extraction of smooth flow field quantities especially on rather short time scales, sampling over an ensemble of molecular dynamics simulations is introduced. We provide details on these extensions including the transient coupling algorithm, open boundary forcing, and multi-instance sampling. Furthermore, we validate the coupling in Couette flow using different particle simulation software packages and particle models, i.e. molecular dynamics and dissipative particle dynamics. Finally, we demonstrate the parallel scalability of the molecular-continuum simulations by using up to 65 536 compute cores of the supercomputer Shaheen II located at KAUST. Program Files doi:http://dx.doi.org/10.17632/w7rgdrhb85.1 Licensing provisions: BSD 3-clause Programming language: C, C++ External routines/libraries: For compiling: SCons, MPI (optional) Subprograms used: ESPResSo, LAMMPS, ls1 mardyn, waLBerla For installation procedures of the MaMiCo interfaces, see the README files in the respective code directories located in coupling/interface/impl. Journal reference of previous version: P. Neumann, H. Flohr, R. Arora, P. Jarmatz, N. Tchipev, H.-J. Bungartz. MaMiCo: Software design for parallel molecular-continuum flow simulations, Computer Physics Communications 200: 324-335, 2016 Does the new version supersede the previous version?: Yes. The functionality of the previous version is completely retained in the new version. Nature of problem: Coupled molecular-continuum simulation for multi-resolution fluid dynamics: parts of the domain are resolved by molecular dynamics or another particle-based solver whereas large parts are covered by a mesh-based CFD solver, e.g. a lattice Boltzmann automaton. Solution method: We couple existing MD and CFD solvers via MaMiCo (macro-micro coupling tool). Data exchange and coupling algorithmics are abstracted and incorporated in MaMiCo. Once an algorithm is set up in MaMiCo, it can be used and extended, even if other solvers are used (as soon as the respective interfaces are implemented/available). Reasons for the new version: We have incorporated a new algorithm to simulate transient molecular-continuum systems and to automatically sample data over multiple MD runs that can be executed simultaneously (on, e.g., a compute cluster). MaMiCo has further been extended by an interface to incorporate boundary forcing to account for open molecular dynamics boundaries. Besides support for coupling with various MD and CFD frameworks, the new version contains a test case that allows to run molecular-continuum Couette flow simulations out-of-the-box. No external tools or simulation codes are required anymore. However, the user is free to switch from the included MD simulation package to LAMMPS. For details on how to run the transient Couette problem, see the file README in the folder coupling/tests, Remark on MaMiCo V1.1. Summary of revisions: Open boundary forcing; Multi-instance MD sampling; support for transient molecular-continuum systems Restrictions: Currently, only single-centered systems are supported. For access to the LAMMPS-based implementation of DPD boundary forcing, please contact Xin Bian, xin.bian@tum.de. Additional comments: Please see file license_mamico.txt for further details regarding distribution and advertising of this software.
NASA Astrophysics Data System (ADS)
2012-01-01
WE RECOMMEND Air swimmers Helium balloon swims like a fish Their Arrows will Darken the Sun: The Evolution and Science of Ballistics Ballistics book hits the spot Physics Experiments for your Bag Handy experiments for your lessons Quantum Physics for Poets Book shows the economic importance of physics SEP colour wheel kit Wheels investigate colour theory SEP colour mixing kit Cheap colour mixing kit uses red, green and blue LEDs iHandy Level iPhone app superbly measures angles Photonics Explorer kit Free optics kit given to schools WORTH A LOOK DrDAQ DrDAQ software gets an upgrade WEB WATCH Websites show range of physics
Cosmology of Universe Particles and Beyond
NASA Astrophysics Data System (ADS)
Xu, Wei
2016-06-01
For the first time in history, all properties of cosmology particles are uncovered and described concisely and systematically, known as the elementary particles in contemporary physics.Aligning with the synthesis of the virtual and physical worlds in a hierarchical taxonomy of the universe, this theory refines the topology framework of cosmology, and presents a new perspective of the Yin Yang natural laws that, through the processes of creation and reproduction, the fundamental elements generate an infinite series of circular objects and a Yin Yang duality of dynamic fields that are sequenced and transformed states of matter between the virtual and physical worlds.Once virtual objects are transformed, they embody various enclaves of energy states, known as dark energy, quarks, leptons, bosons, protons, and neutrons, characterized by their incentive oscillations of timestate variables in a duality of virtual realities: energy and time, spin and charge, mass and space, symmetry and antisymmetry.As a consequence, it derives the fully-scaled quantum properties of physical particles in accordance with numerous historical experiments, and has overcome the limitations of uncertainty principle and the Standard Model, towards concisely exploring physical nature and beyond...
Precision Crystal Calorimeters in High Energy Physics
Ren-Yuan Zhu
2017-12-09
Precision crystal calorimeters traditionally play an important role in high energy physics experiments. In the last two decades, it faces a challenge to maintain its precision in a hostile radiation environment. This paper reviews the performance of crystal calorimeters constructed for high energy physics experiments and the progress achieved in understanding crystalâs radiation damage as well as in developing high quality scintillating crystals for particle physics. Potential applications of new generation scintillating crystals of high density and high light yield, such as LSO and LYSO, in particle physics experiments is also discussed.
The Virtual Environment for Reactor Applications (VERA): Design and architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, John A.; Clarno, Kevin; Sieger, Matt
VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL), the first DOE Hub, which was established in July 2010 for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both amore » software and a numerical perspective, along with the goals and constraints that drove the major design decisions and their implications. As a result, we explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the application of VERA tools for a variety of challenging problems within the nuclear industry.« less
The Virtual Environment for Reactor Applications (VERA): Design and architecture
Turner, John A.; Clarno, Kevin; Sieger, Matt; ...
2016-09-08
VERA, the Virtual Environment for Reactor Applications, is the system of physics capabilities being developed and deployed by the Consortium for Advanced Simulation of Light Water Reactors (CASL), the first DOE Hub, which was established in July 2010 for the modeling and simulation of commercial nuclear reactors. VERA consists of integrating and interfacing software together with a suite of physics components adapted and/or refactored to simulate relevant physical phenomena in a coupled manner. VERA also includes the software development environment and computational infrastructure needed for these components to be effectively used. We describe the architecture of VERA from both amore » software and a numerical perspective, along with the goals and constraints that drove the major design decisions and their implications. As a result, we explain why VERA is an environment rather than a framework or toolkit, why these distinctions are relevant (particularly for coupled physics applications), and provide an overview of results that demonstrate the application of VERA tools for a variety of challenging problems within the nuclear industry.« less
A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software
NASA Astrophysics Data System (ADS)
Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.
2017-10-01
Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
2017 Topical Workshop on Electronics for Particle Physics
NASA Astrophysics Data System (ADS)
2017-09-01
The workshop will cover all aspects of electronics for particle physics experiments, and accelerator instrumentation of general interest to users. LHC experiments (and their operational experience) will remain a focus of the meeting but a strong emphasis on R&D for future experimentation will be maintained, such as SLHC, CLIC, ILC, neutrino facilities as well as other particle and astroparticle physics experiments. The purpose of the workshop is: To present results and original concepts for electronic research and development relevant to experiments as well as accelerator and beam instrumentation at future facilities; To review the status of electronics for the LHC experiments; To identify and encourage common efforts for the development of electronics; To promote information exchange and collaboration in the relevant engineering and physics communities.
Discrete Element Modeling (DEM) of Triboelectrically Charged Particles: Revised Experiments
NASA Technical Reports Server (NTRS)
Hogue, Michael D.; Calle, Carlos I.; Curry, D. R.; Weitzman, P. S.
2008-01-01
In a previous work, the addition of basic screened Coulombic electrostatic forces to an existing commercial discrete element modeling (DEM) software was reported. Triboelectric experiments were performed to charge glass spheres rolling on inclined planes of various materials. Charge generation constants and the Q/m ratios for the test materials were calculated from the experimental data and compared to the simulation output of the DEM software. In this paper, we will discuss new values of the charge generation constants calculated from improved experimental procedures and data. Also, planned work to include dielectrophoretic, Van der Waals forces, and advanced mechanical forces into the software will be discussed.
A new practice-driven approach to develop software in a cyber-physical system environment
NASA Astrophysics Data System (ADS)
Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei
2016-02-01
Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.
Unraveling the Mysteries of the Atom.
ERIC Educational Resources Information Center
Lederman, Leon
1982-01-01
The development, role, and current research in particle physics at the Fermi National Accelerator Laboratory are reviewed, including discussions of its mission to understand the structure of matter, a brief history of particle physics, and the nature and applications of superconductivity, among other topics. (JN)
Toxicity Estimation Software Tool (TEST)
The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...
ERIC Educational Resources Information Center
Miller, Michael J.
1984-01-01
Description of the Macintosh personal, educational, and business computer produced by Apple covers cost; physical characteristics including display devices, circuit boards, and built-in features; company-produced software; third-party produced software; memory and storage capacity; word-processing features; and graphics capabilities. (MBR)
Method and apparatus for separating material
Oder, Robin R.; Jamison, Russell E.
2006-10-24
An apparatus for sorting particles composed of a mixture of particles with differing physical and chemical characteristics. The apparatus includes a comminutor, a mechanism for removing particles from the inside of the comminutor which are intermediate in size between the feed to the comminutor and the product of comminution, a mechanism for either discharging particles taken from the comminutor to a reject stream or providing them to a size classification apparatus such as screening, a mechanism for returning the oversize particles to the comminutor or for discharging them to the reject stream, an electric mechanism for separating particles with an electrical force disposed adjacent to a magnet mechanism, a mechanism for providing the particles to the magnet mechanism and the electric mechanism and for providing triboelectric and capacitive charges to the particles, and a mechanism for returning one of the products of electric and magnetic separation to the comminutor while discharging the other to the reject stream. A method for sorting particles composed of a mixture of particles with differing physical and chemical characteristics.
Heat transfer analysis of a lab scale solar receiver using the discrete ordinates model
NASA Astrophysics Data System (ADS)
Dordevich, Milorad C. W.
This thesis documents the development, implementation and simulation outcomes of the Discrete Ordinates Radiation Model in ANSYS FLUENT simulating the radiative heat transfer occurring in the San Diego State University lab-scale Small Particle Heat Exchange Receiver. In tandem, it also serves to document how well the Discrete Ordinates Radiation Model results compared with those from the in-house developed Monte Carlo Ray Trace Method in a number of simplified geometries. The secondary goal of this study was the inclusion of new physics, specifically buoyancy. Implementation of an additional Monte Carlo Ray Trace Method software package known as VEGAS, which was specifically developed to model lab scale solar simulators and provide directional, flux and beam spread information for the aperture boundary condition, was also a goal of this study. Upon establishment of the model, test cases were run to understand the predictive capabilities of the model. It was shown that agreement within 15% was obtained against laboratory measurements made in the San Diego State University Combustion and Solar Energy Laboratory with the metrics of comparison being the thermal efficiency and outlet, wall and aperture quartz temperatures. Parametric testing additionally showed that the thermal efficiency of the system was very dependent on the mass flow rate and particle loading. It was also shown that the orientation of the small particle heat exchange receiver was important in attaining optimal efficiency due to the fact that buoyancy induced effects could not be neglected. The analyses presented in this work were all performed on the lab-scale small particle heat exchange receiver. The lab-scale small particle heat exchange receiver is 0.38 m in diameter by 0.51 m tall and operated with an input irradiation flux of 3 kWth and a nominal mass flow rate of 2 g/s with a suspended particle mass loading of 2 g/m3. Finally, based on acumen gained during the implementation and development of the model, a new and improved design was simulated to predict how the efficiency within the small particle heat exchange receiver could be improved through a few simple internal geometry design modifications. It was shown that the theoretical calculated efficiency of the small particle heat exchange receiver could be improved from 64% to 87% with adjustments to the internal geometry, mass flow rate, and mass loading.
NASA Technical Reports Server (NTRS)
Lee, David; Ge, Yi; Cha, Soyoung Stephen; Ramachandran, Narayanan; Rose, M. Franklin (Technical Monitor)
2001-01-01
Measurement of three-dimensional (3-D) three-component velocity fields is of great importance in both ground and space experiments for understanding materials processing and fluid physics. The experiments in these fields most likely inhibit the application of conventional planar probes for observing 3-D phenomena. Here, we present the investigation results of stereoscopic tracking velocimetry (STV) for measuring 3-D velocity fields, which include diagnostic technology development, experimental velocity measurement, and comparison with analytical and numerical computation. STV is advantageous in system simplicity for building compact hardware and in software efficiency for continual near-real-time monitoring. It has great freedom in illuminating and observing volumetric fields from arbitrary directions. STV is based on stereoscopic observation of particles-Seeded in a flow by CCD sensors. In the approach, part of the individual particle images that provide data points is likely to be lost or cause errors when their images overlap and crisscross each other especially under a high particle density. In order to maximize the valid recovery of data points, neural networks are implemented for these two important processes. For the step of particle overlap decomposition, the back propagation neural network is utilized because of its ability in pattern recognition with pertinent particle image feature parameters. For the step of particle tracking, the Hopfield neural network is employed to find appropriate particle tracks based on global optimization. Our investigation indicates that the neural networks are very efficient and useful for stereoscopically tracking particles. As an initial assessment of the diagnostic technology performance, laminar water jets with and without pulsation are measured. The jet tip velocity profiles are in good agreement with analytical predictions. Finally, for testing in material processing applications, a simple directional solidification apparatus is built for experimenting with a metal analog of succinonitrile. Its 3-D velocity field at the liquid phase is then measured to be compared with those from numerical computation. Our theoretical, numerical, and experimental investigations have proven STV to be a viable candidate for reliably measuring 3-D flow velocities. With current activities are focused on further improving the processing efficiency, overall accuracy, and automation, the eventual efforts of broad experimental applications and concurrent numerical modeling validation will be vital to many areas in fluid flow and materials processing.