Sample records for source physics tool

  1. Using Tracker as a Pedagogical Tool for Understanding Projectile Motion

    ERIC Educational Resources Information Center

    Wee, Loo Kang; Chew, Charles; Goh, Giam Hwee; Tan, Samuel; Lee, Tat Leong

    2012-01-01

    This article reports on the use of Tracker as a pedagogical tool in the effective learning and teaching of projectile motion in physics. When a computer model building learning process is supported and driven by video analysis data, this free Open Source Physics tool can provide opportunities for students to engage in active enquiry-based…

  2. Physics of cosmological cascades and observable properties

    NASA Astrophysics Data System (ADS)

    Fitoussi, T.; Belmont, R.; Malzac, J.; Marcowith, A.; Cohen-Tanugi, J.; Jean, P.

    2017-04-01

    TeV photons from extragalactic sources are absorbed in the intergalactic medium and initiate electromagnetic cascades. These cascades offer a unique tool to probe the properties of the universe at cosmological scales. We present a new Monte Carlo code dedicated to the physics of such cascades. This code has been tested against both published results and analytical approximations, and is made publicly available. Using this numerical tool, we investigate the main cascade properties (spectrum, halo extension and time delays), and study in detail their dependence on the physical parameters (extragalactic magnetic field, extragalactic background light, source redshift, source spectrum and beaming emission). The limitations of analytical solutions are emphasized. In particular, analytical approximations account only for the first generation of photons and higher branches of the cascade tree are neglected.

  3. Analyzing Virtual Physics Simulations with Tracker

    ERIC Educational Resources Information Center

    Claessens, Tom

    2017-01-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical…

  4. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    PubMed

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  5. The AAPT/ComPADRE Digital Library: Supporting Physics Education at All Levels

    NASA Astrophysics Data System (ADS)

    Mason, Bruce

    For more than a decade, the AAPT/ComPADRE Digital Library has been providing online resources, tools, and services that support broad communities of physics faculty and physics education researchers. This online library provides vetted resources for teachers and students, an environment for authors and developers to share their work, and the collaboration tools for a diverse set of users. This talk will focus on the recent collaborations and developments being hosted on or developed with ComPADRE. Examples include PhysPort, making the tools and resources developed by physics education researchers more accessible, the Open Source Physics project, expanding the use of numerical modeling at all levels of physics education, and PICUP, a community for those promoting computation in the physics curriculum. NSF-0435336, 0532798, 0840768, 0937836.

  6. Applications of Nuclear and Particle Physics Technology: Particles & Detection — A Brief Overview

    NASA Astrophysics Data System (ADS)

    Weisenberger, Andrew G.

    A brief overview of the technology applications with significant societal benefit that have their origins in nuclear and particle physics research is presented. It is shown through representative examples that applications of nuclear physics can be classified into two basic areas: 1) applying the results of experimental nuclear physics and 2) applying the tools of experimental nuclear physics. Examples of the application of the tools of experimental nuclear and particle physics research are provided in the fields of accelerator and detector based technologies namely synchrotron light sources, nuclear medicine, ion implantation and radiation therapy.

  7. The Developmental Inventory of Sources of Stress (DISS).

    ERIC Educational Resources Information Center

    Higbee, Jeanne L.; Dwinell, Patricia L.

    1992-01-01

    Describes the Developmental Inventory of Sources of Stress, an instructional tool to assist counselors, advisors, and faculty working with high-risk first-year students. The DISS helps students understand the sources of stress they can control. Describes the DISS's Time Management, Physical Lifestyle, Chemical Stressors, Academic and Interactive…

  8. PlasmaPy: beginning a community developed Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas A.; Huang, Yi-Min; PlasmaPy Collaboration

    2016-10-01

    In recent years, researchers in several disciplines have collaborated on community-developed open source Python packages such as Astropy, SunPy, and SpacePy. These packages provide core functionality, common frameworks for data analysis and visualization, and educational tools. We propose that our community begins the development of PlasmaPy: a new open source core Python package for plasma physics. PlasmaPy could include commonly used functions in plasma physics, easy-to-use plasma simulation codes, Grad-Shafranov solvers, eigenmode solvers, and tools to analyze both simulations and experiments. The development will include modern programming practices such as version control, embedding documentation in the code, unit tests, and avoiding premature optimization. We will describe early code development on PlasmaPy, and discuss plans moving forward. The success of PlasmaPy depends on active community involvement and a welcoming and inclusive environment, so anyone interested in joining this collaboration should contact the authors.

  9. Teaching Waves with Google Earth

    ERIC Educational Resources Information Center

    Logiurato, Fabrizio

    2012-01-01

    Google Earth is a huge source of interesting illustrations of various natural phenomena. It can represent a valuable tool for science education, not only for teaching geography and geology, but also physics. Here we suggest that Google Earth can be used for introducing in an attractive way the physics of waves. (Contains 9 figures.)

  10. Optimizing laser beam profiles using micro-lens arrays for efficient material processing: applications to solar cells

    NASA Astrophysics Data System (ADS)

    Hauschild, Dirk; Homburg, Oliver; Mitra, Thomas; Ivanenko, Mikhail; Jarczynski, Manfred; Meinschien, Jens; Bayer, Andreas; Lissotschenko, Vitalij

    2009-02-01

    High power laser sources are used in various production tools for microelectronic products and solar cells, including the applications annealing, lithography, edge isolation as well as dicing and patterning. Besides the right choice of the laser source suitable high performance optics for generating the appropriate beam profile and intensity distribution are of high importance for the right processing speed, quality and yield. For industrial applications equally important is an adequate understanding of the physics of the light-matter interaction behind the process. In advance simulations of the tool performance can minimize technical and financial risk as well as lead times for prototyping and introduction into series production. LIMO has developed its own software founded on the Maxwell equations taking into account all important physical aspects of the laser based process: the light source, the beam shaping optical system and the light-matter interaction. Based on this knowledge together with a unique free-form micro-lens array production technology and patented micro-optics beam shaping designs a number of novel solar cell production tool sub-systems have been built. The basic functionalities, design principles and performance results are presented with a special emphasis on resilience, cost reduction and process reliability.

  11. Premier Tools of Energy Research Also Probe Secrets of Viral Disease

    DOE R&D Accomplishments Database

    Chui, Glennda

    2011-03-28

    Advanced light sources peer into matter at the atomic and molecular scales, with applications ranging from physics, chemistry, materials science, and advanced energy research, to biology and medicine.

  12. Experiments Using Cell Phones in Physics Classroom Education: The Computer-Aided "g" Determination

    ERIC Educational Resources Information Center

    Vogt, Patrik; Kuhn, Jochen; Muller, Sebastian

    2011-01-01

    This paper continues the collection of experiments that describe the use of cell phones as experimental tools in physics classroom education. We describe a computer-aided determination of the free-fall acceleration "g" using the acoustical Doppler effect. The Doppler shift is a function of the speed of the source. Since a free-falling objects…

  13. Detailed Modeling of Physical Processes in Electron Sources for Accelerator Applications

    NASA Astrophysics Data System (ADS)

    Chubenko, Oksana; Afanasev, Andrei

    2017-01-01

    At present, electron sources are essential in a wide range of applications - from common technical use to exploring the nature of matter. Depending on the application requirements, different methods and materials are used to generate electrons. State-of-the-art accelerator applications set a number of often-conflicting requirements for electron sources (e.g., quantum efficiency vs. polarization, current density vs. lifetime, etc). Development of advanced electron sources includes modeling and design of cathodes, material growth, fabrication of cathodes, and cathode testing. The detailed simulation and modeling of physical processes is required in order to shed light on the exact mechanisms of electron emission and to develop new-generation electron sources with optimized efficiency. The purpose of the present work is to study physical processes in advanced electron sources and develop scientific tools, which could be used to predict electron emission from novel nano-structured materials. In particular, the area of interest includes bulk/superlattice gallium arsenide (bulk/SL GaAs) photo-emitters and nitrogen-incorporated ultrananocrystalline diamond ((N)UNCD) photo/field-emitters. Work supported by The George Washington University and Euclid TechLabs LLC.

  14. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields

    PubMed Central

    Sapozhnikov, Oleg A.; Tsysar, Sergey A.; Khokhlova, Vera A.; Kreider, Wayne

    2015-01-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors. PMID:26428789

  15. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  16. Computation in Classical Mechanics with Easy Java Simulations (EJS)

    NASA Astrophysics Data System (ADS)

    Cox, Anne J.

    2006-12-01

    Let your students enjoy creating animations and incorporating some computational physics into your Classical Mechanics course. This talk will demonstrate the use of an Open Source Physics package, Easy Java Simulations (EJS), in an already existing sophomore/junior level Classical Mechanics course. EJS allows for incremental introduction of computational physics into existing courses because it is easy to use (for instructors and students alike) and it is open source. Students can use this tool for numerical solutions to problems (as they can with commercial systems: Mathcad and Mathematica), but they can also generate their own animations. For example, students in Classical Mechanics use Lagrangian mechanics to solve a problem, and then use EJS not only to numerically solve the differential equations, but to show the associated motion (and check their answers). EJS, developed by Francisco Esquembre (http://fem.um.es/Ejs/), is built on the OpenSource Physics framework (http://www.opensourcephysics.org/) supported through NSF DUE0442581.

  17. MP3C - the Minor Planet Physical Properties Catalogue: a New VO Service For Multi-database Query

    NASA Astrophysics Data System (ADS)

    Tanga, Paolo; Delbo, M.; Gerakis, J.

    2013-10-01

    In the last few years we witnessed a large growth in the number of asteroids for which we have physical properties. However, these data are dispersed in a multiplicity of catalogs. Extracting data and combining them for further analysis requires custom tools, a situation further complicated by the variety of data sources, some of them standardized (Planetary Data System) others not. With these problems in mind, we created a new Virtual Observatory service named “Minor Planet Physical Properties Catalogue” (abbreviated as MP3C - http://mp3c.oca.eu/). MP3C is not a new database, but rather a portal allowing the user to access selected properties of objects by easy SQL query, even from different sources. At present, such diverse data as orbital parameters, photometric and light curve parameters, sizes and albedos derived by IRAS, AKARI and WISE, SDSS colors, SMASS taxonomy, family membership, satellite data, stellar occultation results, are included. Other data sources will be added in the near future. The physical properties output of the MP3C can be tuned by the users by query criteria based upon ranges of values of the ingested quantities. The resulting list of object can be used for interactive plots through standard VO tools such as TOPCAT. Also, their ephemerids and visibilities from given sites can be computed. We are targeting full VO compliance for providing a new standardized service to the community.

  18. X-ray astronomical spectroscopy

    NASA Technical Reports Server (NTRS)

    Holt, S. S.

    1980-01-01

    The current status of the X-ray spectroscopy of celestial X-ray sources, ranging from nearby stars to distant quasars, is reviewed. Particular emphasis is placed on the role of such spectroscopy as a useful and unique tool in the elucidation of the physical parameters of the sources. The spectroscopic analysis of degenerate and nondegenerate stellar systems, galactic clusters and active galactic nuclei, and supernova remnants is discussed.

  19. Doing accelerator physics using SDDS, UNIX, and EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borland, M.; Emery, L.; Sereno, N.

    1995-12-31

    The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinatemore » the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization.« less

  20. Environmental Research At The Advanced Photon Source

    EPA Science Inventory

    Because of the importance of probing molecular-scale chemical and physical structure of environmental samples in their natural and often hydrated state, synchrotron radiation has been a powerful tool for environmental scientists for decades. Thus, the crucial role that a highly ...

  1. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare.

    PubMed

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-07-02

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a "data modeler" tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.

  2. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare

    PubMed Central

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-01-01

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a “data modeler” tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets. PMID:26147731

  3. Exploring Cryogenic Focused Ion Beam Milling as a Group III-V Device Fabrication Tool

    DTIC Science & Technology

    2013-09-01

    boiling, triple , and critical points of the elements” in CRC Handbook of Chemistry and Physics, 92nd ed., Boca Raton, FL: CRC press, 2011-2012, p. 4...The most widely used ion source in FIB instruments is a gallium (Ga) liquid metal ion source (LMIS) [4]. Gallium is attractive as an ion source...Figure 3b. EDS spectra were captured at different points across the patterned region of the room temperature milled sample, as indicated in Figure 4

  4. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  5. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.; Safdari, Masoud; Kress, Jessica E.

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enablemore » coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site. There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.« less

  6. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  7. Optics simulations: a Python workshop

    NASA Astrophysics Data System (ADS)

    Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.

    2017-08-01

    Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.

  8. PHLUX: Photographic Flux Tools for Solar Glare and Flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2010-12-02

    A web-based tool to a) analytically and empirically quantify glare from reflected light and determine the potential impact (e.g., temporary flash blindness, retinal burn), and b) produce flux maps for central receivers. The tool accepts RAW digital photographs of the glare source (for hazard assessment) or the receiver (for flux mapping), as well as a photograph of the sun for intensity and size scaling. For glare hazard assessment, the tool determines the retinal irradiance (W/cm2) and subtended source angle for an observer and plots the glare source on a hazard spectrum (i.e., low-potential for flash blindness impact, potential for flashmore » blindness impact, retinal burn). For flux mapping, the tool provides a colored map of the receiver scaled by incident solar flux (W/m2) and unwraps the physical dimensions of the receiver while accounting for the perspective of the photographer (e.g., for a flux map of a cylindrical receiver, the horizontal axis denotes receiver angle in degrees and the vertical axis denotes vertical position in meters; for a flat panel receiver, the horizontal axis denotes horizontal position in meters and the vertical axis denotes vertical position in meters). The flux mapping capability also allows the user to specify transects along which the program plots incident solar flux on the receiver.« less

  9. Enabling consistency in pluripotent stem cell-derived products for research and development and clinical applications through material standards.

    PubMed

    French, Anna; Bravery, Christopher; Smith, James; Chandra, Amit; Archibald, Peter; Gold, Joseph D; Artzi, Natalie; Kim, Hae-Won; Barker, Richard W; Meissner, Alexander; Wu, Joseph C; Knowles, Jonathan C; Williams, David; García-Cardeña, Guillermo; Sipp, Doug; Oh, Steve; Loring, Jeanne F; Rao, Mahendra S; Reeve, Brock; Wall, Ivan; Carr, Andrew J; Bure, Kim; Stacey, Glyn; Karp, Jeffrey M; Snyder, Evan Y; Brindley, David A

    2015-03-01

    There is a need for physical standards (reference materials) to ensure both reproducibility and consistency in the production of somatic cell types from human pluripotent stem cell (hPSC) sources. We have outlined the need for reference materials (RMs) in relation to the unique properties and concerns surrounding hPSC-derived products and suggest in-house approaches to RM generation relevant to basic research, drug screening, and therapeutic applications. hPSCs have an unparalleled potential as a source of somatic cells for drug screening, disease modeling, and therapeutic application. Undefined variation and product variability after differentiation to the lineage or cell type of interest impede efficient translation and can obscure the evaluation of clinical safety and efficacy. Moreover, in the absence of a consistent population, data generated from in vitro studies could be unreliable and irreproducible. Efforts to devise approaches and tools that facilitate improved consistency of hPSC-derived products, both as development tools and therapeutic products, will aid translation. Standards exist in both written and physical form; however, because many unknown factors persist in the field, premature written standards could inhibit rather than promote innovation and translation. We focused on the derivation of physical standard RMs. We outline the need for RMs and assess the approaches to in-house RM generation for hPSC-derived products, a critical tool for the analysis and control of product variation that can be applied by researchers and developers. We then explore potential routes for the generation of RMs, including both cellular and noncellular materials and novel methods that might provide valuable tools to measure and account for variation. Multiparametric techniques to identify "signatures" for therapeutically relevant cell types, such as neurons and cardiomyocytes that can be derived from hPSCs, would be of significant utility, although physical RMs will be required for clinical purposes. ©AlphaMed Press.

  10. Enabling Consistency in Pluripotent Stem Cell-Derived Products for Research and Development and Clinical Applications Through Material Standards

    PubMed Central

    Bravery, Christopher; Smith, James; Chandra, Amit; Archibald, Peter; Gold, Joseph D.; Artzi, Natalie; Kim, Hae-Won; Barker, Richard W.; Meissner, Alexander; Wu, Joseph C.; Knowles, Jonathan C.; Williams, David; García-Cardeña, Guillermo; Sipp, Doug; Oh, Steve; Loring, Jeanne F.; Rao, Mahendra S.; Reeve, Brock; Wall, Ivan; Carr, Andrew J.; Bure, Kim; Stacey, Glyn; Karp, Jeffrey M.

    2015-01-01

    Summary There is a need for physical standards (reference materials) to ensure both reproducibility and consistency in the production of somatic cell types from human pluripotent stem cell (hPSC) sources. We have outlined the need for reference materials (RMs) in relation to the unique properties and concerns surrounding hPSC-derived products and suggest in-house approaches to RM generation relevant to basic research, drug screening, and therapeutic applications. hPSCs have an unparalleled potential as a source of somatic cells for drug screening, disease modeling, and therapeutic application. Undefined variation and product variability after differentiation to the lineage or cell type of interest impede efficient translation and can obscure the evaluation of clinical safety and efficacy. Moreover, in the absence of a consistent population, data generated from in vitro studies could be unreliable and irreproducible. Efforts to devise approaches and tools that facilitate improved consistency of hPSC-derived products, both as development tools and therapeutic products, will aid translation. Standards exist in both written and physical form; however, because many unknown factors persist in the field, premature written standards could inhibit rather than promote innovation and translation. We focused on the derivation of physical standard RMs. We outline the need for RMs and assess the approaches to in-house RM generation for hPSC-derived products, a critical tool for the analysis and control of product variation that can be applied by researchers and developers. We then explore potential routes for the generation of RMs, including both cellular and noncellular materials and novel methods that might provide valuable tools to measure and account for variation. Multiparametric techniques to identify “signatures” for therapeutically relevant cell types, such as neurons and cardiomyocytes that can be derived from hPSCs, would be of significant utility, although physical RMs will be required for clinical purposes. PMID:25650438

  11. Power and promise of narrative for advancing physical therapist education and practice.

    PubMed

    Greenfield, Bruce H; Jensen, Gail M; Delany, Clare M; Mostrom, Elizabeth; Knab, Mary; Jampel, Ann

    2015-06-01

    This perspective article provides a justification for and an overview of the use of narrative as a pedagogical tool for educators to help physical therapist students, residents, and clinicians develop skills of reflection and reflexivity in clinical practice. The use of narratives is a pedagogical approach that provides a reflective and interpretive framework for analyzing and making sense of texts, stories, and other experiences within learning environments. This article describes reflection as a well-established method to support critical analysis of clinical experiences; to assist in uncovering different perspectives of patients, families, and health care professionals involved in patient care; and to broaden the epistemological basis (ie, sources of knowledge) for clinical practice. The article begins by examining how phronetic (ie, practical and contextual) knowledge and ethical knowledge are used in physical therapy to contribute to evidence-based practice. Narrative is explored as a source of phronetic and ethical knowledge that is complementary but irreducible to traditional objective and empirical knowledge-the type of clinical knowledge that forms the basis of scientific training. The central premise is that writing narratives is a cognitive skill that should be learned and practiced to develop critical reflection for expert practice. The article weaves theory with practical application and strategies to foster narrative in education and practice. The final section of the article describes the authors' experiences with examples of integrating the tools of narrative into an educational program, into physical therapist residency programs, and into a clinical practice. © 2015 American Physical Therapy Association.

  12. Scale models: A proven cost-effective tool for outage planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, R.; Segroves, R.

    1995-03-01

    As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning andmore » monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.« less

  13. Visualization tool for three-dimensional plasma velocity distributions (ISEE_3D) as a plug-in for SPEDAS

    NASA Astrophysics Data System (ADS)

    Keika, Kunihiro; Miyoshi, Yoshizumi; Machida, Shinobu; Ieda, Akimasa; Seki, Kanako; Hori, Tomoaki; Miyashita, Yukinaga; Shoji, Masafumi; Shinohara, Iku; Angelopoulos, Vassilis; Lewis, Jim W.; Flores, Aaron

    2017-12-01

    This paper introduces ISEE_3D, an interactive visualization tool for three-dimensional plasma velocity distribution functions, developed by the Institute for Space-Earth Environmental Research, Nagoya University, Japan. The tool provides a variety of methods to visualize the distribution function of space plasma: scatter, volume, and isosurface modes. The tool also has a wide range of functions, such as displaying magnetic field vectors and two-dimensional slices of distributions to facilitate extensive analysis. The coordinate transformation to the magnetic field coordinates is also implemented in the tool. The source codes of the tool are written as scripts of a widely used data analysis software language, Interactive Data Language, which has been widespread in the field of space physics and solar physics. The current version of the tool can be used for data files of the plasma distribution function from the Geotail satellite mission, which are publicly accessible through the Data Archives and Transmission System of the Institute of Space and Astronautical Science (ISAS)/Japan Aerospace Exploration Agency (JAXA). The tool is also available in the Space Physics Environment Data Analysis Software to visualize plasma data from the Magnetospheric Multiscale and the Time History of Events and Macroscale Interactions during Substorms missions. The tool is planned to be applied to data from other missions, such as Arase (ERG) and Van Allen Probes after replacing or adding data loading plug-ins. This visualization tool helps scientists understand the dynamics of space plasma better, particularly in the regions where the magnetohydrodynamic approximation is not valid, for example, the Earth's inner magnetosphere, magnetopause, bow shock, and plasma sheet.

  14. PROPOSED WATER QUALITY SURVEILLANCE NETWORK USING PHYSICAL, CHEMICAL AND BIOLOGICAL EARLY WARNING SYSTEMS (CBEWS)

    EPA Science Inventory

    The Homeland Protection Act of 2002 specifically calls for the investigation and use of Early Warning Systems (EWS) for water security reasons. The EWS is a screening tool for detecting changes in source water and distribution system water quality. A suite of time-relevant biol...

  15. PROPOSED WATER QUALITY SURVEILLANCE NETWORK USING PHYSICAL, CHEMICAL AND BIOLOGICAL EARLY WARNING SYSTEMS (BEWS)

    EPA Science Inventory

    The Homeland Protection Act of 2002 specifically calls for the investigation and use of Early Warning Systems (EWS) for water security reasons. The EWS is a screening tool for detecting changes in source water and distribution system water quality. A suite of time-relevant biol...

  16. MO-DE-BRA-03: TOPAS-edu: A Window Into the Stochastic World Through the TOPAS Tool for Particle Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perl, J; Villagomez-Bernabe, B; Currell, F

    2015-06-15

    Purpose: The stochastic nature of the subatomic world presents a challenge for physics education. Even experienced physicists can be amazed at the varied behavior of electrons, x-rays, protons, neutrons, ions and the any short-lived particles that make up the overall behavior of our accelerators, brachytherapy sources and medical imaging systems. The all-particle Monte Carlo particle transport tool, TOPAS Tool for Particle Simulation, originally developed for proton therapy research, has been repurposed into a physics teaching tool, TOPAS-edu. Methods: TOPAS-edu students set up simulated particle sources, collimators, scatterers, imagers and scoring setups by writing simple ASCII files (in the TOPAS Parametermore » Control System format). Students visualize geometry setups and particle trajectories in a variety of modes from OpenGL graphics to VRML 3D viewers to gif and PostScript image files. Results written to simple comma separated values files are imported by the student into their preferred data analysis tool. Students can vary random seeds or adjust parameters of physics processes to better understand the stochastic nature of subatomic physics. Results: TOPAS-edu has been successfully deployed as the centerpiece of a physics course for master’s students at Queen’s University Belfast. Tutorials developed there takes students through a step by step course on the basics of particle transport and interaction, scattering, Bremsstrahlung, etc. At each step in the course, students build simulated experimental setups and then analyze the simulated results. Lessons build one upon another so that a student might end up with a full simulation of a medical accelerator, a water-phantom or an imager. Conclusion: TOPAS-edu was well received by students. A second application of TOPAS-edu is currently in development at Zurich University of Applied Sciences, Switzerland. It is our eventual goal to make TOPAS-edu available free of charge to any non-profit organization, along with associated tutorial materials developed by the TOPAS-edu community. Work supported in part by the U.S. Department of Energy under contract number DE-AC02-76SF00515. B. Villagomez-Bernabe is supported by CONACyT (Mexican Council for Science and Technology) project 231844.« less

  17. Optimal Combination of Non-Invasive Tools for the Early Detection of Potentially Life-Threatening Emergencies in Gynecology

    PubMed Central

    Varas, Catalina; Ravit, Marion; Mimoun, Camille; Panel, Pierre; Huchon, Cyrille; Fauconnier, Arnaud

    2016-01-01

    Objectives Potentially life-threatening gynecological emergencies (G-PLEs) are acute pelvic conditions that may spontaneously evolve into a life-threatening situation, or those for which there is a risk of sequelae or death in the absence of prompt diagnosis and treatment. The objective of this study was to identify the best combination of non-invasive diagnostic tools to ensure an accurate diagnosis and timely response when faced with G-PLEs for patients arriving with acute pelvic pain at the Gynecological Emergency Department (ED). Methods The data on non-invasive diagnostic tools were sourced from the records of patients presenting at the ED of two hospitals in the Parisian suburbs (France) with acute pelvic pain between September 2006 and April 2008. The medical history of the patients was obtained through a standardized questionnaire completed for a prospective observational study, and missing information was completed with data sourced from the medical forms. Diagnostic tool categories were predefined as a collection of signs or symptoms. We analyzed the association of each sign/symptom with G-PLEs using Pearson’s Chi-Square or Fischer’s exact tests. Symptoms and signs associated with G-PLEs (p-value < 0.20) were subjected to logistic regression to evaluate the diagnostic value of each of the predefined diagnostic tools and in various combinations. Results The data of 365 patients with acute pelvic pain were analyzed, of whom 103 were confirmed to have a PLE. We analyzed five diagnostic tools by logistic regression: Triage Process, History-Taking, Physical Examination, Ultrasonography, and Biological Exams. The combination of History-Taking and Ultrasonography had a C-index of 0.83, the highest for a model combining two tools. Conclusions The use of a standardized self-assessment questionnaire for history-taking and focal ultrasound examination were found to be the most successful tool combination for the diagnosis of gynecological emergencies in a Gynecological ED. Additional tools, such as physical examination, do not add substantial diagnostic value. PMID:27583697

  18. Simulation tools for analyzer-based x-ray phase contrast imaging system with a conventional x-ray source

    NASA Astrophysics Data System (ADS)

    Caudevilla, Oriol; Zhou, Wei; Stoupin, Stanislav; Verman, Boris; Brankov, J. G.

    2016-09-01

    Analyzer-based X-ray phase contrast imaging (ABI) belongs to a broader family of phase-contrast (PC) X-ray imaging modalities. Unlike the conventional X-ray radiography, which measures only X-ray absorption, in PC imaging one can also measures the X-rays deflection induced by the object refractive properties. It has been shown that refraction imaging provides better contrast when imaging the soft tissue, which is of great interest in medical imaging applications. In this paper, we introduce a simulation tool specifically designed to simulate the analyzer-based X-ray phase contrast imaging system with a conventional polychromatic X-ray source. By utilizing ray tracing and basic physical principles of diffraction theory our simulation tool can predicting the X-ray beam profile shape, the energy content, the total throughput (photon count) at the detector. In addition we can evaluate imaging system point-spread function for various system configurations.

  19. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    PubMed

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-10-17

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  20. Processing Uav and LIDAR Point Clouds in Grass GIS

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.

    2016-06-01

    Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  1. Survey of Ambient Air Pollution Health Risk Assessment Tools.

    PubMed

    Anenberg, Susan C; Belova, Anna; Brandt, Jørgen; Fann, Neal; Greco, Sue; Guttikunda, Sarath; Heroux, Marie-Eve; Hurley, Fintan; Krzyzanowski, Michal; Medina, Sylvia; Miller, Brian; Pandey, Kiran; Roos, Joachim; Van Dingenen, Rita

    2016-09-01

    Designing air quality policies that improve public health can benefit from information about air pollution health risks and impacts, which include respiratory and cardiovascular diseases and premature death. Several computer-based tools help automate air pollution health impact assessments and are being used for a variety of contexts. Expanding information gathered for a May 2014 World Health Organization expert meeting, we survey 12 multinational air pollution health impact assessment tools, categorize them according to key technical and operational characteristics, and identify limitations and challenges. Key characteristics include spatial resolution, pollutants and health effect outcomes evaluated, and method for characterizing population exposure, as well as tool format, accessibility, complexity, and degree of peer review and application in policy contexts. While many of the tools use common data sources for concentration-response associations, population, and baseline mortality rates, they vary in the exposure information source, format, and degree of technical complexity. We find that there is an important tradeoff between technical refinement and accessibility for a broad range of applications. Analysts should apply tools that provide the appropriate geographic scope, resolution, and maximum degree of technical rigor for the intended assessment, within resources constraints. A systematic intercomparison of the tools' inputs, assumptions, calculations, and results would be helpful to determine the appropriateness of each for different types of assessment. Future work would benefit from accounting for multiple uncertainty sources and integrating ambient air pollution health impact assessment tools with those addressing other related health risks (e.g., smoking, indoor pollution, climate change, vehicle accidents, physical activity). © 2016 Society for Risk Analysis.

  2. Pika: A snow science simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.; Johnson, M.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the state-of-the-art in line with other scientific research efforts.

  3. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  4. X-ray system simulation software tools for radiology and radiography education.

    PubMed

    Kengyelics, Stephen M; Treadgold, Laura A; Davies, Andrew G

    2018-02-01

    To develop x-ray simulation software tools to support delivery of radiological science education for a range of learning environments and audiences including individual study, lectures, and tutorials. Two software tools were developed; one simulated x-ray production for a simple two dimensional radiographic system geometry comprising an x-ray source, beam filter, test object and detector. The other simulated the acquisition and display of two dimensional radiographic images of complex three dimensional objects using a ray casting algorithm through three dimensional mesh objects. Both tools were intended to be simple to use, produce results accurate enough to be useful for educational purposes, and have an acceptable simulation time on modest computer hardware. The radiographic factors and acquisition geometry could be altered in both tools via their graphical user interfaces. A comparison of radiographic contrast measurements of the simulators to a real system was performed. The contrast output of the simulators had excellent agreement with measured results. The software simulators were deployed to 120 computers on campus. The software tools developed are easy-to-use, clearly demonstrate important x-ray physics and imaging principles, are accessible within a standard University setting and could be used to enhance the teaching of x-ray physics to undergraduate students. Current approaches to teaching x-ray physics in radiological science lack immediacy when linking theory with practice. This method of delivery allows students to engage with the subject in an experiential learning environment. Copyright © 2017. Published by Elsevier Ltd.

  5. Physical processes and diagnostics of gamma-ray burst emission

    NASA Technical Reports Server (NTRS)

    Harding, Alice K.

    1992-01-01

    With improved data from BATSE and other instruments, it is important to develop a range of diagnostic tools to link gamma-ray burst observations with theory. I will review some of the physical processes which may take place to form the spectrum of gamma-ray burst sources, assuming that the bursts originate on strongly magnetized neutron stars. The important diagnostics that these processes provide to probe the emission region and how they might be used to interpret observed spectra will also be discussed.

  6. [The physical problems in medicine].

    PubMed

    Bao, Shang-lian; Wang, Wei-dong; Fan, Tie-shuan

    2007-05-01

    According to the World Health Organization (WHO), the basic sciences to support the human health are chemistry, physics and informatics. Chemistry is the base of pharmacy. Physics is the base of medical instruments and equipments (MIE). The diagnosis and therapy of diseases are relying on informatics. Therefore, as the fusion results of physics and medicine, medical physics is the creative source science of MIE. Among all diagnosis tools, medical imaging devices are the fastest-developed and the most-complicated MIE since Roentgen discovered X-ray which was quickly used in medical diagnosis in 1895. Among all treatment tools, the radiotherapeutical devices are the most-widely used and the most effective MIE for tumor treatments since Mrs. Courier found the nature radiation isotope Radium at the end of 19th century and began to use it in tumor therapy. Although the research and development (R&D) of so-complicated MIE need many subjects of science and engineering, the kernel science is medical physics. With the results of more than 50 years' development in developed countries, medical physics has defined its own field, which is the medical imaging physics and the radiotherapeutical physics. But, the definition has been expanded to be wider and wider. Therefore, we should pay more attention to the establishment of Medical Physics in China. In order to develop medical physics in china, the bases of R&D and clinical practice should be also built.

  7. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources.

    PubMed

    Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.

  8. Integrative Analysis of the Physical Transport Network into Australia.

    PubMed

    Cope, Robert C; Ross, Joshua V; Wittmann, Talia A; Prowse, Thomas A A; Cassey, Phillip

    2016-01-01

    Effective biosecurity is necessary to protect nations and their citizens from a variety of threats, including emerging infectious diseases, agricultural or environmental pests and pathogens, and illegal wildlife trade. The physical pathways by which these threats are transported internationally, predominantly shipping and air traffic, have undergone significant growth and changes in spatial distributions in recent decades. An understanding of the specific pathways and donor-traffic hotspots created by this integrated physical transport network is vital for the development of effective biosecurity strategies into the future. In this study, we analysed the physical transport network into Australia over the period 1999-2012. Seaborne and air traffic were weighted to calculate a "weighted cumulative impact" score for each source region worldwide, each year. High risk source regions, and those source regions that underwent substantial changes in risk over the study period, were determined. An overall risk ranking was calculated by integrating across all possible weighting combinations. The source regions having greatest overall physical connectedness with Australia were Singapore, which is a global transport hub, and the North Island of New Zealand, a close regional trading partner with Australia. Both those regions with large amounts of traffic across multiple vectors (e.g., Hong Kong), and those with high levels of traffic of only one type (e.g., Bali, Indonesia with respect to passenger flights), were represented among high risk source regions. These data provide a baseline model for the transport of individuals and commodities against which the effectiveness of biosecurity controls may be assessed, and are a valuable tool in the development of future biosecurity policy.

  9. Integrative Analysis of the Physical Transport Network into Australia

    PubMed Central

    Cope, Robert C.; Ross, Joshua V.; Wittmann, Talia A.; Prowse, Thomas A. A.; Cassey, Phillip

    2016-01-01

    Effective biosecurity is necessary to protect nations and their citizens from a variety of threats, including emerging infectious diseases, agricultural or environmental pests and pathogens, and illegal wildlife trade. The physical pathways by which these threats are transported internationally, predominantly shipping and air traffic, have undergone significant growth and changes in spatial distributions in recent decades. An understanding of the specific pathways and donor-traffic hotspots created by this integrated physical transport network is vital for the development of effective biosecurity strategies into the future. In this study, we analysed the physical transport network into Australia over the period 1999–2012. Seaborne and air traffic were weighted to calculate a “weighted cumulative impact” score for each source region worldwide, each year. High risk source regions, and those source regions that underwent substantial changes in risk over the study period, were determined. An overall risk ranking was calculated by integrating across all possible weighting combinations. The source regions having greatest overall physical connectedness with Australia were Singapore, which is a global transport hub, and the North Island of New Zealand, a close regional trading partner with Australia. Both those regions with large amounts of traffic across multiple vectors (e.g., Hong Kong), and those with high levels of traffic of only one type (e.g., Bali, Indonesia with respect to passenger flights), were represented among high risk source regions. These data provide a baseline model for the transport of individuals and commodities against which the effectiveness of biosecurity controls may be assessed, and are a valuable tool in the development of future biosecurity policy. PMID:26881782

  10. Automatic 3D virtual scenes modeling for multisensors simulation

    NASA Astrophysics Data System (ADS)

    Latger, Jean; Le Goff, Alain; Cathala, Thierry; Larive, Mathieu

    2006-05-01

    SEDRIS that stands for Synthetic Environment Data Representation and Interchange Specification is a DoD/DMSO initiative in order to federate and make interoperable 3D mocks up in the frame of virtual reality and simulation. This paper shows an original application of SEDRIS concept for research physical multi sensors simulation, when SEDRIS is more classically known for training simulation. CHORALE (simulated Optronic Acoustic Radar battlefield) is used by the French DGA/DCE (Directorate for Test and Evaluation of the French Ministry of Defense) to perform multi-sensors simulations. CHORALE enables the user to create virtual and realistic multi spectral 3D scenes, and generate the physical signal received by a sensor, typically an IR sensor. In the scope of this CHORALE workshop, French DGA has decided to introduce a SEDRIS based new 3D terrain modeling tool that enables to create automatically 3D databases, directly usable by the physical sensor simulation CHORALE renderers. This AGETIM tool turns geographical source data (including GIS facilities) into meshed geometry enhanced with the sensor physical extensions, fitted to the ray tracing rendering of CHORALE, both for the infrared, electromagnetic and acoustic spectrum. The basic idea is to enhance directly the 2D source level with the physical data, rather than enhancing the 3D meshed level, which is more efficient (rapid database generation) and more reliable (can be generated many times, changing some parameters only). The paper concludes with the last current evolution of AGETIM in the scope mission rehearsal for urban war using sensors. This evolution includes indoor modeling for automatic generation of inner parts of buildings.

  11. The Impact and Promise of Open-Source Computational Material for Physics Teaching

    NASA Astrophysics Data System (ADS)

    Christian, Wolfgang

    2017-01-01

    A computer-based modeling approach to teaching must be flexible because students and teachers have different skills and varying levels of preparation. Learning how to run the ``software du jour'' is not the objective for integrating computational physics material into the curriculum. Learning computational thinking, how to use computation and computer-based visualization to communicate ideas, how to design and build models, and how to use ready-to-run models to foster critical thinking is the objective. Our computational modeling approach to teaching is a research-proven pedagogy that predates computers. It attempts to enhance student achievement through the Modeling Cycle. This approach was pioneered by Robert Karplus and the SCIS Project in the 1960s and 70s and later extended by the Modeling Instruction Program led by Jane Jackson and David Hestenes at Arizona State University. This talk describes a no-cost open-source computational approach aligned with a Modeling Cycle pedagogy. Our tools, curricular material, and ready-to-run examples are freely available from the Open Source Physics Collection hosted on the AAPT-ComPADRE digital library. Examples will be presented.

  12. Visualization tool for human-machine interface designers

    NASA Astrophysics Data System (ADS)

    Prevost, Michael P.; Banda, Carolyn P.

    1991-06-01

    As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.

  13. IViPP: A Tool for Visualization in Particle Physics

    NASA Astrophysics Data System (ADS)

    Tran, Hieu; Skiba, Elizabeth; Baldwin, Doug

    2011-10-01

    Experiments and simulations in physics generate a lot of data; visualization is helpful to prepare that data for analysis. IViPP (Interactive Visualizations in Particle Physics) is an interactive computer program that visualizes results of particle physics simulations or experiments. IViPP can handle data from different simulators, such as SRIM or MCNP. It can display relevant geometry and measured scalar data; it can do simple selection from the visualized data. In order to be an effective visualization tool, IViPP must have a software architecture that can flexibly adapt to new data sources and display styles. It must be able to display complicated geometry and measured data with a high dynamic range. We therefore organize it in a highly modular structure, we develop libraries to describe geometry algorithmically, use rendering algorithms running on the powerful GPU to display 3-D geometry at interactive rates, and we represent scalar values in a visual form of scientific notation that shows both mantissa and exponent. This work was supported in part by the US Department of Energy through the Laboratory for Laser Energetics (LLE), with special thanks to Craig Sangster at LLE.

  14. Playing Funny: An Introduction to "Commedia dell' Arte."

    ERIC Educational Resources Information Center

    Grantham, Barry

    2001-01-01

    Discusses the use of "Commedia," a way of performing inspired by the historical "Commedia dell' Arte." Notes that it has proved a fertile source of inspiration for all types of physical and stylized theatre and a useful training tool for performers in many fields. Presents a series of exercises designed to introduce the student to Commedia…

  15. Analyzing Virtual Physics Simulations with Tracker

    NASA Astrophysics Data System (ADS)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  16. CFD Multiphysics Tool

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.

    2005-01-01

    The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of level-two design tools for PARSEC. The "CFD Multiphysics Tool" will be the propulsive element of the tool set. The name acknowledges that space propulsion performance assessment is primarily a fluid mechanics problem. At the core of the CFD Multiphysics Tool is an open-source CFD code, HYP, under development at ERAU. ERAU is renowned for its undergraduate degree program in Aerospace Engineering the largest in the nation. The strength of the program is its applications-oriented curriculum, which culminates in one of three two-course Engineering Design sequences: Aerospace Propulsion, Spacecraft, or Aircraft. This same philosophy applies to the HYP Project, albeit with fluid physics modeling commensurate with graduate research. HYP s purpose, like the Multiphysics Tool s, is to enable calculations of real (three-dimensional; geometrically complex; intended for hardware development) applications of high speed and propulsive fluid flows.

  17. Review of current progress in nanometrology with the helium ion microscope

    NASA Astrophysics Data System (ADS)

    Postek, Michael T.; Vladár, András; Archie, Charles; Ming, Bin

    2011-02-01

    Scanning electron microscopy has been employed as an imaging and measurement tool for more than 50 years and it continues as a primary tool in many research and manufacturing facilities across the world. A new challenger to this work is the helium ion microscope (HIM). The HIM is a new imaging and metrology technology. Essentially, substitution of the electron source with a helium ion source yields a tool visually similar in function to the scanning electron microscope, but very different in the fundamental imaging and measurement process. The imaged and measured signal originates differently than in the scanning electron microscope and that fact and its single atom source diameter may be able to push the obtainable resolution lower, provide greater depth-of-field and ultimately improve the metrology. Successful imaging and metrology with this instrument entails understanding and modeling of new ion beam/specimen interaction physics. As a new methodology, HIM is beginning to show promise and the abundance of potentially advantageous applications for nanometrology has yet to be fully exploited. This paper discusses some of the progress made at NIST in collaboration with IBM to understand the science behind this new technology.

  18. CMS Analysis and Data Reduction with Apache Spark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutsche, Oliver; Canali, Luca; Cremer, Illia

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis ofmore » very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.« less

  19. MAGIC: A Tool for Combining, Interpolating, and Processing Magnetograms

    NASA Technical Reports Server (NTRS)

    Allred, Joel

    2012-01-01

    Transients in the solar coronal magnetic field are ultimately the source of space weather. Models which seek to track the evolution of the coronal field require magnetogram images to be used as boundary conditions. These magnetograms are obtained by numerous instruments with different cadences and resolutions. A tool is required which allows modelers to fmd all available data and use them to craft accurate and physically consistent boundary conditions for their models. We have developed a software tool, MAGIC (MAGnetogram Interpolation and Composition), to perform exactly this function. MAGIC can manage the acquisition of magneto gram data, cast it into a source-independent format, and then perform the necessary spatial and temporal interpolation to provide magnetic field values as requested onto model-defined grids. MAGIC has the ability to patch magneto grams from different sources together providing a more complete picture of the Sun's field than is possible from single magneto grams. In doing this, care must be taken so as not to introduce nonphysical current densities along the seam between magnetograms. We have designed a method which minimizes these spurious current densities. MAGIC also includes a number of post-processing tools which can provide additional information to models. For example, MAGIC includes an interface to the DA VE4VM tool which derives surface flow velocities from the time evolution of surface magnetic field. MAGIC has been developed as an application of the KAMELEON data formatting toolkit which has been developed by the CCMC.

  20. A journey into medical physics as viewed by a physicist

    NASA Astrophysics Data System (ADS)

    Gueye, Paul

    2007-03-01

    The world of physics is usually linked to a large variety of subjects spanning from astrophysics, nuclear/high energy physics, materials and optical sciences, plasma physics etc. Lesser is known about the exciting world of medical physics that includes radiation therapy physics, medical diagnostic and imaging physics, nuclear medicine physics, and medical radiation safety. These physicists are typically based in hospital departments of radiation oncology or radiology, and provide technical support for patient diagnosis and treatment in a clinical environment. This talk will focus on providing a bridge between selected areas of physics and their medical applications. The journey will first start from our understanding of high energy beam production and transport beamlines for external beam treatment of diseases (e.g., electron, gamma, X-ray and proton machines) as they relate to accelerator physics. We will then embrace the world of nuclear/high energy physics where detectors development provide a unique tool for understanding low energy beam distribution emitted from radioactive sources used in Brachytherapy treatment modality. Because the ultimate goal of radiation based therapy is its killing power on tumor cells, the next topic will be microdosimetry where responses of biological systems can be studied via electromagnetic systems. Finally, the impact on the imaging world will be embraced using tools heavily used in plasma physics, fluid mechanics and Monte Carlo simulations. These various scientific areas provide unique opportunities for faculty and students at universities, as well as for staff from research centers and laboratories to contribute in this field. We will conclude with the educational training related to medical physics programs.

  1. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    NASA Astrophysics Data System (ADS)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  2. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    PubMed

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-07

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  3. Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

    PubMed Central

    Wu, Tai-luan; Tseng, Ling-li

    2017-01-01

    This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327

  4. Statistical Analysis of a Class: Monte Carlo and Multiple Imputation Spreadsheet Methods for Estimation and Extrapolation

    ERIC Educational Resources Information Center

    Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael

    2017-01-01

    The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…

  5. Investigating the Magnetic Interaction with Geomag and Tracker Video Analysis: Static Equilibrium and Anharmonic Dynamics

    ERIC Educational Resources Information Center

    Onorato, P.; Mascheretti, P.; DeAmbrosis, A.

    2012-01-01

    In this paper, we describe how simple experiments realizable by using easily found and low-cost materials allow students to explore quantitatively the magnetic interaction thanks to the help of an Open Source Physics tool, the Tracker Video Analysis software. The static equilibrium of a "column" of permanents magnets is carefully investigated by…

  6. CHiCP: a web-based tool for the integrative and interactive visualization of promoter capture Hi-C datasets.

    PubMed

    Schofield, E C; Carver, T; Achuthan, P; Freire-Pritchett, P; Spivakov, M; Todd, J A; Burren, O S

    2016-08-15

    Promoter capture Hi-C (PCHi-C) allows the genome-wide interrogation of physical interactions between distal DNA regulatory elements and gene promoters in multiple tissue contexts. Visual integration of the resultant chromosome interaction maps with other sources of genomic annotations can provide insight into underlying regulatory mechanisms. We have developed Capture HiC Plotter (CHiCP), a web-based tool that allows interactive exploration of PCHi-C interaction maps and integration with both public and user-defined genomic datasets. CHiCP is freely accessible from www.chicp.org and supports most major HTML5 compliant web browsers. Full source code and installation instructions are available from http://github.com/D-I-L/django-chicp ob219@cam.ac.uk. © The Author 2016. Published by Oxford University Press. All rights reserved.

  7. Physics and applications of positron beams in an integrated PET/MR.

    PubMed

    Watson, Charles C; Eriksson, Lars; Kolb, Armin

    2013-02-07

    In PET/MR systems having the PET component within the uniform magnetic field interior to the MR, positron beams can be injected into the PET field of view (FOV) from unshielded emission sources external to it, as a consequence of the action of the Lorentz force on the transverse components of the positron's velocity. Such beams may be as small as a few millimeters in diameter, but extend 50 cm or more axially without appreciable divergence. Larger beams form 'phantoms' of annihilations in air that can be easily imaged, and that are essentially free of γ-ray attenuation and scatter effects, providing a unique tool for characterizing PET systems and reconstruction algorithms. Thin targets intersecting these beams can produce intense annihilation sources having the thickness of a sheet of paper, which are very useful for high resolution measurements, and difficult to achieve with conventional sources. Targeted beams can provide other point, line and surface sources for various applications, all without the need to have radioactivity within the FOV. In this paper we discuss the physical characteristics of positron beams in air and present examples of their applications.

  8. HELIOGate, a Portal for the Heliophysics Community

    NASA Astrophysics Data System (ADS)

    Pierantoni; Gabriele; Carley, Eoin

    2014-10-01

    Heliophysics is the branch of physics that investigates the interactions between the Sun and the other bodies of the solar system. Heliophysicists rely on data collected from numerous sources scattered across the Solar System. The data collected from these sources is processed to extract metadata and the metadata extracted in this fashion is then used to build indexes of features and events called catalogues. Heliophysicists also develop conceptual and mathematical models of the phenomena and the environment of the Solar System. More specifically, they investigate the physical characteristics of the phenomena and they simulate how they propagate throughout the Solar System with mathematical and physical abstractions called propagation models. HELIOGate aims at addressing the need to combine and orchestrate existing web services in a flexible and easily configurable fashion to tackle different scientific questions. HELIOGate also offers a tool capable of connecting to size! able computation and storage infrastructures to execute data processing codes that are needed to calibrate raw data and to extract metadata.

  9. FAST Modularization Framework for Wind Turbine Simulation: Full-System Linearization

    DOE PAGES

    Jonkman, Jason M.; Jonkman, Bonnie J.

    2016-10-03

    The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well-established methods and tools for analyzing linear systems. Here, this paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.

  10. FAST modularization framework for wind turbine simulation: full-system linearization

    NASA Astrophysics Data System (ADS)

    Jonkman, J. M.; Jonkman, B. J.

    2016-09-01

    The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well- established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.

  11. Deforming black hole and cosmological solutions by quasiperiodic and/or pattern forming structures in modified and Einstein gravity

    NASA Astrophysics Data System (ADS)

    Bubuianu, Laurenţiu; Vacaru, Sergiu I.

    2018-05-01

    We elaborate on the anholonomic frame deformation method, AFDM, for constructing exact solutions with quasiperiodic structure in modified gravity theories, MGTs, and general relativity, GR. Such solutions are described by generic off-diagonal metrics, nonlinear and linear connections and (effective) matter sources with coefficients depending on all spacetime coordinates via corresponding classes of generation and integration functions and (effective) matter sources. There are studied effective free energy functionals and nonlinear evolution equations for generating off-diagonal quasiperiodic deformations of black hole and/or homogeneous cosmological metrics. The physical data for such functionals are stated by different values of constants and prescribed symmetries for defining quasiperiodic structures at cosmological scales, or astrophysical objects in nontrivial gravitational backgrounds some similar forms as in condensed matter physics. It is shown how quasiperiodic structures determined by general nonlinear, or additive, functionals for generating functions and (effective) sources may transform black hole like configurations into cosmological metrics and inversely. We speculate on possible implications of quasiperiodic solutions in dark energy and dark matter physics. Finally, it is concluded that geometric methods for constructing exact solutions consist an important alternative tool to numerical relativity for investigating nonlinear effects in astrophysics and cosmology.

  12. Development of students' conceptual thinking by means of video analysis and interactive simulations at technical universities

    NASA Astrophysics Data System (ADS)

    Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav

    2015-03-01

    Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.

  13. A conceptual physics class where students found meaning in calculations

    NASA Astrophysics Data System (ADS)

    Hull, Michael M.; Elby, Andrew

    2013-01-01

    Prior to taking a translated version of the Maryland Open Source Tutorials (OSTs) as a stand-alone course, most students at Tokyo Gakugei University in Japan had experienced physics as memorizing laws and equations to use as computational tools. We might expect this reformed physics class, which emphasizes common sense and conceptual reasoning and rarely invokes equations, to produce students who see a disconnect between equation use and intuitive/conceptual reasoning. Many students at Gakugei, however, somehow learned to integrate mathematics into their "constructivist" epistemologies of physics, even though OSTs do not emphasize this integration. Tadao, for example, came to see that although a common-sense solution to a problem is preferable for explaining to someone who doesn't know physics, solving the problem with a quantitative calculation (that connects to physical meaning) can bring clarity and concreteness to communication between experts. How this integration occurred remains an open question for future research.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shenoy, G. K.; Rohlsberger, R.; X-Ray Science Division

    From the beginning of its discovery the Moessbauer effect has continued to be one of the most powerful tools with broad applications in diverse areas of science and technology. With the advent of synchrotron radiation sources such as the Advanced Photon Source (APS), the European Synchrotron Radiation Facility (ESRF) and the Super Photon Ring-8 (SPring-8), the tool has enlarged its scope and delivered new capabilities. The popular techniques most generally used in the field of materials physics, chemical physics, geoscience, and biology are hyperfine spectroscopy via elastic nuclear forward scattering (NFS), vibrational spectroscopy via nuclear inelastic scattering (NRIXS), and, tomore » a lesser extent, diffusional dynamics from quasielastic nuclear forward scattering (QNFS). As we look ahead, new storage rings with enhanced brilliance such as PETRA-III under construction at DESY, Hamburg, and PEP-III in its early design stage at SLAC, Stanford, will provide new and unique science opportunities. In the next two decades, x-ray free-electron lasers (XFELs), based both on self-amplified spontaneous emission (SASE-XFELs) and a seed (SXFELs), with unique time structure, coherence and a five to six orders higher average brilliance will truly revolutionize nuclear resonance applications in a major way. This overview is intended to briefly address the unique radiation characteristics of new sources on the horizon and to provide a glimpse of scientific prospects and dreams in the nuclear resonance field from the new radiation sources. We anticipate an expanded nuclear resonance research activity with applications such as spin and phonon mapping of a single nanostructure and their assemblies, interfaces, and surfaces; spin dynamics; nonequilibrium dynamics; photochemical reactions; excited-state spectroscopy; and nonlinear phenomena.« less

  15. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  16. Surface modification of ferritic steels using MEVVA and duoplasmatron ion sources

    NASA Astrophysics Data System (ADS)

    Kulevoy, Timur V.; Chalyhk, Boris B.; Fedin, Petr A.; Sitnikov, Alexey L.; Kozlov, Alexander V.; Kuibeda, Rostislav P.; Andrianov, Stanislav L.; Orlov, Nikolay N.; Kravchuk, Konstantin S.; Rogozhkin, Sergey V.; Useinov, Alexey S.; Oks, Efim M.; Bogachev, Alexey A.; Nikitin, Alexander A.; Iskandarov, Nasib A.; Golubev, Alexander A.

    2016-02-01

    Metal Vapor Vacuum Arc (MEVVA) ion source (IS) is a unique tool for production of high intensity metal ion beam that can be used for material surface modification. From the other hand, the duoplasmatron ion source provides the high intensity gas ion beams. The MEVVA and duoplasmatron IS developed in Institute for Theoretical and Experimental Physics were used for the reactor steel surface modification experiments. Response of ferritic-martensitic steel specimens on titanium and nitrogen ions implantation and consequent vacuum annealing was investigated. Increase in microhardness of near surface region of irradiated specimens was observed. Local chemical analysis shows atom mixing and redistribution in the implanted layer followed with formation of ultrafine precipitates after annealing.

  17. A Roadmap to Fundamental Physics from LISA EMRI Observations

    NASA Astrophysics Data System (ADS)

    Sopuerta, Carlos F.

    2010-09-01

    The Laser Interferometer Space Antenna is a future space-based gravitational-wave observatory (a joint mission between the European Space Agency and the US National Aeronautics and Space Administration) that is expected to be launched during the next decade. It will operate in the low-frequency gravitational-wave band, probably the richest part of the gravitational-wave spectrum in terms of science potential, where we find: massive black hole mergers as the outcome of galaxy collisions; many galactic compact binaries; the capture and subsequent inspiral of a stellar compact object into a massive black hole; and gravitational-wave signatures from early universe physical processes connected to high-energy physics and physics not yet fully understood. In this article we focus on the third type of source, the so-called extreme-mass-ratio inspirals, a high precision tool for gravitational wave astronomy that can be used, among other things, to advance in our understanding of fundamental physics questions like the nature and structure of black holes and the details of the gravitational interaction in regimes not yet proven by other experiments/observatories. Here, we give an account of some of the progress made in the development of tools to exploit the future LISA EMRI observations, we discuss what scientific questions we can try to answer from this information and, finally, we discuss the main theoretical challenges that we face in order to develop all the necessary tools to maximize the scientific outcome and some avenues that can be followed to make progress in the near future.

  18. Guide to NavyFOAM V1.0

    DTIC Science & Technology

    2011-04-01

    NavyFOAM has been developed using an open-source CFD software tool-kit ( OpenFOAM ) that draws heavily upon object-oriented programming. The...numerical methods and the physical models in the original version of OpenFOAM have been upgraded in an effort to improve accuracy and robustness of...computational fluid dynamics OpenFOAM , Object Oriented Programming (OOP) (CFD), NavyFOAM, 16. SECURITY CLASSIFICATION OF: a. REPORT UNCLASSIFIED b

  19. The Use of Interactive Environments to Promote Self-Regulation in Online Learning: A Literature Review

    ERIC Educational Resources Information Center

    Delen, Erhan; Liew, Jeffrey

    2016-01-01

    Distance education in the 21st century often relies on educational technology as the primary delivery of teaching to learners. In distance education, the source of the information and the learner do not share the same physical setting; therefore, the information is delivered by a variety of methods. The new emerging tools that are used in online…

  20. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    2017-09-01

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.

  1. Quantitative investigation of ligament strains during physical tests for sacroiliac joint pain using finite element analysis.

    PubMed

    Kim, Yoon Hyuk; Yao, Zhidong; Kim, Kyungsoo; Park, Won Man

    2014-06-01

    It may be assumed that the stability is affected when some ligaments are injured or loosened, and this joint instability causes sacroiliac joint pain. Several physical examinations have been used to diagnose sacroiliac pain and to isolate the source of the pain. However, more quantitative and objective information may be necessary to identify unstable or injured ligaments during these tests due to the lack of understanding of the quantitative relationship between the physical tests and the biomechanical parameters that may be related to pains in the sacroiliac joint and the surrounding ligaments. In this study, a three-dimensional finite element model of the sacroiliac joint was developed and the biomechanical conditions for six typical physical tests such as the compression test, distraction test, sacral apex pressure test, thigh thrust test, Patrick's test, and Gaenslen's test were modelled. The sacroiliac joint contact pressure and ligament strain were investigated for each test. The values of contact pressure and the combination of most highly strained ligaments differed markedly among the tests. Therefore, these findings in combination with the physical tests would be helpful to identify the pain source and to understand the pain mechanism. Moreover, the technology provided in this study might be a useful tool to evaluate the physical tests, to improve the present test protocols, or to develop a new physical test protocol. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. RADIOLOGICAL SEALED SOURCE LIBRARY: A NUCLEAR FORENSICS TOOL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canaday, Jodi; Chamberlain, David; Finck, Martha

    If a terrorist were to obtain and possibly detonate a device that contained radiological material, radiological forensic analysis of the material and source capsule could provide law enforcement with valuable clues about the origin of the radiological material; this information could then provide further leads on where the material and sealed source was obtained, and the loss of control point. This information could potentially be utilized for attribution and prosecution. Analyses of nuclear forensic signatures for radiological materials are generally understood to include isotopic ratios, trace element concentrations, the time since irradiation or purification, and morphology. Radiological forensic signatures formore » sealed sources provide additional information that leverages information on the physical design and chemical composition of the source capsule and containers, physical markings indicative of an owner or manufacturer. Argonne National Laboratory (Argonne), in collaboration with Idaho National Laboratory (INL), has been working since 2003 to understand signatures that could be used to identify specific source manufacturers. These signatures include the materials from which the capsule is constructed, dimensions, weld details, elemental composition, and isotopic abundances of the radioactive material. These signatures have been compiled in a library known as the Argonne/INL Radiological Sealed Source Library. Data collected for the library has included open-source information from vendor catalogs and web pages; discussions with source manufacturers and touring of production facilities (both protected through non-disclosure agreements); technical publications; and government registries such as the U.S. Nuclear Regulatory Commission’s Sealed Source and Device Registry.« less

  3. The magic words: Using computers to uncover mental associations for use in magic trick design.

    PubMed

    Williams, Howard; McOwan, Peter W

    2017-01-01

    The use of computational systems to aid in the design of magic tricks has been previously explored. Here further steps are taken in this direction, introducing the use of computer technology as a natural language data sourcing and processing tool for magic trick design purposes. Crowd sourcing of psychological concepts is investigated; further, the role of human associative memory and its exploitation in magical effects is explored. A new trick is developed and evaluated: a physical card trick partially designed by a computational system configured to search for and explore conceptual spaces readily understood by spectators.

  4. X-ray observations of Galactic H.E.S.S. sources: an update

    NASA Astrophysics Data System (ADS)

    Puehlhofer, G.; Eger, P.; Sasaki, M.; Gottschall, D.; Capasso, M.; H. E. S. S. Collaboration

    2016-06-01

    X-ray diagnostics of TeV sources continues to be an important tool to identify the nature of newly detected sources as well as to pinpoint the physics processes that are at work in these highly energetic objects. The contribution aims at giving a review of recent studies that we have performed on TeV sources with H.E.S.S. and XMM-Newton and also other X-ray facilities. Here, we will mainly focus on Galactic objects such as gamma-ray binaries, pulsar wind nebulae, and supernova remnants (SNRs). Particular emphasis will be given to SNR studies, including recently identified SNRs such as HESS J1731-347 and HESS J1534-571 as well as a revisit of RX J1713.7-3946.

  5. Nonuniform Liouville transformers for quasi-homogeneous optical fields. Final technical report, September 25, 1989--January 22, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannson, T.

    1993-03-01

    During the last two decades, there have been dramatic improvements in the development of optical sources. Examples of this development range from semiconductor laser diodes to free electron beam lasers and synchrotron radiation. Before these developments, standards for the measurement of basic optical parameters (quantities) were less demanding. Now, however, there is a fundamental need for new, reliable methods for providing fast quantitative results for a very broad variety of optical systems and sources. This is particularly true for partially coherent optical beams, since all optical sources are either fully or partially spatially coherent (including Lambertian sources). Until now, theremore » has been no satisfactory solution to this problem. During the last two decades, however, the foundations of physical radiometry have been developed by Walther, Wolf and co-workers. By integrating physical optics, statistical optics and conventional radiometry, this body of work provides necessary tools for the evaluation of radiometric quantities for partially coherent optical beams propagating through optical systems. In this program, Physical Optics Corporation (POC) demonstrated the viability of such a radiometric approach for the specific case of generalized energy concentrators called Liouville transformers. We believe that this radiometric approach is necessary to fully characterize any type of optical system since it takes into account the partial coherence of radiation. 90 refs., 57 figs., 4 tabs.« less

  6. Science data visualization in planetary and heliospheric contexts with 3DView

    NASA Astrophysics Data System (ADS)

    Génot, V.; Beigbeder, L.; Popescu, D.; Dufourg, N.; Gangloff, M.; Bouchemit, M.; Caussarieu, S.; Toniutti, J.-P.; Durand, J.; Modolo, R.; André, N.; Cecconi, B.; Jacquey, C.; Pitout, F.; Rouillard, A.; Pinto, R.; Erard, S.; Jourdane, N.; Leclercq, L.; Hess, S.; Khodachenko, M.; Al-Ubaidi, T.; Scherf, M.; Budnik, E.

    2018-01-01

    We present a 3D orbit viewer application capable of displaying science data. 3DView, a web tool designed by the French Plasma Physics Data Center (CDPP) for the planetology and heliophysics community, has extended functionalities to render space physics data (observations and models alike) in their original 3D context. Time series, vectors, dynamic spectra, celestial body maps, magnetic field or flow lines, 2D cuts in simulation cubes, etc, are among the variety of data representation enabled by 3DView. The direct connection to several large databases, the use of VO standards and the possibility to upload user data makes 3DView a versatile tool able to cover a wide range of space physics contexts. The code is open source and the software is regularly used at Masters Degree level or summer school for pedagogical purposes. The present paper describes the general architecture and all major functionalities, and offers several science cases (simulation rendering, mission preparation, etc.) which can be easily replayed by the interested readers. Future developments are finally outlined.

  7. Reviews

    NASA Astrophysics Data System (ADS)

    2006-01-01

    WE RECOMMEND GLX Xplorer Datalogger This hand-held device offers great portability and robustness. Theoretical Concepts in Physics A first-rate reference tool for physics teachers. Do Your Ears Pop in Space? This little gem gives a personal insight into space travel. Full Moon A collection of high-quality photographs from the Apollo missions. The Genius of Science A collection of memories from leading 20th-century physicists. The Simple Science of Flight An excellent source of facts and figures about flight. SUREHigherPhysics This simulation-based software complies with Higher physics. Interactive Physics A programme that makes building simulations quick and easy. WORTH A LOOK Astronomical Enigmas This guide to enigmas could be a little shorter. HANDLE WITH CARE Standing-wave machine This is basically a standing-wave generator with a built-in strobe. WEB WATCH Sounds Amazing is a fantastic site, aimed at Key Stage 4 pupils, for learning about sound and waves.

  8. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  9. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    NASA Astrophysics Data System (ADS)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  10. Development of an interpretive simulation tool for the proton radiography technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levy, M. C., E-mail: levymc@stanford.edu; Lawrence Livermore National Laboratory, Livermore, California 94551; Ryutov, D. D.

    2015-03-15

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool’s numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field “primitives” is also provided. This latter capability allows users tomore » add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ∼10{sup 8} particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ∼10 mm{sup 3}. Insights derived from this application show that the tool can support understanding of HED plasmas.« less

  11. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  12. PlasmaPy: initial development of a Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas; Leonard, Andrew J.; Stańczak, Dominik; Haggerty, Colby C.; Parashar, Tulasi N.; Huang, Yu-Min; PlasmaPy Community

    2017-10-01

    We report on initial development of PlasmaPy: an open source community-driven Python package for plasma physics. PlasmaPy seeks to provide core functionality that is needed for the formation of a fully open source Python ecosystem for plasma physics. PlasmaPy prioritizes code readability, consistency, and maintainability while using best practices for scientific computing such as version control, continuous integration testing, embedding documentation in code, and code review. We discuss our current and planned capabilities, including features presently under development. The development roadmap includes features such as fluid and particle simulation capabilities, a Grad-Shafranov solver, a dispersion relation solver, atomic data retrieval methods, and tools to analyze simulations and experiments. We describe several ways to contribute to PlasmaPy. PlasmaPy has a code of conduct and is being developed under a BSD license, with a version 0.1 release planned for 2018. The success of PlasmaPy depends on active community involvement, so anyone interested in contributing to this project should contact the authors. This work was partially supported by the U.S. Department of Energy.

  13. Deep, Broadband Spectral Line Surveys of Molecule-rich Interstellar Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widicus Weaver, Susanna L.; Laas, Jacob C.; Zou, Luyao

    2017-09-01

    Spectral line surveys are an indispensable tool for exploring the physical and chemical evolution of astrophysical environments due to the vast amount of data that can be obtained in a relatively short amount of time. We present deep, broadband spectral line surveys of 30 interstellar clouds using two broadband λ  = 1.3 mm receivers at the Caltech Submillimeter Observatory. This information can be used to probe the influence of physical environment on molecular complexity. We observed a wide variety of sources to examine the relative abundances of organic molecules as they relate to the physical properties of the source (i.e., temperature,more » density, dynamics, etc.). The spectra are highly sensitive, with noise levels ≤25 mK at a velocity resolution of ∼0.35 km s{sup −1}. In the initial analysis presented here, column densities and rotational temperatures have been determined for the molecular species that contribute significantly to the spectral line density in this wavelength regime. We present these results and discuss their implications for complex molecule formation in the interstellar medium.« less

  14. Flux tubes in the QCD vacuum

    NASA Astrophysics Data System (ADS)

    Cea, Paolo; Cosmai, Leonardo; Cuteri, Francesca; Papa, Alessandro

    2017-06-01

    The hypothesis that the QCD vacuum can be modeled as a dual superconductor is a powerful tool to describe the distribution of the color field generated by a quark-antiquark static pair and, as such, can provide useful clues for the understanding of confinement. In this work we investigate, by lattice Monte Carlo simulations of the S U (3 ) pure gauge theory and of (2 +1 )-flavor QCD with physical mass settings, some properties of the chromoelectric flux tube at zero temperature and their dependence on the physical distance between the static sources. We draw some conclusions about the validity domain of the dual superconductor picture.

  15. Physics Mining of Multi-Source Data Sets

    NASA Technical Reports Server (NTRS)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  16. The magic words: Using computers to uncover mental associations for use in magic trick design

    PubMed Central

    2017-01-01

    The use of computational systems to aid in the design of magic tricks has been previously explored. Here further steps are taken in this direction, introducing the use of computer technology as a natural language data sourcing and processing tool for magic trick design purposes. Crowd sourcing of psychological concepts is investigated; further, the role of human associative memory and its exploitation in magical effects is explored. A new trick is developed and evaluated: a physical card trick partially designed by a computational system configured to search for and explore conceptual spaces readily understood by spectators. PMID:28792941

  17. Mapping the literature of physical therapy.

    PubMed Central

    Wakiji, E M

    1997-01-01

    Physical therapy is a fast growing profession because of the aging population, medical advances, and the public's interest in health promotion. This study is part of the Medical Library Association (MLA) Nursing and Allied Health Resources Section's project to map the allied health literature. It identifies the core journals in physical therapy by analyzing the cited references of articles in two established physical therapy journals, Physical Therapy and Archives of Physical Medicine and Rehabilitation, during the period 1991 through 1993. This bibliometric analysis also determines the extent to which these journals are covered by the primary indexing sources, Allied and Alternative Medicine (AMED), the Cumulative Index to Nursing and Allied Health Literature, EMBASE, and MEDLINE. In this study, fourteen journals were found to supply one-third of all references studied. Ninety-five journals provided an additional third of the references. MEDLINE rated the highest as the indexing tool of choice for these 109 journals. The study results can assist in collection development decisions, advise physical therapists as to the best access to their core literature, and influence database producers to increase their coverage of the literature important to physical therapy. PMID:9285129

  18. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  19. Intermediate energy heavy ions: An emerging multi-disciplinary research tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alonso, J.R.

    1988-10-01

    In the ten years that beams of intermediate energy ({approx}50 MeV/amu{le}E{le}{approx}2 GeV/amu) heavy ions (Z{le}92) have been available, an increasing number of new research areas have been opened up. Pioneering work at the Bevalac at the Lawrence Berkeley Laboratory, still the world's only source of the heaviest beams in this energy range, has led to the establishment of active programs in nuclear physics, atomic physics, cosmic ray physics, as well as biology and medicine, and industrial applications. The great promise for growth of these research areas has led to serious planning for new facilities capable of delivering such beams; severalmore » such facilities are now in construction around the world. 20 refs., 5 figs., 1 tab.« less

  20. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  1. Bruno Touschek: From Betatrons to Electron-Positron Colliders

    NASA Astrophysics Data System (ADS)

    Bernardini, Carlo; Pancheri, Giulia; Pellegrini, Claudio

    Bruno Touschek's life as a physicist spanned the period from World War II to the 1970s. He was a key figure in the developments of electron-positron colliders and storage rings, and made important contributions to theoretical high energy physics. Storage rings, initially developed for high energy physics, are being widely used in many countries as synchrotron radiation sources and are a tool for research in physics, chemistry, biology, environmental sciences and cultural heritage studies. We describe Touschek's life in Austria, where he was born, in Germany, where he participated in the construction of a betatron during WWII, and in Italy, where he proposed and led to completion the first electron-positron storage ring in 1960, in Frascati. We highlight how his central European culture influenced his lifestyle and work, and his main contributions to physics, such as the discovery of the Touschek effect and beam instabilities in the larger storage ring ADONE.

  2. Bruno Touschek: From Betatrons to Electron-Positron Colliders

    NASA Astrophysics Data System (ADS)

    Bernardini, Carlo; Pancheri, Giulia; Pellegrini, Claudio

    Bruno Touschek’s life as a physicist spanned the period from World War II to the 1970s. He was a key figure in the developments of electron-positron colliders and storage rings, and made important contributions to theoretical high energy physics. Storage rings, initially developed for high energy physics, are being widely used in many countries as synchrotron radiation sources and are a tool for research in physics, chemistry, biology, environmental sciences and cultural heritage studies. We describe Touschek’s life in Austria, where he was born, in Germany, where he participated in the construction of a betatron during WWII, and in Italy, where he proposed and led to completion the first electron-positron storage ring in 1960, in Frascati. We highlight how his central European culture influenced his lifestyle and work, and his main contributions to physics, such as the discovery of the Touschek effect and beam instabilities in the larger storage ring ADONE.

  3. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph;more » (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.« less

  4. ‘Gamma Anna’: a classroom demonstration for teaching the concepts of gamma imaging

    NASA Astrophysics Data System (ADS)

    Wolff, Nicola; Griffiths, Jennifer; Yerworth, Rebecca

    2017-01-01

    Gamma imaging is at the interface of medicine and physics and thus its teaching is important in both fields. Pedagogic literature highlights the benefits of interactive demonstrations in teaching: an increase in enjoyment and interest, as well as improvement in academic achievement. However gamma imaging uses radioactive sources, which are potentially dangerous and thus their use is tightly controlled. We have developed a demonstration which uses a localised exothermic reaction within a rag doll as an analogue of radioactivity. This can be safely used in classrooms to demonstrate the principles of gamma imaging. The tool is easy to make, cheap, robust and portable. The supplementary material in this paper gives teacher notes and a description of how to make the rag doll demonstrator. We have tested the tool using six participants, acting as ‘teachers’, who carried out the demonstration and described the doll as easy to use, and the ‘tumour’ clearly identifiable. The teaching tool was separately demonstrated to a group of 12 GCSE physics students and a group of 12 medical students. Feedback showed increased student engagement, enjoyment and understanding of gamma imaging. Previous research has shown that these benefits have an impact on learning and academic outcomes.

  5. Investigating the effects of point source and nonpoint source pollution on the water quality of the East River (Dongjiang) in South China

    USGS Publications Warehouse

    Wu, Yiping; Chen, Ji

    2013-01-01

    Understanding the physical processes of point source (PS) and nonpoint source (NPS) pollution is critical to evaluate river water quality and identify major pollutant sources in a watershed. In this study, we used the physically-based hydrological/water quality model, Soil and Water Assessment Tool, to investigate the influence of PS and NPS pollution on the water quality of the East River (Dongjiang in Chinese) in southern China. Our results indicate that NPS pollution was the dominant contribution (>94%) to nutrient loads except for mineral phosphorus (50%). A comprehensive Water Quality Index (WQI) computed using eight key water quality variables demonstrates that water quality is better upstream than downstream despite the higher level of ammonium nitrogen found in upstream waters. Also, the temporal (seasonal) and spatial distributions of nutrient loads clearly indicate the critical time period (from late dry season to early wet season) and pollution source areas within the basin (middle and downstream agricultural lands), which resource managers can use to accomplish substantial reduction of NPS pollutant loadings. Overall, this study helps our understanding of the relationship between human activities and pollutant loads and further contributes to decision support for local watershed managers to protect water quality in this region. In particular, the methods presented such as integrating WQI with watershed modeling and identifying the critical time period and pollutions source areas can be valuable for other researchers worldwide.

  6. Three options for citation tracking: Google Scholar, Scopus and Web of Science.

    PubMed

    Bakkalbasi, Nisa; Bauer, Kathleen; Glover, Janis; Wang, Lei

    2006-06-29

    Researchers turn to citation tracking to find the most influential articles for a particular topic and to see how often their own published papers are cited. For years researchers looking for this type of information had only one resource to consult: the Web of Science from Thomson Scientific. In 2004 two competitors emerged--Scopus from Elsevier and Google Scholar from Google. The research reported here uses citation analysis in an observational study examining these three databases; comparing citation counts for articles from two disciplines (oncology and condensed matter physics) and two years (1993 and 2003) to test the hypothesis that the different scholarly publication coverage provided by the three search tools will lead to different citation counts from each. Eleven journal titles with varying impact factors were selected from each discipline (oncology and condensed matter physics) using the Journal Citation Reports (JCR). All articles published in the selected titles were retrieved for the years 1993 and 2003, and a stratified random sample of articles was chosen, resulting in four sets of articles. During the week of November 7-12, 2005, the citation counts for each research article were extracted from the three sources. The actual citing references for a subset of the articles published in 2003 were also gathered from each of the three sources. For oncology 1993 Web of Science returned the highest average number of citations, 45.3. Scopus returned the highest average number of citations (8.9) for oncology 2003. Web of Science returned the highest number of citations for condensed matter physics 1993 and 2003 (22.5 and 3.9 respectively). The data showed a significant difference in the mean citation rates between all pairs of resources except between Google Scholar and Scopus for condensed matter physics 2003. For articles published in 2003 Google Scholar returned the largest amount of unique citing material for oncology and Web of Science returned the most for condensed matter physics. This study did not identify any one of these three resources as the answer to all citation tracking needs. Scopus showed strength in providing citing literature for current (2003) oncology articles, while Web of Science produced more citing material for 2003 and 1993 condensed matter physics, and 1993 oncology articles. All three tools returned some unique material. Our data indicate that the question of which tool provides the most complete set of citing literature may depend on the subject and publication year of a given article.

  7. Three options for citation tracking: Google Scholar, Scopus and Web of Science

    PubMed Central

    Bakkalbasi, Nisa; Bauer, Kathleen; Glover, Janis; Wang, Lei

    2006-01-01

    Background Researchers turn to citation tracking to find the most influential articles for a particular topic and to see how often their own published papers are cited. For years researchers looking for this type of information had only one resource to consult: the Web of Science from Thomson Scientific. In 2004 two competitors emerged – Scopus from Elsevier and Google Scholar from Google. The research reported here uses citation analysis in an observational study examining these three databases; comparing citation counts for articles from two disciplines (oncology and condensed matter physics) and two years (1993 and 2003) to test the hypothesis that the different scholarly publication coverage provided by the three search tools will lead to different citation counts from each. Methods Eleven journal titles with varying impact factors were selected from each discipline (oncology and condensed matter physics) using the Journal Citation Reports (JCR). All articles published in the selected titles were retrieved for the years 1993 and 2003, and a stratified random sample of articles was chosen, resulting in four sets of articles. During the week of November 7–12, 2005, the citation counts for each research article were extracted from the three sources. The actual citing references for a subset of the articles published in 2003 were also gathered from each of the three sources. Results For oncology 1993 Web of Science returned the highest average number of citations, 45.3. Scopus returned the highest average number of citations (8.9) for oncology 2003. Web of Science returned the highest number of citations for condensed matter physics 1993 and 2003 (22.5 and 3.9 respectively). The data showed a significant difference in the mean citation rates between all pairs of resources except between Google Scholar and Scopus for condensed matter physics 2003. For articles published in 2003 Google Scholar returned the largest amount of unique citing material for oncology and Web of Science returned the most for condensed matter physics. Conclusion This study did not identify any one of these three resources as the answer to all citation tracking needs. Scopus showed strength in providing citing literature for current (2003) oncology articles, while Web of Science produced more citing material for 2003 and 1993 condensed matter physics, and 1993 oncology articles. All three tools returned some unique material. Our data indicate that the question of which tool provides the most complete set of citing literature may depend on the subject and publication year of a given article. PMID:16805916

  8. The Human Exposure Model (HEM): A Tool to Support Rapid ...

    EPA Pesticide Factsheets

    The US EPA is developing an open and publically available software program called the Human Exposure Model (HEM) to provide near-field exposure information for Life Cycle Impact Assessments (LCIAs). Historically, LCIAs have often omitted impacts from near-field sources of exposure. The use of consumer products often results in near-field exposures (exposures that occur directly from the use of a product) that are larger than environmentally mediated exposures (i.e. far-field sources)1,2. Failure to consider near-field exposures could result in biases in LCIA-based determinations of the relative sustainability of consumer products. HEM is designed to provide this information.Characterizing near-field sources of chemical exposures present a challenge to LCIA practitioners. Unlike far-field sources, where multimedia mass balance models have been used to determine human exposure, near-field sources require product-specific models of human exposure and considerable information on product use and product composition. Such information is difficult and time-consuming to gather and curate. The HEM software will characterize the distribution of doses and product intake fractions2 across populations of product users and bystanders, allowing for differentiation by various demographic characteristics. The tool incorporates a newly developed database of the composition of more than 17,000 products, data on physical and chemical properties for more than 2,000 chemicals, and mo

  9. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    NASA Astrophysics Data System (ADS)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  10. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less

  11. Bullying: Effects on School-Aged Children, Screening Tools, and Referral Sources.

    PubMed

    Fisher, Katie; Cassidy, Brenda; Mitchell, Ann M

    2017-01-01

    Bullying is not a new concept or behavior, and is now gaining national attention as a growing public health concern. Bullying leads to short- and long-term physical and psychological damage to both the victims and the bullies. The serious implications of bullying drive a clinical mandate for teachers and school nurses to be educated and adequately trained to identify and address bullying within schools. This review of the literature describes screening tools that can be utilized to identify both victims and bullies. In addition, referral services utilizing collaborative intervention measures are discussed. This literature review will help school nurses and teachers to identify and expand their role in school-wide bullying prevention and intervention measures.

  12. Open source Matrix Product States: Opening ways to simulate entangled many-body quantum systems in one dimension

    NASA Astrophysics Data System (ADS)

    Jaschke, Daniel; Wall, Michael L.; Carr, Lincoln D.

    2018-04-01

    Numerical simulations are a powerful tool to study quantum systems beyond exactly solvable systems lacking an analytic expression. For one-dimensional entangled quantum systems, tensor network methods, amongst them Matrix Product States (MPSs), have attracted interest from different fields of quantum physics ranging from solid state systems to quantum simulators and quantum computing. Our open source MPS code provides the community with a toolset to analyze the statics and dynamics of one-dimensional quantum systems. Here, we present our open source library, Open Source Matrix Product States (OSMPS), of MPS methods implemented in Python and Fortran2003. The library includes tools for ground state calculation and excited states via the variational ansatz. We also support ground states for infinite systems with translational invariance. Dynamics are simulated with different algorithms, including three algorithms with support for long-range interactions. Convenient features include built-in support for fermionic systems and number conservation with rotational U(1) and discrete Z2 symmetries for finite systems, as well as data parallelism with MPI. We explain the principles and techniques used in this library along with examples of how to efficiently use the general interfaces to analyze the Ising and Bose-Hubbard models. This description includes the preparation of simulations as well as dispatching and post-processing of them.

  13. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  14. Warfighter Integrated Physical Ergonomics Tool Development: Needs Analysis and State of the Art Review

    DTIC Science & Technology

    2011-03-01

    Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 14 e) Forces: Griffon seat design assessments include questions of vibration...the suitability of alternative designs . Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 5 e) Performance Measures...configurations to assess Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 8 design and acquisition decisions, and more

  15. ROOT.NET: Using ROOT from .NET languages like C# and F#

    NASA Astrophysics Data System (ADS)

    Watts, G.

    2012-12-01

    ROOT.NET provides an interface between Microsoft's Common Language Runtime (CLR) and .NET technology and the ubiquitous particle physics analysis tool, ROOT. ROOT.NET automatically generates a series of efficient wrappers around the ROOT API. Unlike pyROOT, these wrappers are statically typed and so are highly efficient as compared to the Python wrappers. The connection to .NET means that one gains access to the full series of languages developed for the CLR including functional languages like F# (based on OCaml). Many features that make ROOT objects work well in the .NET world are added (properties, IEnumerable interface, LINQ compatibility, etc.). Dynamic languages based on the CLR can be used as well, of course (Python, for example). Additionally it is now possible to access ROOT objects that are unknown to the translation tool. This poster will describe the techniques used to effect this translation, along with performance comparisons, and examples. All described source code is posted on the open source site CodePlex.

  16. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-11-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C# and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff.

  17. Modulating the Neutron Flux from a Mirror Neutron Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryutov, D D

    2011-09-01

    A 14-MeV neutron source based on a Gas-Dynamic Trap will provide a high flux of 14 MeV neutrons for fusion materials and sub-component testing. In addition to its main goal, the source has potential applications in condensed matter physics and biophysics. In this report, the author considers adding one more capability to the GDT-based neutron source, the modulation of the neutron flux with a desired frequency. The modulation may be an enabling tool for the assessment of the role of non-steady-state effects in fusion devices as well as for high-precision, low-signal basic science experiments favoring the use of the synchronousmore » detection technique. A conclusion is drawn that modulation frequency of up to 1 kHz and modulation amplitude of a few percent is achievable. Limitations on the amplitude of modulations at higher frequencies are discussed.« less

  18. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  19. Embracing Open Software Development in Solar Physics

    NASA Astrophysics Data System (ADS)

    Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.

    2012-12-01

    We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We discuss the development of both these efforts and how they are beginning to influence the solar physics community.

  20. The enigma of the magnetic pulsar SXP1062: a new look with XMM-Newton

    NASA Astrophysics Data System (ADS)

    Oskinova, Lidia

    2012-10-01

    SXP 1062 is an exceptional case of a young neutron star with known age in a wind-fed HMXB. A unique combination of measured spin period, its derivative, luminosity and young age makes this source a key probe for the physics of accretion and neutron star evolution. All current accretion scenarios encounter major difficulties explaining the spin-down rate of this accretion-powered pulsar. This study will allow us to construct a spin period-luminosity relation as a powerful tool for distinguishing between different accretion and evolution scenarios. The XMM-Newton observations of SXP 1062 will thus shed new light on the physics of accreting neutron stars.

  1. Observational properties of pulsars.

    PubMed

    Manchester, R N

    2004-04-23

    Pulsars are remarkable clocklike celestial sources that are believed to be rotating neutron stars formed in supernova explosions. They are valuable tools for investigations into topics such as neutron star interiors, globular cluster dynamics, the structure of the interstellar medium, and gravitational physics. Searches at radio and x-ray wavelengths over the past 5 years have resulted in a large increase in the number of known pulsars and the discovery of new populations of pulsars, posing challenges to theories of binary and stellar evolution. Recent images at radio, optical, and x-ray wavelengths have revealed structures resulting from the interaction of pulsar winds with the surrounding interstellar medium, giving new insights into the physics of pulsars.

  2. Assessment of rockfall susceptibility by integrating statistical and physically-based approaches

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Crosta, Giovanni; Carrara, Alberto; Agliardi, Federico

    In Val di Fassa (Dolomites, Eastern Italian Alps) rockfalls constitute the most significant gravity-induced natural disaster that threatens both the inhabitants of the valley, who are few, and the thousands of tourists who populate the area in summer and winter. To assess rockfall susceptibility, we developed an integrated statistical and physically-based approach that aimed to predict both the susceptibility to onset and the probability that rockfalls will attain specific reaches. Through field checks and multi-temporal aerial photo-interpretation, we prepared a detailed inventory of both rockfall source areas and associated scree-slope deposits. Using an innovative technique based on GIS tools and a 3D rockfall simulation code, grid cells pertaining to the rockfall source-area polygons were classified as active or inactive, based on the state of activity of the associated scree-slope deposits. The simulation code allows one to link each source grid cell with scree deposit polygons by calculating the trajectory of each simulated launch of blocks. By means of discriminant analysis, we then identified the mix of environmental variables that best identifies grid cells with low or high susceptibility to rockfalls. Among these variables, structural setting, land use, and morphology were the most important factors that led to the initiation of rockfalls. We developed 3D simulation models of the runout distance, intensity and frequency of rockfalls, whose source grid cells corresponded either to the geomorphologically-defined source polygons ( geomorphological scenario) or to study area grid cells with slope angle greater than an empirically-defined value of 37° ( empirical scenario). For each scenario, we assigned to the source grid cells an either fixed or variable onset susceptibility; the latter was derived from the discriminant model group (active/inactive) membership probabilities. Comparison of these four models indicates that the geomorphological scenario with variable onset susceptibility appears to be the most realistic model. Nevertheless, political and legal issues seem to guide local administrators, who tend to select the more conservative empirically-based scenario as a land-planning tool.

  3. Bridging the Particle Physics and Big Data Worlds

    NASA Astrophysics Data System (ADS)

    Pivarski, James

    2017-09-01

    For decades, particle physicists have developed custom software because the scale and complexity of our problems were unique. In recent years, however, the ``big data'' industry has begun to tackle similar problems, and has developed some novel solutions. Incorporating scientific Python libraries, Spark, TensorFlow, and machine learning tools into the physics software stack can improve abstraction, reliability, and in some cases performance. Perhaps more importantly, it can free physicists to concentrate on domain-specific problems. Building bridges isn't always easy, however. Physics software and open-source software from industry differ in many incidental ways and a few fundamental ways. I will show work from the DIANA-HEP project to streamline data flow from ROOT to Numpy and Spark, to incorporate ideas of functional programming into histogram aggregation, and to develop real-time, query-style manipulations of particle data.

  4. Current Challenges in Geothermal Reservoir Simulation

    NASA Astrophysics Data System (ADS)

    Driesner, T.

    2016-12-01

    Geothermal reservoir simulation has long been introduced as a valuable tool for geothermal reservoir management and research. Yet, the current generation of simulation tools faces a number of severe challenges, in particular in the application for novel types of geothermal resources such as supercritical reservoirs or hydraulic stimulation. This contribution reviews a number of key problems: Representing the magmatic heat source of high enthalpy resources in simulations. Current practice is representing the deeper parts of a high enthalpy reservoir by a heat flux or temperature boundary condition. While this is sufficient for many reservoir management purposes it precludes exploring the chances of very high enthalpy resources in the deepest parts of such systems as well as the development of reliable conceptual models. Recent 2D simulations with the CSMP++ simulation platform demonstrate the potential of explicitly including the heat source, namely for understanding supercritical resources. Geometrically realistic incorporation of discrete fracture networks in simulation. A growing number of simulation tools can, in principle, handle flow and heat transport in discrete fracture networks. However, solving the governing equations and representing the physical properties are often biased by introducing strongly simplifying assumptions. Including proper fracture mechanics in complex fracture network simulations remains an open challenge. Improvements of the simulating chemical fluid-rock interaction in geothermal reservoirs. Major improvements have been made towards more stable and faster numerical solvers for multicomponent chemical fluid rock interaction. However, the underlying thermodynamic models and databases are unable to correctly address a number of important regions in temperature-pressure-composition parameter space. Namely, there is currently no thermodynamic formalism to describe relevant chemical reactions in supercritical reservoirs. Overcoming this unsatisfactory situation requires fundamental research in high temperature physical chemistry rather than further numerical development.

  5. Measurement and Instrumentation Challenges at X-ray Free Electron Lasers

    NASA Astrophysics Data System (ADS)

    Feng, Yiping

    2015-03-01

    X-ray Free Electron Laser sources based on the Self Amplified Spontaneous Emission process are intrinsically chaotic, giving rise to pulse-to-pulse fluctuations in all physical properties, including intensity, position and pointing, spatial and temporal profiles, spectral content, timing, and coherence. These fluctuations represents special challenges to users whose experiments are designed to reveal small changes in the underlying physical quantities, which would otherwise be completely washed out without using the proper diagnostics tools. Due to the X-ray FEL's unique characteristics such as the unprecedented peak power and nearly full spatial coherence, there are many technical challenges in conceiving and implementing these devices that are highly transmissive, provide sufficient signal-to-noise ratio, and most importantly work in the single-shot mode. Portions of this research were carried out at the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory. LCLS is an Office of Science User Facility operated for the U.S. Department of Energy Office of Science by Stanford Univ.

  6. The hobbyist phenomenon in physical security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michaud, E. C.

    Pro-Ams (professional amateurs) are groups of people who work on a problem as amateurs or unpaid persons in a given field at professional levels of competence. Astronomy is a good example of Pro-Am activity. At Galaxy Zoo, Pro-Ams evaluate data generated by professional observatories and are able to evaluate the millions of galaxies that have been observed but not classified, and report their findings at professional levels for fun. To allow the archiving of millions of galaxies that have been observed but not classified, the website has been engineered so that the public can view and classify galaxies even ifmore » they are not professional astronomers. In this endeavor, it has been found that amateurs can easily outperform automated vision systems. Today in the world of physical security, Pro-Ams are playing an ever-increasing role. Traditionally, locksmiths, corporations, and government organizations have been largely responsible for developing standards, uncovering vulnerabilities, and devising best security practices. Increasingly, however, nonprofit sporting organizations and clubs are doing this. They can be found all over the world, from Europe to the US and now South East Asia. Examples include TOOOL (The Open Organization of Lockpickers), the Longhorn Lockpicking Club, Sportsfreunde der Sperrtechnik - Deustcheland e.V., though there are many others. Members of these groups have been getting together weekly to discuss many elements of security, with some groups specializing in specific areas of security. When members are asked why they participate in these hobbyist groups, they usually reply (with gusto) that they do it for fun, and that they view defeating locks and other security devices as an interesting and entertaining puzzle. A lot of what happens at these clubs would not be possible if it weren't for 'Super Abundance', the ability to easily acquire (at little or no cost) the products, security tools, technologies, and intellectual resources traditionally limited to corporations, government organizations, or wealthy individuals. With this new access comes new discoveries. For example, hobbyist sport lockpicking groups discovered - and publicized - a number of new vulnerabilities between 2004 and 2009 that resulted in the majority of high-security lock manufacturers having to make changes and improvements to their products. A decade ago, amateur physical security discoveries were rare, at least those discussed publicly. In the interim, Internet sites such as lockpicking.org, lockpicking101.com and others have provided an online meeting place for people to trade tips, find friends with similar interests, and develop tools. The open, public discussion of software vulnerabilities, in contrast, has been going on for a long time. These two industries, physical security and software, have very different upgrade mechanisms. With software, a patch can typically be deployed quickly to fix a serious vulnerability, whereas a hardware fix for a physical security device or system can take upwards of months to implement in the field, especially if (as is often the case) hardware integrators are involved. Even when responding to publicly announced security vulnerabilities, manufacturers of physical security devices such as locks, intrusion detectors, or access control devices rarely view hobbyists as a positive resource. This is most unfortunate. In the field of software, it is common to speak of Open Source versus Closed Source. An Open Source software company may choose to distribute their software with a particular license, and give it away openly, with full details and all the lines of source code made available. Linux is a very popular example of this. A Close Source company, in contrast, chooses not to reveal its source code and will license its software products in a restrictive manor. Slowly, the idea of Open Source is now coming to the world of physical security. In the case of locks, it provides an alternative to the traditional Closed Source world of locksmiths. Now locks are physical objects, and can therefore be disassembled. As such, they have always been Open Source in a limited sense. Secrecy, in fact, is very difficult to maintain for a lock that is widely distributed. Having direct access to the lock design provides the hobbyist with a very open environment for finding security flaws, even if the lock manufacturer attempts to follow a Close Source model. It is clear that the field of physical security is going the digital route with companies such as Medeco, Mul-T-Lock, and Abloy manufacturing electromechanical locks. Various companies have already begun to add microcontrollers, cryptographic chip sets, solid-state sensors, and a number of other high-tech improvements to their product lineup in an effort to thwart people from defeating their security products.« less

  7. Goal setting as an outcome measure: A systematic review.

    PubMed

    Hurn, Jane; Kneebone, Ian; Cropley, Mark

    2006-09-01

    Goal achievement has been considered to be an important measure of outcome by clinicians working with patients in physical and neurological rehabilitation settings. This systematic review was undertaken to examine the reliability, validity and sensitivity of goal setting and goal attainment scaling approaches when used with working age and older people. To review the reliability, validity and sensitivity of both goal setting and goal attainment scaling when employed as an outcome measure within a physical and neurological working age and older person rehabilitation environment, by examining the research literature covering the 36 years since goal-setting theory was proposed. Data sources included a computer-aided literature search of published studies examining the reliability, validity and sensitivity of goal setting/goal attainment scaling, with further references sourced from articles obtained through this process. There is strong evidence for the reliability, validity and sensitivity of goal attainment scaling. Empirical support was found for the validity of goal setting but research demonstrating its reliability and sensitivity is limited. Goal attainment scaling appears to be a sound measure for use in physical rehabilitation settings with working age and older people. Further work needs to be carried out with goal setting to establish its reliability and sensitivity as a measurement tool.

  8. Tools for the Future of Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Geesaman, Donald

    2014-03-01

    The challenges of Nuclear Physics, especially in understanding strongly interacting matter in all its forms in the history of the universe, place ever higher demands on the tools of the field, including the workhorse, accelerators. These demands are not just higher energy and higher luminosity. To recreate the matter that fleetingly was formed in the origin of the heavy elements, we need higher power heavy-ion accelerators and creative techniques to harvest the isotopes. We also need high-current low-energy accelerators deep underground to detect the very slow rate reactions in stellar burning. To explore the three dimensional distributions of high-momentum quarks in hadrons and to search for gluonic excitations we need high-current CW electron accelerators. Understanding the gluonic structure of nuclei and the three dimensional distributions of partons at lower x, we need high-luminosity electron-ion colliders that also have the capabilities to prepare, preserve and manipulate the polarization of both beams. A search for the critical point in the QCD phase diagram demands high luminosity beams over a broad range of species and energy. With advances in cavity design and construction, beam manipulation and cooling, and ion sources and targets, the Nuclear Physics community, in the U.S. and internationally has a coordinated vision to deliver this exciting science. This work is supported by DOE, Office of Nuclear Physics, under contract DE-AC02-06CH11357.

  9. PENTrack - a versatile Monte Carlo tool for ultracold neutron sources and experiments

    NASA Astrophysics Data System (ADS)

    Picker, Ruediger; Chahal, Sanmeet; Christopher, Nicolas; Losekamm, Martin; Marcellin, James; Paul, Stephan; Schreyer, Wolfgang; Yapa, Pramodh

    2016-09-01

    Ultracold neutrons have energies in the hundred nano eV region. They can be stored in traps for hundreds of seconds. This makes them the ideal tool to study the neutron itself. Measurements of neutron decay correlations, lifetime or electric dipole moment are ideally suited for ultracold neutrons, as well as experiments probing the neutron's gravitational levels in the earth's field. We have developed a Monte Carlo simulation tool that can serve to design and optimize these experiments, and possibly correct results: PENTrack is a C++ based simulation code that tracks neutrons, protons and electrons or atoms, as well as their spins, in gravitational and electromagnetic fields. In addition wall interactions of neutrons due to strong interaction are modeled with a Fermi-potential formalism and take surface roughness into account. The presentation will introduce the physics behind the simulation and provide examples of its application.

  10. Semi-supervised Machine Learning for Analysis of Hydrogeochemical Data and Models

    NASA Astrophysics Data System (ADS)

    Vesselinov, Velimir; O'Malley, Daniel; Alexandrov, Boian; Moore, Bryan

    2017-04-01

    Data- and model-based analyses such as uncertainty quantification, sensitivity analysis, and decision support using complex physics models with numerous model parameters and typically require a huge number of model evaluations (on order of 10^6). Furthermore, model simulations of complex physics may require substantial computational time. For example, accounting for simultaneously occurring physical processes such as fluid flow and biogeochemical reactions in heterogeneous porous medium may require several hours of wall-clock computational time. To address these issues, we have developed a novel methodology for semi-supervised machine learning based on Non-negative Matrix Factorization (NMF) coupled with customized k-means clustering. The algorithm allows for automated, robust Blind Source Separation (BSS) of groundwater types (contamination sources) based on model-free analyses of observed hydrogeochemical data. We have also developed reduced order modeling tools, which coupling support vector regression (SVR), genetic algorithms (GA) and artificial and convolutional neural network (ANN/CNN). SVR is applied to predict the model behavior within prior uncertainty ranges associated with the model parameters. ANN and CNN procedures are applied to upscale heterogeneity of the porous medium. In the upscaling process, fine-scale high-resolution models of heterogeneity are applied to inform coarse-resolution models which have improved computational efficiency while capturing the impact of fine-scale effects at the course scale of interest. These techniques are tested independently on a series of synthetic problems. We also present a decision analysis related to contaminant remediation where the developed reduced order models are applied to reproduce groundwater flow and contaminant transport in a synthetic heterogeneous aquifer. The tools are coded in Julia and are a part of the MADS high-performance computational framework (https://github.com/madsjulia/Mads.jl).

  11. Real-Time Joint Streaming Data Processing from Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Y. Y.; Qin, J.; Tiampo, K. F.; Bauer, M.

    2014-12-01

    The results of the technological breakthroughs in computing that have taken place over the last few decades makes it possible to achieve emergency management objectives that focus on saving human lives and decreasing economic effects. In particular, the integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better real-time seismic hazard analysis through distributed computing networks. The main goal of this work is to utilize innovative computational algorithms for better real-time seismic risk analysis by integrating different data sources and processing tools into streaming and cloud computing applications. The Geological Survey of Canada operates the Canadian National Seismograph Network (CNSN) with over 100 high-gain instruments and 60 low-gain or strong motion seismographs. The processing of the continuous data streams from each station of the CNSN provides the opportunity to detect possible earthquakes in near real-time. The information from physical sources is combined to calculate a location and magnitude for an earthquake. The automatically calculated results are not always sufficiently precise and prompt that can significantly reduce the response time to a felt or damaging earthquake. Social sensors, here represented as Twitter users, can provide information earlier to the general public and more rapidly to the emergency planning and disaster relief agencies. We introduce joint streaming data processing from social and physical sensors in real-time based on the idea that social media observations serve as proxies for physical sensors. By using the streams of data in the form of Twitter messages, each of which has an associated time and location, we can extract information related to a target event and perform enhanced analysis by combining it with physical sensor data. Results of this work suggest that the use of data from social media, in conjunction with the development of innovative computing algorithms, when combined with sensor data can provide a new paradigm for real-time earthquake detection in order to facilitate rapid and inexpensive natural risk reduction.

  12. A study of the mechanical vibrations of a table-top extreme ultraviolet interference nanolithography tool.

    PubMed

    Prezioso, S; De Marco, P; Zuppella, P; Santucci, S; Ottaviano, L

    2010-04-01

    A prototype low cost table-top extreme ultraviolet (EUV) laser source (1.5 ns pulse duration, lambda=46.9 nm) was successfully employed as a laboratory scale interference nanolithography (INL) tool. Interference patterns were obtained with a simple Lloyd's mirror setup. Periodic structures on Polymethylmethacrylate/Si substrates were produced on large areas (8 mm(2)) with resolutions from 400 to 22.5 nm half pitch (the smallest resolution achieved so far with table-top EUV laser sources). The mechanical vibrations affecting both the laser source and Lloyd's setup were studied to determine if and how they affect the lateral resolution of the lithographic system. The vibration dynamics was described by a statistical model based on the assumption that the instantaneous position of the vibrating mechanical parts follows a normal distribution. An algorithm was developed to simulate the process of sample irradiation under different vibrations. The comparison between simulations and experiments allowed to estimate the characteristic amplitude of vibrations that was deduced to be lower than 50 nm. The same algorithm was used to reproduce the expected pattern profiles in the lambda/4 half pitch physical resolution limit. In that limit, a nonzero pattern modulation amplitude was obtained from the simulations, comparable to the peak-to-valley height (2-3 nm) measured for the 45 nm spaced fringes, indicating that the mechanical vibrations affecting the INL tool do not represent a limit in scaling down the resolution.

  13. In Pursuit of Improving Airburst and Ground Damage Predictions: Recent Advances in Multi-Body Aerodynamic Testing and Computational Tools Validation

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Gulhan, Ali; Aftosmis, Michael; Brock, Joseph; Mathias, Donovan; Need, Dominic; Rodriguez, David; Seltner, Patrick; Stern, Eric; Wiles, Sebastian

    2017-01-01

    An airburst from a large asteroid during entry can cause significant ground damage. The damage depends on the energy and the altitude of airburst. Breakup of asteroids into fragments and their lateral spread have been observed. Modeling the underlying physics of fragmented bodies interacting at hypersonic speeds and the spread of fragments is needed for a true predictive capability. Current models use heuristic arguments and assumptions such as pancaking or point source explosive energy release at pre-determined altitude or an assumed fragmentation spread rate to predict airburst damage. A multi-year collaboration between German Aerospace Center (DLR) and NASA has been established to develop validated computational tools to address the above challenge.

  14. 15N isotopic analyses: a powerful tool to establish links between seized 3,4-methylenedioxymethamphetamine (MDMA) tablets.

    PubMed

    Palhol, Fabien; Lamoureux, Catherine; Naulet, Norbert

    2003-06-01

    In this study the (15)N/(14)N isotopic ratios of 43 samples of 3,4-methylenedioxymethamphetamine (MDMA) samples were measured using Gas Chromatography-Combustion-Isotope-Ratio Mass Spectrometry (GC-C-IRMS). The results show a large discrimination between samples with a range of delta(15)N values between -16 and +19 per thousand. Comparison between delta(15)N values and other physical and chemical parameters shows a strong relationship between delta(15)N and brand logo or composition. Thus, it could be assumed that tablets from different seizures probably originated from the same clandestine manufacturing source. Hence, (15)N isotopic parameters provide an important additional tool to establish common origins between seizures of clandestine synthetic drugs.

  15. Integrated tokamak modeling: when physics informs engineering and research planning

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2017-10-01

    Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.

  16. Federal Technology Catalog 1982: Summaries of practical technology

    NASA Astrophysics Data System (ADS)

    The catalog presents summaries of practical technology selected for commercial potential and/or promising applications to the fields of computer technology, electrotechnology, energy, engineering, life sciences, machinery and tools, manufacturing, materials, physical sciences, and testing and instrumentation. Each summary not only describes a technology, but gives a source for further information. This publication describes some 1,100 new processes, inventions, equipment, software, and techniques developed by and for dozens of Federal agencies during 1982. Included is coverage of NASA Tech Briefs, DOE Energygrams, and Army Manufacturing Notes.

  17. Directional acoustic measurements by laser Doppler velocimeters. [for jet aircraft noise

    NASA Technical Reports Server (NTRS)

    Mazumder, M. K.; Overbey, R. L.; Testerman, M. K.

    1976-01-01

    Laser Doppler velocimeters (LDVs) were used as velocity microphones to measure sound pressure level in the range of 90-130 db, spectral components, and two-point cross correlation functions for acoustic noise source identification. Close agreement between LDV and microphone data is observed. It was concluded that directional sensitivity and the ability to measure remotely make LDVs useful tools for acoustic measurement where placement of any physical probe is difficult or undesirable, as in the diagnosis of jet aircraft noise.

  18. Modelling the role of forests on water provision services: a hydro-economic valuation approach

    NASA Astrophysics Data System (ADS)

    Beguería, S.; Campos, P.

    2015-12-01

    Hydro-economic models that allow integrating the ecological, hydrological, infrastructure, economic and social aspects into a coherent, scientifically- informed framework constitute preferred tools for supporting decision making in the context of integrated water resources management. We present a case study of water regulation and provision services of forests in the Andalusia region of Spain. Our model computes the physical water flows and conducts an economic environmental income and asset valuation of forest surface and underground water yield. Based on available hydrologic and economic data, we develop a comprehensive water account for all the forest lands at the regional scale. This forest water environmental valuation is integrated within a much larger project aiming at providing a robust and easily replicable accounting tool to evaluate yearly the total income and capital of forests, encompassing all measurable sources of private and public incomes (timber and cork production, auto-consumption, recreational activities, biodiversity conservation, carbon sequestration, water production, etc.). We also force our simulation with future socio-economic scenarios to quantify the physical and economic efects of expected trends or simulated public and private policies on future water resources. Only a comprehensive integrated tool may serve as a basis for the development of integrated policies, such as those internationally agreed and recommended for the management of water resources.

  19. A gender study investigating physics self-efficacy

    NASA Astrophysics Data System (ADS)

    Sawtelle, Vashti

    The underrepresentation of women in physics has been well documented and a source of concern for both policy makers and educators. My dissertation focuses on understanding the role self-efficacy plays in retaining students, particularly women, in introductory physics. I use an explanatory mixed methods approach to first investigate quantitatively the influence of self-efficacy in predicting success and then to qualitatively explore the development of self-efficacy. In the initial quantitative studies, I explore the utility of self-efficacy in predicting the success of introductory physics students, both women and men. Results indicate that self-efficacy is a significant predictor of success for all students. I then disaggregate the data to examine how self-efficacy develops differently for women and men in the introductory physics course. Results show women rely on different sources of self-efficacy than do men, and that a particular instructional environment, Modeling Instruction, has a positive impact on these sources of self-efficacy. In the qualitative phase of the project, this dissertation focuses on the development of self-efficacy. Using the qualitative tool of microanalysis, I introduce a methodology for understanding how self-efficacy develops moment-by-moment using the lens of self-efficacy opportunities. I then use the characterizations of self-efficacy opportunities to focus on a particular course environment and to identify and describe a mechanism by which Modeling Instruction impacts student self-efficacy. Results indicate that the emphasizing the development and deployment of models affords opportunities to impact self-efficacy. The findings of this dissertation indicate that introducing key elements into the classroom, such as cooperative group work, model development and deployment, and interaction with the instructor, create a mechanism by which instructors can impact the self-efficacy of their students. Results from this study indicate that creating a model to impact the retention rates of women in physics should include attending to self-efficacy and designing activities in the classroom that create self-efficacy opportunities.

  20. Development and Validation of a Standardized Tool for Prioritization of Information Sources.

    PubMed

    Akwar, Holy; Kloeze, Harold; Mukhi, Shamir

    2016-01-01

    To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.

  1. REDBACK: an Open-Source Highly Scalable Simulation Tool for Rock Mechanics with Dissipative Feedbacks

    NASA Astrophysics Data System (ADS)

    Poulet, T.; Veveakis, M.; Paesold, M.; Regenauer-Lieb, K.

    2014-12-01

    Multiphysics modelling has become an indispensable tool for geoscientists to simulate the complex behaviours observed in their various fields of study where multiple processes are involved, including thermal, hydraulic, mechanical and chemical (THMC) laws. This modelling activity involves simulations that are computationally expensive and its soaring uptake is tightly linked to the increasing availability of supercomputing power and easy access to powerful nonlinear solvers such as PETSc (http://www.mcs.anl.gov/petsc/). The Multiphysics Object-Oriented Simulation Environment (MOOSE) is a finite-element, multiphysics framework (http://mooseframework.org) that can harness such computational power and allow scientists to develop easily some tightly-coupled fully implicit multiphysics simulations that run automatically in parallel on large clusters. This open-source framework provides a powerful tool to collaborate on numerical modelling activities and we are contributing to its development with REDBACK (https://github.com/pou036/redback), a module for Rock mEchanics with Dissipative feedBACKs. REDBACK builds on the tensor mechanics finite strain implementation available in MOOSE to provide a THMC simulator where the energetic formulation highlights the importance of all dissipative terms in the coupled system of equations. We show first applications of fully coupled dehydration reactions triggering episodic fluid transfer through shear zones (Alevizos et al, 2014). The dimensionless approach used allows focusing on the critical underlying variables which are driving the resulting behaviours observed and this tool is specifically designed to study material instabilities underpinning geological features like faulting, folding, boudinage, shearing, fracturing, etc. REDBACK provides a collaborative and educational tool which captures the physical and mathematical understanding of such material instabilities and provides an easy way to apply this knowledge to realistic scenarios, where the size and complexity of the geometries considered, along with the material parameters distributions, add as many sources of different instabilities. References: Alevizos, S., T. Poulet, and E. Veveakis (2014), J. Geophys. Res., 119, 4558-4582, doi:10.1002/2013JB010070.

  2. A geodata warehouse: Using denormalisation techniques as a tool for delivering spatially enabled integrated geological information to geologists

    NASA Astrophysics Data System (ADS)

    Kingdon, Andrew; Nayembil, Martin L.; Richardson, Anne E.; Smith, A. Graham

    2016-11-01

    New requirements to understand geological properties in three dimensions have led to the development of PropBase, a data structure and delivery tools to deliver this. At the BGS, relational database management systems (RDBMS) has facilitated effective data management using normalised subject-based database designs with business rules in a centralised, vocabulary controlled, architecture. These have delivered effective data storage in a secure environment. However, isolated subject-oriented designs prevented efficient cross-domain querying of datasets. Additionally, the tools provided often did not enable effective data discovery as they struggled to resolve the complex underlying normalised structures providing poor data access speeds. Users developed bespoke access tools to structures they did not fully understand sometimes delivering them incorrect results. Therefore, BGS has developed PropBase, a generic denormalised data structure within an RDBMS to store property data, to facilitate rapid and standardised data discovery and access, incorporating 2D and 3D physical and chemical property data, with associated metadata. This includes scripts to populate and synchronise the layer with its data sources through structured input and transcription standards. A core component of the architecture includes, an optimised query object, to deliver geoscience information from a structure equivalent to a data warehouse. This enables optimised query performance to deliver data in multiple standardised formats using a web discovery tool. Semantic interoperability is enforced through vocabularies combined from all data sources facilitating searching of related terms. PropBase holds 28.1 million spatially enabled property data points from 10 source databases incorporating over 50 property data types with a vocabulary set that includes 557 property terms. By enabling property data searches across multiple databases PropBase has facilitated new scientific research, previously considered impractical. PropBase is easily extended to incorporate 4D data (time series) and is providing a baseline for new "big data" monitoring projects.

  3. School environments and physical activity: the development and testing of an audit tool

    PubMed Central

    Jones, Natalia R; Jones, Andy; van Sluijs, Esther MF; Panter, Jenna; Harrison, Flo; Griffin, Simon J

    2013-01-01

    The aim of this study was to develop, test, and employ an audit tool to objectively assess the opportunities for physical activity within school environments. A 44 item tool was developed and tested at 92 primary schools in the county of Norfolk, England, during summer term of 2007. Scores from the tool covering 6 domains of facility provision were examined against objectively measured hourly moderate to vigorous physical activity levels in 1868 9-10 year old pupils attending the schools. The tool was found to have acceptable reliability and good construct validity, differentiating the physical activity levels of children attending the highest and lowest scoring schools. The characteristics of school grounds may influence pupil’s physical activity levels. PMID:20435506

  4. Tools for beach health data management, data processing, and predictive model implementation

    USGS Publications Warehouse

    ,

    2013-01-01

    This fact sheet describes utilities created for management of recreational waters to provide efficient data management, data aggregation, and predictive modeling as well as a prototype geographic information system (GIS)-based tool for data visualization and summary. All of these utilities were developed to assist beach managers in making decisions to protect public health. The Environmental Data Discovery and Transformation (EnDDaT) Web service identifies, compiles, and sorts environmental data from a variety of sources that help to define climatic, hydrologic, and hydrodynamic characteristics including multiple data sources within the U.S. Geological Survey and the National Oceanic and Atmospheric Administration. The Great Lakes Beach Health Database (GLBH-DB) and Web application was designed to provide a flexible input, export, and storage platform for beach water quality and sanitary survey monitoring data to compliment beach monitoring programs within the Great Lakes. A real-time predictive modeling strategy was implemented by combining the capabilities of EnDDaT and the GLBH-DB for timely, automated prediction of beach water quality. The GIS-based tool was developed to map beaches based on their physical and biological characteristics, which was shared with multiple partners to provide concepts and information for future Web-accessible beach data outlets.

  5. Kinematic and Dynamic Source Rupture Scenario for Potential Megathrust Event along the Southernmost Ryukyu Trench

    NASA Astrophysics Data System (ADS)

    Lin, T. C.; Hu, F.; Chen, X.; Lee, S. J.; Hung, S. H.

    2017-12-01

    Kinematic source model is widely used for the simulation of an earthquake, because of its simplicity and ease of application. On the other hand, dynamic source model is a more complex but important tool that can help us to understand the physics of earthquake initiation, propagation, and healing. In this study, we focus on the southernmost Ryukyu Trench which is extremely close to northern Taiwan. Interseismic GPS data in northeast Taiwan shows a pattern of strain accumulation, which suggests the maximum magnitude of a potential future earthquake in this area is probably about magnitude 8.7. We develop dynamic rupture models for the hazard estimation of the potential megathrust event based on the kinematic rupture scenarios which are inverted using the interseismic GPS data. Besides, several kinematic source rupture scenarios with different characterized slip patterns are also considered to constrain the dynamic rupture process better. The initial stresses and friction properties are tested using the trial-and-error method, together with the plate coupling and tectonic features. An analysis of the dynamic stress field associated with the slip prescribed in the kinematic models can indicate possible inconsistencies with physics of faulting. Furthermore, the dynamic and kinematic rupture models are considered to simulate the ground shaking from based on 3-D spectral-element method. We analyze ShakeMap and ShakeMovie from the simulation results to evaluate the influence over the island between different source models. A dispersive tsunami-propagation simulation is also carried out to evaluate the maximum tsunami wave height along the coastal areas of Taiwan due to coseismic seafloor deformation of different source models. The results of this numerical simulation study can provide a physically-based information of megathrust earthquake scenario for the emergency response agency to take the appropriate action before the really big one happens.

  6. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    PubMed

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  7. High brightness electrodeless Z-Pinch EUV source for mask inspection tools

    NASA Astrophysics Data System (ADS)

    Horne, Stephen F.; Partlow, Matthew J.; Gustafson, Deborah S.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.

    2012-03-01

    Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinchTM light source since 1995. The source is currently being used for metrology, mask inspection, and resist development. Energetiq's higher brightness source has been selected as the source for pre-production actinic mask inspection tools. This improved source enables the mask inspection tool suppliers to build prototype tools with capabilities of defect detection and review down to 16nm design rules. In this presentation we will present new source technology being developed at Energetiq to address the critical source brightness issue. The new technology will be shown to be capable of delivering brightness levels sufficient to meet the HVM requirements of AIMS and ABI and potentially API tools. The basis of the source technology is to use the stable pinch of the electrodeless light source and have a brightness of up to 100W/mm(carat)2-sr. We will explain the source design concepts, discuss the expected performance and present the modeling results for the new design.

  8. Making it Easy to Construct Accurate Hydrological Models that Exploit High Performance Computers (Invited)

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Terrel, A.; Certik, O.; Seljebotn, D.

    2013-12-01

    This presentation will focus on two barriers to progress in the hydrological modeling community, and research and development conducted to lessen or eliminate them. The first is a barrier to sharing hydrological models among specialized scientists that is caused by intertwining the implementation of numerical methods with the implementation of abstract numerical modeling information. In the Proteus toolkit for computational methods and simulation, we have decoupled these two important parts of computational model through separate "physics" and "numerics" interfaces. More recently we have begun developing the Strong Form Language for easy and direct representation of the mathematical model formulation in a domain specific language embedded in Python. The second major barrier is sharing ANY scientific software tools that have complex library or module dependencies, as most parallel, multi-physics hydrological models must have. In this setting, users and developer are dependent on an entire distribution, possibly depending on multiple compilers and special instructions depending on the environment of the target machine. To solve these problem we have developed, hashdist, a stateless package management tool and a resulting portable, open source scientific software distribution.

  9. Active video games as an exercise tool for children with cystic fibrosis.

    PubMed

    O'Donovan, Cuisle; Greally, Peter; Canny, Gerard; McNally, Paul; Hussey, Juliette

    2014-05-01

    Active video games are used in many hospitals as exercise tools for children with cystic fibrosis. However, the exercise intensity associated with playing these games has not been examined in this population. Children with cystic fibrosis [n=30, aged 12.3 (2.6) years, 17 boys, BMI 17.7 (2.8) kg/m(2)] were recruited from outpatient clinics in Dublin hospitals. Age and gender matched control children were recruited from local schools. Oxygen consumption, metabolic equivalents (METs) calculated from resting V˙O2, and heart rate were measured while playing Nintendo Wii™ (Nintendo Co. Ltd., Tokyo, Japan) Sports Boxing and Nintendo Wii Fit Free Jogging using a portable indirect calorimeter (Oxycon Mobile). Playing Wii Boxing resulted in light intensity activity (2.46METs) while playing Wii Fit Free Jogging resulted in moderate intensity physical activity (4.44METs). No significant difference was seen between groups in the energy cost of playing active video games. Active video games are a useful source of light to moderate intensity physical activity in children with cystic fibrosis. © 2013.

  10. [Dignity therapy in oncology].

    PubMed

    Ripamonti, Carla Ida

    2016-04-01

    In oncology, little is known about dignity, dignity-related distress and the issues that influence the sense of dignity for patients. Dignity is personal, subject to changes depending on the experience and the path of life. In oncology some patients feel that their dignity is directly related to the disease, to physical and emotional symptoms, to the highest level of physical and cognitive autonomy and to the continuity of the self. Patient dignity inventory (PDI) is a validate tool designed to measure various sources of dignity-related distress among patients nearing the end of life and serve as a screening tool to assess a broad range of issues that influence the sense of dignity. Dignity therapy is a novel focused psychotherapy consisting in a brief semi-structured interview, audio-recorded and transcribed in order to obtain the "generativity document". The patients are invited to tell about their life history, and to leave words of guidance and offer instructions to pass along to their son, daughters, husband, wife, parents, others. The generativity document is the result of process of emotional and existential care for the patients and a gift for everybody will receive it.

  11. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  12. UMEL: a new regression tool to identify measurement peaks in LIDAR/DIAL systems for environmental physics applications.

    PubMed

    Gelfusa, M; Gaudio, P; Malizia, A; Murari, A; Vega, J; Richetta, M; Gonzalez, S

    2014-06-01

    Recently, surveying large areas in an automatic way, for early detection of both harmful chemical agents and forest fires, has become a strategic objective of defence and public health organisations. The Lidar and Dial techniques are widely recognized as a cost-effective alternative to monitor large portions of the atmosphere. To maximize the effectiveness of the measurements and to guarantee reliable monitoring of large areas, new data analysis techniques are required. In this paper, an original tool, the Universal Multi Event Locator, is applied to the problem of automatically identifying the time location of peaks in Lidar and Dial measurements for environmental physics applications. This analysis technique improves various aspects of the measurements, ranging from the resilience to drift in the laser sources to the increase of the system sensitivity. The method is also fully general, purely software, and can therefore be applied to a large variety of problems without any additional cost. The potential of the proposed technique is exemplified with the help of data of various instruments acquired during several experimental campaigns in the field.

  13. A Coupled Multiphysics Approach for Simulating Induced Seismicity, Ground Acceleration and Structural Damage

    NASA Astrophysics Data System (ADS)

    Podgorney, Robert; Coleman, Justin; Wilkins, Amdrew; Huang, Hai; Veeraraghavan, Swetha; Xia, Yidong; Permann, Cody

    2017-04-01

    Numerical modeling has played an important role in understanding the behavior of coupled subsurface thermal-hydro-mechanical (THM) processes associated with a number of energy and environmental applications since as early as the 1970s. While the ability to rigorously describe all key tightly coupled controlling physics still remains a challenge, there have been significant advances in recent decades. These advances are related primarily to the exponential growth of computational power, the development of more accurate equations of state, improvements in the ability to represent heterogeneity and reservoir geometry, and more robust nonlinear solution schemes. The work described in this paper documents the development and linkage of several fully-coupled and fully-implicit modeling tools. These tools simulate: (1) the dynamics of fluid flow, heat transport, and quasi-static rock mechanics; (2) seismic wave propagation from the sources of energy release through heterogeneous material; and (3) the soil-structural damage resulting from ground acceleration. These tools are developed in Idaho National Laboratory's parallel Multiphysics Object Oriented Simulation Environment, and are integrated together using a global implicit approach. The governing equations are presented, the numerical approach for simultaneously solving and coupling the three coupling physics tools is discussed, and the data input and output methodology is outlined. An example is presented to demonstrate the capabilities of the coupled multiphysics approach. The example involves simulating a system conceptually similar to the geothermal development in Basel Switzerland, and the resultant induced seismicity, ground motion and structural damage is predicted.

  14. Fecal indicator organism modeling and microbial source tracking in environmental waters: Chapter 3.4.6

    USGS Publications Warehouse

    Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.

    2016-01-01

    Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.

  15. HEPS Inventory Tool: An Inventory Tool Including Quality Assessment of School Interventions on Healthy Eating and Physical Activity

    ERIC Educational Resources Information Center

    Dadaczynski, Kevin; Paulus, Peter; de Vries, Nanne; de Ruiter, Silvia; Buijs, Goof

    2010-01-01

    The HEPS Inventory Tool aims to support stakeholders working in school health promotion to promote high quality interventions on healthy eating and physical activity. As a tool it provides a step-by-step approach on how to develop a national or regional inventory of existing school based interventions on healthy eating and physical activity. It…

  16. Using Spectral Losses to Map a Damage Zone for the Source Physics Experiments (SPE)

    NASA Astrophysics Data System (ADS)

    Knox, H. A.; Abbott, R. E.; Bonal, N.; Preston, L. A.

    2013-12-01

    We performed a series of cross-borehole seismic experiments in support of the Source Physics Experiments (SPE). These surveys, which were conducted in a granitic body using a sparker source and hydrophone string, were designed to image the damage zone from two underground explosions (SPE2 and SPE3). We present results here from a total of six boreholes (the explosive shot emplacement hole and 5 satellite holes, 20-35 meters away) where we found a marked loss of high frequency energy in ray paths traversing the region near the SPE explosions. Specifically, the frequencies above ~400 Hz were lost in a region centered around 45 meters depth, coincident with SPE2 and SPE3 shots. We further quantified these spectral losses, developed a map of where they occur, and evaluated the attenuation effects of raypath length (i.e. source-receiver offset). We attribute this severe attenuation to the inelastic damage (i.e. cracking and pulverizing) caused by the large chemical explosions and propose that frequency attenuation of this magnitude provides yet another tool for detecting the damage due to large underground explosions. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  17. Active Learning Strategies for Introductory Light and Optics

    NASA Astrophysics Data System (ADS)

    Sokoloff, David R.

    2016-01-01

    There is considerable evidence that traditional approaches are ineffective in teaching physics concepts, including light and optics concepts. A major focus of the work of the Activity Based Physics Group has been on the development of active learning curricula like RealTime Physics (RTP) labs and Interactive Lecture Demonstrations (ILDs). Among the characteristics of these curricula are: (1) use of a learning cycle in which students are challenged to compare predictions—discussed with their peers in small groups—to observations of the physical world, (2) use of guided hands-on work to construct basic concepts from observations, and (3) use of computer-based tools. It has been possible to change the lecture and laboratory learning environments at a large number of universities, colleges, and high schools without changing the structure of the introductory course. For example, in the United States, nearly 200 physics departments have adopted RTP, and many others use pre-publication, open-source versions or have adopted the RTP approach to develop their own labs. Examples from RTP and ILDs (including optics magic tricks) are described in this paper.

  18. An epistemic framing analysis of upper level physics students' use of mathematics

    NASA Astrophysics Data System (ADS)

    Bing, Thomas Joseph

    Mathematics is central to a professional physicist's work and, by extension, to a physics student's studies. It provides a language for abstraction, definition, computation, and connection to physical reality. This power of mathematics in physics is also the source of many of the difficulties it presents students. Simply put, many different activities could all be described as "using math in physics". Expertise entails a complicated coordination of these various activities. This work examines the many different kinds of thinking that are all facets of the use of mathematics in physics. It uses an epistemological lens, one that looks at the type of explanation a student presently sees as appropriate, to analyze the mathematical thinking of upper level physics undergraduates. Sometimes a student will turn to a detailed calculation to produce or justify an answer. Other times a physical argument is explicitly connected to the mathematics at hand. Still other times quoting a definition is seen as sufficient, and so on. Local coherencies evolve in students' thought around these various types of mathematical justifications. We use the cognitive process of framing to model students' navigation of these various facets of math use in physics. We first demonstrate several common framings observed in our students' mathematical thought and give several examples of each. Armed with this analysis tool, we then give several examples of how this framing analysis can be used to address a research question. We consider what effects, if any, a powerful symbolic calculator has on students' thinking. We also consider how to characterize growing expertise among physics students. Framing offers a lens for analysis that is a natural fit for these sample research questions. To active physics education researchers, the framing analysis presented in this dissertation can provide a useful tool for addressing other research questions. To physics teachers, we present this analysis so that it may make them more explicitly aware of the various types of reasoning, and the dynamics among them, that students employ in our physics classes. This awareness will help us better hear students' arguments and respond appropriately.

  19. Evaluation of air quality in a megacity using statistics tools

    NASA Astrophysics Data System (ADS)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2018-06-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  20. Evaluation of air quality in a megacity using statistics tools

    NASA Astrophysics Data System (ADS)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2017-03-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  1. Helioviewer: A Web 2.0 Tool for Visualizing Heterogeneous Heliophysics Data

    NASA Astrophysics Data System (ADS)

    Hughitt, V. K.; Ireland, J.; Lynch, M. J.; Schmeidel, P.; Dimitoglou, G.; Müeller, D.; Fleck, B.

    2008-12-01

    Solar physics datasets are becoming larger, richer, more numerous and more distributed. Feature/event catalogs (describing objects of interest in the original data) are becoming important tools in navigating these data. In the wake of this increasing influx of data and catalogs there has been a growing need for highly sophisticated tools for accessing and visualizing this wealth of information. Helioviewer is a novel tool for integrating and visualizing disparate sources of solar and Heliophysics data. Taking advantage of the newly available power of modern web application frameworks, Helioviewer merges image and feature catalog data, and provides for Heliophysics data a familiar interface not unlike Google Maps or MapQuest. In addition to streamlining the process of combining heterogeneous Heliophysics datatypes such as full-disk images and coronagraphs, the inclusion of visual representations of automated and human-annotated features provides the user with an integrated and intuitive view of how different factors may be interacting on the Sun. Currently, Helioviewer offers images from The Extreme ultraviolet Imaging Telescope (EIT), The Large Angle and Spectrometric COronagraph experiment (LASCO) and the Michelson Doppler Imager (MDI) instruments onboard The Solar and Heliospheric Observatory (SOHO), as well as The Transition Region and Coronal Explorer (TRACE). Helioviewer also incorporates feature/event information from the LASCO CME List, NOAA Active Regions, CACTus CME and Type II Radio Bursts feature/event catalogs. The project is undergoing continuous development with many more data sources and additional functionality planned for the near future.

  2. Petascale computation of multi-physics seismic simulations

    NASA Astrophysics Data System (ADS)

    Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie; Duru, Kenneth C.

    2017-04-01

    Capturing the observed complexity of earthquake sources in concurrence with seismic wave propagation simulations is an inherently multi-scale, multi-physics problem. In this presentation, we present simulations of earthquake scenarios resolving high-detail dynamic rupture evolution and high frequency ground motion. The simulations combine a multitude of representations of model complexity; such as non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure to capture dynamic rupture behavior at the source; and seismic wave attenuation, 3D subsurface structure and bathymetry impacting seismic wave propagation. Performing such scenarios at the necessary spatio-temporal resolution requires highly optimized and massively parallel simulation tools which can efficiently exploit HPC facilities. Our up to multi-PetaFLOP simulations are performed with SeisSol (www.seissol.org), an open-source software package based on an ADER-Discontinuous Galerkin (DG) scheme solving the seismic wave equations in velocity-stress formulation in elastic, viscoelastic, and viscoplastic media with high-order accuracy in time and space. Our flux-based implementation of frictional failure remains free of spurious oscillations. Tetrahedral unstructured meshes allow for complicated model geometry. SeisSol has been optimized on all software levels, including: assembler-level DG kernels which obtain 50% peak performance on some of the largest supercomputers worldwide; an overlapping MPI-OpenMP parallelization shadowing the multiphysics computations; usage of local time stepping; parallel input and output schemes and direct interfaces to community standard data formats. All these factors enable aim to minimise the time-to-solution. The results presented highlight the fact that modern numerical methods and hardware-aware optimization for modern supercomputers are essential to further our understanding of earthquake source physics and complement both physic-based ground motion research and empirical approaches in seismic hazard analysis. Lastly, we will conclude with an outlook on future exascale ADER-DG solvers for seismological applications.

  3. Evaluating Air-Quality Models: Review and Outlook.

    NASA Astrophysics Data System (ADS)

    Weil, J. C.; Sykes, R. I.; Venkatram, A.

    1992-10-01

    Over the past decade, much attention has been devoted to the evaluation of air-quality models with emphasis on model performance in predicting the high concentrations that are important in air-quality regulations. This paper stems from our belief that this practice needs to be expanded to 1) evaluate model physics and 2) deal with the large natural or stochastic variability in concentration. The variability is represented by the root-mean- square fluctuating concentration (c about the mean concentration (C) over an ensemble-a given set of meteorological, source, etc. conditions. Most air-quality models used in applications predict C, whereas observations are individual realizations drawn from an ensemble. For cC large residuals exist between predicted and observed concentrations, which confuse model evaluations.This paper addresses ways of evaluating model physics in light of the large c the focus is on elevated point-source models. Evaluation of model physics requires the separation of the mean model error-the difference between the predicted and observed C-from the natural variability. A residual analysis is shown to be an elective way of doing this. Several examples demonstrate the usefulness of residuals as well as correlation analyses and laboratory data in judging model physics.In general, c models and predictions of the probability distribution of the fluctuating concentration (c), (c, are in the developmental stage, with laboratory data playing an important role. Laboratory data from point-source plumes in a convection tank show that (c approximates a self-similar distribution along the plume center plane, a useful result in a residual analysis. At pmsent,there is one model-ARAP-that predicts C, c, and (c for point-source plumes. This model is more computationally demanding than other dispersion models (for C only) and must be demonstrated as a practical tool. However, it predicts an important quantity for applications- the uncertainty in the very high and infrequent concentrations. The uncertainty is large and is needed in evaluating operational performance and in predicting the attainment of air-quality standards.

  4. Astronomy and Cancer Research: X-Rays and Nanotechnology from Black Holes to Cancer Therapy

    NASA Astrophysics Data System (ADS)

    Pradhan, Anil K.; Nahar, Sultana N.

    It seems highly unlikely that any connection is to be found between astronomy and medicine. But then it also appears to be obvious: X-rays. However, that is quite superficial because the nature of X-rays in the two disciplines is quite different. Nevertheless, we describe recent research on exactly that kind of link. Furthermore, the linkage lies in atomic physics, and via spectroscopy which is a vital tool in astronomy and may also be equally valuable in biomedical research. This review begins with the physics of black hole environments as viewed through X-ray spectroscopy. It is then shown that similar physics can be applied to spectroscopic imaging and therapeutics using heavy-element (high-Z) moieties designed to target cancerous tumors. X-ray irradiation of high-Z nanomaterials as radiosensitizing agents should be extremely efficient for therapy and diagnostics (theranostics). However, broadband radiation from conventional X-ray sources (such as CT scanners) results in vast and unnecessary radiation exposure. Monochromatic X-ray sources are expected to be considerably more efficient. We have developed a new and comprehensive methodology—Resonant Nano-Plasma Theranostics (RNPT)—that encompasses the use of monochromatic X-ray sources and high-Z nanoparticles. Ongoing research entails theoretical computations, numerical simulations, and in vitro and in vivo biomedical experiments. Stemming from basic theoretical studies of Kα resonant photoabsorption and fluorescence in all elements of the Periodic Table, we have established a comprehensive multi-disciplinary program involving researchers from physics, chemistry, astronomy, pathology, radiation oncology and radiology. Large-scale calculations necessary for theory and modeling are done at a variety of computational platforms at the Ohio Supercomputer Center. The final goal is the implementation of RNPT for clinical applications.

  5. Comparison of gross anatomy test scores using traditional specimens vs. QuickTime Virtual Reality animated specimens

    NASA Astrophysics Data System (ADS)

    Maza, Paul Sadiri

    In recent years, technological advances such as computers have been employed in teaching gross anatomy at all levels of education, even in professional schools such as medical and veterinary medical colleges. Benefits of computer based instructional tools for gross anatomy include the convenience of not having to physically view or dissect a cadaver. Anatomy educators debate over the advantages versus the disadvantages of computer based resources for gross anatomy instruction. Many studies, case reports, and editorials argue for the increased use of computer based anatomy educational tools, while others discuss the necessity of dissection for various reasons important in learning anatomy, such as a three-dimensional physical view of the specimen, physical handling of tissues, interactions with fellow students during dissection, and differences between specific specimens. While many articles deal with gross anatomy education using computers, there seems to be a lack of studies investigating the use of computer based resources as an assessment tool for gross anatomy, specifically using the Apple application QuickTime Virtual Reality (QTVR). This study investigated the use of QTVR movie modules to assess if using computer based QTVR movie module assessments were equal in quality to actual physical specimen examinations. A gross anatomy course in the College of Veterinary Medicine at Cornell University was used as a source of anatomy students and gross anatomy examinations. Two groups were compared, one group taking gross anatomy examinations in a traditional manner, by viewing actual physical specimens and answering questions based on those specimens. The other group took the same examinations using the same specimens, but the specimens were viewed as simulated three-dimensional objects in a QTVR movie module. Sample group means for the assessments were compared. A survey was also administered asking students' perceptions of quality and user-friendliness of the QTVR movie modules. The comparison of the two sample group means of the examinations show that there was no difference in results between using QTVR movie modules to test gross anatomy knowledge versus using physical specimens. The results of this study are discussed to explain the benefits of using such computer based anatomy resources in gross anatomy assessments.

  6. Matter under extreme conditions experiments at the Linac Coherent Light Source

    DOE PAGES

    Glenzer, S. H.; Fletcher, L. B.; Galtier, E.; ...

    2015-12-10

    The Matter in Extreme Conditions end station at the Linac Coherent Light Source (LCLS) is a new tool enabling accurate pump-probe measurements for studying the physical properties of matter in the high-energy density physics regime. This instrument combines the world’s brightest x-ray source, the LCLS x-ray beam, with high-power lasers consisting of two nanosecond Nd:glass laser beams and one short-pulse Ti:sapphire laser. These lasers produce short-lived states of matter with high pressures, high temperatures or high densities with properties that are important for applications in nuclear fusion research, laboratory astrophysics and the development of intense radiation sources. In the firstmore » experiments, we have performed highly accurate x-ray diffraction and x-ray Thomson scattering techniques on shock-compressed matter resolving the transition from compressed solid matter to a co-existence regime and into the warm dense matter state. Furthermore, these complex charged-particle systems are dominated by strong correlations and quantum effects. They exist in planetary interiors and laboratory experiments, e.g., during high-power laser interactions with solids or the compression phase of inertial confinement fusion implosions. Applying record peak brightness X rays resolves the ionic interactions at atomic (Ångstrom) scale lengths and measure the static structure factor, which is a key quantity for determining equation of state data and important transport coefficients. Simultaneously, spectrally resolved measurements of plasmon features provide dynamic structure factor information that yield temperature and density with unprecedented precision at micron-scale resolution in dynamic compression experiments. This set of studies demonstrates our ability to measure fundamental thermodynamic properties that determine the state of matter in the high-energy density physics regime.« less

  7. The Exercise: An Exercise Generator Tool for the SOURCe Project

    ERIC Educational Resources Information Center

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  8. SolarTherm: A flexible Modelica-based simulator for CSP systems

    NASA Astrophysics Data System (ADS)

    Scott, Paul; Alonso, Alberto de la Calle; Hinkley, James T.; Pye, John

    2017-06-01

    Annual performance simulations provide a valuable tool for analysing the viability and overall impact of different concentrating solar power (CSP) component and system designs. However, existing tools work best with conventional systems and are difficult or impossible to adapt when novel components, configurations and operating strategies are of interest. SolarTherm is a new open source simulation tool that fulfils this need for the solar community. It includes a simulation framework and a library of flexible CSP components and control strategies that can be adapted or replaced with new designs to meet the special needs of end users. This paper provides an introduction to SolarTherm and a comparison of models for an energy-based trough system and a physical tower system to those in the well-established and widely-used simulator SAM. Differences were found in some components where the inner workings of SAM are undocumented or not well understood, while the other parts show strong agreement. These results help to validate the fundamentals of SolarTherm and demonstrate that, while at an early stage of development, it is already a useful tool for performing annual simulations.

  9. Infrared Thermal Imaging as a Tool in University Physics Education

    ERIC Educational Resources Information Center

    Mollmann, Klaus-Peter; Vollmer, Michael

    2007-01-01

    Infrared thermal imaging is a valuable tool in physics education at the university level. It can help to visualize and thereby enhance understanding of physical phenomena from mechanics, thermal physics, electromagnetism, optics and radiation physics, qualitatively as well as quantitatively. We report on its use as lecture demonstrations, student…

  10. RF Wave Simulation Using the MFEM Open Source FEM Package

    NASA Astrophysics Data System (ADS)

    Stillerman, J.; Shiraiwa, S.; Bonoli, P. T.; Wright, J. C.; Green, D. L.; Kolev, T.

    2016-10-01

    A new plasma wave simulation environment based on the finite element method is presented. MFEM, a scalable open-source FEM library, is used as the basis for this capability. MFEM allows for assembling an FEM matrix of arbitrarily high order in a parallel computing environment. A 3D frequency domain RF physics layer was implemented using a python wrapper for MFEM and a cold collisional plasma model was ported. This physics layer allows for defining the plasma RF wave simulation model without user knowledge of the FEM weak-form formulation. A graphical user interface is built on πScope, a python-based scientific workbench, such that a user can build a model definition file interactively. Benchmark cases have been ported to this new environment, with results being consistent with those obtained using COMSOL multiphysics, GENRAY, and TORIC/TORLH spectral solvers. This work is a first step in bringing to bear the sophisticated computational tool suite that MFEM provides (e.g., adaptive mesh refinement, solver suite, element types) to the linear plasma-wave interaction problem, and within more complicated integrated workflows, such as coupling with core spectral solver, or incorporating additional physics such as an RF sheath potential model or kinetic effects. USDoE Awards DE-FC02-99ER54512, DE-FC02-01ER54648.

  11. Recommendations for a culturally relevant Internet-based tool to promote physical activity among overweight young African American women, Alabama, 2010-2011.

    PubMed

    Durant, Nefertiti H; Joseph, Rodney P; Cherrington, Andrea; Cuffee, Yendelela; Knight, BernNadette; Lewis, Dwight; Allison, Jeroan J

    2014-01-16

    Innovative approaches are needed to promote physical activity among young adult overweight and obese African American women. We sought to describe key elements that African American women desire in a culturally relevant Internet-based tool to promote physical activity among overweight and obese young adult African American women. A mixed-method approach combining nominal group technique and traditional focus groups was used to elicit recommendations for the development of an Internet-based physical activity promotion tool. Participants, ages 19 to 30 years, were enrolled in a major university. Nominal group technique sessions were conducted to identify themes viewed as key features for inclusion in a culturally relevant Internet-based tool. Confirmatory focus groups were conducted to verify and elicit more in-depth information on the themes. Twenty-nine women participated in nominal group (n = 13) and traditional focus group sessions (n = 16). Features that emerged to be included in a culturally relevant Internet-based physical activity promotion tool were personalized website pages, diverse body images on websites and in videos, motivational stories about physical activity and women similar to themselves in size and body shape, tips on hair care maintenance during physical activity, and online social support through social media (eg, Facebook, Twitter). Incorporating existing social media tools and motivational stories from young adult African American women in Internet-based tools may increase the feasibility, acceptability, and success of Internet-based physical activity programs in this high-risk, understudied population.

  12. Earthquake Source Mechanics

    NASA Astrophysics Data System (ADS)

    The past 2 decades have seen substantial progress in our understanding of the nature of the earthquake faulting process, but increasingly, the subject has become an interdisciplinary one. Thus, although the observation of radiated seismic waves remains the primary tool for studying earthquakes (and has been increasingly focused on extracting the physical processes occurring in the “source”), geological studies have also begun to play a more important role in understanding the faulting process. Additionally, defining the physical underpinning for these phenomena has come to be an important subject in experimental and theoretical rock mechanics.In recognition of this, a Maurice Ewing Symposium was held at Arden House, Harriman, N.Y. (the former home of the great American statesman Averill Harriman), May 20-23, 1985. The purpose of the meeting was to bring together the international community of experimentalists, theoreticians, and observationalists who are engaged in the study of various aspects of earthquake source mechanics. The conference was attended by more than 60 scientists from nine countries (France, Italy, Japan, Poland, China, the United Kingdom, United States, Soviet Union, and the Federal Republic of Germany).

  13. ExEP yield modeling tool and validation test results

    NASA Astrophysics Data System (ADS)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  14. X-ray Laser Animated Fly-Through

    ScienceCinema

    None

    2018-01-16

    Take a tour with an electron's-eye-view through SLAC's revolutionary new X-ray laser facility with this 5 1/2 minute animation. See how the X-ray pulses are generated using the world's longest linear accelerator along with unique arrays of machinery specially designed for this one-of-a-kind tool. For more than 40 years, SLAC's two-mile-long linear accelerator (or linac) linac has produced high-energy electrons for cutting-edge physics experiments. Now, SLAC's linac has entered a new phase of its career with the creation of the Linac Coherent Light Source (LCLS).

  15. Developing the School Physical Activity and Nutrition Environment Tool to Measure Qualities of the Obesogenic Context

    ERIC Educational Resources Information Center

    John, Deborah H.; Gunter, Katherine; Jackson, Jennifer A.; Manore, Melinda

    2016-01-01

    Background: Practical tools are needed that reliably measure the complex physical activity (PA) and nutrition environments of elementary schools that influence children's health and learning behaviors for obesity prevention. The School Physical Activity and Nutrition-Environment Tool (SPAN-ET) was developed and beta tested in 6 rural Oregon…

  16. State-of-the-Art for Hygrothermal Simulation Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, Philip R.; New, Joshua Ryan; Shrestha, Som S.

    2017-02-01

    The hygrothermal (heat and moisture) performance of buildings can be assessed by utilizing simulation tools. There are currently a number of available hygrothermal calculation tools available which vary in their degree of sophistication and runtime requirements. This report investigates three of the most commonly used models (WUFI, HAMT, and EMPD) to assess their limitations and potential to generate physically realistic results to prioritize improvements for EnergyPlus (which uses HAMT and EMPD). The outcome of the study shows that, out of these three tools, WUFI has the greatest hygrothermal capabilities. Limitations of these tools were also assessed including: WUFI’s inability tomore » properly account for air leakage and transfer at surface boundaries; HAMT’s inability to handle air leakage, precipitationrelated moisture problems, or condensation problems from high relative humidity; and multiple limitations for EMPD as a simplified method to estimate indoor temperature and humidity levels and generally not used to estimate the hygrothermal performance of the building envelope materials. In conclusion, out of the three investigated simulation tools, HAMT has the greatest modeling potential, is open source, and we have prioritized specific features that can enable EnergyPlus to model all relevant heat and moisture transfer mechanisms that impact the performance of building envelope components.« less

  17. Experiments Using Cell Phones in Physics Classroom Education: The Computer-Aided g Determination

    NASA Astrophysics Data System (ADS)

    Vogt, Patrik; Kuhn, Jochen; Müller, Sebastian

    2011-09-01

    This paper continues the collection of experiments that describe the use of cell phones as experimental tools in physics classroom education.1-4 We describe a computer-aided determination of the free-fall acceleration g using the acoustical Doppler effect. The Doppler shift is a function of the speed of the source. Since a free-falling objects speed is changing linearly with time, the Doppler shift is also changing with time. It is possible to measure this shift using software that is both easy to use and readily available. Students will use the time-dependency of the Doppler shift to experimentally determine the acceleration due to gravity by using a cell phone as a freely falling object emitting a sound with constant frequency.

  18. Orangutans (Pongo pygmaeus and Pongo abelii) understand connectivity in the skewered grape tool task.

    PubMed

    Mulcahy, Nicholas J; Schubiger, Michèle N; Suddendorf, T

    2013-02-01

    Great apes appear to have limited knowledge of tool functionality when they are presented with tasks that involve a physical connection between a tool and a reward. For instance, they fail to understand that pulling a rope with a reward tied to its end is more beneficial than pulling a rope that only touches a reward. Apes show more success when both ropes have rewards tied to their ends but one rope is nonfunctional because it is clearly separated into aligned sections. It is unclear, however, whether this success is based on perceptual features unrelated to connectivity, such as perceiving the tool's separate sections as independent tools rather than one discontinuous tool. Surprisingly, there appears to be no study that has tested any type of connectivity problem using natural tools made from branches with which wild and captive apes often have extensive experience. It is possible that such ecologically valid tools may better help subjects understand connectivity that involves physical attachment. In this study, we tested orangutans with natural tools and a range of connectivity problems that involved the physical attachment of a reward on continuous and broken tools. We found that the orangutans understood tool connectivity involving physical attachment that apes from other studies failed when tested with similar tasks using artificial as opposed to natural tools. We found no evidence that the orangutans' success in broken tool conditions was based on perceptual features unrelated to connectivity. Our results suggest that artificial tools may limit apes' knowledge of connectivity involving physical attachment, whereas ecologically valid tools may have the opposite effect. PsycINFO Database Record (c) 2013 APA, all rights reserved

  19. Sonification Prototype for Space Physics

    NASA Astrophysics Data System (ADS)

    Candey, R. M.; Schertenleib, A. M.; Diaz Merced, W. L.

    2005-12-01

    As an alternative and adjunct to visual displays, auditory exploration of data via sonification (data controlled sound) and audification (audible playback of data samples) is promising for complex or rapidly/temporally changing visualizations, for data exploration of large datasets (particularly multi-dimensional datasets), and for exploring datasets in frequency rather than spatial dimensions (see also International Conferences on Auditory Display ). Besides improving data exploration and analysis for most researchers, the use of sound is especially valuable as an assistive technology for visually-impaired people and can make science and math more exciting for high school and college students. Only recently have the hardware and software come together to make a cross-platform open-source sonification tool feasible. We have developed a prototype sonification data analysis tool using the JavaSound API and NASA GSFC's ViSBARD software . Wanda Diaz Merced, a blind astrophysicist from Puerto Rico, is instrumental in advising on and testing the tool.

  20. Thermal protection system (TPS) monitoring using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hurley, D. A.; Huston, D. R.; Fletcher, D. G.; Owens, W. P.

    2011-04-01

    This project investigates acoustic emission (AE) as a tool for monitoring the degradation of thermal protection systems (TPS). The AE sensors are part of an array of instrumentation on an inductively coupled plasma (ICP) torch designed for testing advanced thermal protection aerospace materials used for hypervelocity vehicles. AE are generated by stresses within the material, propagate as elastic stress waves, and can be detected with sensitive instrumentation. Graphite (POCO DFP-2) is used to study gas-surface interaction during degradation of thermal protection materials. The plasma is produced by a RF magnetic field driven by a 30kW power supply at 3.5 MHz, which creates a noisy environment with large spikes when powered on or off. AE are waveguided from source to sensor by a liquid-cooled copper probe used to position the graphite sample in the plasma stream. Preliminary testing was used to set filters and thresholds on the AE detection system (Physical Acoustics PCI-2) to minimize the impact of considerable operating noise. Testing results show good correlation between AE data and testing environment, which dictates the physics and chemistry of the thermal breakdown of the sample. Current efforts for the project are expanding the dataset and developing statistical analysis tools. This study shows the potential of AE as a powerful tool for analysis of thermal protection material thermal degradations with the unique capability of real-time, in-situ monitoring.

  1. Using a weight of evidence approach for assessing ...

    EPA Pesticide Factsheets

    The Ottawa River lies in extreme northwest Ohio, flowing into Lake Erie’s western basin at the City of Toledo. The Ottawa River is a component of the Maumee River AOC as defined by the International Commission. The Ottawa River is approximately 45 miles long; however, the 2009-2010 remediation project took place in the lower 8.8 miles of the river where urban and industrial activities have had a detrimental impact on the river as a beneficial resource. The primary COCs at the site are PCBs, PAHs, inorganics (principally lead), and oil and grease. Approximately 260,000 yd3 of contaminated sediments were removed from the study reach. Removal was accomplished through dredging in targeted areas within 3 reaches of the river where COCs exceeded a target level. The overall objectives of this research effort are twofold: 1) Develop chemical, physical, and biological tools and approaches to evaluate the quantity and sources of post-dredge residuals; and 2) Develop an approach to quantify remedial effectiveness using chemical, physical, and biological tools and approaches. This presentation will focus on 2 of the biological tools: assessing response of various trophic levels to changes in tissue concentrations of PCBs and PAHs and DNA damage in Brown Bullheads. From 2009-2013, pre- and post-remedy sampling of fishes representative of different trophic levels was conducted via electroshocking and fyke net sampling. Fishes collected were largemouth bass, brown bullhead,

  2. Physical Education Curriculum Analysis Tool (PECAT)

    ERIC Educational Resources Information Center

    Lee, Sarah M.; Wechsler, Howell

    2006-01-01

    The Physical Education Curriculum Analysis Tool (PECAT) will help school districts conduct a clear, complete, and consistent analysis of written physical education curricula, based upon national physical education standards. The PECAT is customizable to include local standards. The results from the analysis can help school districts enhance…

  3. Transparent Global Seismic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen

    2013-04-01

    Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits of different risk management measures. The following global data, models and methodologies will be available in the platform. Some of these will be released to the public already before, such as the ISC-GEM global instrumental catalogue (released January 2013). Datasets: • Global Earthquake History Catalogue [1000-1903] • Global Instrumental Catalogue [1900-2009] • Global Geodetic Strain Rate Model • Global Active Fault Database • Tectonic Regionalisation • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerability Database • Socio-Economic Vulnerability and Resilience Indicators Models: • Seismic Source Models • Ground Motion (Attenuation) Models • Physical Exposure Models • Physical Vulnerability Models • Composite Index Models (social vulnerability, resilience, indirect loss) The aforementioned models developed under the GEM framework will be combined to produce estimates of hazard and risk at a global scale. Furthermore, building on many ongoing efforts and knowledge of scientists worldwide, GEM will integrate state-of-the-art data, models, results and open-source tools into a single platform that is to serve as a "clearinghouse" on seismic risk. The platform will continue to increase in value, in particular for use in local contexts, through contributions and collaborations with scientists and organisations worldwide.

  4. Modeling tools for the assessment of microbiological risks during floods: a review

    NASA Astrophysics Data System (ADS)

    Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin

    2015-04-01

    Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing particular flood-related microbial risks, and model improvements are suggested that may better characterize key microbial risks during flood events. The state of current tools are assessed in the context of a changing climate where the frequency, intensity and duration of flooding are shifting in some areas.

  5. Systematic Clinical Reasoning in Physical Therapy (SCRIPT): Tool for the Purposeful Practice of Clinical Reasoning in Orthopedic Manual Physical Therapy.

    PubMed

    Baker, Sarah E; Painter, Elizabeth E; Morgan, Brandon C; Kaus, Anna L; Petersen, Evan J; Allen, Christopher S; Deyle, Gail D; Jensen, Gail M

    2017-01-01

    Clinical reasoning is essential to physical therapist practice. Solid clinical reasoning processes may lead to greater understanding of the patient condition, early diagnostic hypothesis development, and well-tolerated examination and intervention strategies, as well as mitigate the risk of diagnostic error. However, the complex and often subconscious nature of clinical reasoning can impede the development of this skill. Protracted tools have been published to help guide self-reflection on clinical reasoning but might not be feasible in typical clinical settings. This case illustrates how the Systematic Clinical Reasoning in Physical Therapy (SCRIPT) tool can be used to guide the clinical reasoning process and prompt a physical therapist to search the literature to answer a clinical question and facilitate formal mentorship sessions in postprofessional physical therapist training programs. The SCRIPT tool enabled the mentee to generate appropriate hypotheses, plan the examination, query the literature to answer a clinical question, establish a physical therapist diagnosis, and design an effective treatment plan. The SCRIPT tool also facilitated the mentee's clinical reasoning and provided the mentor insight into the mentee's clinical reasoning. The reliability and validity of the SCRIPT tool have not been formally studied. Clinical mentorship is a cornerstone of postprofessional training programs and intended to develop advanced clinical reasoning skills. However, clinical reasoning is often subconscious and, therefore, a challenging skill to develop. The use of a tool such as the SCRIPT may facilitate developing clinical reasoning skills by providing a systematic approach to data gathering and making clinical judgments to bring clinical reasoning to the conscious level, facilitate self-reflection, and make a mentored physical therapist's thought processes explicit to his or her clinical mentor. © 2017 American Physical Therapy Association

  6. Drekar v.2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seefeldt, Ben; Sondak, David; Hensinger, David M.

    Drekar is an application code that solves partial differential equations for fluids that can be optionally coupled to electromagnetics. Drekar solves low-mach compressible and incompressible computational fluid dynamics (CFD), compressible and incompressible resistive magnetohydrodynamics (MHD), and multiple species plasmas interacting with electromagnetic fields. Drekar discretization technology includes continuous and discontinuous finite element formulations, stabilized finite element formulations, mixed integration finite element bases (nodal, edge, face, volume) and an initial arbitrary Lagrangian Eulerian (ALE) capability. Drekar contains the implementation of the discretized physics and leverages the open source Trilinos project for both parallel solver capabilities and general finite element discretization tools.more » The code will be released open source under a BSD license. The code is used for fundamental research for simulation of fluids and plasmas on high performance computing environments.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Zhiming; Abdelaziz, Omar; Qu, Ming

    This paper introduces a first-order physics-based model that accounts for the fundamental heat and mass transfer between a humid-air vapor stream on feed side to another flow stream on permeate side. The model comprises a few optional submodels for membrane mass transport; and it adopts a segment-by-segment method for discretizing heat and mass transfer governing equations for flow streams on feed and permeate sides. The model is able to simulate both dehumidifiers and energy recovery ventilators in parallel-flow, cross-flow, and counter-flow configurations. The predicted tresults are compared reasonably well with the measurements. The open-source codes are written in C++. Themore » model and open-source codes are expected to become a fundament tool for the analysis of membrane-based dehumidification in the future.« less

  8. Recommendations for a Culturally Relevant Internet-Based Tool to Promote Physical Activity Among Overweight Young African American Women, Alabama, 2010–2011

    PubMed Central

    Joseph, Rodney P.; Cherrington, Andrea; Cuffee, Yendelela; Knight, BernNadette; Lewis, Dwight; Allison, Jeroan J.

    2014-01-01

    Introduction Innovative approaches are needed to promote physical activity among young adult overweight and obese African American women. We sought to describe key elements that African American women desire in a culturally relevant Internet-based tool to promote physical activity among overweight and obese young adult African American women. Methods A mixed-method approach combining nominal group technique and traditional focus groups was used to elicit recommendations for the development of an Internet-based physical activity promotion tool. Participants, ages 19 to 30 years, were enrolled in a major university. Nominal group technique sessions were conducted to identify themes viewed as key features for inclusion in a culturally relevant Internet-based tool. Confirmatory focus groups were conducted to verify and elicit more in-depth information on the themes. Results Twenty-nine women participated in nominal group (n = 13) and traditional focus group sessions (n = 16). Features that emerged to be included in a culturally relevant Internet-based physical activity promotion tool were personalized website pages, diverse body images on websites and in videos, motivational stories about physical activity and women similar to themselves in size and body shape, tips on hair care maintenance during physical activity, and online social support through social media (eg, Facebook, Twitter). Conclusion Incorporating existing social media tools and motivational stories from young adult African American women in Internet-based tools may increase the feasibility, acceptability, and success of Internet-based physical activity programs in this high-risk, understudied population. PMID:24433625

  9. Statistical physics of vaccination

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  10. Cyber-Physical Security Assessment (CyPSA) Toolset

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Luis; Patapanchala, Panini; Zonouz, Saman

    CyPSA seeks to organize and gain insight into the diverse sets of data that a critical infrastructure provider must manage. Specifically CyPSA inventories, manages, and analyzes assets and relations among those assets. A variety of interfaces are provided. CyPSA inventories assets (both cyber and physical). This may include the cataloging of assets through a common interface. Data sources used to generate a catalogue of assets include PowerWorld, NPView, NMap Scans, and device configurations. Depending upon the role of the person using the tool the types of assets accessed as well as the data sources through which asset information is accessedmore » may vary. CyPSA allows practitioners to catalogue relations among assets and these may either be manually or programmatically generated. For example, some common relations among assets include the following: Topological Network Data: Which devices and assets are connected and how? Data sources for this kind of information include NMap scans, NPView topologies (via Firewall rule analysis). Security Metrics Outputs: The output of various security metrics such as overall exposure. Configure Assets:CyPSA may eventually include the ability to configure assets including relays and switches. For example, a system administrator would be able to configure and alter the state of a relay via the CyPSA interface. Annotate Assets: CyPSA also allows practitioners to manually and programmatically annotate assets. Sources of information with which to annotate assets include provenance metadata regarding the data source from which the asset was loaded, vulnerability information from vulnerability databases, configuration information, and the output of an analysis in general.« less

  11. International Data on Radiological Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martha Finck; Margaret Goldberg

    2010-07-01

    ABSTRACT The mission of radiological dispersal device (RDD) nuclear forensics is to identify the provenance of nuclear and radiological materials used in RDDs and to aid law enforcement in tracking nuclear materials and routes. The application of databases to radiological forensics is to match RDD source material to a source model in the database, provide guidance regarding a possible second device, and aid the FBI by providing a short list of manufacturers and distributors, and ultimately to the last legal owner of the source. The Argonne/Idaho National Laboratory RDD attribution database is a powerful technical tool in radiological forensics. Themore » database (1267 unique vendors) includes all sealed sources and a device registered in the U.S., is complemented by data from the IAEA Catalogue, and is supported by rigorous in-lab characterization of selected sealed sources regarding physical form, radiochemical composition, and age-dating profiles. Close working relationships with global partners in the commercial sealed sources industry provide invaluable technical information and expertise in the development of signature profiles. These profiles are critical to the down-selection of potential candidates in either pre- or post- event RDD attribution. The down-selection process includes a match between an interdicted (or detonated) source and a model in the database linked to one or more manufacturers and distributors.« less

  12. Earth-Observation based mapping and monitoring of exposure change in the megacity of Istanbul: open-source tools from the MARSITE project

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Dell'Acqua, Fabio

    2016-04-01

    The EU FP7 MARSITE project aims at assessing the "state of the art" of seismic risk evaluation and management at European level, as a starting point to move a "step forward" towards new concepts of risk mitigation and management by long-term monitoring activities carried out both on land and at sea. Spaceborne Earth Observation (EO) is one of the means through which MARSITE is accomplishing this commitment, whose importance is growing as a consequence of the operational unfolding of the Copernicus initiative. Sentinel-2 data, with its open-data policy, represents an unprecedented opportunity to access global spaceborne multispectral data for various purposes including risk monitoring. In the framework of EU FP7 projects MARSITE, RASOR and SENSUM, our group has developed a suite of geospatial software tools to automatically extract risk-related features from EO data, especially on the exposure and vulnerability side of the "risk equation" [1]. These are for example the extension of a built-up area or the distribution of building density. These tools are available open-source as QGIS plug-ins [2] and their source code can be freely downloaded from GitHub [3]. A test case on the risk-prone mega city of Istanbul has been set up, and preliminary results will be presented in this paper. The output of the algorithms can be incorporated into a risk modeling process, whose output is very useful to stakeholders and decision makers who intend to assess and mitigate the risk level across the giant urban agglomerate. Keywords - Remote Sensing, Copernicus, Istanbul megacity, seismic risk, multi-risk, exposure, open-source References [1] Harb, M.M.; De Vecchi, D.; Dell'Acqua, F., "Physical Vulnerability Proxies from Remotes Sensing: Reviewing, Implementing and Disseminating Selected Techniques," Geoscience and Remote Sensing Magazine, IEEE , vol.3, no.1, pp.20,33, March 2015. doi: 10.1109/MGRS.2015.2398672 [2] SENSUM QGIS plugin, 2016, available online at: https://plugins.qgis.org/plugins/sensum_eo_tools/ [3] SENSUM QGIS code repository, 2016, available online at: https://github.com/SENSUM-project/sensum_rs_qgis

  13. Estimation of contribution ratios of pollutant sources to a specific section based on an enhanced water quality model.

    PubMed

    Cao, Bibo; Li, Chuan; Liu, Yan; Zhao, Yue; Sha, Jian; Wang, Yuqiu

    2015-05-01

    Because water quality monitoring sections or sites could reflect the water quality status of rivers, surface water quality management based on water quality monitoring sections or sites would be effective. For the purpose of improving water quality of rivers, quantifying the contribution ratios of pollutant resources to a specific section is necessary. Because physical and chemical processes of nutrient pollutants are complex in water bodies, it is difficult to quantitatively compute the contribution ratios. However, water quality models have proved to be effective tools to estimate surface water quality. In this project, an enhanced QUAL2Kw model with an added module was applied to the Xin'anjiang Watershed, to obtain water quality information along the river and to assess the contribution ratios of each pollutant source to a certain section (the Jiekou state-controlled section). Model validation indicated that the results were reliable. Then, contribution ratios were analyzed through the added module. Results show that among the pollutant sources, the Lianjiang tributary contributes the largest part of total nitrogen (50.43%), total phosphorus (45.60%), ammonia nitrogen (32.90%), nitrate (nitrite + nitrate) nitrogen (47.73%), and organic nitrogen (37.87%). Furthermore, contribution ratios in different reaches varied along the river. Compared with pollutant loads ratios of different sources in the watershed, an analysis of contribution ratios of pollutant sources for each specific section, which takes the localized chemical and physical processes into consideration, was more suitable for local-regional water quality management. In summary, this method of analyzing the contribution ratios of pollutant sources to a specific section based on the QUAL2Kw model was found to support the improvement of the local environment.

  14. TU-AB-BRC-07: Efficiency of An IAEA Phase-Space Source for a Low Energy X-Ray Tube Using Egs++

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, PGF; Renaud, MA; Seuntjens, J

    Purpose: To extend the capability of the EGSnrc C++ class library (egs++) to write and read IAEA phase-space files as a particle source, and to assess the relative efficiency gain in dose calculation using an IAEA phase-space source for modelling a miniature low energy x-ray source. Methods: We created a new ausgab object to score particles exiting a user-defined geometry and write them to an IAEA phase-space file. A new particle source was created to read from IAEA phase-space data. With these tools, a phase-space file was generated for particles exiting a miniature 50 kVp x-ray tube (The INTRABEAM System,more » Carl Zeiss). The phase-space source was validated by comparing calculated PDDs with a full electron source simulation of the INTRABEAM. The dose calculation efficiency gain of the phase-space source was determined relative to the full simulation. The efficiency gain as a function of i) depth in water, and ii) job parallelization was investigated. Results: The phase-space and electron source PDDs were found to agree to 0.5% RMS, comparable to statistical uncertainties. The use of a phase-space source for the INTRABEAM led to a relative efficiency gain of greater than 20 over the full electron source simulation, with an increase of up to a factor of 196. The efficiency gain was found to decrease with depth in water, due to the influence of scattering. Job parallelization (across 2 to 256 cores) was not found to have any detrimental effect on efficiency gain. Conclusion: A set of tools has been developed for writing and reading IAEA phase-space files, which can be used with any egs++ user code. For simulation of a low energy x-ray tube, the use of a phase-space source was found to increase the relative dose calculation efficiency by factor of up to 196. The authors acknowledge partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant No. 432290).« less

  15. Hydrologic analysis for selection and placement of conservation practices at the watershed scale

    NASA Astrophysics Data System (ADS)

    Wilson, C.; Brooks, E. S.; Boll, J.

    2012-12-01

    When a water body is exceeding water quality standards and a Total Maximum Daily Load has been established, conservation practices in the watershed are able to reduce point and non-point source pollution. Hydrological analysis is needed to place conservation practices in the most hydrologically sensitive areas. The selection and placement of conservation practices, however, is challenging in ungauged watersheds with little or no data for the hydrological analysis. The objective of this research is to perform a hydrological analysis for mitigation of erosion and total phosphorus in a mixed land use watershed, and to select and place the conservation practices in the most sensitive areas. The study area is the Hangman Creek watershed in Idaho and Washington State, upstream of Long Lake (WA) reservoir, east of Spokane, WA. While the pollutant of concern is total phosphorus (TP), reductions in TP were translated to total suspended solids or reductions in nonpoint source erosion and sediment delivery to streams. Hydrological characterization was done with a simple web-based tool, which runs the Water Erosion Prediction Project (WEPP) model for representative land types in the watersheds, where a land type is defined as a unique combination of soil type, slope configuration, land use and management, and climate. The web-based tool used site-specific spatial and temporal data on land use, soil physical parameters, slope, and climate derived from readily available data sources and provided information on potential pollutant pathways (i.e. erosion, runoff, lateral flow, and percolation). Multiple land types representative in the watershed were ordered from most effective to least effective, and displayed spatially using GIS. The methodology for the Hangman Creek watershed was validated in the nearby Paradise Creek watershed that has long-term stream discharge and monitoring as well as land use data. Output from the web-based tool shows the potential reductions for different tillage practices, buffer strips, streamside management, and conversion to the conservation reserve program in the watershed. The output also includes the relationship between land area where conservation practices are placed and the potential reduction in pollution, showing the diminished returns on investment as less sensitive areas are being treated. This application of a simple web-based tool and the use of a physically-based erosion model (i.e. WEPP) illustrates that quantitative, spatial and temporal analysis of changes in pollutant loading and site-specific recommendations of conservation practices can be made in ungauged watersheds.

  16. Case identification of depression in patients with chronic physical health problems: a diagnostic accuracy meta-analysis of 113 studies

    PubMed Central

    Meader, Nicholas; Mitchell, Alex J; Chew-Graham, Carolyn; Goldberg, David; Rizzo, Maria; Bird, Victoria; Kessler, David; Packham, Jon; Haddad, Mark; Pilling, Stephen

    2011-01-01

    Background Depression is more likely in patients with chronic physical illness, and is associated with increased rates of disability and mortality. Effective treatment of depression may reduce morbidity and mortality. The use of two stem questions for case finding in diabetes and coronary heart disease is advocated in the Quality and Outcomes Framework, and has become normalised into primary care. Aim To define the most effective tool for use in consultations to detect depression in people with chronic physical illness. Design Meta-analysis. Method The following data sources were searched: CENTRAL, CINAHL, Embase, HMIC, MEDLINE, PsycINFO, Web of Knowledge, from inception to July 2009. Three authors selected studies that examined identification tools and used an interview-based ICD (International Classification of Diseases) or DSM (Diagnostic and statistical Manual of Mental Disorders) diagnosis of depression as reference standard. At least two authors independently extracted study characteristics and outcome data and assessed methodological quality. Results A total of 113 studies met the eligibility criteria, providing data on 20 826 participants. It was found that two stem questions, PHQ-9 (Patient Health Questionnaire), the Zung, and GHQ-28 (General Health Questionnaire) were the optimal measures for case identification, but no method was sufficiently accurate to recommend as a definitive case-finding tool. Limitations were the moderate-to-high heterogeneity for most scales and the facts that few studies used ICD diagnoses as the reference standard, and that a variety of methods were used to determine DSM diagnoses. Conclusion Assessing both validity and ease of use, the two stem questions are the preferred method. However, clinicians should not rely on the two-questions approach alone, but should be confident to engage in a more detailed clinical assessment of patients who score positively. PMID:22137418

  17. Exploring physics concepts among novice teachers through CMAP tools

    NASA Astrophysics Data System (ADS)

    Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.

    2018-03-01

    Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.

  18. VISAGE Visualization for Integrated Satellite, Airborne and Ground-Based Data Exploration

    NASA Technical Reports Server (NTRS)

    Conover, Helen; Berendes, Todd; Naeger, Aaron; Maskey, Manil; Gatlin, Patrick; Wingo, Stephanie; Kulkarni, Ajinkya; Gupta, Shivangi; Nagaraj, Sriraksha; Wolff, David; hide

    2017-01-01

    The primary goal of the VISAGE project is to facilitate more efficient Earth Science investigations via a tool that can provide visualization and analytic capabilities for diverse coincident datasets. This proof-of-concept project will be centered around the GPM Ground Validation program, which provides a valuable source of intensive, coincident observations of atmospheric phenomena. The data are from a wide variety of ground-based, airborne and satellite instruments, with a wide diversity in spatial and temporal scales, variables, and formats, which makes these data difficult to use together. VISAGE will focus on "golden cases" where most ground instruments were in operation and multiple research aircraft sampled a significant weather event, ideally while the GPM Core Observatory passed overhead. The resulting tools will support physical process studies as well as satellite and model validation.

  19. MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.

    PubMed

    Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk

    2018-05-29

    Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.

  20. iPadPix—A novel educational tool to visualise radioactivity measured by a hybrid pixel detector

    NASA Astrophysics Data System (ADS)

    Keller, O.; Schmeling, S.; Müller, A.; Benoit, M.

    2016-11-01

    With the ability to attribute signatures of ionising radiation to certain particle types, pixel detectors offer a unique advantage over the traditional use of Geiger-Müller tubes also in educational settings. We demonstrate in this work how a Timepix readout chip combined with a standard 300μm pixelated silicon sensor can be used to visualise radioactivity in real-time and by means of augmented reality. The chip family is the result of technology transfer from High Energy Physics at CERN and facilitated by the Medipix Collaboration. This article summarises the development of a prototype based on an iPad mini and open source software detailed in ref. [1]. Appropriate experimental activities that explore natural radioactivity and everyday objects are given to demonstrate the use of this new tool in educational settings.

  1. SIGNUM: A Matlab, TIN-based landscape evolution model

    NASA Astrophysics Data System (ADS)

    Refice, A.; Giachetta, E.; Capolongo, D.

    2012-08-01

    Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.

  2. A Guided Tour of Mathematical Methods - 2nd Edition

    NASA Astrophysics Data System (ADS)

    Snieder, Roel

    2004-09-01

    Mathematical methods are essential tools for all physical scientists. This second edition provides a comprehensive tour of the mathematical knowledge and techniques that are needed by students in this area. In contrast to more traditional textbooks, all the material is presented in the form of problems. Within these problems the basic mathematical theory and its physical applications are well integrated. The mathematical insights that the student acquires are therefore driven by their physical insight. Topics that are covered include vector calculus, linear algebra, Fourier analysis, scale analysis, complex integration, Green's functions, normal modes, tensor calculus, and perturbation theory. The second edition contains new chapters on dimensional analysis, variational calculus, and the asymptotic evaluation of integrals. This book can be used by undergraduates, and lower-level graduate students in the physical sciences. It can serve as a stand-alone text, or as a source of problems and examples to complement other textbooks. All the material is presented in the form of problems Mathematical insights are gained by getting the reader to develop answers themselves Many applications of the mathematics are given

  3. Matlab Geochemistry: An open source geochemistry solver based on MRST

    NASA Astrophysics Data System (ADS)

    McNeece, C. J.; Raynaud, X.; Nilsen, H.; Hesse, M. A.

    2017-12-01

    The study of geological systems often requires the solution of complex geochemical relations. To address this need we present an open source geochemical solver based on the Matlab Reservoir Simulation Toolbox (MRST) developed by SINTEF. The implementation supports non-isothermal multicomponent aqueous complexation, surface complexation, ion exchange, and dissolution/precipitation reactions. The suite of tools available in MRST allows for rapid model development, in particular the incorporation of geochemical calculations into transport simulations of multiple phases, complex domain geometry and geomechanics. Different numerical schemes and additional physics can be easily incorporated into the existing tools through the object-oriented framework employed by MRST. The solver leverages the automatic differentiation tools available in MRST to solve arbitrarily complex geochemical systems with any choice of species or element concentration as input. Four mathematical approaches enable the solver to be quite robust: 1) the choice of chemical elements as the basis components makes all entries in the composition matrix positive thus preserving convexity, 2) a log variable transformation is used which transfers the nonlinearity to the convex composition matrix, 3) a priori bounds on variables are calculated from the structure of the problem, constraining Netwon's path and 4) an initial guess is calculated implicitly by sequentially adding model complexity. As a benchmark we compare the model to experimental and semi-analytic solutions of the coupled salinity-acidity transport system. Together with the reservoir simulation capabilities of MRST the solver offers a promising tool for geochemical simulations in reservoir domains for applications in a diversity of fields from enhanced oil recovery to radionuclide storage.

  4. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Mehl, S.; Velasco Mansilla, V.

    2015-12-01

    FREEWAT is an HORIZON 2020 EU project. FREEWAT main result will be an open source and public domain GIS integrated modelling environment for the simulation of water quantity and quality in surface water and groundwater with an integrated water management and planning module. FREEWAT aims at promoting water resource management by simplifying the application of the Water Framework Directive and related Directives. Specific objectives of the project are: to coordinate previous EU and national funded research to integrate existing software modules for water management in a single environment into the GIS based FREEWAT and to support the FREEWAT application in an innovative participatory approach gathering technical staff and relevant stakeholders (policy and decision makers) in designing scenarios for application of water policies. The open source characteristics of the platform allow to consider this an initiative "ad includendum", as further institutions or developers may contribute to the development. Core of the platform is the SID&GRID framework (GIS integrated physically-based distributed numerical hydrological model based on a modified version of MODFLOW 2005; Rossetto et al. 2013) in its version ported to QGIS desktop. Activities are carried out on two lines: (i) integration of modules to fulfill the end-users requirements, including tools for producing feasibility and management plans; (ii) a set of activities to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - module for water management and planning; - calibration, uncertainty and sensitivity analysis; - module for solute transport in unsaturated zone; - module for crop growth and water requirements in agriculture; - tools for groundwater quality issues and for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. Large stakeholders involvement is thought to guarantee results dissemination and exploitation.

  5. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.

  6. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  7. Herschel and the Molecular Universe

    NASA Technical Reports Server (NTRS)

    Tielens, A. G. G. M.; Helmich, F. P.

    2006-01-01

    Over the next decade, space-based missions will open up the universe to high spatial and spectral resolution studies at infrared and submillimeter wavelengths. This will allow us to study, in much greater detail, the composition and the origin and evolution of molecules in space. Moreover, molecular transitions in these spectral ranges provide a sensitive probe of the dynamics and the physical and chemical conditions in a wide range of objects at scales ranging from budding planetary systems to galactic and extragalactic sizes. Hence, these missions provide us with the tools to study key astrophysical and astrochemical processes involved in the formation and evolution of planets, stars, and galaxies. These new missions can be expected to lead to the detection of many thousands of new spectral features. Identification, analysis and interpretation of these features in terms of the physical and chemical characteristics of the astronomical sources will require detailed astronomical modeling tools supported by laboratory measurements and theoretical studies of chemical reactions and collisional excitation rates on species of astrophysical relevance. These data will have to be made easily accessible to the scientific community through web-based data archives. In this paper, we will review the Herschel mission and its expected impact on our understanding of the molecular universe.

  8. Dual excitation acoustic paramagnetic logging tool

    DOEpatents

    Vail, III, William B.

    1989-01-01

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in gelogical formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleous present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described.

  9. Dual excitation acoustic paramagnetic logging tool

    DOEpatents

    Vail, W.B. III.

    1989-02-14

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be performed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described. 6 figs.

  10. Use of Spatial Sampling and Microbial Source-Tracking Tools for Understanding Fecal Contamination at Two Lake Erie Beaches

    USGS Publications Warehouse

    Francy, Donna S.; Bertke, Erin E.; Finnegan, Dennis P.; Kephart, Christopher M.; Sheets, Rodney A.; Rhoades, John; Stumpe, Lester

    2006-01-01

    Source-tracking tools were used to identify potential sources of fecal contamination at two Lake Erie bathing beaches: an urban beach (Edgewater in Cleveland, Ohio) and a beach in a small city (Lakeshore in Ashtabula, Ohio). These tools included identifying spatial patterns of Escherichia coli (E. coli) concentrations in each area, determining weather patterns that caused elevated E. coli, and applying microbial source tracking (MST) techniques to specific sites. Three MST methods were used during this study: multiple antibiotic resistance (MAR) indexing of E. coli isolates and the presence of human-specific genetic markers within two types of bacteria, the genus Bacteroides and the species Enterococcus faecium. At Edgewater, sampling for E. coli was done during 2003-05 at bathing-area sites, at nearshore lake sites, and in shallow ground water in foreshore and backshore areas. Spatial sampling at nearshore lake sites showed that fecal contamination was most likely of local origin; E. coli concentrations near the mouths of rivers and outfalls remote to the beach were elevated (greater than 235 colony-forming units per 100 milliliters (CFU/100 mL)) but decreased along transport pathways to the beach. In addition, E. coli concentrations were generally highest in bathing-area samples collected at 1- and 2-foot water depths, midrange at 3-foot depths, and lowest in nearshore lake samples typically collected 150 feet from the shoreline. Elevated E. coli concentrations at bathing-area sites were generally associated with increased wave heights and rainfall, but not always. E. coli concentrations were often elevated in shallow ground-water samples, especially in samples collected less than 10 feet from the edge of water (near foreshore area). The interaction of shallow ground water and waves may be a mechanism of E. coli storage and accumulation in foreshore sands. Infiltration of bird feces through sand with surface water from rainfall and high waves may be concentrating E. coli in shallow ground water in foreshore and backshore sands. At Lakeshore, sampling for E. coli was done at bathing-area, nearshore lake, and parking-lot sites during 2004-05. Low concentrations of E. coli at nearshore lake sites furthest from the shoreline indicated that fecal contamination was most likely of local origin. High concentrations of E. coli in water and bed sediments at several nearshore lake sites showed that contamination was emanating from several points along the shoreline during wet and dry weather, including the boat ramp, an area near the pond drainage, and parking-lot sediments. Physical evidence confirmed that runoff from the parking lot leads to degradation of water quality at the beach. MST samples were collected to help interpret spatial findings and determine whether sources of fecal contamination were from wastewater or bird feces and if a human-specific marker was present. MAR indices were useful in distinguishing between bird feces and wastewater sources because they were about 10 times higher in the latter. The results from MAR indices agreed with results from the two human-specific markers in some but not all of the samples tested. Bacteroides and enterococci human-specific markers were found on one day at Edgewater and two days at Lakeshore. On three days at Edgewater and two days at Lakeshore, the MAR index indicated a mixed source, but neither marker was found in bathing-water samples; this may be because bacterial indicator concentrations were too low to detect a marker. Multiple tools are needed to help identify sources of fecal contamination at coastal beaches. Spatial sampling identified patterns in E. coli concentrations and yielded information on the physical pathways of contamination. MST methods provided information on whether the source was likely of human or nonhuman origin only; however, MST did not provide information on the pathways of contamination.

  11. Undergraduate Research in Physics as an Educational Tool

    NASA Astrophysics Data System (ADS)

    Hakim, Toufic M.; Garg, Shila

    2001-03-01

    The National Science Foundation's 1996 report "Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering and Technology" urged that in order to improve SME&T education, decisive action must be taken so that "all students have access to excellent undergraduate education in science .... and all students learn these subjects by direct experience with the methods and processes of inquiry." Research-related educational activities that integrate education and research have been shown to be valuable in improving the quality of education and enhancing the number of majors in physics departments. Student researchers develop a motivation to continue in science and engineering through an appreciation of how science is done and the excitement of doing frontier research. We will address some of the challenges of integrating research into the physics undergraduate curriculum effectively. The departmental and institutional policies and infrastructure required to help prepare students for this endeavor will be discussed as well as sources of support and the establishment of appropriate evaluation procedures.

  12. Atomic and molecular far-infrared lines from high redshift galaxies

    NASA Astrophysics Data System (ADS)

    Vallini, L.

    2015-03-01

    The advent of Atacama Large Millimeter-submillimeter Array (ALMA), with its unprecedented sensitivity, makes it possible the detection of far-infrared (FIR) metal cooling and molecular lines from the first galaxies that formed after the Big Bang. These lines represent a powerful tool to shed light on the physical properties of the interstellar medium (ISM) in high-redshift sources. In what follows we show the potential of a physically motivated theoretical approach that we developed to predict the ISM properties of high redshift galaxies. The model allows to infer, as a function of the metallicity, the luminosities of various FIR lines observable with ALMA. It is based on high resolution cosmological simulations of star-forming galaxies at the end of the Epoch of Reionization (z˜eq6) , further implemented with sub-grid physics describing the cooling and the heating processes that take place in the neutral diffuse ISM. Finally we show how a different approach based on semi-analytical calculations can allow to predict the CO flux function at z>6.

  13. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  14. Enhancing participatory approach in water resources management: development of a survey to evaluate stakeholders needs and priorities related to software capabilities

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Josef, S.; Boukalova, Z.; Triana, F.; Ghetta, M.; Sabbatini, T.; Bonari, E.; Cannata, M.; De Filippis, G.

    2016-12-01

    The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management) aims at simplifying the application of EU-water related Directives, by developing an open source and public domain, GIS-integrated platform for planning and management of ground- and surface-water resources. The FREEWAT platform is conceived as a canvas, where several distributed and physically-based simulation codes are virtually integrated. The choice of such codes was supported by the result of a survey performed by means of questionnaires distributed to 14 case study FREEWAT project partners and several stakeholders. This was performed in the first phase of the project within the WP 6 (Enhanced science and participatory approach evidence-based decision making), Task 6.1 (Definition of a "needs/tools" evaluation grid). About 30% among all the invited entities and institutions from several EU and non-EU Countries expressed their interest in contributing to the survey. Most of them were research institutions, government and geoenvironmental companies and river basin authorities.The result of the questionnaire provided a spectrum of needs and priorities of partners/stakeholders, which were addressed during the development phase of the FREEWAT platform. The main needs identified were related to ground- and surface-water quality, sustainable water management, interaction between groundwater/surface-water bodies, and design and management of Managed Aquifer Recharge schemes. Needs and priorities were then connected to the specific EU Directives and Regulations to be addressed.One of the main goals of the questionnaires was to collect information and suggestions regarding the use of existing commercial/open-source software tools to address needs and priorities, and regarding the needs to address specific water-related processes/problems.

  15. Development of a flattening filter free multiple source model for use as an independent, Monte Carlo, dose calculation, quality assurance tool for clinical trials.

    PubMed

    Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm 2 to 40 × 40 cm 2 . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  16. Development of a Monte Carlo multiple source model for inclusion in a dose calculation auditing tool.

    PubMed

    Faught, Austin M; Davidson, Scott E; Fontenot, Jonas; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core Houston (IROC-H) (formerly the Radiological Physics Center) has reported varying levels of agreement in their anthropomorphic phantom audits. There is reason to believe one source of error in this observed disagreement is the accuracy of the dose calculation algorithms and heterogeneity corrections used. To audit this component of the radiotherapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Elekta 6 MV and 10 MV therapeutic x-ray beams were commissioned based on measurement of central axis depth dose data for a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open field measurements consisting of depth dose data and dose profiles for field sizes ranging from 3 × 3 cm 2 to 30 × 30 cm 2 . The models were then benchmarked against measurements in IROC-H's anthropomorphic head and neck and lung phantoms. Validation results showed 97.9% and 96.8% of depth dose data passed a ±2% Van Dyk criterion for 6 MV and 10 MV models respectively. Dose profile comparisons showed an average agreement using a ±2%/2 mm criterion of 98.0% and 99.0% for 6 MV and 10 MV models respectively. Phantom plan comparisons were evaluated using ±3%/2 mm gamma criterion, and averaged passing rates between Monte Carlo and measurements were 87.4% and 89.9% for 6 MV and 10 MV models respectively. Accurate multiple source models for Elekta 6 MV and 10 MV x-ray beams have been developed for inclusion in an independent dose calculation tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  17. Source insights into the 11-h daytime and nighttime fine ambient particulate matter in China as well as the synthetic studies using the new Multilinear Engine 2-species ratios (ME2-SR) method.

    PubMed

    Shi, Guoliang; Chen, Gang; Liu, Guirong; Wang, Haiting; Tian, Yingze; Feng, Yinchang

    2016-10-01

    Modeled results are very important for environmental management. Unreasonable modeled result can lead to wrong strategy for air pollution management. In this work, an improved physically constrained source apportionment (PCSA) technology known as Multilinear Engine 2-species ratios (ME2-SR) was developed to the 11-h daytime and nighttime fine ambient particulate matter in urban area. Firstly, synthetic studies were carried out to explore the effectiveness of ME2-SR. The estimated source contributions were compared with the true values. The results suggest that, compared with the positive matrix factorization (PMF) model, the ME2-SR method could obtain more physically reliable outcomes, indicating that ME2-SR was effective, especially when apportioning the datasets with no unknown source. Additionally, 11-h daytime and nighttime PM2.5 samples were collected from Tianjin in China. The sources of the 11-h daytime and nighttime fine ambient particulate matter in China were identified using the new method and the PMF model. The calculated source contributions for ME2-SR for daytime PM2.5 samples are resuspended dust (38.91 μg m(-3), 26.60%), sulfate and nitrate (38.60 μg m(-3), 26.39%), vehicle exhaust and road dust (38.26 μg m(-3), 26.16%) and coal combustion (20.14 μg m(-3), 13.77%), and those for nighttime PM2.5 samples are resuspended dust (18.78 μg m(-3), 12.91%), sulfate and nitrate (41.57 μg m(-3), 28.58%), vehicle exhaust and road dust (38.39 μg m(-3), 26.39%), and coal combustion (36.76 μg m(-3), 25.27%). The comparisons of the constrained versus unconstrained outcomes clearly suggest that the physical meaning of the ME2-SR results is interpretable and reliable, not only for the specified species values but also for source contributions. The findings indicate that the ME2-SR method can be a useful tool in source apportionment studies, for air pollution management. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Fishing tool retrieves MWD nuclear source from deep well

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    A new wire line tool has successfully retrieved the nuclear sources and formation data from a measurement-while-drilling (MWD) tool stuck in a deep, highly deviated well in the Gulf of Mexico. On Nov. 8, 1993, Schlumberger Wireline and Testing and Anadrill ran a logging-while-drilling inductive coupling (LINC) tool on conventional wire line to fish the gamma ray and neutron sources from a compensated density neutron (CDN) tool stuck in a well at 19,855 ft with an inclination greater than 80[degree]. The paper briefly describes the operation and equipment.

  19. A Vision on the Status and Evolution of HEP Physics Software Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canal, P.; Elvira, D.; Hatcher, R.

    2013-07-28

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  20. Open source Modeling and optimization tools for Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  1. Design of a high pulse repitition frequency carbon dioxide laser for processing high damage threshold materials

    NASA Astrophysics Data System (ADS)

    Chatwin, Christopher R.; McDonald, Donald W.; Scott, Brian F.

    1989-07-01

    The absence of an applications led design philosophy has compromised both the development of laser source technology and its effective implementation into manufacturing technology in particular. For example, CO2 lasers are still incapable of processing classes of refractory and non-ferrous metals. Whilst the scope of this paper is restricted to high power CO2 lasers; the design methodology reported herein is applicable to source technology in general, which when exploited, will effect an expansion of applications. The CO2 laser operational envelope should not only be expanded to incorporate high damage threshold materials but also offer a greater degree of controllability. By a combination of modelling and experimentation the requisite beam characteristics, at the workpiece, were determined then utilised to design the Laser Manufacturing System. The design of sub-system elements was achieved by a combination of experimentation and simulation which benefited from a comprehensive set of software tools. By linking these tools the physical processes in the laser - electron processes in the plasma, the history of photons in the resonator, etc. - can be related, in a detailed model, to the heating mechanisms in the workpiece.

  2. The swiss army knife of job submission tools: grid-control

    NASA Astrophysics Data System (ADS)

    Stober, F.; Fischer, M.; Schleper, P.; Stadie, H.; Garbers, C.; Lange, J.; Kovalchuk, N.

    2017-10-01

    grid-control is a lightweight and highly portable open source submission tool that supports all common workflows in high energy physics (HEP). It has been used by a sizeable number of HEP analyses to process tasks that sometimes consist of up to 100k jobs. grid-control is built around a powerful plugin and configuration system, that allows users to easily specify all aspects of the desired workflow. Job submission to a wide range of local or remote batch systems or grid middleware is supported. Tasks can be conveniently specified through the parameter space that will be processed, which can consist of any number of variables and data sources with complex dependencies on each other. Dataset information is processed through a configurable pipeline of dataset filters, partition plugins and partition filters. The partition plugins can take the number of files, size of the work units, metadata or combinations thereof into account. All changes to the input datasets or variables are propagated through the processing pipeline and can transparently trigger adjustments to the parameter space and the job submission. While the core functionality is completely experiment independent, full integration with the CMS computing environment is provided by a small set of plugins.

  3. Explicit B-spline regularization in diffeomorphic image registration

    PubMed Central

    Tustison, Nicholas J.; Avants, Brian B.

    2013-01-01

    Diffeomorphic mappings are central to image registration due largely to their topological properties and success in providing biologically plausible solutions to deformation and morphological estimation problems. Popular diffeomorphic image registration algorithms include those characterized by time-varying and constant velocity fields, and symmetrical considerations. Prior information in the form of regularization is used to enforce transform plausibility taking the form of physics-based constraints or through some approximation thereof, e.g., Gaussian smoothing of the vector fields [a la Thirion's Demons (Thirion, 1998)]. In the context of the original Demons' framework, the so-called directly manipulated free-form deformation (DMFFD) (Tustison et al., 2009) can be viewed as a smoothing alternative in which explicit regularization is achieved through fast B-spline approximation. This characterization can be used to provide B-spline “flavored” diffeomorphic image registration solutions with several advantages. Implementation is open source and available through the Insight Toolkit and our Advanced Normalization Tools (ANTs) repository. A thorough comparative evaluation with the well-known SyN algorithm (Avants et al., 2008), implemented within the same framework, and its B-spline analog is performed using open labeled brain data and open source evaluation tools. PMID:24409140

  4. Source modeling and inversion with near real-time GPS: a GITEWS perspective for Indonesia

    NASA Astrophysics Data System (ADS)

    Babeyko, A. Y.; Hoechner, A.; Sobolev, S. V.

    2010-07-01

    We present the GITEWS approach to source modeling for the tsunami early warning in Indonesia. Near-field tsunami implies special requirements to both warning time and details of source characterization. To meet these requirements, we employ geophysical and geological information to predefine a maximum number of rupture parameters. We discretize the tsunamigenic Sunda plate interface into an ordered grid of patches (150×25) and employ the concept of Green's functions for forward and inverse rupture modeling. Rupture Generator, a forward modeling tool, additionally employs different scaling laws and slip shape functions to construct physically reasonable source models using basic seismic information only (magnitude and epicenter location). GITEWS runs a library of semi- and fully-synthetic scenarios to be extensively employed by system testing as well as by warning center personnel teaching and training. Near real-time GPS observations are a very valuable complement to the local tsunami warning system. Their inversion provides quick (within a few minutes on an event) estimation of the earthquake magnitude, rupture position and, in case of sufficient station coverage, details of slip distribution.

  5. Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises.

    PubMed

    Cannavò, Flavio; Camacho, Antonio G; González, Pablo J; Mattia, Mario; Puglisi, Giuseppe; Fernández, José

    2015-06-09

    Volcano observatories provide near real-time information and, ultimately, forecasts about volcano activity. For this reason, multiple physical and chemical parameters are continuously monitored. Here, we present a new method to efficiently estimate the location and evolution of magmatic sources based on a stream of real-time surface deformation data, such as High-Rate GPS, and a free-geometry magmatic source model. The tool allows tracking inflation and deflation sources in time, providing estimates of where a volcano might erupt, which is important in understanding an on-going crisis. We show a successful simulated application to the pre-eruptive period of May 2008, at Mount Etna (Italy). The proposed methodology is able to track the fast dynamics of the magma migration by inverting the real-time data within seconds. This general method is suitable for integration in any volcano observatory. The method provides first order unsupervised and realistic estimates of the locations of magmatic sources and of potential eruption sites, information that is especially important for civil protection purposes.

  6. Measurements of the cesium flow from a surface-plasma H/sup -/ ion source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, H.V.; Allison, P.W.

    1979-01-01

    A surface ionization gauge (SIG) was constructed and used to measure the Cs/sup 0/ flow rate through the emission slit of a surface-plasma source (SPS) of H/sup -/ ions with Penning geometry. The equivalent cesium density in the SPS discharge is deduced from these flow measurements. For dc operation the optimum H/sup -/ current occurs at an equivalent cesium density of approx. 7 x 10/sup 12/ cm/sup -3/ (corresponding to an average cesium consumption rate of 0.5 mg/h). For pulsed operation the optimum H/sup -/ current occurs at an equivalent cesium density of approx. 2 x 10/sup 13/ cm/sup -3/more » (1-mg/h average cesium consumption rate). Cesium trapping by the SPS discharge was observed for both dc and pulsed operation. A cesium energy of approx. 0.1 eV is deduced from the observed time of flight to the SIG. In addition to providing information on the physics of the source, the SIG is a useful diagnostic tool for source startup and operation.« less

  7. Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises

    PubMed Central

    Cannavò, Flavio; Camacho, Antonio G.; González, Pablo J.; Mattia, Mario; Puglisi, Giuseppe; Fernández, José

    2015-01-01

    Volcano observatories provide near real-time information and, ultimately, forecasts about volcano activity. For this reason, multiple physical and chemical parameters are continuously monitored. Here, we present a new method to efficiently estimate the location and evolution of magmatic sources based on a stream of real-time surface deformation data, such as High-Rate GPS, and a free-geometry magmatic source model. The tool allows tracking inflation and deflation sources in time, providing estimates of where a volcano might erupt, which is important in understanding an on-going crisis. We show a successful simulated application to the pre-eruptive period of May 2008, at Mount Etna (Italy). The proposed methodology is able to track the fast dynamics of the magma migration by inverting the real-time data within seconds. This general method is suitable for integration in any volcano observatory. The method provides first order unsupervised and realistic estimates of the locations of magmatic sources and of potential eruption sites, information that is especially important for civil protection purposes. PMID:26055494

  8. Interactive basic mathematics web using Wordpress

    NASA Astrophysics Data System (ADS)

    Septia, Tika; Husna; Cesaria, Anna

    2017-12-01

    Wordpress is a popular open source tool that can be used for developing learning media. Basic Mathematics is the difficult subject for a physics student. The students need an interactive learning to improve their knowledge. The aims of this study were to develop the interactive media using Wordpress and to know the effectiveness of web as a learning media to improve the ICT Literacy students. This study used ADDIE models. The effectiveness of interactive web can be described as the students’ equipness of ICT literacy. The population is physics students. The findings show that the interactive web is valid for the content, presentation, linguistic, and graphic aspects. The results concluded that basic mathematic interactive web is effective to equip the learners ICT literacy of categories of high, medium, and low with the observations and questionnaires are in very good criteria.

  9. An open source digital servo for atomic, molecular, and optical physics experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leibrandt, D. R., E-mail: david.leibrandt@nist.gov; Heidecker, J.

    2015-12-15

    We describe a general purpose digital servo optimized for feedback control of lasers in atomic, molecular, and optical physics experiments. The servo is capable of feedback bandwidths up to roughly 1 MHz (limited by the 320 ns total latency); loop filter shapes up to fifth order; multiple-input, multiple-output control; and automatic lock acquisition. The configuration of the servo is controlled via a graphical user interface, which also provides a rudimentary software oscilloscope and tools for measurement of system transfer functions. We illustrate the functionality of the digital servo by describing its use in two example scenarios: frequency control of themore » laser used to probe the narrow clock transition of {sup 27}Al{sup +} in an optical atomic clock, and length control of a cavity used for resonant frequency doubling of a laser.« less

  10. An open source digital servo for atomic, molecular, and optical physics experiments.

    PubMed

    Leibrandt, D R; Heidecker, J

    2015-12-01

    We describe a general purpose digital servo optimized for feedback control of lasers in atomic, molecular, and optical physics experiments. The servo is capable of feedback bandwidths up to roughly 1 MHz (limited by the 320 ns total latency); loop filter shapes up to fifth order; multiple-input, multiple-output control; and automatic lock acquisition. The configuration of the servo is controlled via a graphical user interface, which also provides a rudimentary software oscilloscope and tools for measurement of system transfer functions. We illustrate the functionality of the digital servo by describing its use in two example scenarios: frequency control of the laser used to probe the narrow clock transition of (27)Al(+) in an optical atomic clock, and length control of a cavity used for resonant frequency doubling of a laser.

  11. Using the Xbox Kinect sensor for positional data acquisition

    NASA Astrophysics Data System (ADS)

    Ballester, Jorge; Pheatt, Chuck

    2013-01-01

    The Kinect sensor was introduced in November 2010 by Microsoft for the Xbox 360 video game system. It is designed to be positioned above or below a video display to track player body and hand movements in three dimensions (3D). The sensor contains a red, green, and blue (RGB) camera, a depth sensor, an infrared (IR) light source, a three-axis accelerometer, and a multi-array microphone, as well as hardware required to transmit sensor information to an external receiver. In this article, we evaluate the capabilities of the Kinect sensor as a 3D data-acquisition platform for use in physics experiments. Data obtained for a simple pendulum, a spherical pendulum, projectile motion, and a bouncing basketball are presented. Overall, the Kinect sensor is found to be a useful data-acquisition tool for motion studies in the physics laboratory.

  12. An open source digital servo for atomic, molecular, and optical physics experiments

    NASA Astrophysics Data System (ADS)

    Leibrandt, D. R.; Heidecker, J.

    2015-12-01

    We describe a general purpose digital servo optimized for feedback control of lasers in atomic, molecular, and optical physics experiments. The servo is capable of feedback bandwidths up to roughly 1 MHz (limited by the 320 ns total latency); loop filter shapes up to fifth order; multiple-input, multiple-output control; and automatic lock acquisition. The configuration of the servo is controlled via a graphical user interface, which also provides a rudimentary software oscilloscope and tools for measurement of system transfer functions. We illustrate the functionality of the digital servo by describing its use in two example scenarios: frequency control of the laser used to probe the narrow clock transition of 27Al+ in an optical atomic clock, and length control of a cavity used for resonant frequency doubling of a laser.

  13. Call for papers for special issue of Journal of Molecular Spectroscopy focusing on "Frequency-comb spectroscopy"

    NASA Astrophysics Data System (ADS)

    Foltynowicz, Aleksandra; Picqué, Nathalie; Ye, Jun

    2018-05-01

    Frequency combs are becoming enabling tools for many applications in science and technology, beyond the original purpose of frequency metrology of simple atoms. The precisely evenly spaced narrow lines of a laser frequency comb inspire intriguing approaches to molecular spectroscopy, designed and implemented by a growing community of scientists. Frequency-comb spectroscopy advances the frontiers of molecular physics across the entire electro-magnetic spectrum. Used as frequency rulers, frequency combs enable absolute frequency measurements and precise line shape studies of molecular transitions, for e.g. tests of fundamental physics and improved determination of fundamental constants. As light sources interrogating the molecular samples, they dramatically improve the resolution, precision, sensitivity and acquisition time of broad spectral-bandwidth spectroscopy and open up new opportunities and applications at the leading edge of molecular spectroscopy and sensing.

  14. An open source digital servo for atomic, molecular, and optical physics experiments

    PubMed Central

    Leibrandt, D. R.; Heidecker, J.

    2016-01-01

    We describe a general purpose digital servo optimized for feedback control of lasers in atomic, molecular, and optical physics experiments. The servo is capable of feedback bandwidths up to roughly 1 MHz (limited by the 320 ns total latency); loop filter shapes up to fifth order; multiple-input, multiple-output control; and automatic lock acquisition. The configuration of the servo is controlled via a graphical user interface, which also provides a rudimentary software oscilloscope and tools for measurement of system transfer functions. We illustrate the functionality of the digital servo by describing its use in two example scenarios: frequency control of the laser used to probe the narrow clock transition of 27Al+ in an optical atomic clock, and length control of a cavity used for resonant frequency doubling of a laser. PMID:26724014

  15. Report on the American Association of Medical Physics Undergraduate Fellowship Programs

    PubMed Central

    Avery, Stephen; Gueye, Paul; Sandison, George A.

    2013-01-01

    The American Association of Physicists in Medicine (AAPM) sponsors two summer undergraduate research programs to attract top performing undergraduate students into graduate studies in medical physics: the Summer Undergraduate Fellowship Program (SUFP) and the Minority Undergraduate Summer Experience (MUSE). Undergraduate research experience (URE) is an effective tool to encourage students to pursue graduate degrees. The SUFP and MUSE are the only medical physics URE programs. From 2001 to 2012, 148 fellowships have been awarded and a total of $608,000 has been dispersed to fellows. This paper reports on the history, participation, and status of the programs. A review of surveys of past fellows is presented. Overall, the fellows and mentors are very satisfied with the program. The efficacy of the programs is assessed by four metrics: entry into a medical physics graduate program, board certification, publications, and AAPM involvement. Sixty‐five percent of past fellow respondents decided to pursue a graduate degree in medical physics as a result of their participation in the program. Seventy percent of respondents are currently involved in some educational or professional aspect of medical physics. Suggestions for future enhancements to better track and maintain contact with past fellows, expand funding sources, and potentially combine the programs are presented. PACS number: 01.10.Hx PMID:23318397

  16. Evaluating agricultural nonpoint-source pollution using integrated geographic information systems and hydrologic/water quality model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tim, U.S.; Jolly, R.

    1994-01-01

    Considerable progress has been made in developing physically based, distributed parameter, hydrologic/water quality (HIWQ) models for planning and control of nonpoint-source pollution. The widespread use of these models is often constrained by the excessive and time-consuming input data demands and the lack of computing efficiencies necessary for iterative simulation of alternative management strategies. Recent developments in geographic information systems (GIS) provide techniques for handling large amounts of spatial data for modeling nonpoint-source pollution problems. Because a GIS can be used to combine information from several sources to form an array of model input data and to examine any combinations ofmore » spatial input/output data, it represents a highly effective tool for HiWQ modeling. This paper describes the integration of a distributed-parameter model (AGNPS) with a GIS (ARC/INFO) to examine nonpoint sources of pollution in an agricultural watershed. The ARC/INFO GIS provided the tools to generate and spatially organize the disparate data to support modeling, while the AGNPS model was used to predict several water quality variables including soil erosion and sedimentation within a watershed. The integrated system was used to evaluate the effectiveness of several alternative management strategies in reducing sediment pollution in a 417-ha watershed located in southern Iowa. The implementation of vegetative filter strips and contour buffer (grass) strips resulted in a 41 and 47% reduction in sediment yield at the watershed outlet, respectively. In addition, when the integrated system was used, the combination of the above management strategies resulted in a 71% reduction in sediment yield. In general, the study demonstrated the utility of integrating a simulation model with GIS for nonpoini-source pollution control and planning. Such techniques can help characterize the diffuse sources of pollution at the landscape level. 52 refs., 6 figs., 1 tab.« less

  17. ICFA Beam Dynamics Newsletter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pikin, A.

    2017-11-21

    Electron beam ion sources technology made significant progress since 1968 when this method of producing highly charged ions in a potential trap within electron beam was proposed by E. Donets. Better understanding of physical processes in EBIS, technological advances and better simulation tools determined significant progress in key EBIS parameters: electron beam current and current density, ion trap capacity, attainable charge states. Greatly increased the scope of EBIS and EBIT applications. An attempt is made to compile some of EBIS engineering problems and solutions and to demonstrate a present stage of understanding the processes and approaches to build a bettermore » EBIS.« less

  18. 2016 Research Outreach Program report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hye Young; Kim, Yangkyu

    2016-10-13

    This paper is the research activity report for 4 weeks in LANL. Under the guidance of Dr. Lee, who performs nuclear physics research at LANSCE, LANL, I studied the Low Energy NZ (LENZ) setup and how to use the LENZ. First, I studied the LENZ chamber and Si detectors, and worked on detector calibrations, using the computer software, ROOT (CERN developed data analysis tool) and EXCEL (Microsoft office software). I also performed the calibration experiments that measure alpha particles emitted from a Th-229 source by using a S1-type detector (Si detector). And with Dr. Lee, we checked the result.

  19. CFD Analysis of Turbo Expander for Cryogenic Refrigeration and Liquefaction Cycles

    NASA Astrophysics Data System (ADS)

    Verma, Rahul; Sam, Ashish Alex; Ghosh, Parthasarathi

    Computational Fluid Dynamics analysis has emerged as a necessary tool for designing of turbomachinery. It helps to understand the various sources of inefficiency through investigation of flow physics of the turbine. In this paper, 3D turbulent flow analysis of a cryogenic turboexpander for small scale air separation was performed using Ansys CFX®. The turboexpander has been designed following assumptions based on meanlineblade generation procedure provided in open literature and good engineering judgement. Through analysis of flow field, modifications and further analysis required to evolve a more robust design procedure, have been suggested.

  20. The effect of baryons in the cosmological lensing PDFs

    NASA Astrophysics Data System (ADS)

    Castro, Tiago; Quartin, Miguel; Giocoli, Carlo; Borgani, Stefano; Dolag, Klaus

    2018-07-01

    Observational cosmology is passing through a unique moment of grandeur with the amount of quality data growing fast. However, in order to better take advantage of this moment, data analysis tools have to keep up the pace. Understanding the effect of baryonic matter on the large-scale structure is one of the challenges to be faced in cosmology. In this work, we have thoroughly studied the effect of baryonic physics on different lensing statistics. Making use of the Magneticum Pathfinder suite of simulations, we show that the influence of luminous matter on the 1-point lensing statistics of point sources is significant, enhancing the probability of magnified objects with μ > 3 by a factor of 2 and the occurrence of multiple images by a factor of 5-500, depending on the source redshift and size. We also discuss the dependence of the lensing statistics on the angular resolution of sources. Our results and methodology were carefully tested to guarantee that our uncertainties are much smaller than the effects here presented.

  1. Stable Extraction of Threshold Voltage Using Transconductance Change Method for CMOS Modeling, Simulation and Characterization

    NASA Astrophysics Data System (ADS)

    Choi, Woo Young; Woo, Dong-Soo; Choi, Byung Yong; Lee, Jong Duk; Park, Byung-Gook

    2004-04-01

    We proposed a stable extraction algorithm for threshold voltage using transconductance change method by optimizing node interval. With the algorithm, noise-free gm2 (=dgm/dVGS) profiles can be extracted within one-percent error, which leads to more physically-meaningful threshold voltage calculation by the transconductance change method. The extracted threshold voltage predicts the gate-to-source voltage at which the surface potential is within kT/q of φs=2φf+VSB. Our algorithm makes the transconductance change method more practical by overcoming noise problem. This threshold voltage extraction algorithm yields the threshold roll-off behavior of nanoscale metal oxide semiconductor field effect transistor (MOSFETs) accurately and makes it possible to calculate the surface potential φs at any other point on the drain-to-source current (IDS) versus gate-to-source voltage (VGS) curve. It will provide us with a useful analysis tool in the field of device modeling, simulation and characterization.

  2. Modeling Vortex Generators in the Wind-US Code

    NASA Technical Reports Server (NTRS)

    Dudek, Julianne C.

    2010-01-01

    A source term model which simulates the effects of vortex generators was implemented into the Wind-US Navier Stokes code. The source term added to the Navier-Stokes equations simulates the lift force which would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, supersonic flow in a rectangular duct with a counterrotating vortex generator pair, and subsonic flow in an S-duct with 22 co-rotating vortex generators. The validation results indicate that the source term vortex generator model provides a useful tool for screening vortex generator configurations and gives comparable results to solutions computed using a gridded vane.

  3. The effect of baryons in the cosmological lensing PDFs

    NASA Astrophysics Data System (ADS)

    Castro, Tiago; Quartin, Miguel; Giocoli, Carlo; Borgani, Stefano; Dolag, Klaus

    2018-05-01

    Observational cosmology is passing through a unique moment of grandeur with the amount of quality data growing fast. However, in order to better take advantage of this moment, data analysis tools have to keep up the pace. Understanding the effect of baryonic matter on the large-scale structure is one of the challenges to be faced in cosmology. In this work, we have thoroughly studied the effect of baryonic physics on different lensing statistics. Making use of the Magneticum Pathfinder suite of simulations we show that the influence of luminous matter on the 1-point lensing statistics of point sources is significant, enhancing the probability of magnified objects with μ > 3 by a factor of 2 and the occurrence of multiple-images by a factor 5 - 500 depending on the source redshift and size. We also discuss the dependence of the lensing statistics on the angular resolution of sources. Our results and methodology were carefully tested in order to guarantee that our uncertainties are much smaller than the effects here presented.

  4. Enhancing Leadership Quality. TQ Source Tips & Tools: Emerging Strategies to Enhance Educator Quality

    ERIC Educational Resources Information Center

    National Comprehensive Center for Teacher Quality, 2008

    2008-01-01

    Teaching Quality (TQ) Source Tips & Tools: Emerging Strategies to Enhance Educator Quality is an online resource developed by the TQ Center. It is designed to help education practitioners tap into strategies and resources they can use to enhance educator quality. This publication is based on the TQ Source Tips & Tools topic area "Enhancing…

  5. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  6. An Overview of Promising Grades of Tool Materials Based on the Analysis of their Physical-Mechanical Characteristics

    NASA Astrophysics Data System (ADS)

    Kudryashov, E. A.; Smirnov, I. M.; Grishin, D. V.; Khizhnyak, N. A.

    2018-06-01

    The work is aimed at selecting a promising grade of a tool material, whose physical-mechanical characteristics would allow using it for processing the surfaces of discontinuous parts in the presence of shock loads. An analysis of the physical-mechanical characteristics of most common tool materials is performed and the data on a possible provision of the metal-working processes with promising composite grades are presented.

  7. Next generation of Z* modelling tool for high intensity EUV and soft x-ray plasma sources simulations

    NASA Astrophysics Data System (ADS)

    Zakharov, S. V.; Zakharov, V. S.; Choi, P.; Krukovskiy, A. Y.; Novikov, V. G.; Solomyannaya, A. D.; Berezin, A. V.; Vorontsov, A. S.; Markov, M. B.; Parot'kin, S. V.

    2011-04-01

    In the specifications for EUV sources, high EUV power at IF for lithography HVM and very high brightness for actinic mask and in-situ inspections are required. In practice, the non-equilibrium plasma dynamics and self-absorption of radiation limit the in-band radiance of the plasma and the usable radiation power of a conventional single unit EUV source. A new generation of the computational code Z* is currently developed under international collaboration in the frames of FP7 IAPP project FIRE for modelling of multi-physics phenomena in radiation plasma sources, particularly for EUVL. The radiation plasma dynamics, the spectral effects of self-absorption in LPP and DPP and resulting Conversion Efficiencies are considered. The generation of fast electrons, ions and neutrals is discussed. Conditions for the enhanced radiance of highly ionized plasma in the presence of fast electrons are evaluated. The modelling results are guiding a new generation of EUV sources being developed at Nano-UV, based on spatial/temporal multiplexing of individual high brightness units, to deliver the requisite brightness and power for both lithography HVM and actinic metrology applications.

  8. Investigation of neutron interactions with Ge detectors

    NASA Astrophysics Data System (ADS)

    Baginova, Miloslava; Vojtyla, Pavol; Povinec, Pavel P.

    2018-07-01

    Interactions of neutrons with a high-purity germanium detector were studied experimentally and by simulations using the GEANT4 tool. Elastic and inelastic scattering of fast neutrons as well as neutron capture on Ge nuclei were observed. Peaks induced by inelastic scattering of neutrons on 70Ge, 72Ge, 73Ge, 74Ge and 76Ge were well visible in the γ-ray spectra. In addition, peaks due to inelastic scattering of neutrons on copper and lead nuclei, including the well-known peak of 208Pb at 2614.51 keV, were detected. The GEANT4 simulations showed that the simulated spectrum was in a good agreement with the experimental one. Differences between the simulated and the measured spectra were due to the high γ-ray intensity of the used neutron source, physics implemented in GEANT4 and contamination of the neutron source.

  9. Practices to enable the geophysical research spectrum: from fundamentals to applications

    NASA Astrophysics Data System (ADS)

    Kang, S.; Cockett, R.; Heagy, L. J.; Oldenburg, D.

    2016-12-01

    In a geophysical survey, a source injects energy into the earth and a response is measured. These physical systems are governed by partial differential equations and their numerical solutions are obtained by discretizing the earth. Geophysical simulations and inversions are tools for understanding physical responses and constructing models of the subsurface given a finite amount of data. SimPEG (http://simpeg.xyz) is our effort to synthesize geophysical forward and inverse methodologies into a consistent framework. The primary focus of our initial development has been on the electromagnetics (EM) package, with recent extensions to magnetotelluric, direct current (DC), and induced polarization. Across these methods, and applied geophysics in general, we require tools to explore and build an understanding of the physics (behaviour of fields, fluxes), and work with data to produce models through reproducible inversions. If we consider DC or EM experiments, with the aim of understanding responses from subsurface conductors, we require resources that provide multiple "entry points" into the geophysical problem. To understand the physical responses and measured data, we must simulate the physical system and visualize electric fields, currents, and charges. Performing an inversion requires that many moving pieces be brought together: simulation, physics, linear algebra, data processing, optimization, etc. Each component must be trusted, accessible to interrogation and manipulation, and readily combined in order to enable investigation into inversion methodologies. To support such research, we not only require "entry points" into the software, but also extensibility to new situations. In our development of SimPEG, we have sought to use leading practices in software development with the aim of supporting and promoting collaborations across a spectrum of geophysical research: from fundamentals to applications. Designing software to enable this spectrum puts unique constraints on both the architecture of the codebase as well as the development practices that are employed. In this presentation, we will share some lessons learned and, in particular, how our prioritization of testing, documentation, and refactoring has impacted our own research and fostered collaborations.

  10. Some applications of mathematics in theoretical physics - A review

    NASA Astrophysics Data System (ADS)

    Bora, Kalpana

    2016-06-01

    Mathematics is a very beautiful subject-very much an indispensible tool for Physics, more so for Theoretical Physics (by which we mean here mainly Field Theory and High Energy Physics). These branches of Physics are based on Quantum Mechanics and Special Theory of Relativity, and many mathematical concepts are used in them. In this work, we shall elucidate upon only some of them, like-differential geometry, infinite series, Mellin transforms, Fourier and integral transforms, special functions, calculus, complex algebra, topology, group theory, Riemannian geometry, functional analysis, linear algebra, operator algebra, etc. We shall also present, some physics issues, where these mathematical tools are used. It is not wrong to say that Mathematics is such a powerful tool, without which, there can not be any Physics theory!! A brief review on our research work is also presented.

  11. Layman friendly spectroscopy

    NASA Astrophysics Data System (ADS)

    Sentic, Stipo; Sessions, Sharon

    Affordable consumer grade spectroscopes (e.g. SCiO, Qualcomm Tricorder XPRIZE) are becoming more available to the general public. We introduce the concepts of spectroscopy to the public and K12 students and motivate them to delve deeper into spectroscopy in a dramatic participatory presentation and play. We use diffraction gratings, lasers, and light sources of different spectral properties to provide a direct experience of spectroscopy techniques. Finally, we invite the audience to build their own spectroscope--utilizing the APS SpectraSnapp cell phone application--and study light sources surrounding them in everyday life. We recontextualize the stigma that science is hard (e.g. ``Math, Science Popular Until Students Realize They're Hard,'' The Wall Street Journal) by presenting the material in such a way that it demonstrates the scientific method, and aiming to make failure an impersonal scientific tool--rather than a measure of one's ability, which is often a reason for shying away from science. We will present lessons we have learned in doing our outreach to audiences of different ages. This work is funded by the APS Outreach Grant ``Captain, we have matter matters!'' We thank New Mexico Tech Physics Department and Physics Club for help and technical equipment.

  12. Comet: an open-source MS/MS sequence database search tool.

    PubMed

    Eng, Jimmy K; Jahan, Tahmina A; Hoopmann, Michael R

    2013-01-01

    Proteomics research routinely involves identifying peptides and proteins via MS/MS sequence database search. Thus the database search engine is an integral tool in many proteomics research groups. Here, we introduce the Comet search engine to the existing landscape of commercial and open-source database search tools. Comet is open source, freely available, and based on one of the original sequence database search tools that has been widely used for many years. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  14. Data format standard for sharing light source measurements

    NASA Astrophysics Data System (ADS)

    Gregory, G. Groot; Ashdown, Ian; Brandenburg, Willi; Chabaud, Dominique; Dross, Oliver; Gangadhara, Sanjay; Garcia, Kevin; Gauvin, Michael; Hansen, Dirk; Haraguchi, Kei; Hasna, Günther; Jiao, Jianzhong; Kelley, Ryan; Koshel, John; Muschaweck, Julius

    2013-09-01

    Optical design requires accurate characterization of light sources for computer aided design (CAD) software. Various methods have been used to model sources, from accurate physical models to measurement of light output. It has become common practice for designers to include measured source data for design simulations. Typically, a measured source will contain rays which sample the output distribution of the source. The ray data must then be exported to various formats suitable for import into optical analysis or design software. Source manufacturers are also making measurements of their products and supplying CAD models along with ray data sets for designers. The increasing availability of data has been beneficial to the design community but has caused a large expansion in storage needs for the source manufacturers since each software program uses a unique format to describe the source distribution. In 2012, the Illuminating Engineering Society (IES) formed a working group to understand the data requirements for ray data and recommend a standard file format. The working group included representatives from software companies supplying the analysis and design tools, source measurement companies providing metrology, source manufacturers creating the data and users from the design community. Within one year the working group proposed a file format which was recently approved by the IES for publication as TM-25. This paper will discuss the process used to define the proposed format, highlight some of the significant decisions leading to the format and list the data to be included in the first version of the standard.

  15. A Python-based interface to examine motions in time series of solar images

    NASA Astrophysics Data System (ADS)

    Campos-Rozo, J. I.; Vargas Domínguez, S.

    2017-10-01

    Python is considered to be a mature programming language, besides of being widely accepted as an engaging option for scientific analysis in multiple areas, as will be presented in this work for the particular case of solar physics research. SunPy is an open-source library based on Python that has been recently developed to furnish software tools to solar data analysis and visualization. In this work we present a graphical user interface (GUI) based on Python and Qt to effectively compute proper motions for the analysis of time series of solar data. This user-friendly computing interface, that is intended to be incorporated to the Sunpy library, uses a local correlation tracking technique and some extra tools that allows the selection of different parameters to calculate, vizualize and analyze vector velocity fields of solar data, i.e. time series of solar filtergrams and magnetograms.

  16. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    NASA Astrophysics Data System (ADS)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  17. Laws of reflection and Snell's law revisited by video modeling

    NASA Astrophysics Data System (ADS)

    Rodrigues, M.; Simeão Carvalho, P.

    2014-07-01

    Video modelling is being used, nowadays, as a tool for teaching and learning several topics in Physics. Most of these topics are related to kinematics. In this work we show how video modelling can be used for demonstrations and experimental teaching in optics, namely the laws of reflection and the well-known Snell's Law of light. Videos were recorded with a photo camera at 30 frames/s, and analysed with the open source software Tracker. Data collected from several frames was treated with the Data Tool module, and graphs were built to obtain relations between incident, reflected and refraction angles, as well as to determine the refractive index of Perspex. These videos can be freely distributed in the web and explored with students within the classroom, or as a homework assignment to improve student's understanding on specific contents. They present a large didactic potential for teaching basic optics in high school with an interactive methodology.

  18. Neutron Source Facility Training Simulator Based on EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Young Soo; Wei, Thomas Y.; Vilim, Richard B.

    A plant operator training simulator is developed for training the plant operators as well as for design verification of plant control system (PCS) and plant protection system (PPS) for the Kharkov Institute of Physics and Technology Neutron Source Facility. The simulator provides the operator interface for the whole plant including the sub-critical assembly coolant loop, target coolant loop, secondary coolant loop, and other facility systems. The operator interface is implemented based on Experimental Physics and Industrial Control System (EPICS), which is a comprehensive software development platform for distributed control systems. Since its development at Argonne National Laboratory, it has beenmore » widely adopted in the experimental physics community, e.g. for control of accelerator facilities. This work is the first implementation for a nuclear facility. The main parts of the operator interface are the plant control panel and plant protection panel. The development involved implementation of process variable database, sequence logic, and graphical user interface (GUI) for the PCS and PPS utilizing EPICS and related software tools, e.g. sequencer for sequence logic, and control system studio (CSS-BOY) for graphical use interface. For functional verification of the PCS and PPS, a plant model is interfaced, which is a physics-based model of the facility coolant loops implemented as a numerical computer code. The training simulator is tested and demonstrated its effectiveness in various plant operation sequences, e.g. start-up, shut-down, maintenance, and refueling. It was also tested for verification of the plant protection system under various trip conditions.« less

  19. Getting a handle on virtual tools: An examination of the neuronal activity associated with virtual tool use.

    PubMed

    Rallis, Austin; Fercho, Kelene A; Bosch, Taylor J; Baugh, Lee A

    2018-01-31

    Tool use is associated with three visual streams-dorso-dorsal, ventro-dorsal, and ventral visual streams. These streams are involved in processing online motor planning, action semantics, and tool semantics features, respectively. Little is known about the way in which the brain represents virtual tools. To directly assess this question, a virtual tool paradigm was created that provided the ability to manipulate tool components in isolation of one another. During functional magnetic resonance imaging (fMRI), adult participants performed a series of virtual tool manipulation tasks in which vision and movement kinematics of the tool were manipulated. Reaction time and hand movement direction were monitored while the tasks were performed. Functional imaging revealed that activity within all three visual streams was present, in a similar pattern to what would be expected with physical tool use. However, a previously unreported network of right-hemisphere activity was found including right inferior parietal lobule, middle and superior temporal gyri and supramarginal gyrus - regions well known to be associated with tool processing within the left hemisphere. These results provide evidence that both virtual and physical tools are processed within the same brain regions, though virtual tools recruit bilateral tool processing regions to a greater extent than physical tools. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Sources of self-efficacy for physical activity.

    PubMed

    Warner, Lisa M; Schüz, Benjamin; Wolff, Julia K; Parschau, Linda; Wurm, Susanne; Schwarzer, Ralf

    2014-11-01

    The effects of self-efficacy beliefs on physical activity are well documented, but much less is known about the origins of self-efficacy beliefs. This article proposes scales to assess the sources of self-efficacy for physical activity aims and to comparatively test their predictive power for physical activity via self-efficacy over time to detect the principal sources of self-efficacy beliefs for physical activity. A study of 1,406 German adults aged 16-90 years was conducted to construct scales to assess the sources of self-efficacy for physical activity (Study 1). In Study 2, the scales' predictive validity for self-efficacy and physical activity was tested in a sample of 310 older German adults. Short, reliable and valid instruments to measure six sources of self-efficacy for physical activity were developed that enable researchers to comparatively test the predictive value of the sources of self-efficacy. The results suggest that mastery experience, self-persuasion, and reduction in negative affective states are the most important predictors of self-efficacy for physical activity in community-dwelling older adults. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  1. The Interactions of Relationships, Interest, and Self-Efficacy in Undergraduate Physics

    NASA Astrophysics Data System (ADS)

    Dou, Remy

    This collected papers dissertation explores students' academic interactions in an active learning, introductory physics settings as they relate to the development of physics self-efficacy and interest. The motivation for this work extends from the national call to increase participation of students in the pursuit of science, technology, engineering, and mathematics (STEM) careers. Self-efficacy and interest are factors that play prominent roles in popular, evidence-based, career theories, including the Social cognitive career theory (SCCT) and the identity framework. Understanding how these constructs develop in light of the most pervasive characteristic of the active learning introductory physics classroom (i.e., peer-to-peer interactions) has implications on how students learn in a variety of introductory STEM classrooms and settings structured after constructivist and sociocultural learning theories. I collected data related to students' in-class interactions using the tools of social network analysis (SNA). Social network analysis has recently been shown to be an effective and useful way to examine the structure of student relationships that develop in and out of STEM classrooms. This set of studies furthers the implementation of SNA as a tool to examine self-efficacy and interest formation in the active learning physics classroom. Here I represent a variety of statistical applications of SNA, including bootstrapped linear regression (Chapter 2), structural equation modeling (Chapter 3), and hierarchical linear modeling for longitudinal analyses (Chapter 4). Self-efficacy data were collected using the Sources of Self-Efficacy for Science Courses - Physics survey (SOSESC-P), and interest data were collected using the physics identity survey. Data for these studies came from the Modeling Instruction sections of Introductory Physics with Calculus offered at Florida International University in the fall of 2014 and 2015. Analyses support the idea that students' perceptions of one another impact the development of their social network centrality, which in turn affects their self-efficacy building experiences and their overall self-efficacy. It was shown that unlike career theories that emphasize causal relationships between the development of self-efficacy and the subsequent growth of student interest, in this context student interest takes precedence before the development of student self-efficacy. This outcome also has various implications for career theories.

  2. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  3. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  4. M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.

    PubMed

    Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui

    2014-04-28

    Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.

  5. Advanced Computing Tools and Models for Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  6. Gaining insight into the physics of dynamic atomic force microscopy in complex environments using the VEDA simulator

    NASA Astrophysics Data System (ADS)

    Kiracofe, Daniel; Melcher, John; Raman, Arvind

    2012-01-01

    Dynamic atomic force microscopy (dAFM) continues to grow in popularity among scientists in many different fields, and research on new methods and operating modes continues to expand the resolution, capabilities, and types of samples that can be studied. But many promising increases in capability are accompanied by increases in complexity. Indeed, interpreting modern dAFM data can be challenging, especially on complicated material systems, or in liquid environments where the behavior is often contrary to what is known in air or vacuum environments. Mathematical simulations have proven to be an effective tool in providing physical insight into these non-intuitive systems. In this article we describe recent developments in the VEDA (virtual environment for dynamic AFM) simulator, which is a suite of freely available, open-source simulation tools that are delivered through the cloud computing cyber-infrastructure of nanoHUB (www.nanohub.org). Here we describe three major developments. First, simulations in liquid environments are improved by enhancements in the modeling of cantilever dynamics, excitation methods, and solvation shell forces. Second, VEDA is now able to simulate many new advanced modes of operation (bimodal, phase-modulation, frequency-modulation, etc.). Finally, nineteen different tip-sample models are available to simulate the surface physics of a wide variety different material systems including capillary, specific adhesion, van der Waals, electrostatic, viscoelasticity, and hydration forces. These features are demonstrated through example simulations and validated against experimental data, in order to provide insight into practical problems in dynamic AFM.

  7. Analytic Scattering and Refraction Models for Exoplanet Transit Spectra

    NASA Astrophysics Data System (ADS)

    Robinson, Tyler D.; Fortney, Jonathan J.; Hubbard, William B.

    2017-12-01

    Observations of exoplanet transit spectra are essential to understanding the physics and chemistry of distant worlds. The effects of opacity sources and many physical processes combine to set the shape of a transit spectrum. Two such key processes—refraction and cloud and/or haze forward-scattering—have seen substantial recent study. However, models of these processes are typically complex, which prevents their incorporation into observational analyses and standard transit spectrum tools. In this work, we develop analytic expressions that allow for the efficient parameterization of forward-scattering and refraction effects in transit spectra. We derive an effective slant optical depth that includes a correction for forward-scattered light, and present an analytic form of this correction. We validate our correction against a full-physics transit spectrum model that includes scattering, and we explore the extent to which the omission of forward-scattering effects may bias models. Also, we verify a common analytic expression for the location of a refractive boundary, which we express in terms of the maximum pressure probed in a transit spectrum. This expression is designed to be easily incorporated into existing tools, and we discuss how the detection of a refractive boundary could help indicate the background atmospheric composition by constraining the bulk refractivity of the atmosphere. Finally, we show that opacity from Rayleigh scattering and collision-induced absorption will outweigh the effects of refraction for Jupiter-like atmospheres whose equilibrium temperatures are above 400-500 K.

  8. Gaining insight into the physics of dynamic atomic force microscopy in complex environments using the VEDA simulator.

    PubMed

    Kiracofe, Daniel; Melcher, John; Raman, Arvind

    2012-01-01

    Dynamic atomic force microscopy (dAFM) continues to grow in popularity among scientists in many different fields, and research on new methods and operating modes continues to expand the resolution, capabilities, and types of samples that can be studied. But many promising increases in capability are accompanied by increases in complexity. Indeed, interpreting modern dAFM data can be challenging, especially on complicated material systems, or in liquid environments where the behavior is often contrary to what is known in air or vacuum environments. Mathematical simulations have proven to be an effective tool in providing physical insight into these non-intuitive systems. In this article we describe recent developments in the VEDA (virtual environment for dynamic AFM) simulator, which is a suite of freely available, open-source simulation tools that are delivered through the cloud computing cyber-infrastructure of nanoHUB (www.nanohub.org). Here we describe three major developments. First, simulations in liquid environments are improved by enhancements in the modeling of cantilever dynamics, excitation methods, and solvation shell forces. Second, VEDA is now able to simulate many new advanced modes of operation (bimodal, phase-modulation, frequency-modulation, etc.). Finally, nineteen different tip-sample models are available to simulate the surface physics of a wide variety different material systems including capillary, specific adhesion, van der Waals, electrostatic, viscoelasticity, and hydration forces. These features are demonstrated through example simulations and validated against experimental data, in order to provide insight into practical problems in dynamic AFM.

  9. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    NASA Astrophysics Data System (ADS)

    Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston

    2007-06-01

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  11. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis

    PubMed Central

    Lal, Aparna

    2016-01-01

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change. PMID:26848669

  12. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis.

    PubMed

    Lal, Aparna

    2016-02-02

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  13. Scientists and Public: Is the Information Flow Direction Starting to Change?

    NASA Astrophysics Data System (ADS)

    Diaz-Doce, D.; Bee, E. J.; Bell, P. D.; Marchant, A. P.; Reay, S.; Richardson, S. L.; Shelley, W. A.

    2014-12-01

    Over half of the population of the UK own a smartphone, and about the same number of people uses social media such as Twitter. For the British Geological Survey (BGS) this means millions of potential reporters of real-time events and in-the-field data capturers, creating a new source of scientific information that could help to better understand and predict natural processes. BGS first started collecting citizen data, using crowd-sourcing, through websites and smartphone apps focused on gathering geological related information (e.g. mySoil and myVolcano). These tools ask volunteers to follow a guided form where they can upload data related to geology and geological events; including location, description, measurements, photos, videos, or even instructions on sending physical samples. This information is used to augment existing data collections. Social media provides a different channel for gathering useful scientific information from the public. BGS is starting to explore this route with the release of GeoSocial-Aurora , a web mapping tool that searches for tweets related to aurora sightings and locates them as markers on a map. Users are actively encouraged to contribute by sending tweets about aurora sightings in a specific format, which contains the #BGSaurora hashtag, the location of the sighting, and any comments or pictures. The tool harvests these tweets through the Twitter REST API and places them on the map, enabling the user to generate clusters and heatmaps. GeoSocial-Aurora provides scientists with a potential tool for gathering useful data for scientific analysis. It collects actual aurora sighting locations, enabling users to check where the aurora is taking place in real time. This may, in time, help scientists to improve future predictions of when and where auroras are visible.

  14. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and auralizations.

  15. SpacePy - a Python-based library of tools for the space sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven K; Welling, Daniel T; Koller, Josef

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks tomore » promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the analysis methods employed in scientific studies and will give access to advanced tools to all space scientists regardless of affiliation or circumstance.« less

  16. QuakeSim 2.0

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Granat, Robert A.; Norton, Charles D.; Rundle, John B.; Pierce, Marlon E.; Fox, Geoffrey C.; McLeod, Dennis; Ludwig, Lisa Grant

    2012-01-01

    QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders.

  17. Physical Exam Risk Factors for Lower Extremity Injury in High School Athletes: A Systematic Review

    PubMed Central

    Onate, James A.; Everhart, Joshua S.; Clifton, Daniel R.; Best, Thomas M.; Borchers, James R.; Chaudhari, Ajit M.W.

    2016-01-01

    Objective A stated goal of the preparticipation physical evaluation (PPE) is to reduce musculoskeletal injury, yet the musculoskeletal portion of the PPE is reportedly of questionable use in assessing lower extremity injury risk in high school-aged athletes. The objectives of this study are: (1) identify clinical assessment tools demonstrated to effectively determine lower extremity injury risk in a prospective setting, and (2) critically assess the methodological quality of prospective lower extremity risk assessment studies that use these tools. Data Sources A systematic search was performed in PubMed, CINAHL, UptoDate, Google Scholar, Cochrane Reviews, and SportDiscus. Inclusion criteria were prospective injury risk assessment studies involving athletes primarily ages 13 to 19 that used screening methods that did not require highly specialized equipment. Methodological quality was evaluated with a modified physiotherapy evidence database (PEDro) scale. Main Results Nine studies were included. The mean modified PEDro score was 6.0/10 (SD, 1.5). Multidirectional balance (odds ratio [OR], 3.0; CI, 1.5–6.1; P < 0.05) and physical maturation status (P < 0.05) were predictive of overall injury risk, knee hyperextension was predictive of anterior cruciate ligament injury (OR, 5.0; CI, 1.2–18.4; P < 0.05), hip external: internal rotator strength ratio of patellofemoral pain syndrome (P = 0.02), and foot posture index of ankle sprain (r = −0.339, P = 0.008). Conclusions Minimal prospective evidence supports or refutes the use of the functional musculoskeletal exam portion of the current PPE to assess lower extremity injury risk in high school athletes. Limited evidence does support inclusion of multidirectional balance assessment and physical maturation status in a musculoskeletal exam as both are generalizable risk factors for lower extremity injury. PMID:26978166

  18. Smart Aquarium as Physics Learning Media for Renewable Energy

    NASA Astrophysics Data System (ADS)

    Desnita, D.; Raihanati, R.; Susanti, D.

    2018-04-01

    Smart aquarium has been developed as a learning media to visualize Micro Hydro Power Generator (MHPG). Its used aquarium water circulation system and Wind Power Generation (WPG) which generated through a wheel as a source. Its also used to teach about energy changes, circular motion and wheel connection, electromagnetic impact, and AC power circuit. The output power and system efficiency was adjusted through the adjustment of water level and wind speed. Specific targets in this research are: to achieved: (i) develop green aquarium technology that’s suitable to used as a medium of physics learning, (ii) improving quality of process and learning result at a senior high school student. Research method used development research by Borg and Gall, which includes preliminary studies, design, product development, expert validation, and product feasibility test, and vinalisation. The validation test by the expert states that props feasible to use. Limited trials conducted prove that this tool can improve students science process skills.

  19. The Tacoma Narrows Bridge Collapse on Film and Video

    NASA Astrophysics Data System (ADS)

    Olson, Don; Hook, Joseph; Doescher, Russell; Wolf, Steven

    2015-11-01

    This month marks the 75th anniversary of the Tacoma Narrows Bridge collapse. During a gale on Nov. 7, 1940, the bridge exhibited remarkable oscillations before collapsing spectacularly (Figs. 1-5). Physicists over the years have spent a great deal of time and energy studying this event. By using open-source analysis tools and digitized footage of the disaster, physics students in both high school and college can continue in this tradition. Students can watch footage of "Galloping Gertie," ask scientific questions about the bridge's collapse, analyze data, and draw conclusions from that analysis. Students should be encouraged to pursue their own investigations, but the question that drove our inquiry was this: "When physics classes watch modern video showing the oscillations and the free fall of the bridge fragments, are these scenes sped up, slowed down, or at the correct speed compared to what was observed by the eyewitnesses on Nov. 7, 1940?"

  20. SED Modeling of 20 Massive Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Tanti, Kamal Kumar

    In this paper, we present the spectral energy distributions (SEDs) modeling of twenty massive young stellar objects (MYSOs) and subsequently estimated different physical and structural/geometrical parameters for each of the twenty central YSO outflow candidates, along with their associated circumstellar disks and infalling envelopes. The SEDs for each of the MYSOs been reconstructed by using 2MASS, MSX, IRAS, IRAC & MIPS, SCUBA, WISE, SPIRE and IRAM data, with the help of a SED Fitting Tool, that uses a grid of 2D radiative transfer models. Using the detailed analysis of SEDs and subsequent estimation of physical and geometrical parameters for the central YSO sources along with its circumstellar disks and envelopes, the cumulative distribution of the stellar, disk and envelope parameters can be analyzed. This leads to a better understanding of massive star formation processes in their respective star forming regions in different molecular clouds.

  1. Parallel programming with Easy Java Simulations

    NASA Astrophysics Data System (ADS)

    Esquembre, F.; Christian, W.; Belloni, M.

    2018-01-01

    Nearly all of today's processors are multicore, and ideally programming and algorithm development utilizing the entire processor should be introduced early in the computational physics curriculum. Parallel programming is often not introduced because it requires a new programming environment and uses constructs that are unfamiliar to many teachers. We describe how we decrease the barrier to parallel programming by using a java-based programming environment to treat problems in the usual undergraduate curriculum. We use the easy java simulations programming and authoring tool to create the program's graphical user interface together with objects based on those developed by Kaminsky [Building Parallel Programs (Course Technology, Boston, 2010)] to handle common parallel programming tasks. Shared-memory parallel implementations of physics problems, such as time evolution of the Schrödinger equation, are available as source code and as ready-to-run programs from the AAPT-ComPADRE digital library.

  2. Understanding Introductory Students' Application of Integrals in Physics from Multiple Perspectives

    ERIC Educational Resources Information Center

    Hu, Dehui

    2013-01-01

    Calculus is used across many physics topics from introductory to upper-division level college courses. The concepts of differentiation and integration are important tools for solving real world problems. Using calculus or any mathematical tool in physics is much more complex than the straightforward application of the equations and algorithms that…

  3. Accoustic waveform logging--Advances in theory and application

    USGS Publications Warehouse

    Paillet, F.L.; Cheng, C.H.; Pennington , W.D.

    1992-01-01

    Full-waveform acoustic logging has made significant advances in both theory and application in recent years, and these advances have greatly increased the capability of log analysts to measure the physical properties of formations. Advances in theory provide the analytical tools required to understand the properties of measured seismic waves, and to relate those properties to such quantities as shear and compressional velocity and attenuation, and primary and fracture porosity and permeability of potential reservoir rocks. The theory demonstrates that all parts of recorded waveforms are related to various modes of propagation, even in the case of dipole and quadrupole source logging. However, the theory also indicates that these mode properties can be used to design velocity and attenuation picking schemes, and shows how source frequency spectra can be selected to optimize results in specific applications. Synthetic microseismogram computations are an effective tool in waveform interpretation theory; they demonstrate how shear arrival picks and mode attenuation can be used to compute shear velocity and intrinsic attenuation, and formation permeability for monopole, dipole and quadrupole sources. Array processing of multi-receiver data offers the opportunity to apply even more sophisticated analysis techniques. Synthetic microseismogram data is used to illustrate the application of the maximum-likelihood method, semblance cross-correlation, and Prony's method analysis techniques to determine seismic velocities and attenuations. The interpretation of acoustic waveform logs is illustrated by reviews of various practical applications, including synthetic seismogram generation, lithology determination, estimation of geomechanical properties in situ, permeability estimation, and design of hydraulic fracture operations.

  4. Remote sensing and airborne geophysics in the assessment of natural aggregate resources

    USGS Publications Warehouse

    Knepper, D.H.; Langer, W.H.; Miller, S.H.

    1994-01-01

    Natural aggregate made from crushed stone and deposits of sand and gravel is a vital element of the construction industry in the United States. Although natural aggregate is a high volume/low value commodity that is relatively abundant, new sources of aggregate are becoming increasingly difficult to find and develop because of rigid industry specifications, political considerations, development and transporation costs, and environmental concerns, especially in urban growth centers where much of the aggregate is used. As the demand for natural aggregate increases in response to urban growth and the repair and expansion of the national infrastructure, new sources of natural aggregate will be required. The USGS has recognized the necessity of developing the capability to assess the potential for natural aggregate sources on Federal lands; at present, no methodology exists for systematically describing and evaluating potential sources of natural aggregate. Because remote sensing and airborne geophysics can detect surface and nearsurface phenomena, these tools may useful for detecting and mapping potential sources of natural aggregate; however, before a methodology for applying these tools can be developed, it is necessary to understand the type, distribution, physical properties, and characteristics of natural aggregate deposits, as well as the problems that will be encountered in assessing their potential value. There are two primary sources of natural aggregate: (1) exposed or near-surface igneous, metamorphic, and sedimentary bedrock that can be crushed, and (2) deposits of sand and gravel that may be used directly or crushed and sized to meet specifications. In any particular area, the availability of bedrock suitable for crushing is a function of the geologic history of the area - the processes that formed, deformed, eroded and exposed the bedrock. Deposits of sand and gravel are primarily surficial deposits formed by the erosion, transportation by water and ice, and deposition of bedrock fragments. Consequently, most sand and gravel deposits are Tertiary or Quaternary in age and are most common in glaciated areas, alluvial basins, and along rivers and streams. The distribution of potential sources of natural aggregate in the United States is closely tied to physiography and the type of bedrock that occurs in an area. Using these criteria, the United States can be divided into 12 regions: western mountain ranges, alluvial basins, Columbia Plateau, Colorado Plateau and Wyoming basin, High Plains, nonglaciated central region, glaciated central region, Piedmont Blue Ridge region, glaciated northeastern and Superior uplands, Atlantic and Gulf coastal plain, Hawaiian Islands, and Alaska. Each region has similar types of natural aggregate sources within its boundary, although there may be wide variations in specific physical and chemical characteristics of the aggregates within a region. Conventional exploration for natural aggregate deposits has been largely a ground-based operation (field mapping, sampling, trenching and augering, resistivity), although aerial photos and topographic maps have been extensively used to target possible deposits for sampling and testing. Today, the exploration process also considers other factors such as the availability of the land, space and water supply for processing purposes, political and environmental factors, and distance from the market; exploration and planning cannot be separated. There are many physical properties and characteristics by which aggregate material is judged to be acceptable or unacceptable for specific applications; most of these properties and characteristics pertain only to individual aggregate particles and not to the bulk deposit. For example, properties of crushed stone aggregate particles such as thermal volume change, solubility, oxidation and hydration reactivity, and particle strength, among many others, are important consi

  5. Muon Physics at the Paul Scherrer Institut (psi) and at Triumf

    NASA Astrophysics Data System (ADS)

    Walter, Hans-Kristian

    Muons can be produced abundantly at so-called pion factories. Fundamental information about todays standard model of particle physics is obtained by studying their decays. New experiments have been proposed at PSI and TRIUMF to measure the muons lifetime, the Michel parameters, describing its main decay μ+ → e+ + ve + ` vμ, as well as the decay positrons polarizations. Muon and electron number violating decays like μ+ → e+ + γ and neutrinoless muon electron conversion in nuclei μ- N → e- N are especially sensitive to new physics beyond the standard model. The moon when bound in a muonic atom or to an electron to form muonium, can also serve as a tool to investigate properties of its binding partner and the electroweak binding forces. Muonic and pionic hydrogen isotopes and Helium are mostly being studied. Finally muons can be applied to address problems in solid state and surface physics. Here cold and ultracold muons are of special interest, because of their very small phase space. Muon catalyzed fusion in addtition to offering a rich field for atomic and molecular physics could be used in technological applications like energy production (in connection with conventional breeders) or to construct a strong source of 14 MeV neutrons.

  6. Performance profiling for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Choi, Wonqook; Cho, Kihyeon; Yeo, Insung

    2018-05-01

    In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.

  7. New tools for investigating student learning in upper-division electrostatics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.

    Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.

  8. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  9. An acoustic glottal source for vocal tract physical models

    NASA Astrophysics Data System (ADS)

    Hannukainen, Antti; Kuortti, Juha; Malinen, Jarmo; Ojalammi, Antti

    2017-11-01

    A sound source is proposed for the acoustic measurement of physical models of the human vocal tract. The physical models are produced by fast prototyping, based on magnetic resonance imaging during prolonged vowel production. The sound source, accompanied by custom signal processing algorithms, is used for two kinds of measurements from physical models of the vocal tract: (i) amplitude frequency response and resonant frequency measurements, and (ii) signal reconstructions at the source output according to a target pressure waveform with measurements at the mouth position. The proposed source and the software are validated by computational acoustics experiments and measurements on a physical model of the vocal tract corresponding to the vowels [] of a male speaker.

  10. Long Distance Reactor Antineutrino Flux Monitoring

    NASA Astrophysics Data System (ADS)

    Dazeley, Steven; Bergevin, Marc; Bernstein, Adam

    2015-10-01

    The feasibility of antineutrino detection as an unambiguous and unshieldable way to detect the presence of distant nuclear reactors has been studied. While KamLAND provided a proof of concept for long distance antineutrino detection, the feasibility of detecting single reactors at distances greater than 100 km has not yet been established. Even larger detectors than KamLAND would be required for such a project. Considerations such as light attenuation, environmental impact and cost, which favor water as a detection medium, become more important as detectors get larger. We have studied both the sensitivity of water based detection media as a monitoring tool, and the scientific impact such detectors might provide. A next generation water based detector may be able to contribute to important questions in neutrino physics, such as supernova neutrinos, sterile neutrino oscillations, and non standard electroweak interactions (using a nearby compact accelerator source), while also providing a highly sensitive, and inherently unshieldable reactor monitoring tool to the non proliferation community. In this talk I will present the predicted performance of an experimental non proliferation and high-energy physics program. Lawrence Livermore National Laboratory is operated by Lawrence Livermore National Security, LLC, for the U.S. Department of Energy, National Nuclear Security Administration under Contract DE-AC52-07NA27344. Release number LLNL-ABS-674192.

  11. Numerical framework for the modeling of electrokinetic flows

    NASA Astrophysics Data System (ADS)

    Deshpande, Manish; Ghaddar, Chahid; Gilbert, John R.; St. John, Pamela M.; Woudenberg, Timothy M.; Connell, Charles R.; Molho, Joshua; Herr, Amy; Mungal, Godfrey; Kenny, Thomas W.

    1998-09-01

    This paper presents a numerical framework for design-based analyses of electrokinetic flow in interconnects. Electrokinetic effects, which can be broadly divided into electrophoresis and electroosmosis, are of importance in providing a transport mechanism in microfluidic devices for both pumping and separation. Models for the electrokinetic effects can be derived and coupled to the fluid dynamic equations through appropriate source terms. In the design of practical microdevices, however, accurate coupling of the electrokinetic effects requires the knowledge of several material and physical parameters, such as the diffusivity and the mobility of the solute in the solvent. Additionally wall-based effects such as chemical binding sites might exist that affect the flow patterns. In this paper, we address some of these issues by describing a synergistic numerical/experimental process to extract the parameters required. Experiments were conducted to provide the numerical simulations with a mechanism to extract these parameters based on quantitative comparisons with each other. These parameters were then applied in predicting further experiments to validate the process. As part of this research, we have created NetFlow, a tool for micro-fluid analyses. The tool can be validated and applied in existing technologies by first creating test structures to extract representations of the physical phenomena in the device, and then applying them in the design analyses to predict correct behavior.

  12. The Azimuth Project: an Open-Access Educational Resource

    NASA Astrophysics Data System (ADS)

    Baez, J. C.

    2012-12-01

    The Azimuth Project is an online collaboration of scientists, engineers and programmers who are volunteering their time to do something about a wide range of environmental problems. The project has several aspects: 1) a wiki designed to make reliable, sourced information easy to find and accessible to a technically literate nonexperts, 2) a blog featuring expository articles and news items, 3) a project to write programs that explain basic concepts of climate physics and illustrate principles of good open-source software design, and 4) a project to develop mathematical tools for studying complex networked systems. We discuss the progress so far and some preliminary lessons. For example, enlisting the help of experts outside academia highlights the problems with pay-walled journals and the benefits of open access, as well as differences between how software development is done commercially, in the free software community, and in academe.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flechsig, U.; Follath, R.; Reiche, S.

    PHASE is a software tool for physical optics simulation based on the stationary phase approximation method. The code is under continuous development since about 20 years and has been used for instance for fundamental studies and ray tracing of various beamlines at the Swiss Light Source. Along with the planning for SwissFEL a new hard X-ray free electron laser under construction, new features have been added to permit practical performance predictions including diffraction effects which emerge with the fully coherent source. We present the application of the package on the example of the ARAMIS 1 beamline at SwissFEL. The X-raymore » pulse calculated with GENESIS and given as an electrical field distribution has been propagated through the beamline to the sample position. We demonstrate the new features of PHASE like the treatment of measured figure errors, apertures and coatings of the mirrors and the application of Fourier optics propagators for free space propagation.« less

  14. GRI: The Gamma-Ray Imager mission

    NASA Astrophysics Data System (ADS)

    Knödlseder, Jürgen; GRI Consortium

    With the INTEGRAL observatory ESA has provided a unique tool to the astronomical community revealing hundreds of sources, new classes of objects, extraordinary views of antimatter annihilation in our Galaxy, and fingerprints of recent nucleosynthesis processes. While INTEGRAL provides the global overview over the soft gamma-ray sky, there is a growing need to perform deeper, more focused investigations of gamma-ray sources. In soft X-rays a comparable step was taken going from the Einstein and the EXOSAT satellites to the Chandra and XMM/Newton observatories. Technological advances in the past years in the domain of gamma-ray focusing using Laue diffraction have paved the way towards a new gamma-ray mission, providing major improvements regarding sensitivity and angular resolution. Such a future Gamma-Ray Imager will allow studies of particle acceleration processes and explosion physics in unprecedented detail, providing essential clues on the innermost nature of the most violent and most energetic processes in the Universe.

  15. GRI: The Gamma-Ray Imager mission

    NASA Astrophysics Data System (ADS)

    Knödlseder, Jürgen; GRI Consortium

    2006-06-01

    With the INTEGRAL observatory, ESA has provided a unique tool to the astronomical community revealing hundreds of sources, new classes of objects, extraordinary views of antimatter annihilation in our Galaxy, and fingerprints of recent nucleosynthesis processes. While INTEGRAL provides the global overview over the soft gamma-ray sky, there is a growing need to perform deeper, more focused investigations of gamma-ray sources. In soft X-rays a comparable step was taken going from the Einstein and the EXOSAT satellites to the Chandra and XMM/Newton observatories. Technological advances in the past years in the domain of gamma-ray focusing using Laue diffraction have paved the way towards a new gamma-ray mission, providing major improvements regarding sensitivity and angular resolution. Such a future Gamma-Ray Imager will allow the study of particle acceleration processes and explosion physics in unprecedented detail, providing essential clues on the innermost nature of the most violent and most energetic processes in the Universe.

  16. NASA's Quiet Aircraft Technology Project

    NASA Technical Reports Server (NTRS)

    Whitfield, Charlotte E.

    2004-01-01

    NASA's Quiet Aircraft Technology Project is developing physics-based understanding, models and concepts to discover and realize technology that will, when implemented, achieve the goals of a reduction of one-half in perceived community noise (relative to 1997) by 2007 and a further one-half in the far term. Noise sources generated by both the engine and the airframe are considered, and the effects of engine/airframe integration are accounted for through the propulsion airframe aeroacoustics element. Assessments of the contribution of individual source noise reductions to the reduction in community noise are developed to guide the work and the development of new tools for evaluation of unconventional aircraft is underway. Life in the real world is taken into account with the development of more accurate airport noise models and flight guidance methodology, and in addition, technology is being developed that will further reduce interior noise at current weight levels or enable the use of lighter-weight structures at current noise levels.

  17. Long distance measurement with a femtosecond laser based frequency comb

    NASA Astrophysics Data System (ADS)

    Bhattacharya, N.; Cui, M.; Zeitouny, M. G.; Urbach, H. P.; van den Berg, S. A.

    2017-11-01

    Recent advances in the field of ultra-short pulse lasers have led to the development of reliable sources of carrier envelope phase stabilized femtosecond pulses. The pulse train generated by such a source has a frequency spectrum that consists of discrete, regularly spaced lines known as a frequency comb. In this case both the frequency repetition and the carrier-envelope-offset frequency are referenced to a frequency standard, like an atomic clock. As a result the accuracy of the frequency standard is transferred to the optical domain, with the frequency comb as transfer oscillator. These unique properties allow the frequency comb to be applied as a versatile tool, not only for time and frequency metrology, but also in fundamental physics, high-precision spectroscopy, and laser noise characterization. The pulse-to-pulse phase relationship of the light emitted by the frequency comb has opened up new directions for long range highly accurate distance measurement.

  18. Versatile fusion source integrator AFSI for fast ion and neutron studies in fusion devices

    NASA Astrophysics Data System (ADS)

    Sirén, Paula; Varje, Jari; Äkäslompolo, Simppa; Asunta, Otto; Giroud, Carine; Kurki-Suonio, Taina; Weisen, Henri; JET Contributors, The

    2018-01-01

    ASCOT Fusion Source Integrator AFSI, an efficient tool for calculating fusion reaction rates and characterizing the fusion products, based on arbitrary reactant distributions, has been developed and is reported in this paper. Calculation of reactor-relevant D-D, D-T and D-3He fusion reactions has been implemented based on the Bosch-Hale fusion cross sections. The reactions can be calculated between arbitrary particle populations, including Maxwellian thermal particles and minority energetic particles. Reaction rate profiles, energy spectra and full 4D phase space distributions can be calculated for the non-isotropic reaction products. The code is especially suitable for integrated modelling in self-consistent plasma physics simulations as well as in the Serpent neutronics calculation chain. Validation of the model has been performed for neutron measurements at the JET tokamak and the code has been applied to predictive simulations in ITER.

  19. GIS-MODFLOW: Ein kleines OpenSource-Werkzeug zur Anbindung von GIS-Daten an MODFLOW

    NASA Astrophysics Data System (ADS)

    Gossel, Wolfgang

    2013-06-01

    The numerical model MODFLOW (Harbaugh 2005) is an efficient and up-to-date tool for groundwater flow modelling. On the other hand, Geo-Information-Systems (GIS) provide useful tools for data preparation and visualization that can also be incorporated in numerical groundwater modelling. An interface between both would therefore be useful for many hydrogeological investigations. To date, several integrated stand-alone tools have been developed that rely on MODFLOW, MODPATH and transport modelling tools. Simultaneously, several open source-GIS codes were developed to improve functionality and ease of use. These GIS tools can be used as pre- and post-processors of the numerical model MODFLOW via a suitable interface. Here we present GIS-MODFLOW as an open-source tool that provides a new universal interface by using the ESRI ASCII GRID data format that can be converted into MODFLOW input data. This tool can also treat MODFLOW results. Such a combination of MODFLOW and open-source GIS opens new possibilities to render groundwater flow modelling, and simulation results, available to larger circles of hydrogeologists.

  20. Development status of EUV sources for use in beta-tools and high-volume chip manufacturing tools

    NASA Astrophysics Data System (ADS)

    Stamm, U.; Kleinschmidt, J.; Bolshukhin, D.; Brudermann, J.; Hergenhan, G.; Korobotchko, V.; Nikolaus, B.; Schürmann, M. C.; Schriever, G.; Ziener, C.; Borisov, V. M.

    2006-03-01

    In the paper we give an update about the development status of gas discharge produced plasma (GDPP) EUV sources at XTREME technologies. Already in 2003 first commercial prototypes of xenon GDPP sources of the type XTS 13-35 based on the Z-pinch with 35 W power in 2π sr have been delivered and integrated into micro-exposure tools from Exitech, UK. The micro-exposure tools with these sources have been installed in industry in 2004. The first tool has made more than 100 million pulses without visible degradation of the source collector optics. For the next generation of full-field exposure tools (we call it Beta-tools) we develop GDPP sources with power of > 10 W in intermediate focus. Also these sources use xenon as fuel which has the advantage of not introducing additional contaminations. Here we describe basic performance of these sources as well as aspects of collector integration and debris mitigation and optics lifetime. To achieve source performance data required for high volume chip manufacturing we consider tin as fuel for the source because of its higher conversion efficiency compared to xenon. While we had earlier reported an output power of 400 W in 2π sr from a tin source we could reach meanwhile 800 W in 2π sr from the source in burst operation. Provided a high power collector is available with a realistic collector module efficiency of between 9% and 15 % these data would support 70-120 W power in intermediate focus. However, we do not expect that the required duty cycle and the required electrode lifetimes can be met with this standing electrode design Z-pinch approach. To overcome lifetime and duty cycle limitations we have investigated GDPP sources with tin fuel and rotating disk electrodes. Currently we can generate more than 200 W in 2π sr with these sources at 4 kHz repetition rate. To achieve 180 W power in intermediate focus which is the recent requirement of some exposure tool manufacturers this type of source needs to operate at 21-28 kHz repetition rate which may be not possible by various reasons. In order to make operation at reasonable repetition rates with sufficient power possible we have investigated various new excitation concepts of the rotating disk electrode configurations. With one of the concepts pulse energies above 170 mJ in 2π sr could be demonstrated. This approach promises to support 180 W intermediate focus power at repetition rates in the range between 7 and 10 kHz. It will be developed to the next power level in the following phase of XTREME technologies' high volume manufacturing source development program.

  1. Enhancing interdisciplinary collaboration and decisionmaking with J-Earth: an open source data sharing, visualization and GIS analysis platform

    NASA Astrophysics Data System (ADS)

    Prashad, L. C.; Christensen, P. R.; Fink, J. H.; Anwar, S.; Dickenshied, S.; Engle, E.; Noss, D.

    2010-12-01

    Our society currently is facing a number of major environmental challenges, most notably the threat of climate change. A multifaceted, interdisciplinary approach involving physical and social scientists, engineers and decisionmakers is critical to adequately address these complex issues. To best facilitate this interdisciplinary approach, data and models at various scales - from local to global - must be quickly and easily shared between disciplines to effectively understand environmental phenomena and human-environmental interactions. When data are acquired and studied on different scales and within different disciplines, researchers and practitioners may not be able to easily learn from each others results. For example, climate change models are often developed at a global scale, while strategies that address human vulnerability to climate change and mitigation/adaptation strategies are often assessed on a local level. Linkages between urban heat island phenomena and global climate change may be better understood with increased data flow amongst researchers and those making policy decisions. In these cases it would be useful have a single platform to share, visualize, and analyze numerical model and satellite/airborne remote sensing data with social, environmental, and economic data between researchers and practitioners. The Arizona State University 100 Cities Project and Mars Space Flight Facility are developing the open source application J-Earth, with the goal of providing this single platform, that facilitates data sharing, visualization, and analysis between researchers and applied practitioners around environmental and other sustainability challenges. This application is being designed for user communities including physical and social scientists, NASA researchers, non-governmental organizations, and decisionmakers to share and analyze data at multiple scales. We are initially focusing on urban heat island and urban ecology studies, with data and users from local to global levels. J-Earth is a Geographic Information System (GIS) that provides analytical tools for visualizing high-resolution and hyperspectral remote sensing imagery along with numeric and vector data. J-Earth is part of the JMARS (Java Mission-planning and Analysis for Remote Sensing) suite of tools which were first created to target NASA instruments on Mars and Lunar missions. Data can currently be incorporated in J-Earth at a scale of over 32,000 pixels per degree. Among other GIS functions, users can analyze trends along a transect line, or across vector regions, over multiple stacked numerical data layers and export their results. Open source tools, like J-Earth, are not only generally free or low-cost to users but provide the opportunity for users to contribute direction, functionality, and data standards to these projects. The flexible nature of open source projects often facilitates the incorporation of unique and emerging data sources, such as mobile phone data, sensor networks, croudsourced inputs, and social networking. The J-Earth team plans to incorporate datasources such as these with the feedback and participation of the user community.

  2. Interactive graphic editing tools in bioluminescent imaging simulation

    NASA Astrophysics Data System (ADS)

    Li, Hui; Tian, Jie; Luo, Jie; Wang, Ge; Cong, Wenxiang

    2005-04-01

    It is a challenging task to accurately describe complicated biological tissues and bioluminescent sources in bioluminescent imaging simulation. Several graphic editing tools have been developed to efficiently model each part of the bioluminescent simulation environment and to interactively correct or improve the initial models of anatomical structures or bioluminescent sources. There are two major types of graphic editing tools: non-interactive tools and interactive tools. Geometric building blocks (i.e. regular geometric graphics and superquadrics) are applied as non-interactive tools. To a certain extent, complicated anatomical structures and bioluminescent sources can be approximately modeled by combining a sufficient large number of geometric building blocks with Boolean operators. However, those models are too simple to describe the local features and fine changes in 2D/3D irregular contours. Therefore, interactive graphic editing tools have been developed to facilitate the local modifications of any initial surface model. With initial models composed of geometric building blocks, interactive spline mode is applied to conveniently perform dragging and compressing operations on 2D/3D local surface of biological tissues and bioluminescent sources inside the region/volume of interest. Several applications of the interactive graphic editing tools will be presented in this article.

  3. Physical and non-physical energy in scattered wave source-receiver interferometry.

    PubMed

    Meles, Giovanni Angelo; Curtis, Andrew

    2013-06-01

    Source-receiver interferometry allows Green's functions between sources and receivers to be estimated by means of convolution and cross-correlation of other wavefields. Source-receiver interferometry has been observed to work surprisingly well in practical applications when theoretical requirements (e.g., complete enclosing boundaries of other sources and receivers) are contravened: this paper contributes to explain why this may be true. Commonly used inter-receiver interferometry requires wavefields to be generated around specific stationary points in space which are controlled purely by medium heterogeneity and receiver locations. By contrast, application of source-receiver interferometry constructs at least kinematic information about physically scattered waves between a source and a receiver by cross-convolution of scattered waves propagating from and to any points on the boundary. This reduces the ambiguity in interpreting wavefields generated using source-receiver interferometry with only partial boundaries (as is standard in practical applications), as it allows spurious or non-physical energy in the constructed Green's function to be identified and ignored. Further, source-receiver interferometry (which includes a step of inter-receiver interferometry) turns all types of non-physical or spurious energy deriving from inter-receiver interferometry into what appears to be physical energy. This explains in part why source-receiver interferometry may perform relatively well compared to inter-receiver interferometry when constructing scattered wavefields.

  4. Coupled Physics Environment (CouPE) library - Design, Implementation, and Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Vijay S.

    Over several years, high fidelity, validated mono-­physics solvers with proven scalability on peta-­scale architectures have been developed independently. Based on a unified component-­based architecture, these existing codes can be coupled with a unified mesh-­data backplane and a flexible coupling-­strategy-­based driver suite to produce a viable tool for analysts. In this report, we present details on the design decisions and developments on CouPE, an acronym that stands for Coupled Physics Environment that orchestrates a coupled physics solver through the interfaces exposed by MOAB array-­based unstructured mesh, both of which are part of SIGMA (Scalable Interfaces for Geometry and Mesh-­Based Applications) toolkit.more » The SIGMA toolkit contains libraries that enable scalable geometry and unstructured mesh creation and handling in a memory and computationally efficient implementation. The CouPE version being prepared for a full open-­source release along with updated documentation will contain several useful examples that will enable users to start developing their applications natively using the native MOAB mesh and couple their models to existing physics applications to analyze and solve real world problems of interest. An integrated multi-­physics simulation capability for the design and analysis of current and future nuclear reactor models is also being investigated as part of the NEAMS RPL, to tightly couple neutron transport, thermal-­hydraulics and structural mechanics physics under the SHARP framework. This report summarizes the efforts that have been invested in CouPE to bring together several existing physics applications namely PROTEUS (neutron transport code), Nek5000 (computational fluid-dynamics code) and Diablo (structural mechanics code). The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The design of CouPE along with motivations that led to implementation choices are also discussed. The first release of the library will be different from the current version of the code that integrates the components in SHARP and explanation on the need for forking the source base will also be provided. Enhancements in the functionality and improved user guides will be available as part of the release. CouPE v0.1 is scheduled for an open-­source release in December 2014 along with SIGMA v1.1 components that provide support for language-agnostic mesh loading, traversal and query interfaces along with scalable solution transfer of fields between different physics codes. The coupling methodology and software interfaces of the library are presented, along with verification studies on two representative fast sodium-­cooled reactor demonstration problems to prove the usability of the CouPE library.« less

  5. Insightful problem solving and creative tool modification by captive nontool-using rooks.

    PubMed

    Bird, Christopher D; Emery, Nathan J

    2009-06-23

    The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use.

  6. The test of Tensile Properties and Water Resistance of a Novel Cross-linked Starch Prepared by Adding Oil-Flax

    NASA Astrophysics Data System (ADS)

    Shi, Dawei; Wang, Rui

    2017-12-01

    In this study, to solve the poor water resistance and the low mechanical properties of starch, a mixed-starch composite matrix which including glycerol, sorbitol, and urea, were prepared via single-crew extrusion, then adding oil-flax to improve its physical mechanical and used to a source of biodegradable plastics material. The composite matrix was systematically characterized using various analytic tools including XRD, SEM and TG. The composite showed a maximum tensile strength of 18.11Mpa and moisture absorption 17.67%, while the original starch matrix was only 12.51 Mpa and 24.98%, respectively.

  7. MODEST: A Tool for Geodesy and Astronomy

    NASA Technical Reports Server (NTRS)

    Sovers, Ojars J.; Jacobs, Christopher S.; Lanyi, Gabor E.

    2004-01-01

    Features of the JPL VLBI modeling and estimation software "MODEST" are reviewed. Its main advantages include thoroughly documented model physics, portability, and detailed error modeling. Two unique models are included: modeling of source structure and modeling of both spatial and temporal correlations in tropospheric delay noise. History of the code parallels the development of the astrometric and geodetic VLBI technique and the software retains many of the models implemented during its advancement. The code has been traceably maintained since the early 1980s, and will continue to be updated with recent IERS standards. Scripts are being developed to facilitate user-friendly data processing in the era of e-VLBI.

  8. IMAGINE: Interstellar MAGnetic field INference Engine

    NASA Astrophysics Data System (ADS)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  9. A Bubble Chamber Simulator: A New Tool for the Physics Classroom

    ERIC Educational Resources Information Center

    Gagnon, Michel

    2011-01-01

    Mainly used in the 1960s, bubble chambers played a major role in particle physics. Now replaced with modern electronic detectors, we believe they remain an important didactic tool to introduce particle physics as they provide visual, appealing and insightful pictures. Sadly, this rare type of detector is mostly accessible through open-door events…

  10. Locating the Center of Gravity: The Dance of Normal and Frictional Forces

    ERIC Educational Resources Information Center

    Balta, Nuri

    2012-01-01

    Teaching physics concepts with the basic materials that are around us is one of the beauties of physics. Without expensive lab materials and long experiments, many physics concepts can be taught to students using simple tools. Demonstrations with these tools can be presented as discrepant events that surprise, amaze, or puzzle students. Greenslade…

  11. Using Plickers as an Assessment Tool in Health and Physical Education Settings

    ERIC Educational Resources Information Center

    Chng, Lena; Gurvitch, Rachel

    2018-01-01

    Written tests are one of the most common assessment tools classroom teachers use today. Despite its popularity, administering written tests or surveys, especially in health and physical education settings, is time consuming. In addition to the time taken to type and print out the tests or surveys, health and physical education teachers must grade…

  12. Adolescent Girls' Reactions to Nutrition and Physical Activity Assessment Tools and Insight into Lifestyle Habits

    ERIC Educational Resources Information Center

    Metos, Julie; Gren, Lisa; Brusseau, Timothy; Moric, Endi; O'Toole, Karen; Mokhtari, Tahereh; Buys, Saundra; Frost, Caren

    2018-01-01

    Objective: The objective of this study was to understand adolescent girls' experiences using practical diet and physical activity measurement tools and to explore the food and physical activity settings that influence their lifestyle habits. Design: Mixed methods study using quantitative and qualitative methods. Setting: Large city in the western…

  13. Factors Influencing Physical Therapists' Use of Standardized Measures of Walking Capacity Poststroke Across the Care Continuum

    PubMed Central

    Pattison, Kira M.; Brooks, Dina; Cameron, Jill I.

    2015-01-01

    Background The use of standardized assessment tools is an element of evidence-informed rehabilitation, but physical therapists report administering these tools inconsistently poststroke. An in-depth understanding of physical therapists' approaches to walking assessment is needed to develop strategies to advance assessment practice. Objectives The objective of this study was to explore the methods physical therapists use to evaluate walking poststroke, reasons for selecting these methods, and the use of assessment results in clinical practice. Design A qualitative descriptive study involving semistructured telephone interviews was conducted. Methods Registered physical therapists assessing a minimum of 10 people with stroke per year in Ontario, Canada, were purposively recruited from acute care, rehabilitation, and outpatient settings. Interviews were audiotaped and transcribed verbatim. Transcripts were coded line by line by the interviewer. Credibility was optimized through triangulation of analysts, audit trail, and collection of field notes. Results Study participants worked in acute care (n=8), rehabilitation (n=11), or outpatient (n=9) settings and reported using movement observation and standardized assessment tools to evaluate walking. When selecting methods to evaluate walking, physical therapists described being influenced by a hierarchy of factors. Factors included characteristics of the assessment tool, the therapist, the workplace, and patients, as well as influential individuals or organizations. Familiarity exerted the primary influence on adoption of a tool into a therapist's assessment repertoire, whereas patient factors commonly determined daily use. Participants reported using the results from walking assessments to communicate progress to the patient and health care professionals. Conclusions Multilevel factors influence physical therapists' adoption and daily administration of standardized tools to assess walking. Findings will inform knowledge translation efforts aimed at increasing the standardized assessment of walking poststroke. PMID:25929532

  14. Digital holography with electron wave: measuring into the nanoworld

    NASA Astrophysics Data System (ADS)

    Mendoza Santoyo, Fernando; Voelkl, Edgar

    2016-04-01

    Dennis Gabor invented Holography in 1949. His main concern at the time was centered on the spherical aberration correction in the recently created electron microscopes, especially after O. Scherzer had shown mathematically that round electron optical lenses always have a positive spherical aberration coefficient and the mechanical requirements for minimizing the spherical aberration were too high to allow for atomic resolution. At the time the lack of coherent electron sources meant that in-line holography was developed using quasi-coherent light sources. As such Holography did not produce scientific good enough results to be considered a must use tool. In 1956, G. Moellenstedt invented a device called a wire-biprism that allowed the object and reference beams to be combined in an off-axis configuration. The invention of the laser at the end of the 1950s gave a great leap to Holography since this light source was highly coherent and hence led to the invention of Holographic Interferometry during the first lustrum of the 1960s. This new discipline in the Optics field has successfully evolved to become a trusted tool in a wide variety of areas. Coherent electron sources were made available only by the late 1970s, a fact that gave an outstanding impulse to electron holography so that today nanomaterials and structures belonging to a wide variety of subjects can be characterized in regards to their physical and mechanical parameters. This invited paper will present and discuss electron holography's state of the art applications to study the shape of nanoparticles and bacteria, and the qualitative and quantitative study of magnetic and electric fields produced by novel nano-structures.

  15. Capstone: A Geometry-Centric Platform to Enable Physics-Based Simulation and Design of Systems

    DTIC Science & Technology

    2015-10-05

    foundation for the air-vehicle early design tool DaVinci being developed by CREATETM-AV project to enable development of associative models of air...CREATETM-AV solvers Kestrel [11] and Helios [16,17]. Furthermore, it is the foundation for the CREATETM-AV’s DaVinci [9] tool that provides a... Tools and Environments (CREATETM) program [6] aimed at developing a suite of high- performance physics-based computational tools addressing the needs

  16. Barriers and motivators for owners walking their dog: results from qualitative research.

    PubMed

    Cutt, Hayley E; Giles-Corti, Billie; Wood, Lisa J; Knuiman, Matthew W; Burke, Valerie

    2008-08-01

    This qualitative research explored the relationship between dog ownership and dog-related, social environmental and physical environmental factors associated with walking with a dog. Seven focus groups with dog owners (n=51) were conducted. A pre-determined discussion guide was used and transcripts were analysed as group data, using content analysis to identify common themes. Many of the physical environmental barriers and facilitators that influenced dog owners to walk were similar to those found in the literature for general walking. However, a number of key motivators for walking, specific to dog owners, were identified. Dog owners reported that their dog was a strong source of motivation, companionship and social support that encouraged them to walk with their dog. The availability and accessibility of public open space (POS) for dogs and the provision of dog-related infrastructure within POS were also important environmental factors that affected whether owners walked with their dog. Results from this qualitative study were used to develop the Dogs and Physical Activity (DAPA) tool which is now being used to measure the walking behaviour of dog owners.

  17. Delta: a new web-based 3D genome visualization and analysis platform.

    PubMed

    Tang, Bixia; Li, Feifei; Li, Jing; Zhao, Wenming; Zhang, Zhihua

    2018-04-15

    Delta is an integrative visualization and analysis platform to facilitate visually annotating and exploring the 3D physical architecture of genomes. Delta takes Hi-C or ChIA-PET contact matrix as input and predicts the topologically associating domains and chromatin loops in the genome. It then generates a physical 3D model which represents the plausible consensus 3D structure of the genome. Delta features a highly interactive visualization tool which enhances the integration of genome topology/physical structure with extensive genome annotation by juxtaposing the 3D model with diverse genomic assay outputs. Finally, by visually comparing the 3D model of the β-globin gene locus and its annotation, we speculated a plausible transitory interaction pattern in the locus. Experimental evidence was found to support this speculation by literature survey. This served as an example of intuitive hypothesis testing with the help of Delta. Delta is freely accessible from http://delta.big.ac.cn, and the source code is available at https://github.com/zhangzhwlab/delta. zhangzhihua@big.ac.cn. Supplementary data are available at Bioinformatics online.

  18. Strategies for Validating and Directions for Employing SMOS Data, in the Cal-Val Project SWEX (3275)

    NASA Astrophysics Data System (ADS)

    Marczewski, Wojciech; Usowicz, Boguslaw; Usowicz, Jerzy; Romanov, Sergey; Maryskevych, Oksana; Nastula, Jolanta; Slominski, Jan; Zawadzki, Jaroslaw

    2009-11-01

    Earth land surface target of observations is naturally diversified in its physical and bio-physical properties. SMOS observation of SM (Soil Moisture) is highly dependent on proper physical and environmental data necessary, because SM is retrieved from the directly observable BT (Brightness Temperature) on the basis of these external data. That way, SMOS realizes a real data fusion performed NRT (Nearly Real Time) and thus needs validating. Global range of SMOS observations makes it generalizing the diversity on complex way engaging technical, modelling and organizational means. That is a new quality of EO (Earth Observations) in the matter of managing diversity of the target. The paper presents several proofs on employing external data by means of the SMOS software tools, for L1c and L2 data levels. Authors take validation in few selected sites in Poland, and describe their strategy for employing external data from ASAR, MERIS, and other auxiliary sources. Finally the conclusions come to understanding of a use of SMOS data, and seek ways of referencing SM in large scales to known results of the gravitational Mission GRACE.

  19. Observation planning tools for the ESO VLT interferometer

    NASA Astrophysics Data System (ADS)

    McKay, Derek J.; Ballester, Pascal; Vinther, Jakob

    2004-09-01

    Now that the Very Large Telescope Interferometer (VLTI) is producing regular scientific observations, the field of optical interferometry has moved from being a specialist niche area into mainstream astronomy. Making such instruments available to the general community involves difficult challenges in modelling, presentation and automation. The planning of each interferometric observation requires calibrator source selection, visibility prediction, signal-to-noise estimation and exposure time calculation. These planning tools require detailed physical models simulating the complete telescope system - including the observed source, atmosphere, array configuration, optics, detector and data processing. Only then can these software utilities provide accurate predictions about instrument performance, robust noise estimation and reliable metrics indicating the anticipated success of an observation. The information must be presented in a clear, intelligible manner, sufficiently abstract to hide the details of telescope technicalities, but still giving the user a degree of control over the system. The Data Flow System group has addressed the needs of the VLTI and, in doing so, has gained some new insights into the planning of observations, and the modelling and simulation of interferometer performance. This paper reports these new techniques, as well as the successes of the Data Flow System group in this area and a summary of what is now offered as standard to VLTI observers.

  20. Line-source excitation of realistic conformal metasurface cloaks

    NASA Astrophysics Data System (ADS)

    Padooru, Yashwanth R.; Yakovlev, Alexander B.; Chen, Pai-Yen; Alù, Andrea

    2012-11-01

    Following our recently introduced analytical tools to model and design conformal mantle cloaks based on metasurfaces [Padooru et al., J. Appl. Phys. 112, 034907 (2012)], we investigate their performance and physical properties when excited by an electric line source placed in their close proximity. We consider metasurfaces formed by 2-D arrays of slotted (meshes and Jerusalem cross slots) and printed (patches and Jerusalem crosses) sub-wavelength elements. The electromagnetic scattering analysis is carried out using a rigorous analytical model, which utilizes the two-sided impedance boundary conditions at the interface of the sub-wavelength elements. It is shown that the homogenized grid-impedance expressions, originally derived for planar arrays of sub-wavelength elements and plane-wave excitation, may be successfully used to model and tailor the surface reactance of cylindrical conformal mantle cloaks illuminated by near-field sources. Our closed-form analytical results are in good agreement with full-wave numerical simulations, up to sub-wavelength distances from the metasurface, confirming that mantle cloaks may be very effective to suppress the scattering of moderately sized objects, independent of the type of excitation and point of observation. We also discuss the dual functionality of these metasurfaces to boost radiation efficiency and directivity from confined near-field sources.

  1. Improved Dust Forecast Products for Southwest Asia Forecasters through Dust Source Database Advancements

    NASA Astrophysics Data System (ADS)

    Brooks, G. R.

    2011-12-01

    Dust storm forecasting is a critical part of military theater operations in Afghanistan and Iraq as well as other strategic areas of the globe. The Air Force Weather Agency (AFWA) has been using the Dust Transport Application (DTA) as a forecasting tool since 2001. Initially developed by The Johns Hopkins University Applied Physics Laboratory (JHUAPL), output products include dust concentration and reduction of visibility due to dust. The performance of the products depends on several factors including the underlying dust source database, treatment of soil moisture, parameterization of dust processes, and validity of the input atmospheric model data. Over many years of analysis, seasonal dust forecast biases of the DTA have been observed and documented. As these products are unique and indispensible for U.S. and NATO forces, amendments were required to provide the best forecasts possible. One of the quickest ways to scientifically address the dust concentration biases noted over time was to analyze the weaknesses in, and adjust the dust source database. Dust source database strengths and weaknesses, the satellite analysis and adjustment process, and tests which confirmed the resulting improvements in the final dust concentration and visibility products will be shown.

  2. Electrical safety device

    DOEpatents

    White, David B.

    1991-01-01

    An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.

  3. Multisymplectic unified formalism for Einstein-Hilbert gravity

    NASA Astrophysics Data System (ADS)

    Gaset, Jordi; Román-Roy, Narciso

    2018-03-01

    We present a covariant multisymplectic formulation for the Einstein-Hilbert model of general relativity. As it is described by a second-order singular Lagrangian, this is a gauge field theory with constraints. The use of the unified Lagrangian-Hamiltonian formalism is particularly interesting when it is applied to these kinds of theories, since it simplifies the treatment of them, in particular, the implementation of the constraint algorithm, the retrieval of the Lagrangian description, and the construction of the covariant Hamiltonian formalism. In order to apply this algorithm to the covariant field equations, they must be written in a suitable geometrical way, which consists of using integrable distributions, represented by multivector fields of a certain type. We apply all these tools to the Einstein-Hilbert model without and with energy-matter sources. We obtain and explain the geometrical and physical meaning of the Lagrangian constraints and we construct the multimomentum (covariant) Hamiltonian formalisms in both cases. As a consequence of the gauge freedom and the constraint algorithm, we see how this model is equivalent to a first-order regular theory, without gauge freedom. In the case of the presence of energy-matter sources, we show how some relevant geometrical and physical characteristics of the theory depend on the type of source. In all the cases, we obtain explicitly multivector fields which are solutions to the gravitational field equations. Finally, a brief study of symmetries and conservation laws is done in this context.

  4. Innovative approach towards understanding optics

    NASA Astrophysics Data System (ADS)

    Garg, Amit; Bharadwaj, Sadashiv Raj; Kumar, Raj; Shudhanshu, Avinash Kumar; Verma, Deepak Kumar

    2016-01-01

    Over the last few years, there has been a decline in the students’ interest towards Science and Optics. Use of technology in the form of various types of sensors and data acquisition systems has come as a saviour. Till date, manual routine tools and techniques are used to perform various experimental procedures in most of the science/optics laboratories in our country. The manual tools are cumbersome whereas the automated ones are costly. It does not enthuse young researchers towards the science laboratories. There is a need to develop applications which can be easily integrated, tailored at school and undergraduate level laboratories and are economical at the same time. Equipments with advanced technologies are available but they are uneconomical and have complicated working principle with a black box approach. The present work describes development of portable tools and applications which are user-friendly. This is being implemented using open-source physical computing platform based on a simple low cost microcontroller board and a development environment for writing software. The present paper reports the development of an automated spectrometer, an instrument used in almost all optics experiments at undergraduate level, and students’ response to this innovation. These tools will inspire young researchers towards science and facilitate development of advance low cost equipments making life easier for Indian as well as developing nations.

  5. Project management web tools at the MICE experiment

    NASA Astrophysics Data System (ADS)

    Coney, L. R.; Tunnell, C. D.

    2012-12-01

    Project management tools like Trac are commonly used within the open-source community to coordinate projects. The Muon Ionization Cooling Experiment (MICE) uses the project management web application Redmine to host mice.rl.ac.uk. Many groups within the experiment have a Redmine project: analysis, computing and software (including offline, online, controls and monitoring, and database subgroups), executive board, and operations. All of these groups use the website to communicate, track effort, develop schedules, and maintain documentation. The issue tracker is a rich tool that is used to identify tasks and monitor progress within groups on timescales ranging from immediate and unexpected problems to milestones that cover the life of the experiment. It allows the prioritization of tasks according to time-sensitivity, while providing a searchable record of work that has been done. This record of work can be used to measure both individual and overall group activity, identify areas lacking sufficient personnel or effort, and as a measure of progress against the schedule. Given that MICE, like many particle physics experiments, is an international community, such a system is required to allow easy communication within a global collaboration. Unlike systems that are purely wiki-based, the structure of a project management tool like Redmine allows information to be maintained in a more structured and logical fashion.

  6. Exoplanet Yield Estimation for Decadal Study Concepts using EXOSIMS

    NASA Astrophysics Data System (ADS)

    Morgan, Rhonda; Lowrance, Patrick; Savransky, Dmitry; Garrett, Daniel

    2016-01-01

    The anticipated upcoming large mission study concepts for the direct imaging of exo-earths present an exciting opportunity for exoplanet discovery and characterization. While these telescope concepts would also be capable of conducting a broad range of astrophysical investigations, the most difficult technology challenges are driven by the requirements for imaging exo-earths. The exoplanet science yield for these mission concepts will drive design trades and mission concept comparisons.To assist in these trade studies, the Exoplanet Exploration Program Office (ExEP) is developing a yield estimation tool that emphasizes transparency and consistent comparison of various design concepts. The tool will provide a parametric estimate of science yield of various mission concepts using contrast curves from physics-based model codes and Monte Carlo simulations of design reference missions using realistic constraints, such as solar avoidance angles, the observatory orbit, propulsion limitations of star shades, the accessibility of candidate targets, local and background zodiacal light levels, and background confusion by stars and galaxies. The python tool utilizes Dmitry Savransky's EXOSIMS (Exoplanet Open-Source Imaging Mission Simulator) design reference mission simulator that is being developed for the WFIRST Preliminary Science program. ExEP is extending and validating the tool for future mission concepts under consideration for the upcoming 2020 decadal review. We present a validation plan and preliminary yield results for a point design.

  7. Hydro-economic modeling of the role of forests on water resources production in Andalusia, Spain

    NASA Astrophysics Data System (ADS)

    Beguería, Santiago; Serrano-Notivoli, Roberto; Álvarez-Palomino, Alejandro; Campos, Pablo

    2014-05-01

    The development of more refined information tools is a pre-requisite for supporting decision making in the context of integrated water resources management. Among these tools, hydro-economic models are favoured because they allow integrating the ecological, hydrological, infrastructure and economic aspects into a coherent, scientifically-informed framework. We present a case study that assesses physically the water resources of forest lands of the Andalusia region in Spain and conducts an economic environmental income and asset valuation of the forest surface water yield. We show how, based on available hydrologic and economic data, we can develop a comprehensive water account for all the forest lands at the regional scale. This forest water environmental valuation is part of the larger RECAMAN project, which aims at providing a robust and easily replicable accounting tool to evaluate yearly the total income an capital generated by the forest land, encompassing all measurable sources of private and public incomes (timber and cork production, auto-consumption, recreational activities, biodiversity conservation, carbon sequestration, water production, etc.). Only a comprehensive integrated tool such as the one built within the RECAMAN project may serve as a basis for the development of integrated policies such as those internationally agreed and recommended for the management of water resources.

  8. iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations

    NASA Astrophysics Data System (ADS)

    Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.

    The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.

  9. Transient pressure analysis of fractured well in bi-zonal gas reservoirs

    NASA Astrophysics Data System (ADS)

    Zhao, Yu-Long; Zhang, Lie-Hui; Liu, Yong-hui; Hu, Shu-Yong; Liu, Qi-Guo

    2015-05-01

    For hydraulic fractured well, how to evaluate the properties of fracture and formation are always tough jobs and it is very complex to use the conventional method to do that, especially for partially penetrating fractured well. Although the source function is a very powerful tool to analyze the transient pressure for complex structure well, the corresponding reports on gas reservoir are rare. In this paper, the continuous point source functions in anisotropic reservoirs are derived on the basis of source function theory, Laplace transform method and Duhamel principle. Application of construction method, the continuous point source functions in bi-zonal gas reservoir with closed upper and lower boundaries are obtained. Sequentially, the physical models and transient pressure solutions are developed for fully and partially penetrating fractured vertical wells in this reservoir. Type curves of dimensionless pseudo-pressure and its derivative as function of dimensionless time are plotted as well by numerical inversion algorithm, and the flow periods and sensitive factors are also analyzed. The source functions and solutions of fractured well have both theoretical and practical application in well test interpretation for such gas reservoirs, especial for the well with stimulated reservoir volume around the well in unconventional gas reservoir by massive hydraulic fracturing which always can be described with the composite model.

  10. Measurement properties of self-report physical activity assessment tools in stroke: a protocol for a systematic review

    PubMed Central

    Martins, Júlia Caetano; Aguiar, Larissa Tavares; Nadeau, Sylvie; Scianni, Aline Alvim; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais

    2017-01-01

    Introduction Self-report physical activity assessment tools are commonly used for the evaluation of physical activity levels in individuals with stroke. A great variety of these tools have been developed and widely used in recent years, which justify the need to examine their measurement properties and clinical utility. Therefore, the main objectives of this systematic review are to examine the measurement properties and clinical utility of self-report measures of physical activity and discuss the strengths and limitations of the identified tools. Methods and analysis A systematic review of studies that investigated the measurement properties and/or clinical utility of self-report physical activity assessment tools in stroke will be conducted. Electronic searches will be performed in five databases: Medical Literature Analysis and Retrieval System Online (MEDLINE) (PubMed), Excerpta Medica Database (EMBASE), Physiotherapy Evidence Database (PEDro), Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) and Scientific Electronic Library Online (SciELO), followed by hand searches of the reference lists of the included studies. Two independent reviewers will screen all retrieve titles, abstracts, and full texts, according to the inclusion criteria and will also extract the data. A third reviewer will be referred to solve any disagreement. A descriptive summary of the included studies will contain the design, participants, as well as the characteristics, measurement properties, and clinical utility of the self-report tools. The methodological quality of the studies will be evaluated using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist and the clinical utility of the identified tools will be assessed considering predefined criteria. This systematic review will follow the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) statement. Discussion This systematic review will provide an extensive review of the measurement properties and clinical utility of self-report physical activity assessment tools used in individuals with stroke, which would benefit clinicians and researchers. Trial registration number PROSPERO CRD42016037146. PMID:28193848

  11. Challenge of engaging all students via self-paced interactive electronic learning tutorials for introductory physics

    NASA Astrophysics Data System (ADS)

    DeVore, Seth; Marshman, Emily; Singh, Chandralekha

    2017-06-01

    As research-based, self-paced electronic learning tools become increasingly available, a critical issue educators encounter is implementing strategies to ensure that all students engage with them as intended. Here, we first discuss the effectiveness of electronic learning tutorials as self-paced learning tools in large enrollment brick and mortar introductory physics courses and then propose a framework for helping students engage effectively with the learning tools. The tutorials were developed via research in physics education and were found to be effective for a diverse group of introductory physics students in one-on-one implementation. Instructors encouraged the use of these tools in a self-paced learning environment by telling students that they would be helpful for solving the assigned homework problems and that the underlying physics principles in the tutorial problems would be similar to those in the in-class quizzes (which we call paired problems). We find that many students in the courses in which these interactive electronic learning tutorials were assigned as a self-study tool performed poorly on the paired problems. In contrast, a majority of student volunteers in one-on-one implementation greatly benefited from the tutorials and performed well on the paired problems. The significantly lower overall performance on paired problems administered as an in-class quiz compared to the performance of student volunteers who used the research-based tutorials in one-on-one implementation suggests that many students enrolled in introductory physics courses did not effectively engage with the tutorials outside of class and may have only used them superficially. The findings suggest that many students in need of out-of-class remediation via self-paced learning tools may have difficulty motivating themselves and may lack the self-regulation and time-management skills to engage effectively with tools specially designed to help them learn at their own pace. We conclude by proposing a theoretical framework to help students with diverse prior preparations engage effectively with self-paced learning tools.

  12. It's LiFe! Mobile and Web-Based Monitoring and Feedback Tool Embedded in Primary Care Increases Physical Activity: A Cluster Randomized Controlled Trial.

    PubMed

    van der Weegen, Sanne; Verwey, Renée; Spreeuwenberg, Marieke; Tange, Huibert; van der Weijden, Trudy; de Witte, Luc

    2015-07-24

    Physical inactivity is a major public health problem. The It's LiFe! monitoring and feedback tool embedded in the Self-Management Support Program (SSP) is an attempt to stimulate physical activity in people with chronic obstructive pulmonary disease or type 2 diabetes treated in primary care. Our aim was to evaluate whether the SSP combined with the use of the monitoring and feedback tool leads to more physical activity compared to usual care and to evaluate the additional effect of using this tool on top of the SSP. This was a three-armed cluster randomised controlled trial. Twenty four family practices were randomly assigned to one of three groups in which participants received the tool + SSP (group 1), the SSP (group 2), or care as usual (group 3). The primary outcome measure was minutes of physical activity per day. The secondary outcomes were general and exercise self-efficacy and quality of life. Outcomes were measured at baseline after the intervention (4-6 months), and 3 months thereafter. The group that received the entire intervention (tool + SSP) showed more physical activity directly after the intervention than Group 3 (mean difference 11.73, 95% CI 6.21-17.25; P<.001), and Group 2 (mean difference 7.86, 95% CI 2.18-13.54; P=.003). Three months after the intervention, this effect was still present and significant (compared to Group 3: mean difference 10.59, 95% CI 4.94-16.25; P<.001; compared to Group 2: mean difference 9.41, 95% CI 3.70-15.11; P<.001). There was no significant difference in effect between Groups 2 and 3 on both time points. There was no interaction effect for disease type. The combination of counseling with the tool proved an effective way to stimulate physical activity. Counseling without the tool was not effective. Future research about the cost-effectiveness and application under more tailored conditions and in other target groups is recommended. ClinicalTrials.gov: NCT01867970, https://clinicaltrials.gov/ct2/show/NCT01867970 (archived by WebCite at http://www.webcitation.org/6a2qR5BSr).

  13. On use of ZPR research reactors and associated instrumentation and measurement methods for reactor physics studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chauvin, J.P.; Blaise, P.; Lyoussi, A.

    2015-07-01

    The French atomic and alternative energies -CEA- is strongly involved in research and development programs concerning the use of nuclear energy as a clean and reliable source of energy and consequently is working on the present and future generation of reactors on various topics such as ageing plant management, optimization of the plutonium stockpile, waste management and innovative systems exploration. Core physics studies are an essential part of this comprehensive R and D effort. In particular, the Zero Power Reactor (ZPR) of CEA: EOLE, MINERVE and MASURCA play an important role in the validation of neutron (as well photon) physicsmore » calculation tools (codes and nuclear data). The experimental programs defined in the CEA's ZPR facilities aim at improving the calculation routes by reducing the uncertainties of the experimental databases. They also provide accurate data on innovative systems in terms of new materials (moderating and decoupling materials) and new concepts (ADS, ABWR, new MTR (e.g. JHR), GENIV) involving new fuels, absorbers and coolant materials. Conducting such interesting experimental R and D programs is based on determining and measuring main parameters of phenomena of interest to qualify calculation tools and nuclear data 'libraries'. Determining these parameters relies on the use of numerous and different experimental techniques using specific and appropriate instrumentation and detection tools. Main ZPR experimental programs at CEA, their objectives and challenges will be presented and discussed. Future development and perspectives regarding ZPR reactors and associated programs will be also presented. (authors)« less

  14. SU-E-T-254: Development of a HDR-BT QA Tool for Verification of Source Position with Oncentra Applicator Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumazaki, Y; Miyaura, K; Hirai, R

    2015-06-15

    Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicatormore » tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.« less

  15. Impact of physical pre-treatment of source-sorted organic fraction of municipal solid waste on greenhouse-gas emissions and the economy in a Swedish anaerobic digestion system.

    PubMed

    Carlsson, My; Holmström, David; Bohn, Irene; Bisaillon, Mattias; Morgan-Sagastume, Fernando; Lagerkvist, Anders

    2015-04-01

    Several methods for physical pre-treatments of source sorted organic fraction of municipal solid waste (SSOFMSW) before for anaerobic digestion (AD) are available, with the common feature that they generate a homogeneous slurry for AD and a dry refuse fraction for incineration. The selection of efficient methods relies on improved understanding of how the pre-treatment impacts on the separation and on the slurry's AD. The aim of this study was to evaluate the impact of the performance of physical pre-treatment of SSOFMSW on greenhouse-gas (GHG) emissions and on the economy of an AD system including a biogas plant with supplementary systems for heat and power production in Sweden. Based on the performance of selected Swedish facilities, as well as chemical analyses and BMP tests of slurry and refuse, the computer-based evaluation tool ORWARE was improved as to accurately describe mass flows through the physical pre-treatment and anaerobic degradation. The environmental and economic performance of the evaluated system was influenced by the TS concentration in the slurry, as well as the distribution of incoming solids between slurry and refuse. The focus to improve the efficiency of these systems should primarily be directed towards minimising the water addition in the pre-treatment provided that this slurry can still be efficiently digested. Second, the amount of refuse should be minimised, while keeping a good quality of the slurry. Electricity use/generation has high impact on GHG emissions and the results of the study are sensitive to assumptions of marginal electricity and of electricity use in the pre-treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  17. Validity of a self-report survey tool measuring the nutrition and physical activity environment of primary schools.

    PubMed

    Nathan, Nicole; Wolfenden, Luke; Morgan, Philip J; Bell, Andrew C; Barker, Daniel; Wiggers, John

    2013-06-13

    Valid tools measuring characteristics of the school environment associated with the physical activity and dietary behaviours of children are needed to accurately evaluate the impact of initiatives to improve school environments. The aim of this study was to assess the validity of Principal self-report of primary school healthy eating and physical activity environments. Primary school Principals (n = 42) in New South Wales, Australia were invited to complete a telephone survey of the school environment; the School Environment Assessment Tool - SEAT. Equivalent observational data were collected by pre-service teachers located within the school. The SEAT, involved 65 items that assessed food availability via canteens, vending machines and fundraisers and the presence of physical activity facilities, equipment and organised physical activities. Kappa statistics were used to assess agreement between the two measures. Almost 70% of the survey demonstrated moderate to almost perfect agreement. Substantial agreement was found for 10 of 13 items assessing foods sold for fundraising, 3 of 6 items assessing physical activity facilities of the school, and both items assessing organised physical activities that occurred at recess and lunch and school sport. Limited agreement was found for items assessing foods sold through canteens and access to small screen recreation. The SEAT provides researchers and policy makers with a valid tool for assessing aspects of the school food and physical activity environment.

  18. Slow Speed--Fast Motion: Time-Lapse Recordings in Physics Education

    ERIC Educational Resources Information Center

    Vollmer, Michael; Möllmann, Klaus-Peter

    2018-01-01

    Video analysis with a 30 Hz frame rate is the standard tool in physics education. The development of affordable high-speed-cameras has extended the capabilities of the tool for much smaller time scales to the 1 ms range, using frame rates of typically up to 1000 frames s[superscript -1], allowing us to study transient physics phenomena happening…

  19. HEPS Tool for Schools: A Guide for School Policy Development on Healthy Eating and Physical Activity

    ERIC Educational Resources Information Center

    Simovska, Venka; Dadaczynski, Kevin; Viig, Nina Grieg; Bowker, Sue; Woynarowska, Barbara; de Ruiter, Silvia; Buijs, Goof

    2010-01-01

    The HEPS Tool for Schools provides ideas, guidelines and suggested techniques to help schools in their development of school policy on healthy eating and physical activity. There is growing evidence that a comprehensive whole school policy on healthy eating and physical activity can lead to better academic outcomes of pupils as well as promoting…

  20. Meta II: Multi-Model Language Suite for Cyber Physical Systems

    DTIC Science & Technology

    2013-03-01

    AVM META) projects have developed tools for designing cyber physical (or Mechatronic ) Systems . These systems are increasingly complex, take much...projects have developed tools for designing cyber physical (CPS) (or Mechatronic ) systems . Exemplified by modern amphibious and ground military...and parametric interface of Simulink models and defines associations with CyPhy components and component interfaces. 2. Embedded Systems Modeling

  1. National policy on physical activity: the development of a policy audit tool.

    PubMed

    Bull, Fiona C; Milton, Karen; Kahlmeier, Sonja

    2014-02-01

    Physical inactivity is a leading risk factor for noncommunicable disease worldwide. Increasing physical activity requires large scale actions and relevant, supportive national policy across multiple sectors. The policy audit tool (PAT) was developed to provide a standardized instrument to assess national policy approaches to physical activity. A draft tool, based on earlier work, was developed and pilot-tested in 7 countries. After several rounds of revisions, the final PAT comprises 27 items and collects information on 1) government structure, 2) development and content of identified key policies across multiple sectors, 3) the experience of policy implementation at both the national and local level, and 4) a summary of the PAT completion process. PAT provides a standardized instrument for assessing progress of national policy on physical activity. Engaging a diverse international group of countries in the development helped ensure PAT has applicability across a wide range of countries and contexts. Experiences from the development of the PAT suggests that undertaking an audit of health enhancing physical activity (HEPA) policy can stimulate greater awareness of current policy opportunities and gaps, promote critical debate across sectors, and provide a catalyst for collaboration on policy level actions. The final tool is available online.

  2. Insightful problem solving and creative tool modification by captive nontool-using rooks

    PubMed Central

    Bird, Christopher D.; Emery, Nathan J.

    2009-01-01

    The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use. PMID:19478068

  3. Modeling Vortex Generators in a Navier-Stokes Code

    NASA Technical Reports Server (NTRS)

    Dudek, Julianne C.

    2011-01-01

    A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.

  4. The Most Compact Bright Radio-loud AGNs. II. VLBA Observations of 10 Sources at 43 and 86 GHz

    NASA Astrophysics Data System (ADS)

    Cheng, X.-P.; An, T.; Hong, X.-Y.; Yang, J.; Mohan, P.; Kellermann, K. I.; Lister, M. L.; Frey, S.; Zhao, W.; Zhang, Z.-L.; Wu, X.-C.; Li, X.-F.; Zhang, Y.-K.

    2018-01-01

    Radio-loud active galactic nuclei (AGNs), hosting powerful relativistic jet outflows, provide an excellent laboratory for studying jet physics. Very long baseline interferometry (VLBI) enables high-resolution imaging on milli-arcsecond (mas) and sub-mas scales, making it a powerful tool to explore the inner jet structure, shedding light on the formation, acceleration, and collimation of AGN jets. In this paper, we present Very Long Baseline Array observations of 10 radio-loud AGNs at 43 and 86 GHz that were selected from the Planck catalog of compact sources and are among the brightest in published VLBI images at and below 15 GHz. The image noise levels in our observations are typically 0.3 and 1.5 mJy beam‑1 at 43 and 86 GHz, respectively. Compared with the VLBI data observed at lower frequencies from the literature, our observations with higher resolutions (with the highest resolution being up to 0.07 mas at 86 GHz and 0.18 mas at 43 GHz) and at higher frequencies detected new jet components at sub-parsec scales, offering valuable data for studies of the physical properties of the innermost jets. These include the compactness factor of the radio structure (the ratio of core flux density to total flux density), and core brightness temperature ({T}{{b}}). In all these sources, the compact core accounts for a significant fraction (> 60 % ) of the total flux density. Their correlated flux density at the longest baselines is higher than 0.16 Jy. The compactness of these sources make them good phase calibrators of millimeter-wavelength ground-based and space VLBI.

  5. The evolution of the storm-time ring current in response to different characteristics of the plasma source

    NASA Astrophysics Data System (ADS)

    Lemon, C.; Chen, M.; O'Brien, T. P.; Toffoletto, F.; Sazykin, S.; Wolf, R.; Kumar, V.

    2006-12-01

    We present simulation results of the Rice Convection Model-Equilibrium (RCM-E) that test and compare the effect on the storm time ring current of varying the plasma sheet source population characteristics at 6.6 Re during magnetic storms. Previous work has shown that direct injection of ionospheric plasma into the ring current is not a significant source of ring current plasma, suggesting that the plasma sheet is the only source. However, storm time processes in the plasma sheet and inner magnetosphere are very complex, due in large part to the feedback interactions between the plasma distribution, magnetic field, and electric field. We are particularly interested in understanding the role of the plasma sheet entropy parameter (PV^{5/3}, where V=\\int ds/B) in determining the strength and distribution of the ring current in both the main and recovery phases of a storm. Plasma temperature and density can be measured from geosynchrorous orbiting satellites, and these are often used to provide boundary conditions for ring current simulations. However, magnetic field measurements in this region are less commonly available, and there is a relatively poor understanding of the interplay between the plasma and the magnetic field during magnetic storms. The entropy parameter is a quantity that incorporates both the plasma and the magnetic field, and understanding its role in the ring current injection and recovery is essential to describing the processes that are occuring during magnetic storms. The RCM-E includes the physics of feedback between the plasma and both the electric and magnetic fields, and is therefore a valuable tool for understanding these complex storm-time processes. By contrasting the effects of different plasma boundary conditions at geosynchronous orbit, we shed light on the physical processes involved in ring current injection and recovery.

  6. A high-energy, high-flux source of gamma-rays from all-optical non-linear Thomson scattering

    NASA Astrophysics Data System (ADS)

    Corvan, D. J.; Zepf, M.; Sarri, G.

    2016-09-01

    γ-Ray sources are among the most fundamental experimental tools currently available to modern physics. As well as the obvious benefits to fundamental research, an ultra-bright source of γ-rays could form the foundation of scanning of shipping containers for special nuclear materials and provide the bases for new types of cancer therapy. However, for these applications to prove viable, γ-ray sources must become compact and relatively cheap to manufacture. In recent years, advances in laser technology have formed the cornerstone of optical sources of high energy electrons which already have been used to generate synchrotron radiation on a compact scale. Exploiting the scattering induced by a second laser, one can further enhance the energy and number of photons produced provided the problems of synchronisation and compact γ-ray detection are solved. Here, we report on the work that has been done in developing an all-optical and hence, compact non-linear Thomson scattering source, including the new methods of synchronisation and compact γ-ray detection. We present evidence of the generation of multi-MeV (maximum 16-18 MeV) and ultra-high brilliance (exceeding 1020 photons s-1mm-2mrad-2 0.1% BW at 15 MeV) γ-ray beams. These characteristics are appealing for the paramount practical applications mentioned above.

  7. Probing the Active Galactic Nuclei using optical spectroscopy

    NASA Astrophysics Data System (ADS)

    Vivek, M.

    Variability studies offer one of the best tools for understanding the physical conditions present in regions close to the central engine in an AGN. We probed the various properties of AGN through time variability studies of spectral lines in the optical wavelengths using the 2m telescope in IUCAA Girawali observatory. The absorption line variability studies are mainly concentrated in understanding the nature of outflows in quasars. Quasar outflows have a huge impact on the evolution of central supermassive blackholes, their host galaxies and the surrounding intergalactic medium. Studying the variability in these Broad Absorption Lines (BALs) can help us understand the structure, evolution, and basic physical properties of these outflows. We conducted a repeated Low ionization BAL monitoring program with 27 LoBALs (Low Ionization BALs) at z 0.3-2.1 covering timescales from 3.22 to 7.69 years in the quasar rest frame. We see a variety of phenomena, including some BALs that either appeared or disappeared completely and some BALs which do not vary over the observation period. In one case, the excited fine structure lines have changed dramatically. One source shows signatures of radiative acceleration. Here, we present the results from this program. Emission line studies are concentrated in understanding the peculiar characteristics of a dual-AGN source SDSS J092712.64+294344.0.

  8. Status and Plans for the TRANSP Interpretive and Predictive Simulation Code

    NASA Astrophysics Data System (ADS)

    Kaye, Stanley; Andre, Robert; Marina, Gorelenkova; Yuan, Xingqui; Hawryluk, Richard; Jardin, Steven; Poli, Francesca

    2015-11-01

    TRANSP is an integrated interpretive and predictive transport analysis tool that incorporates state of the art heating/current drive sources and transport models. The treatments and transport solvers are becoming increasingly sophisticated and comprehensive. For instance, the ISOLVER component provides a free boundary equilibrium solution, while the PT_SOLVER transport solver is especially suited for stiff transport models such as TGLF. TRANSP also incorporates such source models as NUBEAM for neutral beam injection, GENRAY, TORAY, TORBEAM, TORIC and CQL3D for ICRH, LHCD, ECH and HHFW. The implementation of selected components makes efficient use of MPI for speed up of code calculations. TRANSP has a wide international user-base, and it is run on the FusionGrid to allow for timely support and quick turnaround by the PPPL Computational Plasma Physics Group. It is being used as a basis for both analysis and development of control algorithms and discharge operational scenarios, including simulation of ITER plasmas. This poster will describe present uses of the code worldwide, as well as plans for upgrading the physics modules and code framework. Progress on implementing TRANSP as a component in the ITER IMAS will also be described. This research was supported by the U.S. Department of Energy under contracts DE-AC02-09CH11466.

  9. Tools don't-and won't-make the man: A cognitive look at the future.

    PubMed

    Osiurak, François; Navarro, Jordan; Reynaud, Emanuelle; Thomas, Gauthier

    2018-05-01

    The question of whether tools erase cognitive and physical interindividual differences has been surprisingly overlooked in the literature. Yet if technology is profusely available in a near or far future, will we be equal in our capacity to use it? We sought to address this unexplored, fundamental issue, asking 200 participants to perform 3 physical (e.g., fine manipulation) and 3 cognitive tasks (e.g., calculation) in both non-tool-use and tool-use conditions. Here we show that tools do not erase but rather extend our intrinsic physical and cognitive skills. Moreover, this phenomenon of extension is task specific because we found no evidence for superusers, benefitting from the use of a tool irrespective of the task concerned. These results challenge the possibility that technical solutions could always be found to make people equal. Rather, technical innovation might be systematically limited by the user's initial degree of knowledge or skills for a given task. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Physical intelligence does matter to cumulative technological culture.

    PubMed

    Osiurak, François; De Oliveira, Emmanuel; Navarro, Jordan; Lesourd, Mathieu; Claidière, Nicolas; Reynaud, Emanuelle

    2016-08-01

    Tool-based culture is not unique to humans, but cumulative technological culture is. The social intelligence hypothesis suggests that this phenomenon is fundamentally based on uniquely human sociocognitive skills (e.g., shared intentionality). An alternative hypothesis is that cumulative technological culture also crucially depends on physical intelligence, which may reflect fluid and crystallized aspects of intelligence and enables people to understand and improve the tools made by predecessors. By using a tool-making-based microsociety paradigm, we demonstrate that physical intelligence is a stronger predictor of cumulative technological performance than social intelligence. Moreover, learners' physical intelligence is critical not only in observational learning but also when learners interact verbally with teachers. Finally, we show that cumulative performance is only slightly influenced by teachers' physical and social intelligence. In sum, human technological culture needs "great engineers" to evolve regardless of the proportion of "great pedagogues." Social intelligence might play a more limited role than commonly assumed, perhaps in tool-use/making situations in which teachers and learners have to share symbolic representations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Perspective: Reaches of chemical physics in biology.

    PubMed

    Gruebele, Martin; Thirumalai, D

    2013-09-28

    Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry.

  12. Perspective: Reaches of chemical physics in biology

    PubMed Central

    Gruebele, Martin; Thirumalai, D.

    2013-01-01

    Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry. PMID:24089712

  13. Applying open source data visualization tools to standard based medical data.

    PubMed

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  14. Evaluation of an open source tool for indexing and searching enterprise radiology and pathology reports

    NASA Astrophysics Data System (ADS)

    Kim, Woojin; Boonn, William

    2010-03-01

    Data mining of existing radiology and pathology reports within an enterprise health system can be used for clinical decision support, research, education, as well as operational analyses. In our health system, the database of radiology and pathology reports exceeds 13 million entries combined. We are building a web-based tool to allow search and data analysis of these combined databases using freely available and open source tools. This presentation will compare performance of an open source full-text indexing tool to MySQL's full-text indexing and searching and describe implementation procedures to incorporate these capabilities into a radiology-pathology search engine.

  15. School environment assessment tools to address behavioural risk factors of non-communicable diseases: A scoping review.

    PubMed

    Saluja, Kiran; Rawal, Tina; Bassi, Shalini; Bhaumik, Soumyadeep; Singh, Ankur; Park, Min Hae; Kinra, Sanjay; Arora, Monika

    2018-06-01

    We aimed to identify, describe and analyse school environment assessment (SEA) tools that address behavioural risk factors (unhealthy diet, physical inactivity, tobacco and alcohol consumption) for non-communicable diseases (NCD). We searched in MEDLINE and Web of Science, hand-searched reference lists and contacted experts. Basic characteristics, measures assessed and measurement properties (validity, reliability, usability) of identified tools were extracted. We narratively synthesized the data and used content analysis to develop a list of measures used in the SEA tools. Twenty-four SEA tools were identified, mostly from developed countries. Out of these, 15 were questionnaire based, 8 were checklists or observation based tools and one tool used a combined checklist/observation based and telephonic questionnaire approach. Only 1 SEA tool had components related to all the four NCD risk factors, 2 SEA tools has assessed three NCD risk factors (diet/nutrition, physical activity, tobacco), 10 SEA tools has assessed two NCD risk factors (diet/nutrition and physical activity) and 11 SEA tools has assessed only one of the NCD risk factor. Several measures were used in the tools to assess the four NCD risk factors, but tobacco and alcohol was sparingly included. Measurement properties were reported for 14 tools. The review provides a comprehensive list of measures used in SEA tools which could be a valuable resource to guide future development of such tools. A valid and reliable SEA tool which could simultaneously evaluate all NCD risk factors, that has been tested in different settings with varying resource availability is needed.

  16. Evaluating the integration of cultural competence skills into health and physical assessment tools: a survey of Canadian schools of nursing.

    PubMed

    Chircop, Andrea; Edgecombe, Nancy; Hayward, Kathryn; Ducey-Gilbert, Cherie; Sheppard-Lemoine, Debbie

    2013-04-01

    Currently used audiovisual (AV) teaching tools to teach health and physical assessment reflect a Eurocentric bias using the biomedical model. The purpose of our study was to (a) identify commonly used AV teaching tools of Canadian schools of nursing and (b) evaluate the identified tools. A two-part descriptive quantitative method design was used. First, we surveyed schools of nursing across Canada. Second, the identified AV teaching tools were evaluated for content and modeling of cultural competence. The majority of the schools (67%) used publisher-produced videos associated with a physical assessment textbook. Major findings included minimal demonstration of negotiation with a client around cultural aspects of the interview including the need for an interpreter, modesty, and inclusion of support persons. Identification of culturally specific examples given during the videos was superficial and did not provide students with a comprehensive understanding of necessary culturally competent skills.

  17. Moment-Tensor Spectra of Source Physics Experiments (SPE) Explosions in Granite

    NASA Astrophysics Data System (ADS)

    Yang, X.; Cleveland, M.

    2016-12-01

    We perform frequency-domain moment tensor inversions of Source Physics Experiments (SPE) explosions conducted in granite during Phase I of the experiment. We test the sensitivity of source moment-tensor spectra to factors such as the velocity model, selected dataset and smoothing and damping parameters used in the inversion to constrain the error bound of inverted source spectra. Using source moments and corner frequencies measured from inverted source spectra of these explosions, we develop a new explosion P-wave source model that better describes observed source spectra of these small and over-buried chemical explosions detonated in granite than classical explosion source models derived mainly from nuclear-explosion data. In addition to source moment and corner frequency, we analyze other features in the source spectra to investigate their physical causes.

  18. Tool use in left-brain-damaged patients: Difficulties in reasoning but not in estimating the physical properties of objects.

    PubMed

    Faye, Alexandrine; Jacquin-Courtois, Sophie; Osiurak, François

    2018-03-01

    The purpose of this study was to deepen our understanding of the cognitive bases of human tool use based on the technical reasoning hypothesis (i.e., the reasoning-based approach). This approach assumes that tool use is supported by the ability to reason about an object's physical properties (e.g., length, weight, strength, etc.) to perform mechanical actions (e.g., lever). In this framework, an important issue is to understand whether left-brain-damaged (LBD) individuals with tool-use deficits are still able to estimate the physical object's properties necessary to use the tool. Eleven LBD patients and 12 control participants performed 3 original experimental tasks: Use-Length (visual evaluation of the length of a stick to bring down a target), Visual-Length (to visually compare objects of different lengths) and Addition-Length (to visually compare added lengths). Participants were also tested on conventional tasks: Familiar Tool Use and Mechanical Problem-Solving (novel tools). LBD patients had more difficulties than controls on both conventional tasks. No significant differences were observed for the 3 experimental tasks. These results extend the reasoning-based approach, stressing that it might not be the representation of length that is impaired in LBD patients, but rather the ability to generate mechanical actions based on physical object properties. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. The Relationship between the Physical Therapist Clinical Performance Instrument Scores and Doctor of Physical Therapy Student Learning Styles

    ERIC Educational Resources Information Center

    Courtright, Joachim

    2017-01-01

    INTRODUCTION. The learning style of a student is an important factor in their ability to gain knowledge. This is especially important in challenging curriculums such as the Doctor of Physical Therapy (DPT) program. A common tool to assess one's learning style is The Kolb Learning Styles Inventory (LSI). A common tool used to measure the…

  20. Nonverbal communication in doctor-elderly patient transactions (NDEPT): development of a tool.

    PubMed

    Gorawara-Bhat, Rita; Cook, Mary Ann; Sachs, Greg A

    2007-05-01

    There are several measurement tools to assess verbal dimensions in clinical encounters; in contrast, there is no established tool to evaluate physical nonverbal dimensions in geriatric encounters. The present paper describes the development of a tool to assess the physical context of exam rooms in doctor-older patient visits. Salient features of the tool were derived from the medical literature and systematic observations of videotapes and refined during current research. The tool consists of two main dimensions of exam rooms: (1) physical dimensions comprising static and dynamic attributes that become operational through the spatial configuration and can influence the manifestation of (2) kinesic attributes. Details of the coding form and inter-rater reliability are presented. The usefulness of the tool is demonstrated through an analysis of 50 National Institute of Aging videotapes. Physicians in exam rooms with no desk in the interaction, no height difference and optimal interaction distance were observed to have greater eye contact and touch than physicians' in exam rooms with a desk, similar height difference and interaction distance. The tool can enable physicians to assess the spatial configuration of exam rooms (through Parts A and B) and thus facilitate the structuring of kinesic attributes (Part C).

  1. Validity and applicability of a video-based animated tool to assess mobility in elderly Latin American populations

    PubMed Central

    Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria

    2016-01-01

    Aim To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. Methods The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. Results A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90–0.97) in Brazil and 0.81 (95% confidence interval 0.66–0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. Conclusions These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective perfor mance measures. PMID:24666718

  2. Validity and applicability of a video-based animated tool to assess mobility in elderly Latin American populations.

    PubMed

    Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria

    2014-10-01

    To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90-0.97) in Brazil and 0.81 (95% confidence interval 0.66-0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective performance measures. © 2013 Japan Geriatrics Society.

  3. Evaluation of ground-penetrating radar to detect free-phase hydrocarbons in fractured rocks - Results of numerical modeling and physical experiments

    USGS Publications Warehouse

    Lane, J.W.; Buursink, M.L.; Haeni, F.P.; Versteeg, R.J.

    2000-01-01

    The suitability of common-offset ground-penetrating radar (GPR) to detect free-phase hydrocarbons in bedrock fractures was evaluated using numerical modeling and physical experiments. The results of one- and two-dimensional numerical modeling at 100 megahertz indicate that GPR reflection amplitudes are relatively insensitive to fracture apertures ranging from 1 to 4 mm. The numerical modeling and physical experiments indicate that differences in the fluids that fill fractures significantly affect the amplitude and the polarity of electromagnetic waves reflected by subhorizontal fractures. Air-filled and hydrocarbon-filled fractures generate low-amplitude reflections that are in-phase with the transmitted pulse. Water-filled fractures create reflections with greater amplitude and opposite polarity than those reflections created by air-filled or hydrocarbon-filled fractures. The results from the numerical modeling and physical experiments demonstrate it is possible to distinguish water-filled fracture reflections from air- or hydrocarbon-filled fracture reflections, nevertheless subsurface heterogeneity, antenna coupling changes, and other sources of noise will likely make it difficult to observe these changes in GPR field data. This indicates that the routine application of common-offset GPR reflection methods for detection of hydrocarbon-filled fractures will be problematic. Ideal cases will require appropriately processed, high-quality GPR data, ground-truth information, and detailed knowledge of subsurface physical properties. Conversely, the sensitivity of GPR methods to changes in subsurface physical properties as demonstrated by the numerical and experimental results suggests the potential of using GPR methods as a monitoring tool. GPR methods may be suited for monitoring pumping and tracer tests, changes in site hydrologic conditions, and remediation activities.The suitability of common-offset ground-penetrating radar (GPR) to detect free-phase hydrocarbons in bedrock fractures was evaluated using numerical modeling and physical experiments. The results of one- and two-dimensional numerical modeling at 100 megahertz indicate that GPR reflection amplitudes are relatively insensitive to fracture apertures ranging from 1 to 4 mm. The numerical modeling and physical experiments indicate that differences in the fluids that fill fractures significantly affect the amplitude and the polarity of electromagnetic waves reflected by subhorizontal fractures. Air-filled and hydrocarbon-filled fractures generate low-amplitude reflections that are in-phase with the transmitted pulse. Water-filled fractures create reflections with greater amplitude and opposite polarity than those reflections created by air-filled or hydrocarbon-filled fractures. The results from the numerical modeling and physical experiments demonstrate it is possible to distinguish water-filled fracture reflections from air- or hydrocarbon-filled fracture reflections, nevertheless subsurface heterogeneity, antenna coupling changes, and other sources of noise will likely make it difficult to observe these changes in GPR field data. This indicates that the routine application of common-offset GPR reflection methods for detection of hydrocarbon-filled fractures will be problematic. Ideal cases will require appropriately processed, high-quality GPR data, ground-truth information, and detailed knowledge of subsurface physical properties. Conversely, the sensitivity of GPR methods to changes in subsurface physical properties as demonstrated by the numerical and experimental results suggests the potential of using GPR methods as a monitoring tool. GPR methods may be suited for monitoring pumping and tracer tests, changes in site hydrologic conditions, and remediation activities.

  4. Development and implementation of a remote audit tool for high dose rate (HDR) Ir-192 brachytherapy using optically stimulated luminescence dosimetry

    PubMed Central

    Casey, Kevin E.; Alvarez, Paola; Kry, Stephen F.; Howell, Rebecca M.; Lawyer, Ann; Followill, David

    2013-01-01

    Purpose: The aim of this work was to create a mailable phantom with measurement accuracy suitable for Radiological Physics Center (RPC) audits of high dose-rate (HDR) brachytherapy sources at institutions participating in National Cancer Institute-funded cooperative clinical trials. Optically stimulated luminescence dosimeters (OSLDs) were chosen as the dosimeter to be used with the phantom. Methods: The authors designed and built an 8 × 8 × 10 cm3 prototype phantom that had two slots capable of holding Al2O3:C OSLDs (nanoDots; Landauer, Glenwood, IL) and a single channel capable of accepting all 192Ir HDR brachytherapy sources in current clinical use in the United States. The authors irradiated the phantom with Nucletron and Varian 192Ir HDR sources in order to determine correction factors for linearity with dose and the combined effects of irradiation energy and phantom characteristics. The phantom was then sent to eight institutions which volunteered to perform trial remote audits. Results: The linearity correction factor was kL = (−9.43 × 10−5 × dose) + 1.009, where dose is in cGy, which differed from that determined by the RPC for the same batch of dosimeters using 60Co irradiation. Separate block correction factors were determined for current versions of both Nucletron and Varian 192Ir HDR sources and these vendor-specific correction factors differed by almost 2.6%. For the Nucletron source, the correction factor was 1.026 [95% confidence interval (CI) = 1.023–1.028], and for the Varian source, it was 1.000 (95% CI = 0.995–1.005). Variations in lateral source positioning up to 0.8 mm and distal/proximal source positioning up to 10 mm had minimal effect on dose measurement accuracy. The overall dose measurement uncertainty of the system was estimated to be 2.4% and 2.5% for the Nucletron and Varian sources, respectively (95% CI). This uncertainty was sufficient to establish a ±5% acceptance criterion for source strength audits under a formal RPC audit program. Trial audits of four Nucletron sources and four Varian sources revealed an average RPC-to-institution dose ratio of 1.000 (standard deviation = 0.011). Conclusions: The authors have created an OSLD-based 192Ir HDR brachytherapy source remote audit tool which offers sufficient dose measurement accuracy to allow the RPC to establish a remote audit program with a ±5% acceptance criterion. The feasibility of the system has been demonstrated with eight trial audits to date. PMID:24320455

  5. Open source GIS based tools to improve hydrochemical water resources management in EU H2020 FREEWAT platform

    NASA Astrophysics Data System (ADS)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna

    2017-04-01

    Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for being used in the realization of groundwater modelling. They include, ionic balance calculations, chemical time-series analysis, correlation of chemical parameters, and calculation of various common hydrochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff), among others. Furthermore, it allows the generation of maps of the spatial distributions of parameters and diagrams and thematic maps for the parameters measured and classified in the queried area. References: Rossetto R., Borsi I., Schifani C., Bonari E., Mogorovich P., Primicerio M. (2013). SID&GRID: Integrating hydrological modeling in GIS environment. Rendiconti Online Societa Geologica Italiana, Vol. 24, 282-283 Steduto, P., Faurès, J.M., Hoogeveen, J., Winpenny, J.T., Burke, J.J. (2012). Coping with water scarcity: an action framework for agriculture and food security. ISSN 1020-1203 ; 38 Velasco, V., Tubau, I., Vázquez-Suñé, E., Gogu, R., Gaitanaru, D., Alcaraz, M., Sanchez-Vila, X. (2014). GIS-based hydrogeochemical analysis tools (QUIMET). Computers & Geosciences, 70, 164-180.

  6. Photutils: Photometry tools

    NASA Astrophysics Data System (ADS)

    Bradley, Larry; Sipocz, Brigitta; Robitaille, Thomas; Tollerud, Erik; Deil, Christoph; Vinícius, Zè; Barbary, Kyle; Günther, Hans Moritz; Bostroem, Azalee; Droettboom, Michael; Bray, Erik; Bratholm, Lars Andersen; Pickering, T. E.; Craig, Matt; Pascual, Sergio; Greco, Johnny; Donath, Axel; Kerzendorf, Wolfgang; Littlefair, Stuart; Barentsen, Geert; D'Eugenio, Francesco; Weaver, Benjamin Alan

    2016-09-01

    Photutils provides tools for detecting and performing photometry of astronomical sources. It can estimate the background and background rms in astronomical images, detect sources in astronomical images, estimate morphological parameters of those sources (e.g., centroid and shape parameters), and perform aperture and PSF photometry. Written in Python, it is an affiliated package of Astropy (ascl:1304.002).

  7. Microbial source tracking: a tool for identifying sources of microbial contamination in the food chain.

    PubMed

    Fu, Ling-Lin; Li, Jian-Rong

    2014-01-01

    The ability to trace fecal indicators and food-borne pathogens to the point of origin has major ramifications for food industry, food regulatory agencies, and public health. Such information would enable food producers and processors to better understand sources of contamination and thereby take corrective actions to prevent transmission. Microbial source tracking (MST), which currently is largely focused on determining sources of fecal contamination in waterways, is also providing the scientific community tools for tracking both fecal bacteria and food-borne pathogens contamination in the food chain. Approaches to MST are commonly classified as library-dependent methods (LDMs) or library-independent methods (LIMs). These tools will have widespread applications, including the use for regulatory compliance, pollution remediation, and risk assessment. These tools will reduce the incidence of illness associated with food and water. Our aim in this review is to highlight the use of molecular MST methods in application to understanding the source and transmission of food-borne pathogens. Moreover, the future directions of MST research are also discussed.

  8. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  9. Motion Artefacts in MRI: a Complex Problem with Many Partial Solutions

    PubMed Central

    Zaitsev, Maxim; Maclaren, Julian.; Herbst, Michael

    2015-01-01

    Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artefacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artefacts, but no single method can be applied in all imaging situations. Instead, a ‘toolbox’ of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artefacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artefacts, with the aim of aiding artefact detection and mitigation in particular clinical situations. PMID:25630632

  10. Motion artifacts in MRI: A complex problem with many partial solutions.

    PubMed

    Zaitsev, Maxim; Maclaren, Julian; Herbst, Michael

    2015-10-01

    Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artifacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artifacts, but no single method can be applied in all imaging situations. Instead, a "toolbox" of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artifacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artifacts, with the aim of aiding artifact detection and mitigation in particular clinical situations. © 2015 Wiley Periodicals, Inc.

  11. Mapping hotspots of malaria transmission from pre-existing hydrology, geology and geomorphology data in the pre-elimination context of Zanzibar, United Republic of Tanzania.

    PubMed

    Hardy, Andrew; Mageni, Zawadi; Dongus, Stefan; Killeen, Gerry; Macklin, Mark G; Majambare, Silas; Ali, Abdullah; Msellem, Mwinyi; Al-Mafazy, Abdul-Wahiyd; Smith, Mark; Thomas, Chris

    2015-01-22

    Larval source management strategies can play an important role in malaria elimination programmes, especially for tackling outdoor biting species and for eliminating parasite and vector populations when they are most vulnerable during the dry season. Effective larval source management requires tools for identifying geographic foci of vector proliferation and malaria transmission where these efforts may be concentrated. Previous studies have relied on surface topographic wetness to indicate hydrological potential for vector breeding sites, but this is unsuitable for karst (limestone) landscapes such as Zanzibar where water flow, especially in the dry season, is subterranean and not controlled by surface topography. We examine the relationship between dry and wet season spatial patterns of diagnostic positivity rates of malaria infection amongst patients reporting to health facilities on Unguja, Zanzibar, with the physical geography of the island, including land cover, elevation, slope angle, hydrology, geology and geomorphology in order to identify transmission hot spots using Boosted Regression Trees (BRT) analysis. The distribution of both wet and dry season malaria infection rates can be predicted using freely available static data, such as elevation and geology. Specifically, high infection rates in the central and southeast regions of the island coincide with outcrops of hard dense limestone which cause locally elevated water tables and the location of dolines (shallow depressions plugged with fine-grained material promoting the persistence of shallow water bodies). This analysis provides a tractable tool for the identification of malaria hotspots which incorporates subterranean hydrology, which can be used to target larval source management strategies.

  12. Methodology and Software for Gross Defect Detection of Spent Nuclear Fuel at the Atucha-I Reactor [Novel Methodology and Software for Spent Fuel Gross Defect Detection at the Atucha-I Reactor

    DOE PAGES

    Sitaraman, Shivakumar; Ham, Young S.; Gharibyan, Narek; ...

    2017-03-27

    Here, fuel assemblies in the spent fuel pool are stored by suspending them in two vertically stacked layers at the Atucha Unit 1 nuclear power plant (Atucha-I). This introduces the unique problem of verifying the presence of fuel in either layer without physically moving the fuel assemblies. Given that the facility uses both natural uranium and slightly enriched uranium at 0.85 wt% 235U and has been in operation since 1974, a wide range of burnups and cooling times can exist in any given pool. A gross defect detection tool, the spent fuel neutron counter (SFNC), has been used at themore » site to verify the presence of fuel up to burnups of 8000 MWd/t. At higher discharge burnups, the existing signal processing software of the tool was found to fail due to nonlinearity of the source term with burnup.« less

  13. Heterogeneous Sensor Data Exploration and Sustainable Declarative Monitoring Architecture: Application to Smart Building

    NASA Astrophysics Data System (ADS)

    Servigne, S.; Gripay, Y.; Pinarer, O.; Samuel, J.; Ozgovde, A.; Jay, J.

    2016-09-01

    Concerning energy consumption and monitoring architectures, our goal is to develop a sustainable declarative monitoring architecture for lower energy consumption taking into account the monitoring system itself. Our second is to develop theoretical and practical tools to model, explore and exploit heterogeneous data from various sources in order to understand a phenomenon like energy consumption of smart building vs inhabitants' social behaviours. We focus on a generic model for data acquisition campaigns based on the concept of generic sensor. The concept of generic sensor is centered on acquired data and on their inherent multi-dimensional structure, to support complex domain-specific or field-oriented analysis processes. We consider that a methodological breakthrough may pave the way to deep understanding of voluminous and heterogeneous scientific data sets. Our use case concerns energy efficiency of buildings to understand relationship between physical phenomena and user behaviors. The aim of this paper is to give a presentation of our methodology and results concerning architecture and user-centric tools.

  14. Methodology and Software for Gross Defect Detection of Spent Nuclear Fuel at the Atucha-I Reactor [Novel Methodology and Software for Spent Fuel Gross Defect Detection at the Atucha-I Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitaraman, Shivakumar; Ham, Young S.; Gharibyan, Narek

    Here, fuel assemblies in the spent fuel pool are stored by suspending them in two vertically stacked layers at the Atucha Unit 1 nuclear power plant (Atucha-I). This introduces the unique problem of verifying the presence of fuel in either layer without physically moving the fuel assemblies. Given that the facility uses both natural uranium and slightly enriched uranium at 0.85 wt% 235U and has been in operation since 1974, a wide range of burnups and cooling times can exist in any given pool. A gross defect detection tool, the spent fuel neutron counter (SFNC), has been used at themore » site to verify the presence of fuel up to burnups of 8000 MWd/t. At higher discharge burnups, the existing signal processing software of the tool was found to fail due to nonlinearity of the source term with burnup.« less

  15. Meteorological stations as a tool to teach on climate system sciences

    NASA Astrophysics Data System (ADS)

    Cerdà, Artemi; Bodí, Merche B.; Damián Ruiz-Sinoga, José

    2010-05-01

    Higher education has been focussed on teaching climate system theory. Meteorology and climatology student rarely visited a meteorological station. However, meteorological stations are the source of information for the climate system studies and they supply the key information for modelling. This paper shows how meteorological station is a key tool to introduce student to the study of climate and meteorology. The research stations of Montesa and El Teularet-Sierra de Enguera are being used for seven years to supply data to the students of Climatology, 1st year of the Degree in Geography at the University of Valencia. The results show that the students that used the raw data set were proud to use original data. Those students got higher qualifications and they choose also in the following year courses on climatology or Physical Geography. Then, the conclusions are that the use of meteorological stations is a positive contribution to the improvement of the knowledge of the students, and his compromise with the science and the environment.

  16. Process evaluation of physical activity counselling with and without the use of mobile technology: A mixed methods study.

    PubMed

    Verwey, R; van der Weegen, S; Spreeuwenberg, M; Tange, H; van der Weijden, T; de Witte, L

    2016-01-01

    A monitoring-and-feedback tool was developed to stimulate physical activity by giving feedback on physical activity performance to patients and practice nurses. The tool consists of an activity monitor (accelerometer), wirelessly connected to a Smartphone and a web application. Use of this tool is combined with a behaviour change counselling protocol (the Self-management Support Programme) based on the Five A's model (Assess-Advise-Agree-Assist-Arrange). To examine the reach, implementation and satisfaction with the counselling protocol and the tool. A process evaluation was conducted in two intervention groups of a three-armed cluster randomised controlled trial, in which the counselling protocol was evaluated with (group 1, n=65) and without (group 2, n=66) the use of the tool using a mixed methods design. Sixteen family practices in the South of the Netherlands. Practice nurses (n=20) and their associated physically inactive patients (n=131), diagnosed with Chronic Obstructive Pulmonary Disease or Type 2 Diabetes, aged between 40 and 70 years old, and having access to a computer with an Internet connection. Semi structured interviews about the receipt of the intervention were conducted with the nurses and log files were kept regarding the consultations. After the intervention, questionnaires were presented to patients and nurses regarding compliance to and satisfaction with the interventions. Functioning and use of the tool were also evaluated by system and helpdesk logging. Eighty-six percent of patients (group 1: n=57 and group 2: n=56) and 90% of nurses (group 1: n=10 and group 2: n=9) responded to the questionnaires. The execution of the Self-management Support Programme was adequate; in 83% (group 1: n=52, group 2: n=57) of the patients, the number and planning of the consultations were carried out as intended. Eighty-eight percent (n=50) of the patients in group 1 used the tool until the end of the intervention period. Technical problems occurred in 58% (n=33). Participants from group 1 were significantly more positive: patients: χ(2)(2, N=113)=11.17, p=0.004, and nurses: χ(2)(2, N=19)=6.37, p=0.040. Use of the tool led to greater awareness of the importance of physical activity, more discipline in carrying it out and more enjoyment. The interventions were adequately executed and received as planned. Patients from both groups appreciated the focus on physical activity and personal attention given by the nurse. The most appreciated aspect of the combined intervention was the tool, although technical problems frequently occurred. Patients with the tool estimated more improvement of physical activity than patients without the tool. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Tool use as distributed cognition: how tools help, hinder and define manual skill.

    PubMed

    Baber, Chris; Parekh, Manish; Cengiz, Tulin G

    2014-01-01

    Our thesis in this paper is that, in order to appreciate the interplay between cognitive (goal-directed) and physical performance in tool use, it is necessary to determine the role that representations play in the use of tools. We argue that rather being solely a matter of internal (mental) representation, tool use makes use of the external representations that define the human-environment-tool-object system. This requires the notion of Distributed Cognition to encompass not simply the manner in which artifacts represent concepts but also how they represent praxis. Our argument is that this can be extended to include how artifacts-in-context afford use and how this response to affordances constitutes a particular form of skilled performance. By artifacts-in-context, we do not mean solely the affordances offered by the physical dimensions of a tool but also the interaction between the tool and the object that it is being used on. From this, "affordance" does not simply relate to the physical appearance of the tool but anticipates subsequent actions by the user directed towards the goal of changing the state of the object and this is best understood in terms of the "complimentarity" in the system. This assertion raises two challenges which are explored in this paper. The first is to distinguish "affordance" from the adaptation that one might expect to see in descriptions of motor control; when we speak of "affordance" as a form of anticipation, don't we just mean the ability to adjust movements in response to physical demands? The second is to distinguish "affordance" from a schema of the tool; when we talk about anticipation, don't we just mean the ability to call on a schema representing a "recipe" for using that tool for that task? This question of representation, specifically what knowledge needs to be represented in tool use, is central to this paper.

  18. Measurement methods to build up the digital optical twin

    NASA Astrophysics Data System (ADS)

    Prochnau, Marcel; Holzbrink, Michael; Wang, Wenxin; Holters, Martin; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    The realization of the Digital Optical Twin (DOT), which is in short the digital representation of the physical state of an optical system, is particularly useful in the context of an automated assembly process of optical systems. During the assembly process, the physical system status of the optical system is continuously measured and compared with the digital model. In case of deviations between physical state and the digital model, the latter one is adapted to match the physical state. To reach the goal described above, in a first step measurement/characterization technologies concerning their suitability to generate a precise digital twin of an existing optical system have to be identified and evaluated. This paper gives an overview of possible characterization methods and, finally, shows first results of evaluated, compared methods (e.g. spot-radius, MTF, Zernike-polynomials), to create a DOT. The focus initially lies on the unequivocalness of the optimization results as well as on the computational time required for the optimization to reach the characterized system state. Possible sources of error are the measurement accuracy (to characterize the system) , execution time of the measurement, time needed to map the digital to the physical world (optimization step) as well as interface possibilities to integrate the measurement tool into an assembly cell. Moreover, it is to be discussed whether the used measurement methods are suitable for a `seamless' integration into an assembly cell.

  19. The Feasibility of Standardised Geriatric Assessment Tools and Physical Exercises in Frail Older Adults.

    PubMed

    Jadczak, A D; Mahajan, N; Visvanathan, R

    2017-01-01

    Geriatric assessment tools are applicable to the general geriatric population; however, their feasibility in frail older adults is yet to be determined. The study aimed to determine the feasibility of standardised geriatric assessment tools and physical exercises in hospitalised frail older adults. Various assessment tools including the FRAIL Screen, the Charlson Comorbidity Index, the SF-36, the Trail Making Test (TMT), the Rapid Cognitive Screen, the Self Mini Nutritional Assessment (MNA-SF) and the Lawton iADL as well as standard physical exercises were assessed using observational protocols. The FRAIL Screen, MNA-SF, Rapid Cognitive Screen, Lawton iADL and the physical exercises were deemed to be feasible with only minor comprehension, execution and safety issues. The TMT was not considered to be feasible and the SF-36 should be replaced by its shorter form, the SF-12. In order to ensure the validity of these findings a study with a larger sample size should be undertaken.

  20. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users

  1. Identifying factors of comfort in using hand tools.

    PubMed

    Kuijt-Evers, L F M; Groenesteijn, L; de Looze, M P; Vink, P

    2004-09-01

    To design comfortable hand tools, knowledge about comfort/discomfort in using hand tools is required. We investigated which factors determine comfort/discomfort in using hand tools according to users. Therefore, descriptors of comfort/discomfort in using hand tools were collected from literature and interviews. After that, the relatedness of a selection of the descriptors to comfort in using hand tools was investigated. Six comfort factors could be distinguished (functionality, posture and muscles, irritation and pain of hand and fingers, irritation of hand surface, handle characteristics, aesthetics). These six factors can be classified into three meaningful groups: functionality, physical interaction and appearance. The main conclusions were that (1) the same descriptors were related to comfort and discomfort in using hand tools, (2) descriptors of functionality are most related to comfort in using hand tools followed by descriptors of physical interaction and (3) descriptors of appearance become secondary in comfort in using hand tools.

  2. Systematic review of behaviour change techniques to promote participation in physical activity among people with dementia.

    PubMed

    Nyman, Samuel R; Adamczewska, Natalia; Howlett, Neil

    2018-02-01

    The objective of this study was to systematically review the evidence for the potential promise of behaviour change techniques (BCTs) to increase physical activity among people with dementia (PWD). PsychINFO, MEDLINE, CINAHL, and the Cochrane Central Register of Controlled Trials databases were searched 01/01/2000-01/12/2016. Randomized controlled/quasi-randomized trials were included if they recruited people diagnosed/suspected to have dementia, used at least one BCT in the intervention arm, and had at least one follow-up measure of physical activity/adherence. Studies were appraised using the Cochrane Collaboration Risk of Bias Tool, and BCTs were coded using Michie et al., 2013, Annals of Behavioral Medicine, 46, 81. taxonomy. Intervention findings were narratively synthesized as either 'very promising', 'quite promising', or 'non-promising', and BCTs were judged as having potential promise if they featured in at least twice as many very/quite promising than non-promising interventions (as per Gardner et al., 2016, Health Psychology Review, 10, 89). Nineteen articles from nine trials reported physical activity findings on behavioural outcomes (two very promising, one quite promising, and two non-promising) or intervention adherence (one quite promising and four non-promising). Thirteen BCTs were used across the interventions. While no BCT had potential promise to increase intervention adherence, three BCTs had potential promise for improving physical activity behaviour outcomes: goal setting (behaviour), social support (unspecified), and using a credible source. Three BCTs have potential promise for use in future interventions to increase physical activity among PWD. Statement of contribution What is already known on this subject? While physical activity is a key lifestyle factor to enhance and maintain health and wellbeing amongst the general population, adults rarely participate in sufficient levels to obtain these benefits. Systematic reviews suggest that specific behaviour change techniques can increase physical activity, although one review suggested that self-regulatory techniques may be counterproductive when promoting physical activity among older people. Until now, no systematic review has been conducted to assess which behaviour change techniques may be associated with greater participation in physical activity among people with dementia. What does this study add? Interventions showed mixed promise for increasing physical activity and little effect on participant adherence. Goal setting (behaviour), social support (unspecified), and using a credible source are promising approaches. No technique showed promise for increasing adherence to physical activity interventions among people with dementia. © 2017 The British Psychological Society.

  3. Smartphones as Experimental Tools: Different Methods to Determine the Gravitational Acceleration in Classroom Physics by Using Everyday Devices

    ERIC Educational Resources Information Center

    Kuhn, Jochen; Vogt, Patrik

    2013-01-01

    New media technology becomes more and more important for our daily life as well as for teaching physics. Within the scope of our N.E.T. research project we develop experiments using New Media Experimental Tools (N.E.T.) in physics education and study their influence on students learning abilities. We want to present the possibilities e.g. of…

  4. Strong-field physics with mid-infrared lasers

    NASA Astrophysics Data System (ADS)

    Pogorelsky, I. V.

    2002-04-01

    Mid-infrared gas laser technology promises to become a unique tool for research in strong-field relativistic physics. The degree to which physics is relativistic is determined by a ponderomotive potential. At a given intensity, a 10 μm wavelength CO2 laser reaches a 100 times higher ponderomotive potential than the 1 μm wavelength solid state lasers. Thus, we can expect a proportional increase in the throughput of such processes as laser acceleration, x-ray production, etc. These arguments have been confirmed in proof-of-principle Thomson scattering and laser acceleration experiments conducted at BNL and UCLA where the first terawatt-class CO2 lasers are in operation. Further more, proposals for the 100 TW, 100 fs CO2 lasers based on frequency-chirped pulse amplification have been conceived. Such lasers can produce physical effects equivalent to a hypothetical multi-petawatt solid state laser. Ultra-fast mid-infrared lasers will open new routes to the next generation electron and ion accelerators, ultra-bright monochromatic femtosecond x-ray and gamma sources, allow to attempt the study of Hawking-Unruh radiation, and explore relativistic aspects of laser-matter interactions. We review the present status and experiments with terawatt-class CO2 lasers, sub-petawatt projects, and prospective applications in strong-field science. .

  5. Web-based rehabilitation interventions for people with rheumatoid arthritis: A systematic review.

    PubMed

    Srikesavan, Cynthia; Bryer, Catherine; Ali, Usama; Williamson, Esther

    2018-01-01

    Background Rehabilitation approaches for people with rheumatoid arthritis include joint protection, exercises and self-management strategies. Health interventions delivered via the web have the potential to improve access to health services overcoming time constraints, physical limitations, and socioeconomic and geographic barriers. The objective of this review is to determine the effects of web-based rehabilitation interventions in adults with rheumatoid arthritis. Methods Randomised controlled trials that compared web-based rehabilitation interventions with usual care, waiting list, no treatment or another web-based intervention in adults with rheumatoid arthritis were included. The outcomes were pain, function, quality of life, self-efficacy, rheumatoid arthritis knowledge, physical activity and adverse effects. Methodological quality was assessed using the Cochrane Risk of Bias tool and quality of evidence with the Grading of Recommendations Assessment, Development and Evaluation approach. Results Six source documents from four trials ( n = 567) focusing on self-management, health information or physical activity were identified. The effects of web-based rehabilitation interventions on pain, function, quality of life, self-efficacy, rheumatoid arthritis knowledge and physical activity are uncertain because of the very low quality of evidence mostly from small single trials. Adverse effects were not reported. Conclusion Large, well-designed trials are needed to evaluate the clinical and cost-effectiveness of web-based rehabilitation interventions in rheumatoid arthritis.

  6. STRONG FIELD PHYSICS WITH MID INFRARED LASERS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    POGORELSKY,I.V.

    2001-08-27

    Mid-infrared gas laser technology promises to become a unique tool for research in strong-field relativistic physics. The degree to which physics is relativistic is determined by a ponderomotive potential. At a given intensity, a 10 {micro}m wavelength CO{sub 2} laser reaches a 100 times higher ponderomotive potential than the 1 {micro}m wavelength solid state lasers. Thus, we can expect a proportional increase in the throughput of such processes as laser acceleration, x-ray production, etc. These arguments have been confirmed in proof-of-principle Thomson scattering and laser acceleration experiments conducted at BNL and UCLA where the first terawatt-class CO{sub 2} lasers aremore » in operation. Further more, proposals for the 100 TW, 100 fs CO{sub 2} lasers based on frequency-chirped pulse amplification have been conceived. Such lasers can produce physical effects equivalent to a hypothetical multi-petawatt solid state laser. Ultra-fast mid-infrared lasers will open new routes to the next generation electron and ion accelerators, ultra-bright monochromatic femtosecond x-ray and gamma sources, allow to attempt the study of Hawking-Unruh radiation, and explore relativistic aspects of laser-matter interactions. We review the present status and experiments with terawatt-class CO{sub 2} lasers, sub-petawatt projects, and prospective applications in strong-field science.« less

  7. Verification of the Icarus Material Response Tool

    NASA Technical Reports Server (NTRS)

    Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre

    2017-01-01

    Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.

  8. Coherent tools for physics-based simulation and characterization of noise in semiconductor devices oriented to nonlinear microwave circuit CAD

    NASA Astrophysics Data System (ADS)

    Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan

    2004-05-01

    We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.

  9. Code Sharing and Collaboration: Experiences from the Scientist's Expert Assistant Project and their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.

  10. Code Sharing and Collaboration: Experiences From the Scientist's Expert Assistant Project and Their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.

  11. SLUG - stochastically lighting up galaxies - III. A suite of tools for simulated photometry, spectroscopy, and Bayesian inference with stochastic stellar populations

    NASA Astrophysics Data System (ADS)

    Krumholz, Mark R.; Fumagalli, Michele; da Silva, Robert L.; Rendahl, Theodore; Parra, Jonathan

    2015-09-01

    Stellar population synthesis techniques for predicting the observable light emitted by a stellar population have extensive applications in numerous areas of astronomy. However, accurate predictions for small populations of young stars, such as those found in individual star clusters, star-forming dwarf galaxies, and small segments of spiral galaxies, require that the population be treated stochastically. Conversely, accurate deductions of the properties of such objects also require consideration of stochasticity. Here we describe a comprehensive suite of modular, open-source software tools for tackling these related problems. These include the following: a greatly-enhanced version of the SLUG code introduced by da Silva et al., which computes spectra and photometry for stochastically or deterministically sampled stellar populations with nearly arbitrary star formation histories, clustering properties, and initial mass functions; CLOUDY_SLUG, a tool that automatically couples SLUG-computed spectra with the CLOUDY radiative transfer code in order to predict stochastic nebular emission; BAYESPHOT, a general-purpose tool for performing Bayesian inference on the physical properties of stellar systems based on unresolved photometry; and CLUSTER_SLUG and SFR_SLUG, a pair of tools that use BAYESPHOT on a library of SLUG models to compute the mass, age, and extinction of mono-age star clusters, and the star formation rate of galaxies, respectively. The latter two tools make use of an extensive library of pre-computed stellar population models, which are included in the software. The complete package is available at http://www.slugsps.com.

  12. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  13. Job title of recent bachelor's degree recipients

    NASA Astrophysics Data System (ADS)

    White, Susan C.

    2015-05-01

    Physics bachelor's degree recipients work in all kinds of professions—science writing, medicine, law, history of science, acting, music, healthcare and more. Since very few of these employees have the word "physics" in their job titles, it can be hard for new graduates to know where to look for jobs and how to find other recent physics graduates in the workforce. The American Institute of Physics and the Society of Physics Students joined forces on an NSF-funded grant to create career tools for undergraduate physics students.1 One of the tools available to students in the Careers Toolbox is a listing of common job titles of physics bachelors degree recipients working in various fields; some of the job titles are listed below.

  14. Students’ epistemic understanding of mathematical derivations in physics

    NASA Astrophysics Data System (ADS)

    Sirnoorkar, Amogh; Mazumdar, Anwesh; Kumar, Arvind

    2017-01-01

    We propose an epistemic measure of physics in terms of the ability to discriminate between the purely mathematical, physical (i.e. dependent on empirical inputs) and nominal (i.e. empty of mathematical or physical content) propositions appearing in a typical derivation in physics. The measure can be relevant in understanding the maths-physics link hurdles among college students. To illustrate the idea, we construct a tool for a familiar derivation (involving specific heats of an ideal gas), and use it for a sample of students from three different institutes. The reliability of the tool is examined. The results indicate, as intuitively expected, that epistemic clarity correlates with content clarity. Data yield several significant trends on the extent and kinds of epistemic pitfalls prevalent among physics undergraduates.

  15. TESSIM: a simulator for the Athena-X-IFU

    NASA Astrophysics Data System (ADS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; den Hartog, R. H.; Bandler, S. R.; de Plaa, J.; den Herder, J.-W. A.

    2016-07-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS- files which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http://www.sternwarte.uni-erlangen.de/research/sixte/).

  16. TESSIM: A Simulator for the Athena-X-IFU

    NASA Technical Reports Server (NTRS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; Den Hartog, R. H.; Bandler, S. R.; De Plaa, J.; hide

    2016-01-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS-les which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http:www.sternwarte.uni-erlangen.deresearchsixte).

  17. Atomic force microscopy of atomic-scale ledges and etch pits formed during dissolution of quartz

    NASA Technical Reports Server (NTRS)

    Gratz, A. J.; Manne, S.; Hansma, P. K.

    1991-01-01

    The processes involved in the dissolution and growth of crystals are closely related. Atomic force microscopy (AFM) of faceted pits (called negative crystals) formed during quartz dissolution reveals subtle details of these underlying physical mechanisms for silicates. In imaging these surfaces, the AFM detected ledges less than 1 nm high that were spaced 10 to 90 nm apart. A dislocation pit, invisible to optical and scanning electron microscopy measurements and serving as a ledge source, was also imaged. These observations confirm the applicability of ledge-motion models to dissolution and growth of silicates; coupled with measurements of dissolution rate on facets, these methods provide a powerful tool for probing mineral surface kinetics.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nalu is a Sierra ToolKit (STK) based application module, and it has provided a set of "lessons learned" for the STK transition effort through its early adoption of STK. It makes use of the open-sourced Trilinos/ Tpetra library. Through the investment of LORD and ASCR projects, the Nalu code module has been extended beyond prototype status. Physics capability includes low Mach, variable density turbulent flow. The ongoing objective for Nalu is to facilitate partnerships with external organizations in order to extend code capability and knowledge; however, it is not intended to support routine CFD analysis. The targeted usage of thismore » module is for non-NW applications that support work-for-others in the multiphysics energy sector.« less

  19. NARMER-1: a photon point-kernel code with build-up factors

    NASA Astrophysics Data System (ADS)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  20. Towards predictive simulations of soot formation: from surrogate to turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanquart, Guillaume

    The combustion of transportation fuels leads to the formation of several kinds of pollutants, among which are soot particles. These particles, also formed during coal combustion and in fires, are the source of several health problems and environmental issues. Unfortunately, our current understanding of the chemical and physical phenomena leading to the formation of soot particles remains incomplete, and as a result, the predictive capability of our numerical tools is lacking. The objective of the work was to reduce the gap in the present understanding and modeling of soot formation both in laminar and turbulent flames. The effort spanned severalmore » length scales from the molecular level to large scale turbulent transport.« less

  1. Small-Size High-Current Generators for X-Ray Backlighting

    NASA Astrophysics Data System (ADS)

    Chaikovsky, S. A.; Artyomov, A. P.; Zharova, N. V.; Zhigalin, A. S.; Lavrinovich, I. V.; Oreshkin, V. I.; Ratakhin, N. A.; Rousskikh, A. G.; Fedunin, A. V.; Fedushchak, V. F.; Erfort, A. A.

    2017-12-01

    The paper deals with the soft X-ray backlighting based on the X-pinch as a powerful tool for physical studies of fast processes. Proposed are the unique small-size pulsed power generators operating as a low-inductance capacitor bank. These pulse generators provide the X-pinch-based soft X-ray source (hν = 1-10 keV) of micron size at 2-3 ns pulse duration. The small size and weight of pulse generators allow them to be transported to any laboratory for conducting X-ray backlighting of test objects with micron space resolution and nanosecond exposure time. These generators also allow creating synchronized multi-frame radiographic complexes with frame delay variation in a broad range.

  2. System Architecture Modeling for Technology Portfolio Management using ATLAS

    NASA Technical Reports Server (NTRS)

    Thompson, Robert W.; O'Neil, Daniel A.

    2006-01-01

    Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.

  3. Identifying fecal sources in a selected catchment reach using multiple source-tracking tools

    USGS Publications Warehouse

    Vogel, J.R.; Stoeckel, D.M.; Lamendella, R.; Zelt, R.B.; Santo, Domingo J.W.; Walker, S.R.; Oerther, D.B.

    2007-01-01

    Given known limitations of current microbial source-tracking (MST) tools, emphasis on small, simple study areas may enhance interpretations of fecal contamination sources in streams. In this study, three MST tools - Escherichia coli repetitive element polymerase chain reaction (rep-PCR), coliphage typing, and Bacteroidales 16S rDNA host-associated markers - were evaluated in a selected reach of Plum Creek in sooth-central Nebraska. Water-quality samples were collected from six sites. One reach was selected for MST evaluation based on observed patterns of E. coli contamination. Despite high E. coli concentrations, coliphages were detected only once among water samples, precluding their use as a MST tool in this setting. Rep-PCR classification of E. coli isolates from both water and sediment samples supported the hypothesis that cattle and wildlife were dominant sources of fecal contamination, with minor contributions by horses and humans. Conversely, neither ruminant nor human sources were detected by Bacteroidales markers in most water samples. In bed sediment, ruminant- and human-associated Bacteroidales markers were detected throughout the interval from 0 to 0.3 m, with detections independent of E. coli concentrations in the sediment. Although results by E. coli-based and Bacteroidales-based MST methods led to similar interpretations, detection of Bacteroidales markers in sediment more commonly than in water indicates that different tools to track fecal contamination (in this case, tools based on Bacteroidales DNA and E. coli isolates) may have varying relevance to the more specific goal of tracking the sources of E. coli in watersheds. This is the first report of simultaneous, toolbox approach application of a library-based and marker-based MST analyses to lowing surface water. ?? ASA, CSSA, SSSA.

  4. An Evaluation of Open Source Learning Management Systems According to Administration Tools and Curriculum Design

    ERIC Educational Resources Information Center

    Ozdamli, Fezile

    2007-01-01

    Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…

  5. Measurement properties of self-report physical activity assessment tools in stroke: a protocol for a systematic review.

    PubMed

    Martins, Júlia Caetano; Aguiar, Larissa Tavares; Nadeau, Sylvie; Scianni, Aline Alvim; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais

    2017-02-13

    Self-report physical activity assessment tools are commonly used for the evaluation of physical activity levels in individuals with stroke. A great variety of these tools have been developed and widely used in recent years, which justify the need to examine their measurement properties and clinical utility. Therefore, the main objectives of this systematic review are to examine the measurement properties and clinical utility of self-report measures of physical activity and discuss the strengths and limitations of the identified tools. A systematic review of studies that investigated the measurement properties and/or clinical utility of self-report physical activity assessment tools in stroke will be conducted. Electronic searches will be performed in five databases: Medical Literature Analysis and Retrieval System Online (MEDLINE) (PubMed), Excerpta Medica Database (EMBASE), Physiotherapy Evidence Database (PEDro), Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) and Scientific Electronic Library Online (SciELO), followed by hand searches of the reference lists of the included studies. Two independent reviewers will screen all retrieve titles, abstracts, and full texts, according to the inclusion criteria and will also extract the data. A third reviewer will be referred to solve any disagreement. A descriptive summary of the included studies will contain the design, participants, as well as the characteristics, measurement properties, and clinical utility of the self-report tools. The methodological quality of the studies will be evaluated using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist and the clinical utility of the identified tools will be assessed considering predefined criteria. This systematic review will follow the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) statement. This systematic review will provide an extensive review of the measurement properties and clinical utility of self-report physical activity assessment tools used in individuals with stroke, which would benefit clinicians and researchers. PROSPERO CRD42016037146. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction. We have developed a methodology for synthesizing physics-based broadband ground motion that incorporates the effects of realistic earthquake rupture along specific faults and the actual geology between the source and site.

  7. The Macro Dynamics of Weapon System Acquisition: Shaping Early Decisions to Get Better Outcomes

    DTIC Science & Technology

    2012-05-17

    defects and rework •Design tools and processes •Lack of feedback to key design and SE processes •Lack of quantified risk and uncertainty at key... Tools for Rapid Exploration of the Physical Design Space Coupling Operability, Interoperability, and Physical Feasibility Analyses – a Game Changer...Interoperability •Training Quantified Margins and Uncertainties at Each Critical Decision Point M&S RDT&E A Continuum of Tools Underpinned with

  8. Status and future of MUSE

    NASA Astrophysics Data System (ADS)

    Harfst, S.; Portegies Zwart, S.; McMillan, S.

    2008-12-01

    We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  9. Effects of different sources of physically effective fiber on rumen microbial populations.

    PubMed

    Shaw, C N; Kim, M; Eastridge, M L; Yu, Z

    2016-03-01

    Physically effective fiber is needed by dairy cattle to prevent ruminal acidosis. This study aimed to examine the effects of different sources of physically effective fiber on the populations of fibrolytic bacteria and methanogens. Five ruminally cannulated Holstein cows were each fed five diets differing in physically effective fiber sources over 15 weeks (21 days/period) in a Latin Square design: (1) 44.1% corn silage, (2) 34.0% corn silage plus 11.5% alfalfa hay, (3) 34.0% corn silage plus 5.1% wheat straw, (4) 36.1% corn silage plus 10.1% wheat straw, and (5) 34.0% corn silage plus 5.5% corn stover. The impact of the physically effective fiber sources on total bacteria and archaea were examined using denaturing gradient gel electrophoresis. Specific real-time PCR assays were used to quantify total bacteria, total archaea, the genus Butyrivibrio, Fibrobacter succinogenes, Ruminococcus albus, Ruminococcus flavefaciens and three uncultured rumen bacteria that were identified from adhering ruminal fractions in a previous study. No significant differences were observed among the different sources of physical effective fiber with respect to the microbial populations quantified. Any of the physically effective fiber sources may be fed to dairy cattle without negative impact on the ruminal microbial community.

  10. Open source bioimage informatics for cell biology.

    PubMed

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  11. Development of a new generation of high-resolution anatomical models for medical device evaluation: the Virtual Population 3.0

    NASA Astrophysics Data System (ADS)

    Gosselin, Marie-Christine; Neufeld, Esra; Moser, Heidi; Huber, Eveline; Farcito, Silvia; Gerber, Livia; Jedensjö, Maria; Hilber, Isabel; Di Gennaro, Fabienne; Lloyd, Bryn; Cherubini, Emilio; Szczerba, Dominik; Kainz, Wolfgang; Kuster, Niels

    2014-09-01

    The Virtual Family computational whole-body anatomical human models were originally developed for electromagnetic (EM) exposure evaluations, in particular to study how absorption of radiofrequency radiation from external sources depends on anatomy. However, the models immediately garnered much broader interest and are now applied by over 300 research groups, many from medical applications research fields. In a first step, the Virtual Family was expanded to the Virtual Population to provide considerably broader population coverage with the inclusion of models of both sexes ranging in age from 5 to 84 years old. Although these models have proven to be invaluable for EM dosimetry, it became evident that significantly enhanced models are needed for reliable effectiveness and safety evaluations of diagnostic and therapeutic applications, including medical implants safety. This paper describes the research and development performed to obtain anatomical models that meet the requirements necessary for medical implant safety assessment applications. These include implementation of quality control procedures, re-segmentation at higher resolution, more-consistent tissue assignments, enhanced surface processing and numerous anatomical refinements. Several tools were developed to enhance the functionality of the models, including discretization tools, posing tools to expand the posture space covered, and multiple morphing tools, e.g., to develop pathological models or variations of existing ones. A comprehensive tissue properties database was compiled to complement the library of models. The results are a set of anatomically independent, accurate, and detailed models with smooth, yet feature-rich and topologically conforming surfaces. The models are therefore suited for the creation of unstructured meshes, and the possible applications of the models are extended to a wider range of solvers and physics. The impact of these improvements is shown for the MRI exposure of an adult woman with an orthopedic spinal implant. Future developments include the functionalization of the models for specific physical and physiological modeling tasks.

  12. HydroUnits: A Python-based Physical Units Management Tool in Hydrologic Computing Systems

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    While one objective of data management systems is to provide the units when annotating the collected data, another is that the units must be correctly manipulated during conversion steps. This is not a trivial task however and the units conversion time and errors for large datasets can be quite expensive. To date, more than a dozen Python modules have been developed to deal with units attached to quantities. However, they fall short in many ways and also suffer from not integrating with a units controlled vocabulary. Moreover, none of them permits the encoding of some complex units defined in the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc.'s Observations Data Model (CUAHSI ODM) as a vectorial representation for storage demand reduction and does not incorporate provision to accommodate unforeseen standards-based units. We developed HydroUnits, a Python-based units management tool for three specific purposes: encoding of physical units in the Transducer Electronic Data Sheet (TEDS) as defined in the IEEE 1451.0 standard, performing dimensional analysis and on-the-fly conversion of time series allowing users to retrieve data from a data source in a desired equivalent unit while accommodating unforeseen and user-defined units. HydroUnits differentiates itself to existing tools by a number of factors including the implementation approach adopted, the adoption of standard-based units naming conventions and more importantly the emphasis on units controlled vocabularies which are a critical aspect of units treatment. Additionally, HydroUnits supports unit conversion for quantities with additive scaling factor, and natively supports time series conversion and takes leap years into consideration for units consisting of the time dimension (e.g., month, minute). Due to its overall implementation approach, HydroUnits exhibits a high level of versatility that no other tool we are aware of has achieved.

  13. Middle-aged women's decisions about body weight management: needs assessment and testing of a knowledge translation tool.

    PubMed

    Stacey, Dawn; Jull, Janet; Beach, Sarah; Dumas, Alex; Strychar, Irene; Adamo, Kristi; Brochu, Martin; Prud'homme, Denis

    2015-04-01

    This study aims to assess middle-aged women's needs when making body weight management decisions and to evaluate a knowledge translation tool for addressing their needs. A mixed-methods study used an interview-guided theory-based survey of professional women aged 40 to 65 years. The tool summarized evidence to address their needs and enabled women to monitor actions taken. Acceptability and usability were reported descriptively. Sixty female participants had a mean body mass index of 28.0 kg/m(2) (range, 17.0-44.9 kg/m(2)), and half were premenopausal. Common options for losing (82%) or maintaining (18%) weight included increasing physical activity (60%), eating healthier (57%), and getting support (40%). Decision-making involved getting information on options (52%), soliciting others' decisions/advice (20%), and being self-motivated (20%). Preferred information sources included written information (97%), counseling (90%), and social networking websites (43%). Five professionals (dietitian, personal trainer, occupational therapist, and two physicians) had similar responses. Of 53 women sent the tool, 27 provided acceptability feedback. They rated it as good to excellent for information on menopause (96%), body weight changes (85%), and managing body weight (85%). Most would tell others about it (81%). After 4 weeks of use, 25 women reported that the wording made sense (96%) and that the tool had clear instructions (92%) and was easy to use across time (88%). The amount of information was rated as just right (64%), but the tool had limited space for responding (72%). When making decisions about body weight management, women's needs were "getting information" and "getting support." The knowledge translation tool was acceptable and usable, but further evaluation is required.

  14. Middle-aged women’s decisions about body weight management: needs assessment and testing of a knowledge translation tool

    PubMed Central

    Stacey, Dawn; Jull, Janet; Beach, Sarah; Dumas, Alex; Strychar, Irene; Adamo, Kristi; Brochu, Martin; Prud’homme, Denis

    2015-01-01

    Abstract Objective This study aims to assess middle-aged women’s needs when making body weight management decisions and to evaluate a knowledge translation tool for addressing their needs. Methods A mixed-methods study used an interview-guided theory-based survey of professional women aged 40 to 65 years. The tool summarized evidence to address their needs and enabled women to monitor actions taken. Acceptability and usability were reported descriptively. Results Sixty female participants had a mean body mass index of 28.0 kg/m2 (range, 17.0-44.9 kg/m2), and half were premenopausal. Common options for losing (82%) or maintaining (18%) weight included increasing physical activity (60%), eating healthier (57%), and getting support (40%). Decision-making involved getting information on options (52%), soliciting others’ decisions/advice (20%), and being self-motivated (20%). Preferred information sources included written information (97%), counseling (90%), and social networking websites (43%). Five professionals (dietitian, personal trainer, occupational therapist, and two physicians) had similar responses. Of 53 women sent the tool, 27 provided acceptability feedback. They rated it as good to excellent for information on menopause (96%), body weight changes (85%), and managing body weight (85%). Most would tell others about it (81%). After 4 weeks of use, 25 women reported that the wording made sense (96%) and that the tool had clear instructions (92%) and was easy to use across time (88%). The amount of information was rated as just right (64%), but the tool had limited space for responding (72%). Conclusions When making decisions about body weight management, women’s needs were “getting information” and “getting support.” The knowledge translation tool was acceptable and usable, but further evaluation is required. PMID:25816120

  15. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.

  16. Through-silicon via plating void metrology using focused ion beam mill

    NASA Astrophysics Data System (ADS)

    Rudack, A. C.; Nadeau, J.; Routh, R.; Young, R. J.

    2012-03-01

    3D IC integration continues to increase in complexity, employing advanced interconnect technologies such as throughsilicon vias (TSVs), wafer-to-wafer (W2W) bonding, and multi-chip stacking. As always, the challenge with developing new processes is to get fast, effective feedback to the integration engineer. Ideally this data is provided by nondestructive in-line metrology, but this is not always possible. For example, some form of physical cross-sectioning is still the most practical way to detect and characterize TSV copper plating voids. This can be achieved by cleaving, followed by scanning electron microscope (SEM) inspection. A more effective physical cross-sectioning method has been developed using an automated dual-beam focused ion beam (FIB)-SEM system, in which multiple locations can be sectioned and imaged while leaving the wafer intact. This method has been used routinely to assess copper plating voids over the last 24 months at SEMATECH. FIB-SEM feedback has been used to evaluate new plating chemistries, plating recipes, and process tool requalification after downtime. The dualbeam FIB-SEM used for these studies employs a gallium-based liquid metal ion source (LMIS). The overall throughput of relatively large volumes being milled is limited to 3-4 hours per section due to the maximum available beam current of 20 nA. Despite the larger volumetric removal rates of other techniques (e.g., mechanical polishing, broad-ion milling, and laser ablation), the value of localized, site-specific, and artifact-free FIB milling is well appreciated. The challenge, therefore, has been to reap the desired FIB benefits, but at faster volume removal rates. This has led to several system and technology developments for improving the throughput of the FIB technique, the most recent being the introduction of FIBs based on an inductively coupled plasma (ICP) ion source. The ICP source offers much better performance than the LMIS at very high beam currents, enabling more than 1 μA of ion beam current for fast material removal. At a lower current, the LMIS outperforms the ICP source, but imaging resolution below 30 nm has been demonstrated with ICP-based systems. In addition, the ICP source allows a wide range of possible ion species, with Xe currently the milling species of choice, due to its high mass and favorable ion source performance parameters. Using a 1 μA Xe beam will have an overall milling rate for silicon some 20X higher than a Ga beam operating at 65 nA. This paper will compare the benefits already seen using the Ga-based FIB-SEM approach to TSV metrology, with the improvements in throughput and time-to-data obtained by using the faster material removal capabilities of a FIB based on an ICP ion source. Plasma FIB (PFIB) is demonstrated to be a feasible tool for TSV plating void metrology.

  17. Tool use as distributed cognition: how tools help, hinder and define manual skill

    PubMed Central

    Baber, Chris; Parekh, Manish; Cengiz, Tulin G.

    2014-01-01

    Our thesis in this paper is that, in order to appreciate the interplay between cognitive (goal-directed) and physical performance in tool use, it is necessary to determine the role that representations play in the use of tools. We argue that rather being solely a matter of internal (mental) representation, tool use makes use of the external representations that define the human–environment–tool–object system. This requires the notion of Distributed Cognition to encompass not simply the manner in which artifacts represent concepts but also how they represent praxis. Our argument is that this can be extended to include how artifacts-in-context afford use and how this response to affordances constitutes a particular form of skilled performance. By artifacts-in-context, we do not mean solely the affordances offered by the physical dimensions of a tool but also the interaction between the tool and the object that it is being used on. From this, “affordance” does not simply relate to the physical appearance of the tool but anticipates subsequent actions by the user directed towards the goal of changing the state of the object and this is best understood in terms of the “complimentarity” in the system. This assertion raises two challenges which are explored in this paper. The first is to distinguish “affordance” from the adaptation that one might expect to see in descriptions of motor control; when we speak of “affordance” as a form of anticipation, don’t we just mean the ability to adjust movements in response to physical demands? The second is to distinguish “affordance” from a schema of the tool; when we talk about anticipation, don’t we just mean the ability to call on a schema representing a “recipe” for using that tool for that task? This question of representation, specifically what knowledge needs to be represented in tool use, is central to this paper. PMID:24605103

  18. Ultrafast electron diffraction and electron microscopy: present status and future prospects

    NASA Astrophysics Data System (ADS)

    Ishchenko, A. A.; Aseyev, S. A.; Bagratashvili, V. N.; Panchenko, V. Ya; Ryabov, E. A.

    2014-07-01

    Acting as complementary research tools, high time-resolved spectroscopy and diffractometry techniques proceeding from various physical principles open up new possibilities for studying matter with necessary integration of the 'structure-dynamics-function' triad in physics, chemistry, biology and materials science. Since the 1980s, a new field of research has started at the leading research laboratories, aimed at developing means of filming the coherent dynamics of nuclei in molecules and fast processes in biological objects ('atomic and molecular movies'). The utilization of ultrashort laser pulse sources has significantly modified traditional electron beam approaches to and provided high space-time resolution for the study of materials. Diffraction methods using frame-by-frame filming and the development of the main principles of the study of coherent dynamics of atoms have paved the way to observing wave packet dynamics, the intermediate states of reaction centers, and the dynamics of electrons in molecules, thus allowing a transition from the kinetics to the dynamics of the phase trajectories of molecules in the investigation of chemical reactions.

  19. The brain as a dynamic physical system.

    PubMed

    McKenna, T M; McMullen, T A; Shlesinger, M F

    1994-06-01

    The brain is a dynamic system that is non-linear at multiple levels of analysis. Characterization of its non-linear dynamics is fundamental to our understanding of brain function. Identifying families of attractors in phase space analysis, an approach which has proven valuable in describing non-linear mechanical and electrical systems, can prove valuable in describing a range of behaviors and associated neural activity including sensory and motor repertoires. Additionally, transitions between attractors may serve as useful descriptors for analysing state changes in neurons and neural ensembles. Recent observations of synchronous neural activity, and the emerging capability to record the spatiotemporal dynamics of neural activity by voltage-sensitive dyes and electrode arrays, provide opportunities for observing the population dynamics of neural ensembles within a dynamic systems context. New developments in the experimental physics of complex systems, such as the control of chaotic systems, selection of attractors, attractor switching and transient states, can be a source of powerful new analytical tools and insights into the dynamics of neural systems.

  20. Coupling physically based and data-driven models for assessing freshwater inflow into the Small Aral Sea

    NASA Astrophysics Data System (ADS)

    Ayzel, Georgy; Izhitskiy, Alexander

    2018-06-01

    The Aral Sea desiccation and related changes in hydroclimatic conditions on a regional level is a hot topic for past decades. The key problem of scientific research projects devoted to an investigation of modern Aral Sea basin hydrological regime is its discontinuous nature - the only limited amount of papers takes into account the complex runoff formation system entirely. Addressing this challenge we have developed a continuous prediction system for assessing freshwater inflow into the Small Aral Sea based on coupling stack of hydrological and data-driven models. Results show a good prediction skill and approve the possibility to develop a valuable water assessment tool which utilizes the power of classical physically based and modern machine learning models both for territories with complex water management system and strong water-related data scarcity. The source code and data of the proposed system is available on a Github page (https://github.com/SMASHIproject/IWRM2018).

  1. Predictive Rotation Profile Control for the DIII-D Tokamak

    NASA Astrophysics Data System (ADS)

    Wehner, W. P.; Schuster, E.; Boyer, M. D.; Walker, M. L.; Humphreys, D. A.

    2017-10-01

    Control-oriented modeling and model-based control of the rotation profile are employed to build a suitable control capability for aiding rotation-related physics studies at DIII-D. To obtain a control-oriented model, a simplified version of the momentum balance equation is combined with empirical representations of the momentum sources. The control approach is rooted in a Model Predictive Control (MPC) framework to regulate the rotation profile while satisfying constraints associated with the desired plasma stored energy and/or βN limit. Simple modifications allow for alternative control objectives, such as maximizing the plasma rotation while maintaining a specified input torque. Because the MPC approach can explicitly incorporate various types of constraints, this approach is well suited to a variety of control objectives, and therefore serves as a valuable tool for experimental physics studies. Closed-loop TRANSP simulations are presented to demonstrate the effectiveness of the control approach. Supported by the US DOE under DE-SC0010661 and DE-FC02-04ER54698.

  2. Synergism of Nanomaterials with Physical Stimuli for Biology and Medicine.

    PubMed

    Shin, Tae-Hyun; Cheon, Jinwoo

    2017-03-21

    Developing innovative tools that facilitate the understanding of sophisticated biological systems has been one of the Holy Grails in the physical and biological sciences. In this Commentary, we discuss recent advances, opportunities, and challenges in the use of nanomaterials as a precision tool for biology and medicine.

  3. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    NASA Technical Reports Server (NTRS)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For testing, plug-ins are implemented in COSMOS to control the NOS3 simulations, while the command and telemetry tools available in COSMOS are used to communicate with FSW. NOS3 is actively being used for FSW development and component testing of the Simulation-to-Flight 1 (STF-1) CubeSat. As NOS3 matures, hardware models have been added for common CubeSat components such as Novatel GPS receivers, ClydeSpace electrical power systems and batteries, ISISpace antenna systems, etc. In the future, NASA IVV plans to distribute NOS3 to other CubeSat developers and release the suite to the open-source community.

  4. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  5. ThinkerTools. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2012

    2012-01-01

    "ThinkerTools" is a computer-based program that aims to develop students' understanding of physics and scientific modeling. The program is composed of two curricula for middle school students, "ThinkerTools Inquiry" and "Model-Enhanced ThinkerTools". "ThinkerTools Inquiry" allows students to explore the…

  6. Efficient Delivery and Visualization of Long Time-Series Datasets Using Das2 Tools

    NASA Astrophysics Data System (ADS)

    Piker, C.; Granroth, L.; Faden, J.; Kurth, W. S.

    2017-12-01

    For over 14 years the University of Iowa Radio and Plasma Wave Group has utilized a network transparent data streaming and visualization system for most daily data review and collaboration activities. This system, called Das2, was originally designed in support of the Cassini Radio and Plasma Wave Science (RPWS) investigation, but is now relied on for daily review and analysis of Voyager, Polar, Cluster, Mars Express, Juno and other mission results. In light of current efforts to promote automatic data distribution in space physics it seems prudent to provide an overview of our open source Das2 programs and interface definitions to the wider community and to recount lessons learned. This submission will provide an overview of interfaces that define the system, describe the relationship between the Das2 effort and Autoplot and will examine handling Cassini RPWS Wideband waveforms and dynamic spectra as examples of dealing with long time-series data sets. In addition, the advantages and limitations of the current Das2 tool set will be discussed, as well as lessons learned that are applicable to other data sharing initiatives. Finally, plans for future developments including improved catalogs to support 'no-software' data sources and redundant multi-server fail over, as well as new adapters for CSV (Comma Separated Values) and JSON (Javascript Object Notation) output to support Cassini closeout and the HAPI (Heliophysics Application Programming Interface) initiative are outlined.

  7. Helioviewer.org: An Open-source Tool for Visualizing Solar Data

    NASA Astrophysics Data System (ADS)

    Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.

    2009-05-01

    As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.

  8. Hierarchy Software Development Framework (h-dp-fwk) project

    NASA Astrophysics Data System (ADS)

    Zaytsev, A.

    2010-04-01

    Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.

  9. Real simulation tools in introductory courses: packaging and repurposing our research code.

    NASA Astrophysics Data System (ADS)

    Heagy, L. J.; Cockett, R.; Kang, S.; Oldenburg, D.

    2015-12-01

    Numerical simulations are an important tool for scientific research and applications in industry. They provide a means to experiment with physics in a tangible, visual way, often providing insights into the problem. Over the last two years, we have been developing course and laboratory materials for an undergraduate geophysics course primarily taken by non-geophysics majors, including engineers and geologists. Our aim is to provide the students with resources to build intuition about geophysical techniques, promote curiosity driven exploration, and help them develop the skills necessary to communicate across disciplines. Using open-source resources and our existing research code, we have built modules around simulations, with supporting content to give student interactive tools for exploration into the impacts of input parameters and visualization of the resulting fields, fluxes and data for a variety of problems in applied geophysics, including magnetics, seismic, electromagnetics, and direct current resistivity. The content provides context for the problems, along with exercises that are aimed at getting students to experiment and ask 'what if...?' questions. In this presentation, we will discuss our approach for designing the structure of the simulation-based modules, the resources we have used, challenges we have encountered, general feedback from students and instructors, as well as our goals and roadmap for future improvement. We hope that our experiences and approach will be beneficial to other instructors who aim to put simulation tools in the hands of students.

  10. Unfortunate Outcomes of a "Funny" Physics Problem: Some Eye-Opening YouTube Comments

    ERIC Educational Resources Information Center

    Slisko, Josip; Dykstra, Dewey, Jr.

    2011-01-01

    The impressions we make as instructors of physics can affect student learning and public perception of physics teachers, physics as an academic subject, and physics as a profession. There are many sources from which we can collect evidence of these impressions. Among these sources are online public forums such as those at the Internet site known…

  11. Learning motion concepts using real-time microcomputer-based laboratory tools

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald K.; Sokoloff, David R.

    1990-09-01

    Microcomputer-based laboratory (MBL) tools have been developed which interface to Apple II and Macintosh computers. Students use these tools to collect physical data that are graphed in real time and then can be manipulated and analyzed. The MBL tools have made possible discovery-based laboratory curricula that embody results from educational research. These curricula allow students to take an active role in their learning and encourage them to construct physical knowledge from observation of the physical world. The curricula encourage collaborative learning by taking advantage of the fact that MBL tools present data in an immediately understandable graphical form. This article describes one of the tools—the motion detector (hardware and software)—and the kinematics curriculum. The effectiveness of this curriculum compared to traditional college and university methods for helping students learn basic kinematics concepts has been evaluated by pre- and post-testing and by observation. There is strong evidence for significantly improved learning and retention by students who used the MBL materials, compared to those taught in lecture.

  12. A better way of fitting clips? A comparative study with respect to physical workload.

    PubMed

    Gaudez, Clarisse; Wild, Pascal; Aublet-Cuvelier, Agnès

    2015-11-01

    The clip fitting task is a frequently encountered assembly operation in the car industry. It can cause upper limb pain. During task laboratory simulations, upper limb muscular activity and external force were compared for 4 clip fitting methods: with the bare hand, with an unpowered tool commonly used at a company and with unpowered and powered prototype tools. None of the 4 fitting methods studied induced a lower overall workload than the other three. Muscle activity was lower at the dominant limb when using the unpowered tools and at the non-dominant limb with the bare hand or with the powered tool. Fitting clips with the bare hand required a higher external force than fitting with the three tools. Evaluation of physical workload was different depending on whether external force or muscle activity results were considered. Measuring external force only, as recommended in several standards, is insufficient for evaluating physical workload. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. An investigation of current and future satellite and in-situ data for the remote sensing of the land surface energy balance

    NASA Technical Reports Server (NTRS)

    Diak, George R.

    1994-01-01

    This final report from the University of Wisconsin-Madison Cooperative Institute for Meteorological Satellite Studies (CIMSS) summarizes a research program designed to improve our knowledge of the water and energy balance of the land surface through the application of remote sensing and in-situ data sources. The remote sensing data source investigations to be detailed involve surface radiometric ('skin') temperatures and also high-spectral-resolution infrared radiance data from atmospheric sounding instruments projected to be available at the end of the decade, which have shown promising results for evaluating the land-surface water and energy budget. The in-situ data types to be discussed are measurements of the temporal changes of the height of the planetary boundary layer and measurements of air temperature within the planetary boundary layer. Physical models of the land surface, planetary boundary layer and free atmosphere have been used as important tools to interpret the in-situ and remote sensing signals of the surface energy balance. A prototype 'optimal' system for combining multiple data sources into a three-dimensional estimate of the surface energy balance was developed and first results from this system will be detailed. Potential new sources of data for this system and suggested continuation research will also be discussed.

  14. Finite elements numerical codes as primary tool to improve beam optics in NIO1

    NASA Astrophysics Data System (ADS)

    Baltador, C.; Cavenago, M.; Veltri, P.; Serianni, G.

    2017-08-01

    The RF negative ion source NIO1, built at Consorzio RFX in Padua (Italy), is aimed to investigate general issues on ion source physics in view of the full-size ITER injector MITICA as well as DEMO relevant solutions, like energy recovery and alternative neutralization systems, crucial for neutral beam injectors in future fusion experiments. NIO1 has been designed to produce 9 H-beamlets (in a 3x3 pattern) of 15mA each and 60keV, using a three electrodes system downstream the plasma source. At the moment the source is at its early operational stage and only operation at low power and low beam energy is possible. In particular, NIO1 presents a too strong set of SmCo co-extraction electron suppression magnets (CESM) in the extraction grid (EG) that will be replaced by a weaker set of Ferrite magnets. A completely new set of magnets will be also designed and mounted on the new EG that will be installed next year, replacing the present one. In this paper, the finite element code OPERA 3D is used to investigate the effects of the three sets of magnets on beamlet optics. A comparison of numerical results with measurements will be provided where possible.

  15. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  16. Rocket Engine Oscillation Diagnostics

    NASA Technical Reports Server (NTRS)

    Nesman, Tom; Turner, James E. (Technical Monitor)

    2002-01-01

    Rocket engine oscillating data can reveal many physical phenomena ranging from unsteady flow and acoustics to rotordynamics and structural dynamics. Because of this, engine diagnostics based on oscillation data should employ both signal analysis and physical modeling. This paper describes an approach to rocket engine oscillation diagnostics, types of problems encountered, and example problems solved. Determination of design guidelines and environments (or loads) from oscillating phenomena is required during initial stages of rocket engine design, while the additional tasks of health monitoring, incipient failure detection, and anomaly diagnostics occur during engine development and operation. Oscillations in rocket engines are typically related to flow driven acoustics, flow excited structures, or rotational forces. Additional sources of oscillatory energy are combustion and cavitation. Included in the example problems is a sampling of signal analysis tools employed in diagnostics. The rocket engine hardware includes combustion devices, valves, turbopumps, and ducts. Simple models of an oscillating fluid system or structure can be constructed to estimate pertinent dynamic parameters governing the unsteady behavior of engine systems or components. In the example problems it is shown that simple physical modeling when combined with signal analysis can be successfully employed to diagnose complex rocket engine oscillatory phenomena.

  17. S&MPO - An information system for ozone spectroscopy on the WEB

    NASA Astrophysics Data System (ADS)

    Babikov, Yurii L.; Mikhailenko, Semen N.; Barbe, Alain; Tyuterev, Vladimir G.

    2014-09-01

    Spectroscopy and Molecular Properties of Ozone ("S&MPO") is an Internet accessible information system devoted to high resolution spectroscopy of the ozone molecule, related properties and data sources. S&MPO contains information on original spectroscopic data (line positions, line intensities, energies, transition moments, spectroscopic parameters) recovered from comprehensive analyses and modeling of experimental spectra as well as associated software for data representation written in PHP Java Script, C++ and FORTRAN. The line-by-line list of vibration-rotation transitions and other information is organized as a relational database under control of MySQL database tools. The main S&MPO goal is to provide access to all available information on vibration-rotation molecular states and transitions under extended conditions based on extrapolations of laboratory measurements using validated theoretical models. Applications for the S&MPO may include: education/training in molecular physics, radiative processes, laser physics; spectroscopic applications (analysis, Fourier transform spectroscopy, atmospheric optics, optical standards, spectroscopic atlases); applications to environment studies and atmospheric physics (remote sensing); data supply for specific databases; and to photochemistry (laser excitation, multiphoton processes). The system is accessible via Internet on two sites: http://smpo.iao.ru and http://smpo.univ-reims.fr.

  18. Overview of MST Research

    NASA Astrophysics Data System (ADS)

    Sarff, J. S.; MST Team

    2011-10-01

    MST progress in advancing the RFP for (1) fusion plasma confinement with minimal external magnetization, (2) toroidal confinement physics, and (3) basic plasma physics is summarized. New tools and diagnostics are accessing physics barely studied in the RFP. Several diagnostic advances are important for ITER/burning plasma. A 1 MW neutral beam injector operates routinely for fast ion, heating, and transport investigations. Energetic ions are also created spontaneously by tearing mode reconnection, reminiscent of astrophysical plasmas. Classical confinement of impurity ions is measured in reduced-tearing plasmas. Fast ion slowing-down is also classical. Alfven-eigenmode-like activity occurs with NBI, but apparently not TAE. Stellarator-like helical structure appears in the core of high current plasmas, with improved confinement characteristics. FIR interferometry, Thomson scattering, and HIBP diagnostics are beginning to explore microturbulence scales, an opportunity to exploit the RFP's high beta and strong magnetic shear parameter space. A programmable power supply for the toroidal field flexibly explores scenarios from advanced inductive profile control to low current tokamak operation. A 1 MW 5.5 GHz source for electron Bernstein wave injection is nearly complete to investigate heating and current drive in over-dense plasmas. Supported by DOE and NSF.

  19. The TensorMol-0.1 model chemistry: a neural network augmented with long-range physics.

    PubMed

    Yao, Kun; Herr, John E; Toth, David W; Mckintyre, Ryker; Parkhill, John

    2018-02-28

    Traditional force fields cannot model chemical reactivity, and suffer from low generality without re-fitting. Neural network potentials promise to address these problems, offering energies and forces with near ab initio accuracy at low cost. However a data-driven approach is naturally inefficient for long-range interatomic forces that have simple physical formulas. In this manuscript we construct a hybrid model chemistry consisting of a nearsighted neural network potential with screened long-range electrostatic and van der Waals physics. This trained potential, simply dubbed "TensorMol-0.1", is offered in an open-source Python package capable of many of the simulation types commonly used to study chemistry: geometry optimizations, harmonic spectra, open or periodic molecular dynamics, Monte Carlo, and nudged elastic band calculations. We describe the robustness and speed of the package, demonstrating its millihartree accuracy and scalability to tens-of-thousands of atoms on ordinary laptops. We demonstrate the performance of the model by reproducing vibrational spectra, and simulating the molecular dynamics of a protein. Our comparisons with electronic structure theory and experimental data demonstrate that neural network molecular dynamics is poised to become an important tool for molecular simulation, lowering the resource barrier to simulating chemistry.

  20. Comparison of Measured to Predicted Estimations of Nonpoint Source Contaminants Using Conservation Practices in an Agriculturally-Dominated Watershed in Northeast Arkansas, USA.

    PubMed

    Frasher, Sarah K; Woodruff, Tracy M; Bouldin, Jennifer L

    2016-06-01

    In efforts to reduce nonpoint source runoff and improve water quality, Best Management Practices (BMPs) were implemented in the Outlet Larkin Creek Watershed. Farmers need to make scientifically informed decisions concerning BMPs addressing contaminants from agricultural fields. The BMP Tool was developed from previous studies to estimate BMP effectiveness at reducing nonpoint source contaminants. The purpose of this study was to compare the measured percent reduction of dissolved phosphorus (DP) and total suspended solids to the reported percent reductions from the BMP Tool for validation. Similarities were measured between the BMP Tool and the measured water quality parameters. Construction of a sedimentation pond resulted in 74 %-76 % reduction in DP as compared to 80 % as predicted with the BMP Tool. However, further research is needed to validate the tool for additional water quality parameters. The BMP Tool is recommended for future BMP implementation as a useful predictor for farmers.

  1. A consistent framework to predict mass fluxes and depletion times for DNAPL contaminations in heterogeneous aquifers under uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Jonas; Nowak, Wolfgang

    2013-04-01

    At many hazardous waste sites and accidental spills, dense non-aqueous phase liquids (DNAPLs) such as TCE, PCE, or TCA have been released into the subsurface. Once a DNAPL is released into the subsurface, it serves as persistent source of dissolved-phase contamination. In chronological order, the DNAPL migrates through the porous medium and penetrates the aquifer, it forms a complex pattern of immobile DNAPL saturation, it dissolves into the groundwater and forms a contaminant plume, and it slowly depletes and bio-degrades in the long-term. In industrial countries the number of such contaminated sites is tremendously high to the point that a ranking from most risky to least risky is advisable. Such a ranking helps to decide whether a site needs to be remediated or may be left to natural attenuation. Both the ranking and the designing of proper remediation or monitoring strategies require a good understanding of the relevant physical processes and their inherent uncertainty. To this end, we conceptualize a probabilistic simulation framework that estimates probability density functions of mass discharge, source depletion time, and critical concentration values at crucial target locations. Furthermore, it supports the inference of contaminant source architectures from arbitrary site data. As an essential novelty, the mutual dependencies of the key parameters and interacting physical processes are taken into account throughout the whole simulation. In an uncertain and heterogeneous subsurface setting, we identify three key parameter fields: the local velocities, the hydraulic permeabilities and the DNAPL phase saturations. Obviously, these parameters depend on each other during DNAPL infiltration, dissolution and depletion. In order to highlight the importance of these mutual dependencies and interactions, we present results of several model set ups where we vary the physical and stochastic dependencies of the input parameters and simulated processes. Under these changes, the probability density functions demonstrate strong statistical shifts in their expected values and in their uncertainty. Considering the uncertainties of all key parameters but neglecting their interactions overestimates the output uncertainty. However, consistently using all available physical knowledge when assigning input parameters and simulating all relevant interactions of the involved processes reduces the output uncertainty significantly back down to useful and plausible ranges. When using our framework in an inverse setting, omitting a parameter dependency within a crucial physical process would lead to physical meaningless identified parameters. Thus, we conclude that the additional complexity we propose is both necessary and adequate. Overall, our framework provides a tool for reliable and plausible prediction, risk assessment, and model based decision support for DNAPL contaminated sites.

  2. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  3. Kantowski-Sachs Einstein-æther perfect fluid models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latta, Joey; Leon, Genly; Paliathanasis, Andronikos, E-mail: lattaj@mathstat.dal.ca, E-mail: genly.leon@pucv.cl, E-mail: anpaliat@phys.uoa.gr

    We investigate Kantowski-Sachs models in Einstein-æ ther theory with a perfect fluid source using the singularity analysis to prove the integrability of the field equations and dynamical system tools to study the evolution. We find an inflationary source at early times, and an inflationary sink at late times, for a wide region in the parameter space. The results by A.A. Coley, G. Leon, P. Sandin and J. Latta ( JCAP 12 (2015) 010), are then re-obtained as particular cases. Additionally, we select other values for the non-GR parameters which are consistent with current constraints, getting a very rich phenomenology. Inmore » particular, we find solutions with infinite shear, zero curvature, and infinite matter energy density in comparison with the Hubble scalar. We also have stiff-like future attractors, anisotropic late-time attractors, or both, in some special cases. Such results are developed analytically, and then verified by numerics. Finally, the physical interpretation of the new critical points is discussed.« less

  4. Investigative techniques used to locate the liquid hydrogen leakage on the Space Shuttle Main Propulsion System

    NASA Technical Reports Server (NTRS)

    Hammock, William R., Jr.; Cota, Phillip E., Jr.; Rosenbaum, Bernard J.; Barrett, Michael J.

    1991-01-01

    Standard leak detection methods at ambient temperature have been developed in order to prevent excessive leakage from the Space Shuttle liquid oxygen and liquid hydrogen Main Propulsion System. Unacceptable hydrogen leakage was encountered on the Columbia and Atlantis flight vehicles in the summer of 1990 after the standard leak check requirements had been satisfied. The leakage was only detectable when the fuel system was exposed to subcooled liquid hydrogen during External Tank loading operations. Special instrumentation and analytical tools were utilized during a series of propellant tanking tests in order to identify the sources of the hydrogen leakage. After the leaks were located and corrected, the physical characteristics of the leak sources were analyzed in an effort to understand how the discrepancies were introduced and why the leakage had evaded the standard leak detection methods. As a result of the post-leak analysis, corrective actions and leak detection improvements have been implemented in order to preclude a similar incident.

  5. MCNP capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less

  6. Cross-sectional imaging of extracted jawbone of a pig by optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Tachikawa, Noriko; Yoshimura, Reiko; Ohbayashi, Kohji

    2011-03-01

    Dental implantation has become popular in dental treatments. Although careful planning is made to identify vital structures such as the inferior alveolar nerve or the sinus, as well as dimensions of the bone, prior to commencement of surgery, dental implantation is not fully free from risks. If a diagnostic tool is available to objectively measure bone feature before surgery and dimensions during surgery, considerable fraction of the risks may be avoided. Optical coherence tomography (OCT) is a candidate for the purpose, which enables cross-sectional imaging of bone. In this work, we performed in vitro cross-sectional imaging of extracted pig's jawbone with swept source OCT using superstructure-grating distributed Bragg reflector (SSG-DBR) laser as the source. The relatively long wavelength range of 1600nm of the laser is suitable for deeper bone imaging. We confirmed an image penetration depth of about 3 mm in physical length, which satisfies one of the criterions to apply OCT for in vivo diagnosis of bone during surgery.

  7. The MCUCN simulation code for ultracold neutron physics

    NASA Astrophysics Data System (ADS)

    Zsigmond, G.

    2018-02-01

    Ultracold neutrons (UCN) have very low kinetic energies 0-300 neV, thereby can be stored in specific material or magnetic confinements for many hundreds of seconds. This makes them a very useful tool in probing fundamental symmetries of nature (for instance charge-parity violation by neutron electric dipole moment experiments) and contributing important parameters for the Big Bang nucleosynthesis (neutron lifetime measurements). Improved precision experiments are in construction at new and planned UCN sources around the world. MC simulations play an important role in the optimization of such systems with a large number of parameters, but also in the estimation of systematic effects, in benchmarking of analysis codes, or as part of the analysis. The MCUCN code written at PSI has been extensively used for the optimization of the UCN source optics and in the optimization and analysis of (test) experiments within the nEDM project based at PSI. In this paper we present the main features of MCUCN and interesting benchmark and application examples.

  8. Teaching School Physics. A UNESCO Source Book.

    ERIC Educational Resources Information Center

    Lewis, John L., Ed.

    This UNESCO source book on teaching physics in schools provides a synthesis of views and policies prevalent throughout the world with respect to physics education. The book's contents are contributed by educators from several nations who have been able to give an international outlook in the discussion of various aspects of physics education. The…

  9. Weathering the Storm: Developing a Spatial Data Infrastructure and Online Research Platform for Oil Spill Preparedness

    NASA Astrophysics Data System (ADS)

    Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.

    2016-12-01

    Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.

  10. Contribution of Patient Interviews as Part of a Comprehensive Approach to the Identification of Drug-Related Problems on Geriatric Wards.

    PubMed

    Stämpfli, Dominik; Boeni, Fabienne; Gerber, Andy; Bättig, Victor A D; Hersberger, Kurt E; Lampert, Markus L

    2018-06-19

    Inappropriate prescribing is linked to increased risks for adverse drug reactions and hospitalisation. Combining explicit and implicit criteria of inappropriate prescribing with the information obtained in patient interviews seems beneficial with regard to the identification of drug-related problems (DRPs) in hospitalised patients. We aimed to investigate the inclusion of pharmacist interviews as part of medication reviews (including the use of explicit and implicit criteria of inappropriate prescribing) to identify DRPs in older inpatients. Clinical medication reviews were performed on geriatric and associated physical and neurological rehabilitation wards in a regional secondary care hospital. Data from electronic medical records, laboratory data, and current treatment regimens were complemented with a novel structured patient interview performed by a clinical pharmacist. The structured interview questioned patients on administration issues, prescribed medication, self-medication, and allergies. The reviews included the use of current treatment guidelines, the Medication Appropriateness Index, the Screening Tool of Older People's Prescriptions (STOPP, v2), and the Screening Tool to Alert to Right Treatment (START, v2). The potential relevance of the DRPs was estimated using the German version of the CLEO tool. In 110 patients, 595 DRPs were identified, averaging 5.4 per patient (range 0-17). The structured interviews identified 249 DRPs (41.8%), of which 227 were not identified by any other source of information. The majority of DRPs (213/249, i.e. 85.5%) identified by patient interview were estimated to be of minor clinical relevance (i.e. limited adherence, knowledge, quality of life, or satisfaction). We demonstrated that structured patient interviews identified additional DRPs that other sources did not identify. Embedded within a comprehensive approach, the structured patient interviews were needed as data resource for over one-third of all DRPs.

  11. Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics

    NASA Astrophysics Data System (ADS)

    Hobley, Daniel E. J.; Adams, Jordan M.; Nudurupati, Sai Siddhartha; Hutton, Eric W. H.; Gasparini, Nicole M.; Istanbulluoglu, Erkan; Tucker, Gregory E.

    2017-01-01

    The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures - including both regular and irregular grids - to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.

  12. Using Functional Languages and Declarative Programming to analyze ROOT data: LINQtoROOT

    NASA Astrophysics Data System (ADS)

    Watts, Gordon

    2015-05-01

    Modern high energy physics analysis is complex. It typically requires multiple passes over different datasets, and is often held together with a series of scripts and programs. For example, one has to first reweight the jet energy spectrum in Monte Carlo to match data before plots of any other jet related variable can be made. This requires a pass over the Monte Carlo and the Data to derive the reweighting, and then another pass over the Monte Carlo to plot the variables the analyser is really interested in. With most modern ROOT based tools this requires separate analysis loops for each pass, and script files to glue to the results of the two analysis loops together. A framework has been developed that uses the functional and declarative features of the C# language and its Language Integrated Query (LINQ) extensions to declare the analysis. The framework uses language tools to convert the analysis into C++ and runs ROOT or PROOF as a backend to get the results. This gives the analyser the full power of an object-oriented programming language to put together the analysis and at the same time the speed of C++ for the analysis loop. The tool allows one to incorporate C++ algorithms written for ROOT by others. A by-product of the design is the ability to cache results between runs, dramatically reducing the cost of adding one-more-plot and also to keep a complete record associated with each plot for data preservation reasons. The code is mature enough to have been used in ATLAS analyses. The package is open source and available on the open source site CodePlex.

  13. SOURCE APPORTIONMENT RESULTS, UNCERTAINTIES, AND MODELING TOOLS

    EPA Science Inventory

    Advanced multivariate receptor modeling tools are available from the U.S. Environmental Protection Agency (EPA) that use only speciated sample data to identify and quantify sources of air pollution. EPA has developed both EPA Unmix and EPA Positive Matrix Factorization (PMF) and ...

  14. iPhone forensics based on Macintosh open source and freeware tools

    NASA Astrophysics Data System (ADS)

    Höne, Thomas; Creutzburg, Reiner

    2011-02-01

    The aim of this article is to show the usefulness of Mac OS X based open source tools for forensic investigation of modern iPhones. It demonstrates how important data stored in the iPhone is investigated. Two different scenarios of investigations are presented that are well-suited for a forensics lab work in university. This work shows how to analyze an Apple iPhone using open source and freeware tools. Important data used in a forensics investigation, which are possibly stored on a mobile device are presented. Also the superstructure and functions of the iPhone are explained.

  15. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  16. John Wheeler, 1952 - 1976: Black Holes and Geometrodynamics

    NASA Astrophysics Data System (ADS)

    Thorne, Kip S.

    2009-05-01

    In 1952 John Wheeler turned his attention from nuclear physics and national defense to a backwater of physics: general relativity. Over the next 25 years, with students and postdocs he led a ``revolution'' that made relativity a major subfield of fundamental physics and a tool for astrophysics. Wheeler viewed curved spacetime as a nonlinear dynamical entity, to be studied via tools of geometrodynamics (by analogy with electrodynamics) -- including numerical relativity, for which his students laid the earliest foundations. With Joseph Weber (his postdoc), he did theoretical work on gravitational waves that helped launch Weber on a career of laying foundations for modern gravitational-wave detectors. Wheeler and his students showed compellingly that massive stars must form black holes; and he gave black holes their name, formulated the theory of their pulsations and stability (with Tullio Regge), and mentored several generations of students in seminal black-hole research (including Jacob Bekenstein's black-hole entropy). Before the discovery of pulsars, Wheeler identified magnetized, spinning neutron stars as the likely power sources for supernova remnants including the Crab nebula. He identified the Planck length and time as the characteristic scales for the laws of quantum gravity, and formulated the concept of quantum fluctuations of spacetime geometry and quantum foam. With Bryce DeWitt, he defined a quantum wave function on the space of 3-geometries and derived the Wheeler-DeWitt equation that governs it, and its a sum-over-histories action principle. Wheeler was a great inspiration to his colleagues and students, pointing the directions toward fruitful research problems and making intuitive-leap speculations about what lies beyond the frontiers of knowledge. Many of his ideas that sounded crazy at the time were ``just crazy enough to be right''.

  17. Analyzing petabytes of data with Hadoop

    ScienceCinema

    Hammerbacher, Jeff

    2018-05-14

    The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in their environment compare to Hadoop, and how Hadoop could improve better to serve the high energy physics community. Short Biography: Jeff Hammerbacher is Vice President of Products and Chief Scientist at Cloudera. Jeff was an Entrepreneur in Residence at Accel Partners immediately prior to founding Cloudera. Before Accel, he conceived, built, and led the Data team at Facebook. The Data team was responsible for driving many of the applications of statistics and machine learning at Facebook, as well as building out the infrastructure to support these tasks for massive data sets. The team produced two open source projects: Hive, a system for offline analysis built above Hadoop, and Cassandra, a structured storage system on a P2P network. Before joining Facebook, Jeff was a quantitative analyst on Wall Street. Jeff earned his Bachelor's Degree in Mathematics from Harvard University and recently served as contributing editor to the book "Beautiful Data", published by O'Reilly in July 2009.

  18. Analyzing petabytes of data with Hadoop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammerbacher, Jeff

    The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlightmore » best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in their environment compare to Hadoop, and how Hadoop could improve better to serve the high energy physics community. Short Biography: Jeff Hammerbacher is Vice President of Products and Chief Scientist at Cloudera. Jeff was an Entrepreneur in Residence at Accel Partners immediately prior to founding Cloudera. Before Accel, he conceived, built, and led the Data team at Facebook. The Data team was responsible for driving many of the applications of statistics and machine learning at Facebook, as well as building out the infrastructure to support these tasks for massive data sets. The team produced two open source projects: Hive, a system for offline analysis built above Hadoop, and Cassandra, a structured storage system on a P2P network. Before joining Facebook, Jeff was a quantitative analyst on Wall Street. Jeff earned his Bachelor's Degree in Mathematics from Harvard University and recently served as contributing editor to the book "Beautiful Data", published by O'Reilly in July 2009.« less

  19. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  20. Open source bioimage informatics for cell biology

    PubMed Central

    Swedlow, Jason R.; Eliceiri, Kevin W.

    2009-01-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery. PMID:19833518

  1. Open-source image registration for MRI-TRUS fusion-guided prostate interventions.

    PubMed

    Fedorov, Andriy; Khallaghi, Siavash; Sánchez, C Antonio; Lasso, Andras; Fels, Sidney; Tuncali, Kemal; Sugar, Emily Neubauer; Kapur, Tina; Zhang, Chenxi; Wells, William; Nguyen, Paul L; Abolmaesumi, Purang; Tempany, Clare

    2015-06-01

    We propose two software tools for non-rigid registration of MRI and transrectal ultrasound (TRUS) images of the prostate. Our ultimate goal is to develop an open-source solution to support MRI-TRUS fusion image guidance of prostate interventions, such as targeted biopsy for prostate cancer detection and focal therapy. It is widely hypothesized that image registration is an essential component in such systems. The two non-rigid registration methods are: (1) a deformable registration of the prostate segmentation distance maps with B-spline regularization and (2) a finite element-based deformable registration of the segmentation surfaces in the presence of partial data. We evaluate the methods retrospectively using clinical patient image data collected during standard clinical procedures. Computation time and Target Registration Error (TRE) calculated at the expert-identified anatomical landmarks were used as quantitative measures for the evaluation. The presented image registration tools were capable of completing deformable registration computation within 5 min. Average TRE was approximately 3 mm for both methods, which is comparable with the slice thickness in our MRI data. Both tools are available under nonrestrictive open-source license. We release open-source tools that may be used for registration during MRI-TRUS-guided prostate interventions. Our tools implement novel registration approaches and produce acceptable registration results. We believe these tools will lower the barriers in development and deployment of interventional research solutions and facilitate comparison with similar tools.

  2. Pipe dream? Envisioning a grassroots Python ecosystem of open, common software tools and data access in support of river and coastal biogeochemical research (Invited)

    NASA Astrophysics Data System (ADS)

    Mayorga, E.

    2013-12-01

    Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such tools can provide the additional advantage of enhancing cohesion and communication across specific research areas, and reducing research obstacles in a range of disciplines.

  3. Belle2VR: A Virtual-Reality Visualization of Subatomic Particle Physics in the Belle II Experiment.

    PubMed

    Duer, Zach; Piilonen, Leo; Glasson, George

    2018-05-01

    Belle2VR is an interactive virtual-reality visualization of subatomic particle physics, designed by an interdisciplinary team as an educational tool for learning about and exploring subatomic particle collisions. This article describes the tool, discusses visualization design decisions, and outlines our process for collaborative development.

  4. Statistical Physics in the Era of Big Data

    ERIC Educational Resources Information Center

    Wang, Dashun

    2013-01-01

    With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…

  5. TMDlib and TMDplotter: library and plotting tools for transverse-momentum-dependent parton distributions

    NASA Astrophysics Data System (ADS)

    Hautmann, F.; Jung, H.; Krämer, M.; Mulders, P. J.; Nocera, E. R.; Rogers, T. C.; Signori, A.

    2014-12-01

    Transverse-momentum-dependent distributions (TMDs) are extensions of collinear parton distributions and are important in high-energy physics from both theoretical and phenomenological points of view. In this manual we introduce the library , a tool to collect transverse-momentum-dependent parton distribution functions (TMD PDFs) and fragmentation functions (TMD FFs) together with an online plotting tool, TMDplotter. We provide a description of the program components and of the different physical frameworks the user can access via the available parameterisations.

  6. TMDlib and TMDplotter: library and plotting tools for transverse-momentum-dependent parton distributions.

    PubMed

    Hautmann, F; Jung, H; Krämer, M; Mulders, P J; Nocera, E R; Rogers, T C; Signori, A

    Transverse-momentum-dependent distributions (TMDs) are extensions of collinear parton distributions and are important in high-energy physics from both theoretical and phenomenological points of view. In this manual we introduce the library [Formula: see text], a tool to collect transverse-momentum-dependent parton distribution functions (TMD PDFs) and fragmentation functions (TMD FFs) together with an online plotting tool, TMDplotter. We provide a description of the program components and of the different physical frameworks the user can access via the available parameterisations.

  7. Tools, information sources, and methods used in deciding on drug availability in HMOs.

    PubMed

    Barner, J C; Thomas, J

    1998-01-01

    The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.

  8. Online Databases in Physics.

    ERIC Educational Resources Information Center

    Sievert, MaryEllen C.; Verbeck, Alison F.

    1984-01-01

    This overview of 47 online sources for physics information available in the United States--including sub-field databases, transdisciplinary databases, and multidisciplinary databases-- notes content, print source, language, time coverage, and databank. Two discipline-specific databases (SPIN and PHYSICS BRIEFS) are also discussed. (EJS)

  9. Using Open Source Software in Visual Simulation Development

    DTIC Science & Technology

    2005-09-01

    increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a

  10. Physics of vascular brachytherapy.

    PubMed

    Jani, S K

    1999-08-01

    Basic physics plays an important role in understanding the clinical utility of radioisotopes in brachytherapy. Vascular brachytherapy is a very unique application of localized radiation in that dose levels very close to the source are employed to treat tissues within the arterial wall. This article covers basic physics of radioactivity and differentiates between beta and gamma radiations. Physical parameters such as activity, half-life, exposure and absorbed dose have been explained. Finally, the dose distribution around a point source and a linear source is described. The principles of basic physics are likely to play an important role in shaping the emerging technology and its application in vascular brachytherapy.

  11. Clio's Workshop: Resources for Historical Studies in American Librarianship.

    ERIC Educational Resources Information Center

    Tucker, John Mark

    2000-01-01

    Identifies tools most useful in the library historian's workshop, particularly those of a bibliographic, reference, or documentary nature. Highlights include bibliographic tools and the formative years of librarianship; sources of current scholarship; reference sources, including minority voices; documentary materials; and historians in…

  12. Measurement of obesity prevention in childcare settings: A systematic review of current instruments.

    PubMed

    Stanhope, Kaitlyn K; Kay, Christi; Stevenson, Beth; Gazmararian, Julie A

    The incidence of childhood obesity is highest among children entering kindergarten. Overweight and obesity in early childhood track through adulthood. Programs increasingly target children in early life for obesity prevention. However, the published literature lacks a review on tools available for measuring behaviour and environmental level change in child care. The objective is to describe measurement tools currently in use in evaluating obesity-prevention in preschool-aged children. Literature searches were conducted in PubMed using the keywords "early childhood obesity," "early childhood measurement," "early childhood nutrition" and "early childhood physical activity." Inclusion criteria included a discussion of: (1) obesity prevention, risk assessment or treatment in children ages 1-5 years; and (2) measurement of nutrition or physical activity. One hundred thirty-four publications were selected for analysis. Data on measurement tools, population and outcomes were abstracted into tables. Tables are divided by individual and environmental level measures and further divided into physical activity, diet and physical health outcomes. Recommendations are made for weighing advantages and disadvantages of tools. Despite rising numbers of interventions targeting obesity-prevention and treatment in preschool-aged children, there is no consensus for which tools represent a gold standard or threshold of accuracy. Copyright © 2016 Asia Oceania Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  13. Tools for open geospatial science

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  14. New tools for sculpting cranial implants in a shared haptic augmented reality environment.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2006-01-01

    New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.

  15. Experiments of multichannel least-square methods for sound field reproduction inside aircraft mock-up: Objective evaluations

    NASA Astrophysics Data System (ADS)

    Gauthier, P.-A.; Camier, C.; Lebel, F.-A.; Pasco, Y.; Berry, A.; Langlois, J.; Verron, C.; Guastavino, C.

    2016-08-01

    Sound environment reproduction of various flight conditions in aircraft mock-ups is a valuable tool for the study, prediction, demonstration and jury testing of interior aircraft sound quality and annoyance. To provide a faithful reproduced sound environment, time, frequency and spatial characteristics should be preserved. Physical sound field reproduction methods for spatial sound reproduction are mandatory to immerse the listener's body in the proper sound fields so that localization cues are recreated at the listener's ears. Vehicle mock-ups pose specific problems for sound field reproduction. Confined spaces, needs for invisible sound sources and very specific acoustical environment make the use of open-loop sound field reproduction technologies such as wave field synthesis (based on free-field models of monopole sources) not ideal. In this paper, experiments in an aircraft mock-up with multichannel least-square methods and equalization are reported. The novelty is the actual implementation of sound field reproduction with 3180 transfer paths and trim panel reproduction sources in laboratory conditions with a synthetic target sound field. The paper presents objective evaluations of reproduced sound fields using various metrics as well as sound field extrapolation and sound field characterization.

  16. The Approximate Bayesian Computation methods in the localization of the atmospheric contamination source

    NASA Astrophysics Data System (ADS)

    Kopka, P.; Wawrzynczak, A.; Borysiewicz, M.

    2015-09-01

    In many areas of application, a central problem is a solution to the inverse problem, especially estimation of the unknown model parameters to model the underlying dynamics of a physical system precisely. In this situation, the Bayesian inference is a powerful tool to combine observed data with prior knowledge to gain the probability distribution of searched parameters. We have applied the modern methodology named Sequential Approximate Bayesian Computation (S-ABC) to the problem of tracing the atmospheric contaminant source. The ABC is technique commonly used in the Bayesian analysis of complex models and dynamic system. Sequential methods can significantly increase the efficiency of the ABC. In the presented algorithm, the input data are the on-line arriving concentrations of released substance registered by distributed sensor network from OVER-LAND ATMOSPHERIC DISPERSION (OLAD) experiment. The algorithm output are the probability distributions of a contamination source parameters i.e. its particular location, release rate, speed and direction of the movement, start time and duration. The stochastic approach presented in this paper is completely general and can be used in other fields where the parameters of the model bet fitted to the observable data should be found.

  17. Algodoo: A Tool for Encouraging Creativity in Physics Teaching and Learning

    NASA Astrophysics Data System (ADS)

    Gregorcic, Bor; Bodin, Madelen

    2017-01-01

    Algodoo (http://www.algodoo.com) is a digital sandbox for physics 2D simulations. It allows students and teachers to easily create simulated "scenes" and explore physics through a user-friendly and visually attractive interface. In this paper, we present different ways in which students and teachers can use Algodoo to visualize and solve physics problems, investigate phenomena and processes, and engage in out-of-school activities and projects. Algodoo, with its approachable interface, inhabits a middle ground between computer games and "serious" computer modeling. It is suitable as an entry-level modeling tool for students of all ages and can facilitate discussions about the role of computer modeling in physics.

  18. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  19. Mediated learning in the workplace: student perspectives on knowledge resources.

    PubMed

    Shanahan, Madeleine

    2015-01-01

    In contemporary clinical practice, student radiographers can use many types of knowledge resources to support their learning. These include workplace experts, digital and nondigital information sources (eg, journals, textbooks, and the Internet), and electronic communication tools such as e-mail and social media. Despite the range of knowledge tools available, there is little available data about radiography students' use of these resources during clinical placement. A 68-item questionnaire was distributed to 62 students enrolled in an Australian university undergraduate radiography program after they completed a clinical placement. Researchers used descriptive statistics to analyze student access to workplace experts and their use of digital and nondigital information sources and electronic communication tools. A 5-point Likert scale (1 = very important; 5 = not important) was used to assess the present importance and perceived future value of knowledge tools for workplace learning. Of the 53 students who completed and returned the questionnaire anonymously, most rely on the knowledge of practicing technologists and on print and electronic information sources to support their learning; some students also use electronic communication tools. Students perceive that these knowledge resources also will be important tools for their future learning as qualified health professionals. The findings from this study present baseline data regarding the value students attribute to multiple knowledge tools and regarding student access to and use of these tools during clinical placement. In addition, most students have access to multiple knowledge tools in the workplace and incorporate these tools simultaneously into their overall learning practice during clinical placement. Although a range of knowledge tools is used in the workplace to support learning among student radiographers, the quality of each tool should be critically analyzed before it is adopted in practice. Integrating practice-based learning with learning mediated by information sources provides a more complete paradigm of learning during clinical placement.

  20. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information.

    PubMed

    Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L

    2013-02-12

    Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.

  1. Numerical modeling of the SNS H{sup −} ion source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veitzer, Seth A.; Beckwith, Kristian R. C.; Kundrapu, Madhusudhan

    Ion source rf antennas that produce H- ions can fail when plasma heating causes ablation of the insulating coating due to small structural defects such as cracks. Reducing antenna failures that reduce the operating capabilities of the Spallation Neutron Source (SNS) accelerator is one of the top priorities of the SNS H- Source Program at ORNL. Numerical modeling of ion sources can provide techniques for optimizing design in order to reduce antenna failures. There are a number of difficulties in developing accurate models of rf inductive plasmas. First, a large range of spatial and temporal scales must be resolved inmore » order to accurately capture the physics of plasma motion, including the Debye length, rf frequencies on the order of tens of MHz, simulation time scales of many hundreds of rf periods, large device sizes on tens of cm, and ion motions that are thousands of times slower than electrons. This results in large simulation domains with many computational cells for solving plasma and electromagnetic equations, short time steps, and long-duration simulations. In order to reduce the computational requirements, one can develop implicit models for both fields and particle motions (e.g. divergence-preserving ADI methods), various electrostatic models, or magnetohydrodynamic models. We have performed simulations using all three of these methods and have found that fluid models have the greatest potential for giving accurate solutions while still being fast enough to perform long timescale simulations in a reasonable amount of time. We have implemented a number of fluid models with electromagnetics using the simulation tool USim and applied them to modeling the SNS H- ion source. We found that a reduced, single-fluid MHD model with an imposed magnetic field due to the rf antenna current and the confining multi-cusp field generated increased bulk plasma velocities of > 200 m/s in the region of the antenna where ablation is often observed in the SNS source. We report here on comparisons of simulated plasma parameters and code performance using more accurate physical models, such as two-temperature extended MHD models, for both a related benchmark system describing a inductively coupled plasma reactor, and for the SNS ion source. We also present results from scaling studies for mesh generation and solvers in the USim simulation code.« less

  2. A latest developed all permanent magnet ECRIS for atomic physics research at IMP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, L.T.; Zhao, H.W.; Zhang, Z.M.

    2006-03-15

    Electron cyclotron resonance (ECR) ion sources have been used for atomic physics research for a long time. With the development of atomic physics research in the Institute of Modern Physics (IMP), additional high performance experimental facilities are required. A 300 kV high voltage (HV) platform has been under construction since 2003, and an all permanent magnet ECR ion source is supposed to be put on the platform. Lanzhou all permanent magnet ECR ion source No. 2 (LAPECR2) is a latest developed all permanent magnet ECRIS. It is a 900 kg weight and null-set 650 mmx562 mm outer dimension (magnetic body)more » ion source. The injection magnetic field of the source is 1.28 T and the extraction magnetic field is 1.07 T. This source is designed to be running at 14.5 GHz. The high magnetic field inside the plasma chamber enables the source to give good performances at 14.5 GHz. LAPECR2 source is now under commissioning in IMP. In this article, the typical parameters of the source LAPECR2 are listed, and the typical results of the preliminary commissioning are presented.« less

  3. APT: Aperture Photometry Tool

    NASA Astrophysics Data System (ADS)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  4. Physics Model-Based Scatter Correction in Multi-Source Interior Computed Tomography.

    PubMed

    Gong, Hao; Li, Bin; Jia, Xun; Cao, Guohua

    2018-02-01

    Multi-source interior computed tomography (CT) has a great potential to provide ultra-fast and organ-oriented imaging at low radiation dose. However, X-ray cross scattering from multiple simultaneously activated X-ray imaging chains compromises imaging quality. Previously, we published two hardware-based scatter correction methods for multi-source interior CT. Here, we propose a software-based scatter correction method, with the benefit of no need for hardware modifications. The new method is based on a physics model and an iterative framework. The physics model was derived analytically, and was used to calculate X-ray scattering signals in both forward direction and cross directions in multi-source interior CT. The physics model was integrated to an iterative scatter correction framework to reduce scatter artifacts. The method was applied to phantom data from both Monte Carlo simulations and physical experimentation that were designed to emulate the image acquisition in a multi-source interior CT architecture recently proposed by our team. The proposed scatter correction method reduced scatter artifacts significantly, even with only one iteration. Within a few iterations, the reconstructed images fast converged toward the "scatter-free" reference images. After applying the scatter correction method, the maximum CT number error at the region-of-interests (ROIs) was reduced to 46 HU in numerical phantom dataset and 48 HU in physical phantom dataset respectively, and the contrast-noise-ratio at those ROIs increased by up to 44.3% and up to 19.7%, respectively. The proposed physics model-based iterative scatter correction method could be useful for scatter correction in dual-source or multi-source CT.

  5. The Shale Hills Sensorium for Embedded Sensors, Simulation, & Visualization: A Prototype for Land-Vegetation-Atmosphere Interactions

    NASA Astrophysics Data System (ADS)

    Duffy, C.

    2008-12-01

    The future of environmental observing systems will utilize embedded sensor networks with continuous real- time measurement of hydrologic, atmospheric, biogeochemical, and ecological variables across diverse terrestrial environments. Embedded environmental sensors, benefitting from advances in information sciences, networking technology, materials science, computing capacity, and data synthesis methods, are undergoing revolutionary change. It is now possible to field spatially-distributed, multi-node sensor networks that provide density and spatial coverage previously accessible only via numerical simulation. At the same time, computational tools are advancing rapidly to the point where it is now possible to simulate the physical processes controlling individual parcels of water and solutes through the complete terrestrial water cycle. Our goal for the Penn State Critical Zone Observatory is to apply environmental sensor arrays, integrated hydrologic models, and state-of-the-art visualization deployed and coordinated at a testbed within the Penn State Experimental Forest. The Shale Hills Hydro_Sensorium prototype proposed here is designed to observe land-atmosphere interactions in four-dimensional (space and time). The term Hydro_Sensorium implies the totality of physical sensors, models and visualization tools that allow us to perceive the detailed space and time complexities of the water and energy cycle for a watershed or river basin for all physical states and fluxes (groundwater, soil moisture, temperature, streamflow, latent heat, snowmelt, chemistry, isotopes etc.). This research will ultimately catalyze the study of complex interactions between the land surface, subsurface, biological and atmospheric systems over a broad range of scales. The sensor array would be real-time and fully controllable by remote users for "computational steering" and data fusion. Presently fully-coupled physical models are being developed that link the atmosphere-land-vegetation-subsurface system into a fully-coupled distributed system. During the last 5 years the Penn State Integrated Hydrologic Modeling System has been under development as an open-source community modeling project funded by NSF EAR/GEO and NSF CBET/ENG. PIHM represents a strategy for the formulation and solution of fully-coupled process equations at the watershed and river basin scales, and includes a tightly coupled GIS tool for data handling, domain decomposition, optimal unstructured grid generation, and model parameterization. The sensor and simulation system has the following elements: 1) extensive, spatially-distributed, non- invasive, smart sensor networks to gather massive geologic, hydrologic, and geochemical data; 2) stochastic information fusion methods; 3) spatially-explicit multiphysics models/solutions of the land-vegetation- atmosphere system; and 4) asynchronous, parallel/distributed, adaptive algorithms for rapidly simulating the states of a basin at high resolution, 5) signal processing tools for data mining and parameter estimation, and 6) visualization tools. The prototype proposed sensor array and simulation system proposed here will offer a coherent new approach to environmental predictions with a fully integrated observing system design. We expect that the Shale Hills Hydro_Sensorium may provide the needed synthesis of information and conceptualization necessary to advance predictive understanding in complex hydrologic systems.

  6. Assessing participants' perceptions on group-based principles for action in community-based health enhancing physical activity programmes: The APEF tool.

    PubMed

    Herens, Marion; Wagemakers, Annemarie

    2017-12-01

    In community-based health enhancing physical activity (CBHEPA) programmes, group-based principles for action such as active participation, enjoyment, and fostering group processes are widely advocated. However, not much is known about participants' perceptions of these principles as there are no assessment tools available. Therefore, this article describes the development of the APEF (Active Participation, Enjoyment, and Fostering group processes) tool and reports on its implementation in a Dutch CBHEPA programme. Indicators for the principles have been identified from literature research, interviews with professionals, and secondary analysis of three group interviews with 11 practitioners. To address the identified indicators, the APEF tool was developed, pretested, and used in 10 focus groups with 76 participants. The APEF tool consists of eight statements about group-based principles for action, on which CBHEPA participants vote, followed by in-depth discussion. The voting procedure engages participants. Spider diagrams visualise participants' perceptions of group-based principles. The APEF tool addresses the challenge of relating group level outcomes to individual outcomes such as physical activity behaviour. The tool facilitates as well as evaluates group-based principles for action, it stimulates dialogue and is culturally sensitive, but it needs strong facilitating skills to manage group dynamics. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Facilities Stewardship: Measuring the Return on Physical Assets.

    ERIC Educational Resources Information Center

    Kadamus, David A.

    2001-01-01

    Asserts that colleges and universities should apply the same analytical rigor to physical assets as they do financial assets. Presents a management tool, the Return on Physical Assets model, to help guide physical asset allocation decisions. (EV)

  8. Diagnosing alternative conceptions of Fermi energy among undergraduate students

    NASA Astrophysics Data System (ADS)

    Sharma, Sapna; Ahluwalia, Pardeep Kumar

    2012-07-01

    Physics education researchers have scientifically established the fact that the understanding of new concepts and interpretation of incoming information are strongly influenced by the preexisting knowledge and beliefs of students, called epistemological beliefs. This can lead to a gap between what students actually learn and what the teacher expects them to learn. In a classroom, as a teacher, it is desirable that one tries to bridge this gap at least on the key concepts of a particular field which is being taught. One such key concept which crops up in statistical physics/solid-state physics courses, and around which the behaviour of materials is described, is Fermi energy (εF). In this paper, we present the results which emerged about misconceptions on Fermi energy in the process of administering a diagnostic tool called the Statistical Physics Concept Survey developed by the authors. It deals with eight themes of basic importance in learning undergraduate solid-state physics and statistical physics. The question items of the tool were put through well-established sequential processes: definition of themes, Delphi study, interview with students, drafting questions, administration, validity and reliability of the tool. The tool was administered to a group of undergraduate students and postgraduate students, in a pre-test and post-test design. In this paper, we have taken one of the themes i.e. Fermi energy of the diagnostic tool for our analysis and discussion. Students’ responses and reasoning comments given during interview were analysed. This analysis helped us to identify prevailing misconceptions/learning gaps among students on this topic. How spreadsheets can be effectively used to remove the identified misconceptions and help appreciate the finer nuances while visualizing the behaviour of the system around Fermi energy, normally sidestepped both by the teachers and learners, is also presented in this paper.

  9. RealTime Physics: Active learning laboratory

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald K.; Sokoloff, David R.

    1997-03-01

    Our research shows that student learning of physics concepts in introductory physics courses is enhanced by the use of special guided discovery laboratory curricula which embody the results of educational research and which are supported by the use of the Tools for Scientific Thinking microcomputer-based laboratory (MBL) tools. In this paper we first describe the general characteristics of the research-based RealTime Physics laboratory curricula developed for use in introductory physics classes in colleges, universities and high schools. We then describe RealTime Physics Mechanics in detail. Finally we examine student learning of dynamics in traditional physics courses and in courses using RealTime Physics Mechanics, primarily by the use of correlated questions on the Force and Motion Conceptual Evaluation. We present considerable evidence that students who use the new laboratory curricula demonstrate significantly improved learning and retention of dynamics concepts compared to students taught by traditional methods.

  10. Design of an instrument to measure the quality of care in Physical Therapy.

    PubMed

    Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; Prado, Cristiane do; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo

    2015-01-01

    To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care.

  11. Development of materials for the rapid manufacture of die cast tooling

    NASA Astrophysics Data System (ADS)

    Hardro, Peter Jason

    The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.

  12. A moist Boussinesq shallow water equations set for testing atmospheric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zerroukat, M., E-mail: mohamed.zerroukat@metoffice.gov.uk; Allen, T.

    The shallow water equations have long been used as an initial test for numerical methods applied to atmospheric models with the test suite of Williamson et al. being used extensively for validating new schemes and assessing their accuracy. However the lack of physics forcing within this simplified framework often requires numerical techniques to be reworked when applied to fully three dimensional models. In this paper a novel two-dimensional shallow water equations system that retains moist processes is derived. This system is derived from three-dimensional Boussinesq approximation of the hydrostatic Euler equations where, unlike the classical shallow water set, we allowmore » the density to vary slightly with temperature. This results in extra (or buoyancy) terms for the momentum equations, through which a two-way moist-physics dynamics feedback is achieved. The temperature and moisture variables are advected as separate tracers with sources that interact with the mean-flow through a simplified yet realistic bulk moist-thermodynamic phase-change model. This moist shallow water system provides a unique tool to assess the usually complex and highly non-linear dynamics–physics interactions in atmospheric models in a simple yet realistic way. The full non-linear shallow water equations are solved numerically on several case studies and the results suggest quite realistic interaction between the dynamics and physics and in particular the generation of cloud and rain. - Highlights: • Novel shallow water equations which retains moist processes are derived from the three-dimensional hydrostatic Boussinesq equations. • The new shallow water set can be seen as a more general one, where the classical equations are a special case of these equations. • This moist shallow water system naturally allows a feedback mechanism from the moist physics increments to the momentum via buoyancy. • Like full models, temperature and moistures are advected as tracers that interact through a simplified yet realistic phase-change model. • This model is a unique tool to test numerical methods for atmospheric models, and physics–dynamics coupling, in a very realistic and simple way.« less

  13. Some applications of mathematics in theoretical physics - A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bora, Kalpana

    2016-06-21

    Mathematics is a very beautiful subject−very much an indispensible tool for Physics, more so for Theoretical Physics (by which we mean here mainly Field Theory and High Energy Physics). These branches of Physics are based on Quantum Mechanics and Special Theory of Relativity, and many mathematical concepts are used in them. In this work, we shall elucidate upon only some of them, like−differential geometry, infinite series, Mellin transforms, Fourier and integral transforms, special functions, calculus, complex algebra, topology, group theory, Riemannian geometry, functional analysis, linear algebra, operator algebra, etc. We shall also present, some physics issues, where these mathematical toolsmore » are used. It is not wrong to say that Mathematics is such a powerful tool, without which, there can not be any Physics theory!! A brief review on our research work is also presented.« less

  14. Cloudy 94 and Applications to Quasar Emission Line Regions

    NASA Technical Reports Server (NTRS)

    Ferland, Gary J.

    2000-01-01

    This review discusses the most recent developments of the plasma simulation code Cloudy and its application to the, emission-line regions of quasars. The longterm goal is to develop the tools needed to determine the chemical composition of the emitting gas and the luminosity of the central engine for any emission line source. Emission lines and the underlying thermal continuum are formed in plasmas that are far from thermodynamic equilibrium. Their thermal and ionization states are the result of a balance of a vast set of microphysical processes. Once produced, radiation must, propagate out of the (usually) optically thick source. No analytic solutions are possible, and recourse to numerical simulations is necessary. I am developing the large-scale plasma simulation code Cloudy as an investigative tool for this work, much as an observer might build a spectrometer. This review describes the current version of Cloudy, version 94. It describes improvements made since the, release of the previous version, C90. The major recent, application has been the development of the "Locally Optimally-Emitting Cloud" (LOC) model of AGN emission line regions. Powerful selection effects, introduced by the atomic physics and line formation process, permit individual lines to form most efficiently only near certain selected parameters. These selection effects, together with the presence of gas with a wide range of conditions, are enough to reproduce the spectrum of a typical quasar with little dependence on details. The spectrum actually carries little information to the identity of the emitters. I view this as a major step forward since it provides a method to handle accidental details at the source, so that we can concentrate on essential information such as the luminosity or chemical composition of the quasar.

  15. The development and validity of the Salford Gait Tool: an observation-based clinical gait assessment tool.

    PubMed

    Toro, Brigitte; Nester, Christopher J; Farren, Pauline C

    2007-03-01

    To develop the construct, content, and criterion validity of the Salford Gait Tool (SF-GT) and to evaluate agreement between gait observations using the SF-GT and kinematic gait data. Tool development and comparative evaluation. University in the United Kingdom. For designing construct and content validity, convenience samples of 10 children with hemiplegic, diplegic, and quadriplegic cerebral palsy (CP) and 152 physical therapy students and 4 physical therapists were recruited. For developing criterion validity, kinematic gait data of 13 gait clusters containing 56 children with hemiplegic, diplegic, and quadriplegic CP and 11 neurologically intact children was used. For clinical evaluation, a convenience sample of 23 pediatric physical therapists participated. We developed a sagittal plane observational gait assessment tool through a series of design, test, and redesign iterations. The tool's grading system was calibrated using kinematic gait data of 13 gait clusters and was evaluated by comparing the agreement of gait observations using the SF-GT with kinematic gait data. Criterion standard kinematic gait data. There was 58% mean agreement based on grading categories and 80% mean agreement based on degree estimations evaluated with the least significant difference method. The new SF-GT has good concurrent criterion validity.

  16. Development and implementation of a remote audit tool for high dose rate (HDR) Ir-192 brachytherapy using optically stimulated luminescence dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casey, Kevin E.; Kry, Stephen F.; Howell, Rebecca M.

    Purpose: The aim of this work was to create a mailable phantom with measurement accuracy suitable for Radiological Physics Center (RPC) audits of high dose-rate (HDR) brachytherapy sources at institutions participating in National Cancer Institute-funded cooperative clinical trials. Optically stimulated luminescence dosimeters (OSLDs) were chosen as the dosimeter to be used with the phantom.Methods: The authors designed and built an 8 × 8 × 10 cm{sup 3} prototype phantom that had two slots capable of holding Al{sub 2}O{sub 3}:C OSLDs (nanoDots; Landauer, Glenwood, IL) and a single channel capable of accepting all {sup 192}Ir HDR brachytherapy sources in current clinicalmore » use in the United States. The authors irradiated the phantom with Nucletron and Varian {sup 192}Ir HDR sources in order to determine correction factors for linearity with dose and the combined effects of irradiation energy and phantom characteristics. The phantom was then sent to eight institutions which volunteered to perform trial remote audits.Results: The linearity correction factor was k{sub L}= (−9.43 × 10{sup −5}× dose) + 1.009, where dose is in cGy, which differed from that determined by the RPC for the same batch of dosimeters using {sup 60}Co irradiation. Separate block correction factors were determined for current versions of both Nucletron and Varian {sup 192}Ir HDR sources and these vendor-specific correction factors differed by almost 2.6%. For the Nucletron source, the correction factor was 1.026 [95% confidence interval (CI) = 1.023–1.028], and for the Varian source, it was 1.000 (95% CI = 0.995–1.005). Variations in lateral source positioning up to 0.8 mm and distal/proximal source positioning up to 10 mm had minimal effect on dose measurement accuracy. The overall dose measurement uncertainty of the system was estimated to be 2.4% and 2.5% for the Nucletron and Varian sources, respectively (95% CI). This uncertainty was sufficient to establish a ±5% acceptance criterion for source strength audits under a formal RPC audit program. Trial audits of four Nucletron sources and four Varian sources revealed an average RPC-to-institution dose ratio of 1.000 (standard deviation = 0.011).Conclusions: The authors have created an OSLD-based {sup 192}Ir HDR brachytherapy source remote audit tool which offers sufficient dose measurement accuracy to allow the RPC to establish a remote audit program with a ±5% acceptance criterion. The feasibility of the system has been demonstrated with eight trial audits to date.« less

  17. Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states

    PubMed Central

    Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.

    2012-01-01

    Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984

  18. The Impact of Space Experiments on our Knowledge of the Physics of the Universe

    NASA Astrophysics Data System (ADS)

    Giovannelli, Franco; Sabau-Graziati, Lola

    2004-05-01

    With the advent of space experiments it was demonstrated that cosmic sources emit energy practically across all the electromagnetic spectrum via different physical processes. Several physical quantities give witness to these processes which usually are not stationary; those physical observable quantities are then generally variable. Therefore simultaneous multifrequency observations are strictly necessary in order to understand the actual behaviour of cosmic sources. Space experiments have opened practically all the electromagnetic windows on the Universe. A discussion of the most important results coming from multifrequency photonic astrophysics experiments will provide new inputs for the advance of the knowledge of the physics, very often in its more extreme conditions. A multitude of high quality data across practically the whole electromagnetic spectrum came at the scientific community's disposal a few years after the beginning of the Space Era. With these data we are attempting to explain the physics governing the Universe and, moreover, its origin, which has been and still is a matter of the greatest curiosity for humanity. In this paper we will try to describe the last steps of the investigation born with the advent of space experiments, to note upon the most important results and open problems still existing, and to comment upon the perspectives we can reasonably expect. Once the idea of this paper was well accepted by ourselves, we had the problem of how to plan the exposition. Indeed, the exposition of the results can be made in different ways, following several points of view, according to: - a division in diffuse and discrete sources; - different classes of cosmic sources; - different spectral ranges, which implies in turn a sub-classification in accordance with different techniques of observations; - different physical emission mechanisms of electromagnetic radiation; - different vehicles used for launching the experiments (aircraft, balloons, rockets, satellites, observatories). In order to exhaustively present The Impact of Space Experiments on our Knowledge of the Physics of the Universe it would then have been necessary to write a kind of Encyclopaedia of the Astronomical Space Research, which is not our desire. On the contrary, since our goal is to provide an useful tool for the reader who has not specialized in space astrophysics and for the students, we decided to write this paper in the form of a review, the length of which can be still considered reasonable, taking into account the complexity of the arguments discussed. Because of the impossibility of realizing a complete picture of the physics governing the Universe, we were obliged to select how to proceed, the subjects to be discussed the more or the less, or those to be rejected. Because this work was born in the Ph.D. thesis of one of us (LSG) (Sabau-Graziati, 1990) we decided to follow the `astronomical tradition' used there, namely: the spectral energy ranges. Although such energy ranges do not determine physical objects (even if in many cases such ranges are used to define the sources as: radio, infrared, optical, ultraviolet, X-ray, γ-ray emitters), they do determine the methods of study, and from the technical point of view they define the technology employed in the relative experiments. However, since then we have decided to avoid a deep description of the experiments, satellites, and observatories, simply to grant a preference to the physical results, rather than to technologies, however fundamental for obtaining those results. The exposition, after an introduction (Section 1) and some crucial results from space astronomy (Section 2), has been focussed into three parts: the physics of the diffuse cosmic sources deduced from space experiments (Section 3), the physics of cosmic rays from ground- and space-based experiments (Section 4), and the physics of discrete cosmic sources deduced from space experiments (Section 5). In this first part of the paper we have used the logic of describing the main results obtained in different energy ranges, which in turn characterize the experiments on board space vehicles. Within each energy range we have discussed the contributions to the knowledge of various kind of cosmic sources coming from different experiments. And this part is mainly derived by the bulk of the introductory part of LSG's Ph.D. thesis. In the second part of the paper, starting from Section 6, we have preferred to discuss several classes of cosmic sources independently of the energy ranges, mainly focussing the results from a multifrequency point of view, making a preference for the knowledge of the physics governing the whole class. This was decided also because of the multitude of new space experiments launched in the last fifteen years, which would have rendered almost impossible a discussion of the results divided into energy ranges without weakening the construction of the entire puzzle. We do not pretend to cover every aspect of every subject considered under the heading of the physics of the universe. Instead a cross section of essays on historical, modern, and philosophical topics are offered and combined with personal views into tricks of the space astrophysics trade. The reader is, then, invited to accept this paper even though it obviously lacks completeness and the arguments discussed are certainly biased by a selection effect owed essentially to our knowledge, and to it being of a reasonable length. Some parts of it could seem, in certain sense, to belong to an older paper, in which the `news' is not reported. But this is owed to our own choice, just in full accord with the goals of the text: we want to present those results which have, in our opinion, been really important, in the development of the science. These impacting results do not necessarily constitute the last news. This text was formally closed just on the day of the launch of the INTEGRAL satellite: October 17, 2002. After that date only finishing touches have been added.

  19. Software tool for physics chart checks.

    PubMed

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  20. Implementation of an Intelligent Tutorial System for Socioenvironmental Management Projects

    ERIC Educational Resources Information Center

    Vera, Gil; Daniel, Víctor; Awad, Gabriel

    2015-01-01

    The agents responsible of execution of physical infrastructure projects of the Government of Antioquia must know the theories and concepts related to the socio-environmental management of physical infrastructure projects. In the absence of tools and the little information on the subject, it is necessary to build a m-learning tool to facilitate to…

Top