Sample records for physical computation level

  1. When does a physical system compute?

    PubMed

    Horsman, Clare; Stepney, Susan; Wagner, Rob C; Kendon, Viv

    2014-09-08

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution . We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a 'computational entity', and its critical role in defining when computing is taking place in physical systems.

  2. When does a physical system compute?

    PubMed Central

    Horsman, Clare; Stepney, Susan; Wagner, Rob C.; Kendon, Viv

    2014-01-01

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution. We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a ‘computational entity’, and its critical role in defining when computing is taking place in physical systems. PMID:25197245

  3. State-Transition Structures in Physics and in Computation

    NASA Astrophysics Data System (ADS)

    Petri, C. A.

    1982-12-01

    In order to establish close connections between physical and computational processes, it is assumed that the concepts of “state” and of “transition” are acceptable both to physicists and to computer scientists, at least in an informal way. The aim of this paper is to propose formal definitions of state and transition elements on the basis of very low level physical concepts in such a way that (1) all physically possible computations can be described as embedded in physical processes; (2) the computational aspects of physical processes can be described on a well-defined level of abstraction; (3) the gulf between the continuous models of physics and the discrete models of computer science can be bridged by simple mathematical constructs which may be given a physical interpretation; (4) a combinatorial, nonstatistical definition of “information” can be given on low levels of abstraction which may serve as a basis to derive higher-level concepts of information, e.g., by a statistical or probabilistic approach. Conceivable practical consequences are discussed.

  4. Computer Self-Efficacy, Computer Anxiety, Performance and Personal Outcomes of Turkish Physical Education Teachers

    ERIC Educational Resources Information Center

    Aktag, Isil

    2015-01-01

    The purpose of this study is to determine the computer self-efficacy, performance outcome, personal outcome, and affect and anxiety level of physical education teachers. Influence of teaching experience, computer usage and participation of seminars or in-service programs on computer self-efficacy level were determined. The subjects of this study…

  5. The challenges of developing computational physics: the case of South Africa

    NASA Astrophysics Data System (ADS)

    Salagaram, T.; Chetty, N.

    2013-08-01

    Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry.

  6. New modalities for scientific engagement in Africa - the case for computational physics

    NASA Astrophysics Data System (ADS)

    Chetty, N.

    2011-09-01

    Computational physics as a mode of studying the mathematical and physical sciences has grown world-wide over the past two decades, but this trend is yet to fully develop in Africa. The essential ingredients are there for this to happen: increasing internet connectivity, cheaper computing resources and the widespread availability of open source and freeware. The missing ingredients centre on intellectual isolation and the low levels of quality international collaborations. Low level of funding for research from local governments remains a critical issue. This paper gives a motivation for the importance of developing computational physics at the university undergraduate level, graduate level and research levels and gives suggestions on how this may be achieved within the African context. It is argued that students develop a more intuitive feel for the mathematical and physical sciences, that they learn useful, transferable skills that make our graduates well-sought after in the industrial and commercial environments, and that such graduates are better prepared to tackle research problems at the masters and doctoral levels. At the research level, the case of the African School Series on Electronic Structure Methods and Applications (ASESMA) is presented as a new multi-national modality for engaging with African scientists. There are many novel aspects to this School series, which are discussed.

  7. Microcomputers in a Beginning Tertiary Physics Course.

    ERIC Educational Resources Information Center

    Pearce, J. M.; O'Brien, R.

    1986-01-01

    Describes a college-level physics course which focuses on both physics knowledge/skills and use of microcomputers. Types of experiments done with the computers and how students use the computers to treat data are considered. (JN)

  8. White-collar workers' self-reported physical symptoms associated with using computers.

    PubMed

    Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria

    2012-01-01

    The aim of our work was to study the physical symptoms of upper- and lower-level white-collar workers using a questionnaire. The study was cross-sectional with a questionnaire posted to 15 000 working-age persons. The responses (6121) included 970 upper- and 1150 lower-level white-collar workers. In the upper- and lower-level white-collar worker groups, 45.7 and 56.0%, respectively, had experienced pain, numbness and aches in the neck either pretty often or more frequently. When comparing daily computer users and nonusers, there were significant differences in pain, numbness and aches in the neck or in the shoulders. In addition, age and gender influenced some physical symptoms. In the future, it is essential to take into account that working with computers can be especially associated with physical symptoms in the neck and in the shoulders when workers use computers daily.

  9. Computing the Free Energy Barriers for Less by Sampling with a Coarse Reference Potential while Retaining Accuracy of the Target Fine Model.

    PubMed

    Plotnikov, Nikolay V

    2014-08-12

    Proposed in this contribution is a protocol for calculating fine-physics (e.g., ab initio QM/MM) free-energy surfaces at a high level of accuracy locally (e.g., only at reactants and at the transition state for computing the activation barrier) from targeted fine-physics sampling and extensive exploratory coarse-physics sampling. The full free-energy surface is still computed but at a lower level of accuracy from coarse-physics sampling. The method is analytically derived in terms of the umbrella sampling and the free-energy perturbation methods which are combined with the thermodynamic cycle and the targeted sampling strategy of the paradynamics approach. The algorithm starts by computing low-accuracy fine-physics free-energy surfaces from the coarse-physics sampling in order to identify the reaction path and to select regions for targeted sampling. Thus, the algorithm does not rely on the coarse-physics minimum free-energy reaction path. Next, segments of high-accuracy free-energy surface are computed locally at selected regions from the targeted fine-physics sampling and are positioned relative to the coarse-physics free-energy shifts. The positioning is done by averaging the free-energy perturbations computed with multistep linear response approximation method. This method is analytically shown to provide results of the thermodynamic integration and the free-energy interpolation methods, while being extremely simple in implementation. Incorporating the metadynamics sampling to the algorithm is also briefly outlined. The application is demonstrated by calculating the B3LYP//6-31G*/MM free-energy barrier for an enzymatic reaction using a semiempirical PM6/MM reference potential. These modifications allow computing the activation free energies at a significantly reduced computational cost but at the same level of accuracy compared to computing full potential of mean force.

  10. Computing the Free Energy Barriers for Less by Sampling with a Coarse Reference Potential while Retaining Accuracy of the Target Fine Model

    PubMed Central

    2015-01-01

    Proposed in this contribution is a protocol for calculating fine-physics (e.g., ab initio QM/MM) free-energy surfaces at a high level of accuracy locally (e.g., only at reactants and at the transition state for computing the activation barrier) from targeted fine-physics sampling and extensive exploratory coarse-physics sampling. The full free-energy surface is still computed but at a lower level of accuracy from coarse-physics sampling. The method is analytically derived in terms of the umbrella sampling and the free-energy perturbation methods which are combined with the thermodynamic cycle and the targeted sampling strategy of the paradynamics approach. The algorithm starts by computing low-accuracy fine-physics free-energy surfaces from the coarse-physics sampling in order to identify the reaction path and to select regions for targeted sampling. Thus, the algorithm does not rely on the coarse-physics minimum free-energy reaction path. Next, segments of high-accuracy free-energy surface are computed locally at selected regions from the targeted fine-physics sampling and are positioned relative to the coarse-physics free-energy shifts. The positioning is done by averaging the free-energy perturbations computed with multistep linear response approximation method. This method is analytically shown to provide results of the thermodynamic integration and the free-energy interpolation methods, while being extremely simple in implementation. Incorporating the metadynamics sampling to the algorithm is also briefly outlined. The application is demonstrated by calculating the B3LYP//6-31G*/MM free-energy barrier for an enzymatic reaction using a semiempirical PM6/MM reference potential. These modifications allow computing the activation free energies at a significantly reduced computational cost but at the same level of accuracy compared to computing full potential of mean force. PMID:25136268

  11. Effect of Physical Education Teachers' Computer Literacy on Technology Use in Physical Education

    ERIC Educational Resources Information Center

    Kretschmann, Rolf

    2015-01-01

    Teachers' computer literacy has been identified as a factor that determines their technology use in class. The aim of this study was to investigate the relationship between physical education (PE) teachers' computer literacy and their technology use in PE. The study group consisted of 57 high school level in-service PE teachers. A survey was used…

  12. An Adaptive Mesh Algorithm: Mesh Structure and Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scannapieco, Anthony J.

    2016-06-21

    The purpose of Adaptive Mesh Refinement is to minimize spatial errors over the computational space not to minimize the number of computational elements. The additional result of the technique is that it may reduce the number of computational elements needed to retain a given level of spatial accuracy. Adaptive mesh refinement is a computational technique used to dynamically select, over a region of space, a set of computational elements designed to minimize spatial error in the computational model of a physical process. The fundamental idea is to increase the mesh resolution in regions where the physical variables are represented bymore » a broad spectrum of modes in k-space, hence increasing the effective global spectral coverage of those physical variables. In addition, the selection of the spatially distributed elements is done dynamically by cyclically adjusting the mesh to follow the spectral evolution of the system. Over the years three types of AMR schemes have evolved; block, patch and locally refined AMR. In block and patch AMR logical blocks of various grid sizes are overlaid to span the physical space of interest, whereas in locally refined AMR no logical blocks are employed but locally nested mesh levels are used to span the physical space. The distinction between block and patch AMR is that in block AMR the original blocks refine and coarsen entirely in time, whereas in patch AMR the patches change location and zone size with time. The type of AMR described herein is a locally refi ned AMR. In the algorithm described, at any point in physical space only one zone exists at whatever level of mesh that is appropriate for that physical location. The dynamic creation of a locally refi ned computational mesh is made practical by a judicious selection of mesh rules. With these rules the mesh is evolved via a mesh potential designed to concentrate the nest mesh in regions where the physics is modally dense, and coarsen zones in regions where the physics is modally sparse.« less

  13. Promoting Physical Activity through Hand-Held Computer Technology

    PubMed Central

    King, Abby C.; Ahn, David K.; Oliveira, Brian M.; Atienza, Audie A.; Castro, Cynthia M.; Gardner, Christopher D.

    2009-01-01

    Background Efforts to achieve population-wide increases in walking and similar moderate-intensity physical activities potentially can be enhanced through relevant applications of state-of-the-art interactive communication technologies. Yet few systematic efforts to evaluate the efficacy of hand-held computers and similar devices for enhancing physical activity levels have occurred. The purpose of this first-generation study was to evaluate the efficacy of a hand-held computer (i.e., personal digital assistant [PDA]) for increasing moderate intensity or more vigorous (MOD+) physical activity levels over 8 weeks in mid-life and older adults relative to a standard information control arm. Design Randomized, controlled 8-week experiment. Data were collected in 2005 and analyzed in 2006-2007. Setting/Participants Community-based study of 37 healthy, initially underactive adults aged 50 years and older who were randomized and completed the 8-week study (intervention=19, control=18). Intervention Participants received an instructional session and a PDA programmed to monitor their physical activity levels twice per day and provide daily and weekly individualized feedback, goal setting, and support. Controls received standard, age-appropriate written physical activity educational materials. Main Outcome Measure Physical activity was assessed via the Community Healthy Activities Model Program for Seniors (CHAMPS) questionnaire at baseline and 8 weeks. Results Relative to controls, intervention participants reported significantly greater 8-week mean estimated caloric expenditure levels and minutes per week in MOD+ activity (p<0.04). Satisfaction with the PDA was reasonably high in this largely PDA-naive sample. Conclusions Results from this first-generation study indicate that hand-held computers may be effective tools for increasing initial physical activity levels among underactive adults. PMID:18201644

  14. Computer vision uncovers predictors of physical urban change.

    PubMed

    Naik, Nikhil; Kominers, Scott Duke; Raskar, Ramesh; Glaeser, Edward L; Hidalgo, César A

    2017-07-18

    Which neighborhoods experience physical improvements? In this paper, we introduce a computer vision method to measure changes in the physical appearances of neighborhoods from time-series street-level imagery. We connect changes in the physical appearance of five US cities with economic and demographic data and find three factors that predict neighborhood improvement. First, neighborhoods that are densely populated by college-educated adults are more likely to experience physical improvements-an observation that is compatible with the economic literature linking human capital and local success. Second, neighborhoods with better initial appearances experience, on average, larger positive improvements-an observation that is consistent with "tipping" theories of urban change. Third, neighborhood improvement correlates positively with physical proximity to the central business district and to other physically attractive neighborhoods-an observation that is consistent with the "invasion" theories of urban sociology. Together, our results provide support for three classical theories of urban change and illustrate the value of using computer vision methods and street-level imagery to understand the physical dynamics of cities.

  15. Computer vision uncovers predictors of physical urban change

    PubMed Central

    Naik, Nikhil; Kominers, Scott Duke; Raskar, Ramesh; Glaeser, Edward L.; Hidalgo, César A.

    2017-01-01

    Which neighborhoods experience physical improvements? In this paper, we introduce a computer vision method to measure changes in the physical appearances of neighborhoods from time-series street-level imagery. We connect changes in the physical appearance of five US cities with economic and demographic data and find three factors that predict neighborhood improvement. First, neighborhoods that are densely populated by college-educated adults are more likely to experience physical improvements—an observation that is compatible with the economic literature linking human capital and local success. Second, neighborhoods with better initial appearances experience, on average, larger positive improvements—an observation that is consistent with “tipping” theories of urban change. Third, neighborhood improvement correlates positively with physical proximity to the central business district and to other physically attractive neighborhoods—an observation that is consistent with the “invasion” theories of urban sociology. Together, our results provide support for three classical theories of urban change and illustrate the value of using computer vision methods and street-level imagery to understand the physical dynamics of cities. PMID:28684401

  16. Effects of Computer-Assisted Jigsaw II Cooperative Learning Strategy on Physics Achievement and Retention

    ERIC Educational Resources Information Center

    Gambari, Isiaka Amosa; Yusuf, Mudasiru Olalere

    2016-01-01

    This study investigated the effects of computer-assisted Jigsaw II cooperative strategy on physics achievement and retention. The study also determined how moderating variables of achievement levels as it affects students' performance in physics when Jigsaw II cooperative learning is used as an instructional strategy. Purposive sampling technique…

  17. FPGA-Based High-Performance Embedded Systems for Adaptive Edge Computing in Cyber-Physical Systems: The ARTICo³ Framework.

    PubMed

    Rodríguez, Alfonso; Valverde, Juan; Portilla, Jorge; Otero, Andrés; Riesgo, Teresa; de la Torre, Eduardo

    2018-06-08

    Cyber-Physical Systems are experiencing a paradigm shift in which processing has been relocated to the distributed sensing layer and is no longer performed in a centralized manner. This approach, usually referred to as Edge Computing, demands the use of hardware platforms that are able to manage the steadily increasing requirements in computing performance, while keeping energy efficiency and the adaptability imposed by the interaction with the physical world. In this context, SRAM-based FPGAs and their inherent run-time reconfigurability, when coupled with smart power management strategies, are a suitable solution. However, they usually fail in user accessibility and ease of development. In this paper, an integrated framework to develop FPGA-based high-performance embedded systems for Edge Computing in Cyber-Physical Systems is presented. This framework provides a hardware-based processing architecture, an automated toolchain, and a runtime to transparently generate and manage reconfigurable systems from high-level system descriptions without additional user intervention. Moreover, it provides users with support for dynamically adapting the available computing resources to switch the working point of the architecture in a solution space defined by computing performance, energy consumption and fault tolerance. Results show that it is indeed possible to explore this solution space at run time and prove that the proposed framework is a competitive alternative to software-based edge computing platforms, being able to provide not only faster solutions, but also higher energy efficiency for computing-intensive algorithms with significant levels of data-level parallelism.

  18. Assessing the physical loading of wearable computers.

    PubMed

    Knight, James F; Baber, Chris

    2007-03-01

    Wearable computers enable workers to interact with computer equipment in situations where previously they were unable. Attaching a computer to the body though has an unknown physical effect. This paper reports a methodology for addressing this, by assessing postural effects and the effect of added weight. Using the example of arm-mounted computers (AMCs), the paper shows that adopting a posture to interact with an AMC generates fatiguing levels of stress and a load of 0.54 kg results in increased level of stress and increased rate of fatigue. The paper shows that, due to poor postures adopted when wearing and interacting with computers and the weight of the device attached to the body, one possible outcome for prolonged exposure is the development of musculoskeletal disorders.

  19. Computational electromagnetics: the physics of smooth versus oscillatory fields.

    PubMed

    Chew, W C

    2004-03-15

    This paper starts by discussing the difference in the physics between solutions to Laplace's equation (static) and Maxwell's equations for dynamic problems (Helmholtz equation). Their differing physical characters are illustrated by how the two fields convey information away from their source point. The paper elucidates the fact that their differing physical characters affect the use of Laplacian field and Helmholtz field in imaging. They also affect the design of fast computational algorithms for electromagnetic scattering problems. Specifically, a comparison is made between fast algorithms developed using wavelets, the simple fast multipole method, and the multi-level fast multipole algorithm for electrodynamics. The impact of the physical characters of the dynamic field on the parallelization of the multi-level fast multipole algorithm is also discussed. The relationship of diagonalization of translators to group theory is presented. Finally, future areas of research for computational electromagnetics are described.

  20. PSI for Low-Enrollment Junior-Senior Physics Courses

    ERIC Educational Resources Information Center

    Frahm, Charles P.; Young, Robert D.

    1976-01-01

    The administration of a Personalized System of Instruction (PSI) for junior-senior level courses in mechanics, electricity and magneturn, atomic physics, mathematical physics, physics and computers, astrophysics, and relativity is described. (CP)

  1. Greek Undergraduate Physical Education Students' Basic Computer Skills

    ERIC Educational Resources Information Center

    Adamakis, Manolis; Zounhia, Katerina

    2013-01-01

    The purposes of this study were to determine how undergraduate physical education (PE) students feel about their level of competence concerning basic computer skills and to examine possible differences between groups (gender, specialization, high school graduation type, and high school direction). Although many students and educators believe…

  2. Computers in Undergraduate Science Education. Conference Proceedings.

    ERIC Educational Resources Information Center

    Blum, Ronald, Ed.

    Six areas of computer use in undergraduate education, particularly in the fields of mathematics and physics, are discussed in these proceedings. The areas included are: the computational mode; computer graphics; the simulation mode; analog computing; computer-assisted instruction; and the current politics and management of college level computer…

  3. Open-System Quantum Annealing in Mean-Field Models with Exponential Degeneracy

    DTIC Science & Technology

    2016-08-25

    life quantum computers are inevitably affected by intrinsic noise resulting in dissipative nonunitary dynamics realized by these devices. We consider an... quantum computer . DOI: 10.1103/PhysRevX.6.021028 Subject Areas: Condensed Matter Physics, Quantum Physics, Quantum Information I. INTRODUCTION Quantum ... computing hardware is affected by a substantial level of intrinsic noise and therefore naturally realizes dis- sipative quantum dynamics [1,2

  4. Physics Problem Workbook, Instructor Manual.

    ERIC Educational Resources Information Center

    Jones, John L.

    This publication of Computer Oriented Materials Production for Undergraduate Teaching (COMPUTe), is intended to aid in the development of an autotutorial program for college-level undergraduate physics. Particularly in the area of mechanics, the author feels there is a need for a tutorial program which enables students to use a variety of…

  5. Computer-Based Physics: An Anthology.

    ERIC Educational Resources Information Center

    Blum, Ronald, Ed.

    Designed to serve as a guide for integrating interactive problem-solving or simulating computers into a college-level physics course, this anthology contains nine articles each of which includes an introduction, a student manual, and a teacher's guide. Among areas covered in the articles are the computerized reduction of data to a Gaussian…

  6. Algodoo: A Tool for Encouraging Creativity in Physics Teaching and Learning

    NASA Astrophysics Data System (ADS)

    Gregorcic, Bor; Bodin, Madelen

    2017-01-01

    Algodoo (http://www.algodoo.com) is a digital sandbox for physics 2D simulations. It allows students and teachers to easily create simulated "scenes" and explore physics through a user-friendly and visually attractive interface. In this paper, we present different ways in which students and teachers can use Algodoo to visualize and solve physics problems, investigate phenomena and processes, and engage in out-of-school activities and projects. Algodoo, with its approachable interface, inhabits a middle ground between computer games and "serious" computer modeling. It is suitable as an entry-level modeling tool for students of all ages and can facilitate discussions about the role of computer modeling in physics.

  7. The Impact of Internet Virtual Physics Laboratory Instruction on the Achievement in Physics, Science Process Skills and Computer Attitudes of 10th-Grade Students

    NASA Astrophysics Data System (ADS)

    Yang, Kun-Yuan; Heh, Jia-Sheng

    2007-10-01

    The purpose of this study was to investigate and compare the impact of Internet Virtual Physics Laboratory (IVPL) instruction with traditional laboratory instruction in physics academic achievement, performance of science process skills, and computer attitudes of tenth grade students. One-hundred and fifty students from four classes at one private senior high school in Taoyuan Country, Taiwan, R.O.C. were sampled. All four classes contained 75 students who were equally divided into an experimental group and a control group. The pre-test results indicated that the students' entry-level physics academic achievement, science process skills, and computer attitudes were equal for both groups. On the post-test, the experimental group achieved significantly higher mean scores in physics academic achievement and science process skills. There was no significant difference in computer attitudes between the groups. We concluded that the IVPL had potential to help tenth graders improve their physics academic achievement and science process skills.

  8. Computing by physical interaction in neurons.

    PubMed

    Aur, Dorian; Jog, Mandar; Poznanski, Roman R

    2011-12-01

    The electrodynamics of action potentials represents the fundamental level where information is integrated and processed in neurons. The Hodgkin-Huxley model cannot explain the non-stereotyped spatial charge density dynamics that occur during action potential propagation. Revealed in experiments as spike directivity, the non-uniform charge density dynamics within neurons carry meaningful information and suggest that fragments of information regarding our memories are endogenously stored in structural patterns at a molecular level and are revealed only during spiking activity. The main conceptual idea is that under the influence of electric fields, efficient computation by interaction occurs between charge densities embedded within molecular structures and the transient developed flow of electrical charges. This process of computation underlying electrical interactions and molecular mechanisms at the subcellular level is dissimilar from spiking neuron models that are completely devoid of physical interactions. Computation by interaction describes a more powerful continuous model of computation than the one that consists of discrete steps as represented in Turing machines.

  9. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  10. Transient Three-Dimensional Side Load Analysis of a Film Cooled Nozzle

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Guidos, Mike

    2008-01-01

    Transient three-dimensional numerical investigations on the side load physics for an engine encompassing a film cooled nozzle extension and a regeneratively cooled thrust chamber, were performed. The objectives of this study are to identify the three-dimensional side load physics and to compute the associated aerodynamic side load using an anchored computational methodology. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Ultimately, the computational results will be provided to the nozzle designers for estimating of effect of the peak side load on the nozzle structure. Computations simulating engine startup at ambient pressures corresponding to sea level and three high altitudes were performed. In addition, computations for both engine startup and shutdown transients were also performed for a stub nozzle, operating at sea level. For engine with the full nozzle extension, computational result shows starting up at sea level, the peak side load occurs when the lambda shock steps into the turbine exhaust flow, while the side load caused by the transition from free-shock separation to restricted-shock separation comes at second; and the side loads decreasing rapidly and progressively as the ambient pressure decreases. For the stub nozzle operating at sea level, the computed side loads during both startup and shutdown becomes very small due to the much reduced flow area.

  11. Assessing the Integration of Computational Modeling and ASU Modeling Instruction in the High School Physics Classroom

    NASA Astrophysics Data System (ADS)

    Aiken, John; Schatz, Michael; Burk, John; Caballero, Marcos; Thoms, Brian

    2012-03-01

    We describe the assessment of computational modeling in a ninth grade classroom in the context of the Arizona Modeling Instruction physics curriculum. Using a high-level programming environment (VPython), students develop computational models to predict the motion of objects under a variety of physical situations (e.g., constant net force), to simulate real world phenomenon (e.g., car crash), and to visualize abstract quantities (e.g., acceleration). The impact of teaching computation is evaluated through a proctored assignment that asks the students to complete a provided program to represent the correct motion. Using questions isomorphic to the Force Concept Inventory we gauge students understanding of force in relation to the simulation. The students are given an open ended essay question that asks them to explain the steps they would use to model a physical situation. We also investigate the attitudes and prior experiences of each student using the Computation Modeling in Physics Attitudinal Student Survey (COMPASS) developed at Georgia Tech as well as a prior computational experiences survey.

  12. PCs and Personal Health.

    ERIC Educational Resources Information Center

    Lombardi, Don

    1991-01-01

    Studies suggest that computer work stations may induce high levels of physical and psychological stress. Advises school districts to take a proactive stance on ergonomics. Cites laws and pending litigation regulating computer use in the workspace and offers guidelines for computer users. (MLF)

  13. A Methodological Study of a Computer-Managed Instructional Program in High School Physics.

    ERIC Educational Resources Information Center

    Denton, Jon James

    The purpose of this study was to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides in physics at the secondary school level. The sample consisted of three classes. Of these, two were randomly selected to serve as the treatment groups, e.g., individualized instruction and…

  14. Tensor Arithmetic, Geometric and Mathematic Principles of Fluid Mechanics in Implementation of Direct Computational Experiments

    NASA Astrophysics Data System (ADS)

    Bogdanov, Alexander; Khramushin, Vasily

    2016-02-01

    The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.

  15. High-precision arithmetic in mathematical physics

    DOE PAGES

    Bailey, David H.; Borwein, Jonathan M.

    2015-05-12

    For many scientific calculations, particularly those involving empirical data, IEEE 32-bit floating-point arithmetic produces results of sufficient accuracy, while for other applications IEEE 64-bit floating-point is more appropriate. But for some very demanding applications, even higher levels of precision are often required. Furthermore, this article discusses the challenge of high-precision computation, in the context of mathematical physics, and highlights what facilities are required to support future computation, in light of emerging developments in computer architecture.

  16. Computational Modeling of the Optical Rotation of Amino Acids: An "in Silico" Experiment for Physical Chemistry

    ERIC Educational Resources Information Center

    Simpson, Scott; Autschbach, Jochen; Zurek, Eva

    2013-01-01

    A computational experiment that investigates the optical activity of the amino acid valine has been developed for an upper-level undergraduate physical chemistry laboratory course. Hybrid density functional theory calculations were carried out for valine to confirm the rule that adding a strong acid to a solution of an amino acid in the l…

  17. A Methodological Study Evaluating a Pretutorial Computer-Compiled Instructional Program in High School Physics Instruction Initiated from Student-Teacher Selected Instructional Objectives. Final Report.

    ERIC Educational Resources Information Center

    Leonard, B. Charles; Denton, Jon J.

    A study sought to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides to account for the idiosyncratic variations among students in physics classes at the secondary school level. The students in the treatment groups were oriented toward the practices of selecting…

  18. The Use of Computer Competencies of Students in the Departments of Physical Education and Sport Teaching, and School Teaching

    ERIC Educational Resources Information Center

    Okan, Ilyas

    2016-01-01

    This study aims to reveal the levels of the use of computer, which is nowadays one of the most important technologies, of teacher candidate studying in the departments of Physical Education and Sport Teaching, and School teaching; also aims to research whether there is differences according to various criteria or not. In research, data were…

  19. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  20. BigData and computing challenges in high energy and nuclear physics

    NASA Astrophysics Data System (ADS)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  1. A Computational Experiment on Single-Walled Carbon Nanotubes

    ERIC Educational Resources Information Center

    Simpson, Scott; Lonie, David C.; Chen, Jiechen; Zurek, Eva

    2013-01-01

    A computational experiment that investigates single-walled carbon nanotubes (SWNTs) has been developed and employed in an upper-level undergraduate physical chemistry laboratory course. Computations were carried out to determine the electronic structure, radial breathing modes, and the influence of the nanotube's diameter on the…

  2. The Impact and Promise of Open-Source Computational Material for Physics Teaching

    NASA Astrophysics Data System (ADS)

    Christian, Wolfgang

    2017-01-01

    A computer-based modeling approach to teaching must be flexible because students and teachers have different skills and varying levels of preparation. Learning how to run the ``software du jour'' is not the objective for integrating computational physics material into the curriculum. Learning computational thinking, how to use computation and computer-based visualization to communicate ideas, how to design and build models, and how to use ready-to-run models to foster critical thinking is the objective. Our computational modeling approach to teaching is a research-proven pedagogy that predates computers. It attempts to enhance student achievement through the Modeling Cycle. This approach was pioneered by Robert Karplus and the SCIS Project in the 1960s and 70s and later extended by the Modeling Instruction Program led by Jane Jackson and David Hestenes at Arizona State University. This talk describes a no-cost open-source computational approach aligned with a Modeling Cycle pedagogy. Our tools, curricular material, and ready-to-run examples are freely available from the Open Source Physics Collection hosted on the AAPT-ComPADRE digital library. Examples will be presented.

  3. Step-by-step magic state encoding for efficient fault-tolerant quantum computation

    PubMed Central

    Goto, Hayato

    2014-01-01

    Quantum error correction allows one to make quantum computers fault-tolerant against unavoidable errors due to decoherence and imperfect physical gate operations. However, the fault-tolerant quantum computation requires impractically large computational resources for useful applications. This is a current major obstacle to the realization of a quantum computer. In particular, magic state distillation, which is a standard approach to universality, consumes the most resources in fault-tolerant quantum computation. For the resource problem, here we propose step-by-step magic state encoding for concatenated quantum codes, where magic states are encoded step by step from the physical level to the logical one. To manage errors during the encoding, we carefully use error detection. Since the sizes of intermediate codes are small, it is expected that the resource overheads will become lower than previous approaches based on the distillation at the logical level. Our simulation results suggest that the resource requirements for a logical magic state will become comparable to those for a single logical controlled-NOT gate. Thus, the present method opens a new possibility for efficient fault-tolerant quantum computation. PMID:25511387

  4. Step-by-step magic state encoding for efficient fault-tolerant quantum computation.

    PubMed

    Goto, Hayato

    2014-12-16

    Quantum error correction allows one to make quantum computers fault-tolerant against unavoidable errors due to decoherence and imperfect physical gate operations. However, the fault-tolerant quantum computation requires impractically large computational resources for useful applications. This is a current major obstacle to the realization of a quantum computer. In particular, magic state distillation, which is a standard approach to universality, consumes the most resources in fault-tolerant quantum computation. For the resource problem, here we propose step-by-step magic state encoding for concatenated quantum codes, where magic states are encoded step by step from the physical level to the logical one. To manage errors during the encoding, we carefully use error detection. Since the sizes of intermediate codes are small, it is expected that the resource overheads will become lower than previous approaches based on the distillation at the logical level. Our simulation results suggest that the resource requirements for a logical magic state will become comparable to those for a single logical controlled-NOT gate. Thus, the present method opens a new possibility for efficient fault-tolerant quantum computation.

  5. Programming languages and compiler design for realistic quantum hardware.

    PubMed

    Chong, Frederic T; Franklin, Diana; Martonosi, Margaret

    2017-09-13

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  6. Programming languages and compiler design for realistic quantum hardware

    NASA Astrophysics Data System (ADS)

    Chong, Frederic T.; Franklin, Diana; Martonosi, Margaret

    2017-09-01

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  7. Transverse and Quantum Effects in Light Control by Light; (A) Parallel Beams: Pump Dynamics for Three Level Superfluorescence; and (B) Counterflow Beams: An Algorithm for Transverse, Full Transient Effects in Optical Bi-Stability in a Fabryperot Cavity.

    DTIC Science & Technology

    1983-01-01

    The resolution of the compu- and also leads to an expression for "dz,"*. tational grid is thereby defined according to e the actual requirements of...computational economy are achieved simultaneously by redistributing the computational grid points according to the physical requirements of the problem...computational Eulerian grid points according to implemented using a two-dimensionl time- the physical requirements of the nonlinear dependent finite

  8. Physics Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1984

    1984-01-01

    Discusses: (1) Brewster's angle in the elementary laboratory; (2) color mixing by computer; (3) computer iteration at A-level; (4) a simple probe for pressure measurement; (5) the measurement of distance using a laser; and (6) an activity on Archimede's principle. (JN)

  9. Modeling the dynamics of multipartite quantum systems created departing from two-level systems using general local and non-local interactions

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco

    2017-12-01

    Quantum information is an emergent area merging physics, mathematics, computer science and engineering. To reach its technological goals, it is requiring adequate approaches to understand how to combine physical restrictions, computational approaches and technological requirements to get functional universal quantum information processing. This work presents the modeling and the analysis of certain general type of Hamiltonian representing several physical systems used in quantum information and establishing a dynamics reduction in a natural grammar for bipartite processing based on entangled states.

  10. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGrail, B.P.; Mahoney, L.A.

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less

  11. Comparisons of physical experiment and discrete element simulations of sheared granular materials in an annular shear cell

    USGS Publications Warehouse

    Ji, S.; Hanes, D.M.; Shen, H.H.

    2009-01-01

    In this study, we report a direct comparison between a physical test and a computer simulation of rapidly sheared granular materials. An annular shear cell experiment was conducted. All parameters were kept the same between the physical and the computational systems to the extent possible. Artificially softened particles were used in the simulation to reduce the computational time to a manageable level. Sensitivity study on the particle stiffness ensured such artificial modification was acceptable. In the experiment, a range of normal stress was applied to a given amount of particles sheared in an annular trough with a range of controlled shear speed. Two types of particles, glass and Delrin, were used in the experiment. Qualitatively, the required torque to shear the materials under different rotational speed compared well with those in the physical experiments for both the glass and the Delrin particles. However, the quantitative discrepancies between the measured and simulated shear stresses were nearly a factor of two. Boundary conditions, particle size distribution, particle damping and friction, including a sliding and rolling, contact force model, were examined to determine their effects on the computational results. It was found that of the above, the rolling friction between particles had the most significant effect on the macro stress level. This study shows that discrete element simulation is a viable method for engineering design for granular material systems. Particle level information is needed to properly conduct these simulations. However, not all particle level information is equally important in the study regime. Rolling friction, which is not commonly considered in many discrete element models, appears to play an important role. ?? 2009 Elsevier Ltd.

  12. Student Perceptions in the Design of a Computer Card Game for Learning Computer Literacy Issues: A Case Study

    ERIC Educational Resources Information Center

    Kordaki, Maria; Papastergiou, Marina; Psomos, Panagiotis

    2016-01-01

    The aim of this work was twofold. First, an empirical study was designed aimed at investigating the perceptions that entry-level non-computing majors--namely Physical Education and Sport Science (PESS) undergraduate students--hold about basic Computer Literacy (CL) issues. The participants were 90 first-year PESS students, and their perceptions…

  13. Computer Use and Computer Anxiety in Older Korean Americans.

    PubMed

    Yoon, Hyunwoo; Jang, Yuri; Xie, Bo

    2016-09-01

    Responding to the limited literature on computer use in ethnic minority older populations, the present study examined predictors of computer use and computer anxiety in older Korean Americans. Separate regression models were estimated for computer use and computer anxiety with the common sets of predictors: (a) demographic variables (age, gender, marital status, and education), (b) physical health indicators (chronic conditions, functional disability, and self-rated health), and (c) sociocultural factors (acculturation and attitudes toward aging). Approximately 60% of the participants were computer-users, and they had significantly lower levels of computer anxiety than non-users. A higher likelihood of computer use and lower levels of computer anxiety were commonly observed among individuals with younger age, male gender, advanced education, more positive ratings of health, and higher levels of acculturation. In addition, positive attitudes toward aging were found to reduce computer anxiety. Findings provide implications for developing computer training and education programs for the target population. © The Author(s) 2015.

  14. Computation of Chemical Shifts for Paramagnetic Molecules: A Laboratory Experiment for the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Pritchard, Benjamin P.; Simpson, Scott; Zurek, Eva; Autschbach, Jochen

    2014-01-01

    A computational experiment investigating the [superscript 1]H and [superscript 13]C nuclear magnetic resonance (NMR) chemical shifts of molecules with unpaired electrons has been developed and implemented. This experiment is appropriate for an upper-level undergraduate laboratory course in computational, physical, or inorganic chemistry. The…

  15. Success in introductory college physics: The role of gender, high school preparation, and student learning perceptions

    NASA Astrophysics Data System (ADS)

    Chen, Jean Chi-Jen

    Physics is fundamental for science, engineering, medicine, and for understanding many phenomena encountered in people's daily lives. The purpose of this study was to investigate the relationships between student success in college-level introductory physics courses and various educational and background characteristics. The primary variables of this study were gender, high school mathematics and science preparation, preference and perceptions of learning physics, and performance in introductory physics courses. Demographic characteristics considered were age, student grade level, parents' occupation and level of education, high school senior grade point average, and educational goals. A Survey of Learning Preference and Perceptions was developed to collect the information for this study. A total of 267 subjects enrolled in six introductory physics courses, four algebra-based and two calculus-based, participated in the study conducted during Spring Semester 2002. The findings from the algebra-based physics courses indicated that participant's educational goal, high school senior GPA, father's educational level, mother's educational level, and mother's occupation in the area of science, engineering, or computer technology were positively related to performance while participant age was negatively related. Biology preparation, mathematics preparation, and additional mathematics and science preparation in high school were also positively related to performance. The relationships between the primary variables and performance in calculus-based physics courses were limited to high school senior year GPA and high school physics preparation. Findings from all six courses indicated that participant's educational goal, high school senior GPA, father's educational level, and mother's occupation in the area of science, engineering, or computer technology, high school preparation in mathematics, biology, and the completion of additional mathematics and science courses were positively related to performance. No significant performance differences were found between male and female students. However, there were significant gender differences in physics learning perceptions. Female participants tended to try to understand physics materials and relate the physics problems to real world situations while their male counterparts tended to rely on rote learning and equation application. This study found that participants performed better by trying to understand the physics material and relate physics problems to real world situations. Participants who relied on rote learning did not perform well.

  16. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  17. Performance Modeling of Experimental Laser Lightcrafts

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Chen, Yen-Sen; Liu, Jiwen; Myrabo, Leik N.; Mead, Franklin B., Jr.; Turner, Jim (Technical Monitor)

    2001-01-01

    A computational plasma aerodynamics model is developed to study the performance of a laser propelled Lightcraft. The computational methodology is based on a time-accurate, three-dimensional, finite-difference, chemically reacting, unstructured grid, pressure-based formulation. The underlying physics are added and tested systematically using a building-block approach. The physics modeled include non-equilibrium thermodynamics, non-equilibrium air-plasma finite-rate kinetics, specular ray tracing, laser beam energy absorption and refraction by plasma, non-equilibrium plasma radiation, and plasma resonance. A series of transient computations are performed at several laser pulse energy levels and the simulated physics are discussed and compared with those of tests and literatures. The predicted coupling coefficients for the Lightcraft compared reasonably well with those of tests conducted on a pendulum apparatus.

  18. Adolescent Sedentary Behaviors: Correlates Differ for Television Viewing and Computer Use

    PubMed Central

    Babey, Susan H.; Hastert, Theresa A.; Wolstein, Joelle

    2013-01-01

    Purpose Sedentary behavior is associated with obesity in youth. Understanding correlates of specific sedentary behaviors can inform the development of interventions to reduce sedentary time. The current research examines correlates of leisure computer use and television viewing among California adolescents. Methods Using data from the 2005 California Health Interview Survey (CHIS), we examined individual, family and environmental correlates of two sedentary behaviors among 4,029 adolescents: leisure computer use and television watching. Results Linear regression analyses adjusting for a range of factors indicated several differences in the correlates of television watching and computer use. Correlates of additional time spent watching television included male sex, American Indian and African American race, lower household income, lower levels of physical activity, lower parent educational attainment, and additional hours worked by parents. Correlates of a greater amount of time spent using the computer for fun included older age, Asian race, higher household income, lower levels of physical activity, less parental knowledge of free time activities, and living in neighborhoods with higher proportions of non-white residents and higher proportions of low-income residents. Only physical activity was associated similarly with both watching television and computer use. Conclusions These results suggest that correlates of time spent on television watching and leisure computer use are different. Reducing screen time is a potentially successful strategy in combating childhood obesity, and understanding differences in the correlates of different screen time behaviors can inform the development of more effective interventions to reduce sedentary time. PMID:23260837

  19. Informatics and physics intersubject communications in the 7th and 8th grades of the basics level by means of computer modeling

    NASA Astrophysics Data System (ADS)

    Vasina, A. V.

    2017-01-01

    The author of the article imparts pedagogical experience of realization of intersubject communications of school basic courses of informatics, technology and physics through research activity of students with the use of specialized programs for the development and studying of computer models of physical processes. The considered technique is based on the principles of independent scholar activity of students, intersubject communications such as educational disciplines of technology, physics and informatics; it helps to develop the research activity of students and a professional and practical orientation of education. As an example the lesson of modeling of flotation with the use of the environment "1C Physical simulator" is considered.

  20. Exploring the Integration of Computational Modeling in the ASU Modeling Curriculum

    NASA Astrophysics Data System (ADS)

    Schatz, Michael; Aiken, John; Burk, John; Caballero, Marcos; Douglas, Scott; Thoms, Brian

    2012-03-01

    We describe the implementation of computational modeling in a ninth grade classroom in the context of the Arizona Modeling Instruction physics curriculum. Using a high-level programming environment (VPython), students develop computational models to predict the motion of objects under a variety of physical situations (e.g., constant net force), to simulate real world phenomenon (e.g., car crash), and to visualize abstract quantities (e.g., acceleration). We discuss how VPython allows students to utilize all four structures that describe a model as given by the ASU Modeling Instruction curriculum. Implications for future work will also be discussed.

  1. Quantum Computing Architectural Design

    NASA Astrophysics Data System (ADS)

    West, Jacob; Simms, Geoffrey; Gyure, Mark

    2006-03-01

    Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.

  2. Lock It Up! Computer Security.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    1997-01-01

    The data contained on desktop computer systems and networks pose security issues for virtually every district. Sensitive information can be protected by educating users, altering the physical layout, using password protection, designating access levels, backing up data, reformatting floppy disks, using antivirus software, and installing encryption…

  3. Computational physics in RISC environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhoades, C.E. Jr.

    The new high performance Reduced Instruction Set Computers (RISC) promise near Cray-level performance at near personal-computer prices. This paper explores the performance, conversion and compatibility issues associated with developing, testing and using our traditional, large-scale simulation models in the RISC environments exemplified by the IBM RS6000 and MISP R3000 machines. The questions of operating systems (CTSS versus UNIX), compilers (Fortran, C, pointers) and data are addressed in detail. Overall, it is concluded that the RISC environments are practical for a very wide range of computational physic activities. Indeed, all but the very largest two- and three-dimensional codes will work quitemore » well, particularly in a single user environment. Easily projected hardware-performance increases will revolutionize the field of computational physics. The way we do research will change profoundly in the next few years. There is, however, nothing more difficult to plan, nor more dangerous to manage than the creation of this new world.« less

  4. Computational physics in RISC environments. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhoades, C.E. Jr.

    The new high performance Reduced Instruction Set Computers (RISC) promise near Cray-level performance at near personal-computer prices. This paper explores the performance, conversion and compatibility issues associated with developing, testing and using our traditional, large-scale simulation models in the RISC environments exemplified by the IBM RS6000 and MISP R3000 machines. The questions of operating systems (CTSS versus UNIX), compilers (Fortran, C, pointers) and data are addressed in detail. Overall, it is concluded that the RISC environments are practical for a very wide range of computational physic activities. Indeed, all but the very largest two- and three-dimensional codes will work quitemore » well, particularly in a single user environment. Easily projected hardware-performance increases will revolutionize the field of computational physics. The way we do research will change profoundly in the next few years. There is, however, nothing more difficult to plan, nor more dangerous to manage than the creation of this new world.« less

  5. Nuclear and Particle Physics Simulations: The Consortium of Upper-Level Physics Software

    NASA Astrophysics Data System (ADS)

    Bigelow, Roberta; Moloney, Michael J.; Philpott, John; Rothberg, Joseph

    1995-06-01

    The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.

  6. P3: a practice focused learning environment

    NASA Astrophysics Data System (ADS)

    Irving, Paul W.; Obsniuk, Michael J.; Caballero, Marcos D.

    2017-09-01

    There has been an increased focus on the integration of practices into physics curricula, with a particular emphasis on integrating computation into the undergraduate curriculum of scientists and engineers. In this paper, we present a university-level, introductory physics course for science and engineering majors at Michigan State University called P3 (projects and practices in physics) that is centred around providing introductory physics students with the opportunity to appropriate various science and engineering practices. The P3 design integrates computation with analytical problem solving and is built upon a curriculum foundation of problem-based learning, the principles of constructive alignment and the theoretical framework of community of practice. The design includes an innovative approach to computational physics instruction, instructional scaffolds, and a unique approach to assessment that enables instructors to guide students in the development of the practices of a physicist. We present the very positive student related outcomes of the design gathered via attitudinal and conceptual inventories and research interviews of students’ reflecting on their experiences in the P3 classroom.

  7. CAL-laborate: A Collaborative Publication on the Use of Computer Aided Learning for Tertiary Level Physical Sciences and Geosciences.

    ERIC Educational Resources Information Center

    Fernandez, Anne, Ed.; Sproats, Lee, Ed.; Sorensen, Stacey, Ed.

    2000-01-01

    The science community has been trying to use computers in teaching for many years. There has been much conformity in how this was to be achieved, and the wheel has been re-invented again and again as enthusiast after enthusiast has "done their bit" towards getting computers accepted. Computers are now used by science undergraduates (as well as…

  8. Performance Modeling of an Experimental Laser Propelled Lightcraft

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Chen, Yen-Sen; Liu, Jiwen; Myrabo, Leik N.; Mead, Franklin B., Jr.

    2000-01-01

    A computational plasma aerodynamics model is developed to study the performance of an experimental laser propelled lightcraft. The computational methodology is based on a time-accurate, three-dimensional, finite-difference, chemically reacting, unstructured grid, pressure- based formulation. The underlying physics are added and tested systematically using a building-block approach. The physics modeled include non-equilibn'um thermodynamics, non-equilibrium air-plasma finite-rate kinetics, specular ray tracing, laser beam energy absorption and equi refraction by plasma, non-equilibrium plasma radiation, and plasma resonance. A series of transient computations are performed at several laser pulse energy levels and the simulated physics are discussed and compared with those of tests and literature. The predicted coupling coefficients for the lightcraft compared reasonably well with those of tests conducted on a pendulum apparatus.

  9. PREFACE: 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics & 38th National Conference on Theoretical Physics

    NASA Astrophysics Data System (ADS)

    2014-09-01

    This volume contains selected papers presented at the 38th National Conference on Theoretical Physics (NCTP-38) and the 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics (IWTCP-1). Both the conference and the workshop were held from 29 July to 1 August 2013 in Pullman hotel, Da Nang, Vietnam. The IWTCP-1 was a new activity of the Vietnamese Theoretical Physics Society (VTPS) organized in association with the 38th National Conference on Theoretical Physics (NCTP-38), the most well-known annual scientific forum dedicated to the dissemination of the latest development in the field of theoretical physics within the country. The IWTCP-1 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). The overriding goal of the IWTCP is to provide an international forum for scientists and engineers from academia to share ideas, problems and solution relating to the recent advances in theoretical physics as well as in computational physics. The main IWTCP motivation is to foster scientific exchanges between the Vietnamese theoretical and computational physics community and world-wide scientists as well as to promote high-standard level of research and education activities for young physicists in the country. About 110 participants coming from 10 countries participated in the conference and the workshop. 4 invited talks, 18 oral contributions and 46 posters were presented at the conference. In the workshop we had one keynote lecture and 9 invited talks presented by international experts in the fields of theoretical and computational physics, together with 14 oral and 33 poster contributions. The proceedings were edited by Nguyen Tri Lan, Trinh Xuan Hoang, and Nguyen Ai Viet. We would like to thank all invited speakers, participants and sponsors for making the conference and the workshop successful. Nguyen Ai Viet Chair of NCTP-38 and IWTCP-1

  10. Chaos: A Topic for Interdisciplinary Education in Physics

    ERIC Educational Resources Information Center

    Bae, Saebyok

    2009-01-01

    Since society and science need interdisciplinary works, the interesting topic of chaos is chosen for interdisciplinary education in physics. The educational programme contains various university-level activities such as computer simulations, chaos experiment and team projects besides ordinary teaching. According to the participants, the programme…

  11. Using Laboratory Homework to Facilitate Skill Integration and Assess Understanding in Intermediate Physics Courses

    NASA Astrophysics Data System (ADS)

    Johnston, Marty; Jalkio, Jeffrey

    2013-04-01

    By the time students have reached the intermediate level physics courses they have been exposed to a broad set of analytical, experimental, and computational skills. However, their ability to independently integrate these skills into the study of a physical system is often weak. To address this weakness and assess their understanding of the underlying physical concepts we have introduced laboratory homework into lecture based, junior level theoretical mechanics and electromagnetics courses. A laboratory homework set replaces a traditional one and emphasizes the analysis of a single system. In an exercise, students use analytical and computational tools to predict the behavior of a system and design a simple measurement to test their model. The laboratory portion of the exercises is straight forward and the emphasis is on concept integration and application. The short student reports we collect have revealed misconceptions that were not apparent in reviewing the traditional homework and test problems. Work continues on refining the current problems and expanding the problem sets.

  12. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  13. The role of individual differences on perceptions of wearable fitness device trust, usability, and motivational impact.

    PubMed

    Rupp, Michael A; Michaelis, Jessica R; McConnell, Daniel S; Smither, Janan A

    2018-07-01

    Lack of physical activity is a severe health concern in the United States with fewer than half of all Americans meeting the recommended weekly physical activity guidelines. Although wearable fitness devices can be effective in motivating people to be active, consumers are abandoning this technology soon after purchase. We examined the impact of several user (i.e. personality, age, computer self-efficacy, physical activity level) and device characteristics (trust, usability, and motivational affordances) on the behavioral intentions to use a wearable fitness device. Novice users completed a brief interaction with a fitness device similar to a first purchase experience before completing questionnaires about their interaction. We found computer self-efficacy, physical activity level, as well as personality traits indirectly increased the desire to use a fitness device and influenced the saliency of perceived motivational affordances. Additionally, trust, usability, and perceived motivational affordances were associated with increased intentions to use fitness devices. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. VizieR Online Data Catalog: Transition probabilities in TeII + TeIII spectra (Zhang+, 2013)

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Palmeri, P.; Quinet, P.; Biemont, E.

    2013-02-01

    Computed weighted oscillator strengths (loggf) and transition probabilities (gA) for Te II (Table 8) and Te III (Table 9). Transitions with wavelengths <1um, loggf>-1 and CF>0.05 are only quoted. Air wavelengths are given above 200 nm. In Table 8 the levels are taken from Kamida et al (Kamida, A., Ralchenko, Yu., Reader, J., and NIST ASD Team (2012). NIST Atomic Spectra Database (ver. 5.0), [Online]. Available: http://physics.nist.gov/asd [2012, September 20]. National Institute of Standards and Technology, Gaithersburg, MD.). In Table 9 the levels are those given in Tauheed & Naz (Tauheed, A., Naz, A. 2011, Journal of the Korean Physical Society 59, 2910) with the exceptions of the 5p6p levels which were taken from Kramida et al. The wavelengths were computed from the experimental levels of Kramida et al and Tauheed & Naz. (2 data files).

  15. Middle School Students' Body Mass Index and Physical Activity Levels in Physical Education

    ERIC Educational Resources Information Center

    Gao, Zan; Oh, Hyunju; Sheng, Huiping

    2011-01-01

    One of the most critical public concerns in the United States is the rapid increase in childhood obesity, partly due to the social and environmental changes (e.g., excessive TV and computer use, pressures of standardized testing, etc.) in the past few decades, which has resulted in less physical activity in school children's daily routines.…

  16. PLATO Based Computer Assisted Instruction: An Exploration.

    ERIC Educational Resources Information Center

    Wise, Richard L.

    This study focuses on student response to computer-assisted instruction (CAI) after it was introduced into a college level physical geography course, "Introduction to Weather and Climate." PLATO, a University of Illinois mainframe network developed in the 1960s, was selected for its user friendliness, its large supply of courseware, its…

  17. Teaching Computer Science Courses in Distance Learning

    ERIC Educational Resources Information Center

    Huan, Xiaoli; Shehane, Ronald; Ali, Adel

    2011-01-01

    As the success of distance learning (DL) has driven universities to increase the courses offered online, certain challenges arise when teaching computer science (CS) courses to students who are not physically co-located and have individual learning schedules. Teaching CS courses involves high level demonstrations and interactivity between the…

  18. The AAPT/ComPADRE Digital Library: Supporting Physics Education at All Levels

    NASA Astrophysics Data System (ADS)

    Mason, Bruce

    For more than a decade, the AAPT/ComPADRE Digital Library has been providing online resources, tools, and services that support broad communities of physics faculty and physics education researchers. This online library provides vetted resources for teachers and students, an environment for authors and developers to share their work, and the collaboration tools for a diverse set of users. This talk will focus on the recent collaborations and developments being hosted on or developed with ComPADRE. Examples include PhysPort, making the tools and resources developed by physics education researchers more accessible, the Open Source Physics project, expanding the use of numerical modeling at all levels of physics education, and PICUP, a community for those promoting computation in the physics curriculum. NSF-0435336, 0532798, 0840768, 0937836.

  19. Experiments with Helium-Filled Balloons

    ERIC Educational Resources Information Center

    Zable, Anthony C.

    2010-01-01

    The concepts of Newtonian mechanics, fluids, and ideal gas law physics are often treated as separate and isolated topics in the typical introductory college-level physics course, especially in the laboratory setting. To bridge these subjects, a simple experiment was developed that utilizes computer-based data acquisition sensors and a digital gram…

  20. Estimated activity patterns in British 45 year olds: cross-sectional findings from the 1958 British birth cohort.

    PubMed

    Parsons, T J; Thomas, C; Power, C

    2009-08-01

    To investigate patterns of, and associations between, physical activity at work and in leisure time, television viewing and computer use. 4531 men and 4594 women with complete plausible data, age 44-45 years, participating in the 1958 British birth cohort study. Physical activity, television viewing and computer use (hours/week) were estimated using a self-complete questionnaire and intensity (MET hours/week) derived for physical activity. Relationships were investigated using linear regression and chi(2) tests. From a target sample of 11,971, 9223 provided information on physical activity, of whom 75 and 47% provided complete and plausible activity data on work and leisure time activity respectively. Men and women spent a median of 40.2 and 34.2 h/week, respectively in work activity, and 8.3 and 5.8 h/week in leisure activity. Half of all participants watched television for > or =2 h/day, and half used a computer for <1 h/day. Longer work hours were not associated with a shorter duration of leisure activity, but were associated with a shorter duration of computer use (men only). In men, higher work MET hours were associated with higher leisure-time MET hours, and shorter durations of television viewing and computer use. Watching more television was related to fewer hours or MET hours of leisure activity, as was longer computer use in men. Longer computer use was related to more hours (or MET hours) in leisure activities in women. Physical activity levels at work and in leisure time in mid-adulthood are low. Television viewing (and computer use in men) may compete with leisure activity for time, whereas longer duration of work hours is less influential. To change active and sedentary behaviours, better understanding of barriers and motivators is needed.

  1. An investigation of the use of microcomputer-based laboratory simulations in promoting conceptual understanding in secondary physics instruction

    NASA Astrophysics Data System (ADS)

    Tomshaw, Stephen G.

    Physics education research has shown that students bring alternate conceptions to the classroom which can be quite resistant to traditional instruction methods (Clement, 1982; Halloun & Hestenes, 1985; McDermott, 1991). Microcomputer-based laboratory (MBL) experiments that employ an active-engagement strategy have been shown to improve student conceptual understanding in high school and introductory university physics courses (Thornton & Sokoloff, 1998). These (MBL) experiments require a specialized computer interface, type-specific sensors (e.g. motion detectors, force probes, accelerometers), and specialized software in addition to the standard physics experimental apparatus. Tao and Gunstone (1997) have shown that computer simulations used in an active engagement environment can also lead to conceptual change. This study investigated 69 secondary physics students' use of computer simulations of MBL activities in place of the hands-on MBL laboratory activities. The average normalized gain in students' conceptual understanding was measured using the Force and Motion Conceptual Evaluation (FMCE). Student attitudes towards physics and computers were probed using the Views About Science Survey (VASS) and the Computer Attitude Scale (CAS). While it may be possible to obtain an equivalent level of conceptual understanding using computer simulations in combination with an active-engagement environment, this study found no significant gains in students' conceptual understanding ( = -0.02) after they completed a series of nine simulated experiments from the Tools for Scientific Thinking curriculum (Thornton & Sokoloff, 1990). The absence of gains in conceptual understanding may indicate that either the simulations were ineffective in promoting conceptual change or problems with the implementation of the treatment inhibited its effectiveness. There was a positive shift in students' attitudes towards physics in the VASS dimensions of structure and reflective thinking, while there was a negative shift in students' attitudes towards computers in the CAS subscales of anxiety and usefulness. The negative shift in attitudes towards computers may be due to the additional time and work required by the students to perform the simulation experiments with no apparent reward in terms of their physics grade. Suggestions for future research include a qualitative element to observe student interactions and alternate formats for the simulations themselves.

  2. Universe creation on a computer

    NASA Astrophysics Data System (ADS)

    McCabe, Gordon

    The purpose of this paper is to provide an account of the epistemology and metaphysics of universe creation on a computer. The paper begins with F.J. Tipler's argument that our experience is indistinguishable from the experience of someone embedded in a perfect computer simulation of our own universe, hence we cannot know whether or not we are part of such a computer program ourselves. Tipler's argument is treated as a special case of epistemological scepticism, in a similar vein to 'brain-in-a-vat' arguments. It is argued that Tipler's hypothesis that our universe is a program running on a digital computer in another universe, generates empirical predictions, and is therefore a falsifiable hypothesis. The computer program hypothesis is also treated as a hypothesis about what exists beyond the physical world, and is compared with Kant's metaphysics of noumena. It is argued that if our universe is a program running on a digital computer, then our universe must have compact spatial topology, and the possibilities of observationally testing this prediction are considered. The possibility of testing the computer program hypothesis with the value of the density parameter Ω0 is also analysed. The informational requirements for a computer to represent a universe exactly and completely are considered. Consequent doubt is thrown upon Tipler's claim that if a hierarchy of computer universes exists, we would not be able to know which 'level of implementation' our universe exists at. It is then argued that a digital computer simulation of a universe, or any other physical system, does not provide a realisation of that universe or system. It is argued that a digital computer simulation of a physical system is not objectively related to that physical system, and therefore cannot exist as anything else other than a physical process occurring upon the components of the computer. It is concluded that Tipler's sceptical hypothesis, and a related hypothesis from Bostrom, cannot be true: it is impossible that our own experience is indistinguishable from the experience of somebody embedded in a digital computer simulation because it is impossible for anybody to be embedded in a digital computer simulation.

  3. The change in critical technologies for computational physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1990-01-01

    It is noted that the types of technology required for computational physics are changing as the field matures. Emphasis has shifted from computer technology to algorithm technology and, finally, to visual analysis technology as areas of critical research for this field. High-performance graphical workstations tied to a supercommunicator with high-speed communications along with the development of especially tailored visualization software has enabled analysis of highly complex fluid-dynamics simulations. Particular reference is made here to the development of visual analysis tools at NASA's Numerical Aerodynamics Simulation Facility. The next technology which this field requires is one that would eliminate visual clutter by extracting key features of simulations of physics and technology in order to create displays that clearly portray these key features. Research in the tuning of visual displays to human cognitive abilities is proposed. The immediate transfer of technology to all levels of computers, specifically the inclusion of visualization primitives in basic software developments for all work stations and PCs, is recommended.

  4. Nature as a network of morphological infocomputational processes for cognitive agents

    NASA Astrophysics Data System (ADS)

    Dodig-Crnkovic, Gordana

    2017-01-01

    This paper presents a view of nature as a network of infocomputational agents organized in a dynamical hierarchy of levels. It provides a framework for unification of currently disparate understandings of natural, formal, technical, behavioral and social phenomena based on information as a structure, differences in one system that cause the differences in another system, and computation as its dynamics, i.e. physical process of morphological change in the informational structure. We address some of the frequent misunderstandings regarding the natural/morphological computational models and their relationships to physical systems, especially cognitive systems such as living beings. Natural morphological infocomputation as a conceptual framework necessitates generalization of models of computation beyond the traditional Turing machine model presenting symbol manipulation, and requires agent-based concurrent resource-sensitive models of computation in order to be able to cover the whole range of phenomena from physics to cognition. The central role of agency, particularly material vs. cognitive agency is highlighted.

  5. BESIU Physical Analysis on Hadoop Platform

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Zang, Dongsong; Lei, Xiaofeng; Li, Qiang; Sun, Gongxing

    2014-06-01

    In the past 20 years, computing cluster has been widely used for High Energy Physics data processing. The jobs running on the traditional cluster with a Data-to-Computing structure, have to read large volumes of data via the network to the computing nodes for analysis, thereby making the I/O latency become a bottleneck of the whole system. The new distributed computing technology based on the MapReduce programming model has many advantages, such as high concurrency, high scalability and high fault tolerance, and it can benefit us in dealing with Big Data. This paper brings the idea of using MapReduce model to do BESIII physical analysis, and presents a new data analysis system structure based on Hadoop platform, which not only greatly improve the efficiency of data analysis, but also reduces the cost of system building. Moreover, this paper establishes an event pre-selection system based on the event level metadata(TAGs) database to optimize the data analyzing procedure.

  6. Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Xu; Tuo, Rui; Jeff Wu, C. F.

    Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less

  7. Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion

    DOE PAGES

    He, Xu; Tuo, Rui; Jeff Wu, C. F.

    2017-01-31

    Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less

  8. Computation in Classical Mechanics with Easy Java Simulations (EJS)

    NASA Astrophysics Data System (ADS)

    Cox, Anne J.

    2006-12-01

    Let your students enjoy creating animations and incorporating some computational physics into your Classical Mechanics course. This talk will demonstrate the use of an Open Source Physics package, Easy Java Simulations (EJS), in an already existing sophomore/junior level Classical Mechanics course. EJS allows for incremental introduction of computational physics into existing courses because it is easy to use (for instructors and students alike) and it is open source. Students can use this tool for numerical solutions to problems (as they can with commercial systems: Mathcad and Mathematica), but they can also generate their own animations. For example, students in Classical Mechanics use Lagrangian mechanics to solve a problem, and then use EJS not only to numerically solve the differential equations, but to show the associated motion (and check their answers). EJS, developed by Francisco Esquembre (http://fem.um.es/Ejs/), is built on the OpenSource Physics framework (http://www.opensourcephysics.org/) supported through NSF DUE0442581.

  9. WE-DE-202-00: Connecting Radiation Physics with Computational Biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  10. Probing Majorana modes in the tunneling spectra of a resonant level.

    PubMed

    Korytár, R; Schmitteckert, P

    2013-11-27

    Unambiguous identification of Majorana physics presents an outstanding problem whose solution could render topological quantum computing feasible. We develop a numerical approach to treat finite-size superconducting chains supporting Majorana modes, which is based on iterative application of a two-site Bogoliubov transformation. We demonstrate the applicability of the method by studying a resonant level attached to the superconductor subject to external perturbations. In the topological phase, we show that the spectrum of a single resonant level allows us to distinguish peaks coming from Majorana physics from the Kondo resonance.

  11. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  12. Levels and loops: the future of artificial intelligence and neuroscience.

    PubMed Central

    Bell, A J

    1999-01-01

    In discussing artificial intelligence and neuroscience, I will focus on two themes. The first is the universality of cycles (or loops): sets of variables that affect each other in such a way that any feed-forward account of causality and control, while informative, is misleading. The second theme is based around the observation that a computer is an intrinsically dualistic entity, with its physical set-up designed so as not to interfere with its logical set-up, which executes the computation. The brain is different. When analysed empirically at several different levels (cellular, molecular), it appears that there is no satisfactory way to separate a physical brain model (or algorithm, or representation), from a physical implementational substrate. When program and implementation are inseparable and thus interfere with each other, a dualistic point-of-view is impossible. Forced by empiricism into a monistic perspective, the brain-mind appears as neither embodied by or embedded in physical reality, but rather as identical to physical reality. This perspective has implications for the future of science and society. I will approach these from a negative point-of-view, by critiquing some of our millennial culture's popular projected futures. PMID:10670021

  13. An Assessment of Research-Doctorate Programs in the United States: Mathematical & Physical Sciences.

    ERIC Educational Resources Information Center

    Jones, Lyle V., Ed.; And Others

    The quality of doctoral-level chemistry (N=145), computer science (N=58), geoscience (N=91), mathematics (N=115), physics (N=123), and statistics/biostatistics (N=64) programs at United States universities was assessed, using 16 measures. These measures focused on variables related to: program size; characteristics of graduates; reputational…

  14. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  15. Peculiarities of organization of project and research activity of students in computer science, physics and technology

    NASA Astrophysics Data System (ADS)

    Stolyarov, I. V.

    2017-01-01

    The author of this article manages a project and research activity of students in the areas of computer science, physics, engineering and biology, basing on the acquired experience in these fields. Pupils constantly become winners of competitions and conferences of different levels, for example, three of the finalists of Intel ISEF in 2013 in Phoenix (Arizona, USA) and in 2014 in Los Angeles (California, USA). In 2013 A. Makarychev received the "Small Nobel prize" in Computer Science section and special award sponsors - the company's CAST. Scientific themes and methods suggested by the author and developed in joint publications of students from Russia, Germany and Austria are the patents for invention and certificates for registration in the ROSPATENT. The article presents the results of the implementation of specific software and hardware systems in physics, engineering and medicine.

  16. Health-Related Physical Fitness, BMI, physical activity and time spent at a computer screen in 6 and 7-year-old children from rural areas in Poland.

    PubMed

    Cieśla, Elżbieta; Mleczko, Edward; Bergier, Józef; Markowska, Małgorzata; Nowak-Starz, Grażyna

    2014-01-01

    The objective of the study was determination of the effect of various forms of physical activity, BMI, and time devoted to computer games on the level of Health-Related Physical Fitness (H-RF) in 6-7-year-old children from Polish rural areas. The study covered 25,816 children aged 6-7: 12,693 girls and 13,123 boys. The evaluations included body height and weight, and 4 H-RF fitness components (trunk strength, explosive leg power, arm strength and flexibility). The BMI was calculated for each child. The Questionnaire directed to parents was designed to collect information concerning the time devoted by children to computer games, spontaneous and additional physical activity. The strength of the relationships between dependent and independent variables was determined using the Spearman's rank correlation (RSp), and the relationship by using the regression analysis. The BMI negatively affected the level of all the H-RF components analysed (p=0.000). The negative effect of computer games revealed itself only with respect to flexibility (p=0.000), explosive leg power (p=0.000) and trunk muscle strength (p=0.000). A positive effect of spontaneous activity was observed for flexibility (p=0.047), explosive leg power (p=0.000), and arm strength (p=0.000). Additional activity showed a positive relationship with trunk muscles strength (p=0.000), and explosive leg power (p=0.000). The results of studies suggest that it is necessary to pay attention to the prevention of diseases related with the risk of obesity and overweight among Polish rural children as early as at pre-school age. There is also a need during education for shaping in these children the awareness of concern about own body, and the need for active participation in various forms of physical activity.

  17. Aeroelastic Modeling of a Nozzle Startup Transient

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2014-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a tightly coupled aeroelastic modeling algorithm by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed under the framework of modal analysis. Transient aeroelastic nozzle startup analyses at sea level were performed, and the computed transient nozzle fluid-structure interaction physics presented,

  18. A Computational Framework for Bioimaging Simulation.

    PubMed

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  19. Comparison of Gross Anatomy Test Scores Using Traditional Specimens vs. Quicktime Virtual Reality Animated Specimens

    ERIC Educational Resources Information Center

    Maza, Paul Sadiri

    2010-01-01

    In recent years, technological advances such as computers have been employed in teaching gross anatomy at all levels of education, even in professional schools such as medical and veterinary medical colleges. Benefits of computer based instructional tools for gross anatomy include the convenience of not having to physically view or dissect a…

  20. Studying the Effects of Nuclear Weapons Using a Slide-Rule Computer

    ERIC Educational Resources Information Center

    Shastri, Ananda

    2007-01-01

    This paper describes the construction of a slide-rule computer that allows one to quickly determine magnitudes of several effects that result from the detonation of a nuclear device. Suggestions for exercises are also included that allow high school and college-level physics students to explore scenarios involving these effects. It is hoped that…

  1. Exploration of factors that affect the comparative effectiveness of physical and virtual manipulatives in an undergraduate laboratory

    NASA Astrophysics Data System (ADS)

    Chini, Jacquelyn J.; Madsen, Adrian; Gire, Elizabeth; Rebello, N. Sanjay; Puntambekar, Sadhana

    2012-06-01

    Recent research results have failed to support the conventionally held belief that students learn physics best from hands-on experiences with physical equipment. Rather, studies have found that students who perform similar experiments with computer simulations perform as well or better on measures of conceptual understanding than their peers who used physical equipment. In this study, we explored how university-level nonscience majors’ understanding of the physics concepts related to pulleys was supported by experimentation with real pulleys and a computer simulation of pulleys. We report that when students use one type of manipulative (physical or virtual), the comparison is influenced both by the concept studied and the timing of the post-test. Students performed similarly on questions related to force and mechanical advantage regardless of the type of equipment used. On the other hand, students who used the computer simulation performed better on questions related to work immediately after completing the activities; however, the two groups performed similarly on the work questions on a test given one week later. Additionally, both sequences of experimentation (physical-virtual and virtual-physical) equally supported students’ understanding of all of the concepts. These results suggest that both the concept learned and the stability of learning gains should continue to be explored to improve educators’ ability to select the best learning experience for a given topic.

  2. PREFACE: 2nd International Workshop on Theoretical and Computational Physics (IWTCP-2): Modern Methods and Latest Results in Particle Physics, Nuclear Physics and Astrophysics and the 39th National Conference on Theoretical Physics (NCTP-39)

    NASA Astrophysics Data System (ADS)

    Hoang, Trinh Xuan; Ky, Nguyen Anh; Lan, Nguyen Tri; Viet, Nguyen Ai

    2015-06-01

    This volume contains selected papers presented at the 2nd International Workshop on Theoretical and Computational Physics (IWTCP-2): Modern Methods and Latest Results in Particle Physics, Nuclear Physics and Astrophysics and the 39th National Conference on Theoretical Physics (NCTP-39). Both the workshop and the conference were held from 28th - 31st July 2014 in Dakruco Hotel, Buon Ma Thuot, Dak Lak, Vietnam. The NCTP-39 and the IWTCP-2 were organized under the support of the Vietnamese Theoretical Physics Society, with a motivation to foster scientific exchanges between the theoretical and computational physicists in Vietnam and worldwide, as well as to promote high-standard level of research and education activities for young physicists in the country. The IWTCP-2 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). About 100 participants coming from nine countries participated in the workshop and the conference. At the IWTCP-2 workshop, we had 16 invited talks presented by international experts, together with eight oral and ten poster contributions. At the NCTP-39, three invited talks, 15 oral contributions and 39 posters were presented. We would like to thank all invited speakers, participants and sponsors for making the workshop and the conference successful. Trinh Xuan Hoang, Nguyen Anh Ky, Nguyen Tri Lan and Nguyen Ai Viet

  3. Computation of NLO processes involving heavy quarks using Loop-Tree Duality

    NASA Astrophysics Data System (ADS)

    Driencourt-Mangin, Félix

    2017-03-01

    We present a new method to compute higher-order corrections to physical cross-sections, at Next-to-Leading Order and beyond. This method, based on the Loop Tree Duality, leads to locally integrable expressions in four dimensions. By introducing a physically motivated momentum mapping between the momenta involved in the real and the virtual contributions, infrared singularities naturally cancel at integrand level, without the need to introduce subtraction counter-terms. Ultraviolet singularities are dealt with by using dual representations of suitable counter-terms, with some subtleties regarding the self-energy contributions. As an example, we apply this method to compute the 1 → 2 decay rate in the context of a scalar toy model with massive particles.

  4. Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models

    USGS Publications Warehouse

    Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.

    2011-01-01

    We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.

  5. An Eight-Parameter Function for Simulating Model Rocket Engine Thrust Curves

    ERIC Educational Resources Information Center

    Dooling, Thomas A.

    2007-01-01

    The toy model rocket is used extensively as an example of a realistic physical system. Teachers from grade school to the university level use them. Many teachers and students write computer programs to investigate rocket physics since the problem involves nonlinear functions related to air resistance and mass loss. This paper describes a nonlinear…

  6. Overview of Threats and Failure Models for Safety-Relevant Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This document presents a high-level overview of the threats to safety-relevant computer-based systems, including (1) a description of the introduction and activation of physical and logical faults; (2) the propagation of their effects; and (3) function-level and component-level error and failure mode models. These models can be used in the definition of fault hypotheses (i.e., assumptions) for threat-risk mitigation strategies. This document is a contribution to a guide currently under development that is intended to provide a general technical foundation for designers and evaluators of safety-relevant systems.

  7. Graphics Processing Units for HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.

    2016-07-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  8. Physics of epi-thermal boron neutron capture therapy (epi-thermal BNCT).

    PubMed

    Seki, Ryoichi; Wakisaka, Yushi; Morimoto, Nami; Takashina, Masaaki; Koizumi, Masahiko; Toki, Hiroshi; Fukuda, Mitsuhiro

    2017-12-01

    The physics of epi-thermal neutrons in the human body is discussed in the effort to clarify the nature of the unique radiologic properties of boron neutron capture therapy (BNCT). This discussion leads to the computational method of Monte Carlo simulation in BNCT. The method is discussed through two examples based on model phantoms. The physics is kept at an introductory level in the discussion in this tutorial review.

  9. WE-DE-202-01: Connecting Nanoscale Physics to Initial DNA Damage Through Track Structure Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, J.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  10. Television, computer, and video viewing; physical activity; and upper limb fracture risk in children: a population-based case control study.

    PubMed

    Ma, Deqiong; Jones, Graeme

    2003-11-01

    The effect of physical activity on upper limb fractures was examined in this population-based case control study with 321 age- and gender-matched pairs. Sports participation increased fracture risk in boys and decreased risk in girls. Television viewing had a deleterious dose response association with wrist and forearm fractures while light physical activity was protective. The aim of this population-based case control study was to examine the association between television, computer, and video viewing; types and levels of physical activity; and upper limb fractures in children 9-16 years of age. A total of 321 fracture cases and 321 randomly selected individually matched controls were studied. Television, computer, and video viewing and types and levels of physical activity were determined by interview-administered questionnaire. Bone strength was assessed by DXA and metacarpal morphometry. In general, sports participation increased total upper limb fracture risk in boys and decreased risk in girls. Gender-specific risk estimates were significantly different for total, contact, noncontact, and high-risk sports participation as well as four individual sports (soccer, cricket, surfing, and swimming). In multivariate analysis, time spent television, computer, and video viewing in both sexes was positively associated with wrist and forearm fracture risk (OR 1.6/category, 95% CI: 1.1-2.2), whereas days involved in light physical activity participation decreased fracture risk (OR 0.8/category, 95% CI: 0.7-1.0). Sports participation increased hand (OR 1.5/sport, 95% CI: 1.1-2.0) and upper arm (OR 29.8/sport, 95% CI: 1.7-535) fracture risk in boys only and decreased wrist and forearm fracture risk in girls only (OR 0.5/sport, 95% CI: 0.3-0.9). Adjustment for bone density and metacarpal morphometry did not alter these associations. There is gender discordance with regard to sports participation and fracture risk in children, which may reflect different approaches to sport. Importantly, television, computer, and video viewing has a dose-dependent association with wrist and forearm fractures, whereas light physical activity is protective. The mechanism is unclear but may involve bone-independent factors, or less likely, changes in bone quality not detected by DXA or metacarpal morphometry.

  11. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    NASA Astrophysics Data System (ADS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-09-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers.

  12. Instream Flows Incremental Methodology :Kootenai River, Montana : Final Report 1990-2000.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Greg; Skaar, Don; Dalbey, Steve

    2002-11-01

    Regulated rivers such as the Kootenai River below Libby Dam often exhibit hydrographs and water fluctuation levels that are atypical when compared to non-regulated rivers. These flow regimes are often different conditions than those which native fish species evolved with, and can be important limiting factors in some systems. Fluctuating discharge levels can change the quantity and quality of aquatic habitat for fish. The instream flow incremental methodology (IFIM) is a tool that can help water managers evaluate different discharges in terms of their effects on available habitat for a particular fish species. The U.S. Fish and Wildlife Service developedmore » the IFIM (Bovee 1982) to quantify changes in aquatic habitat with changes in instream flow (Waite and Barnhart 1992; Baldridge and Amos 1981; Gore and Judy 1981; Irvine et al. 1987). IFIM modeling uses hydraulic computer models to relate changes in discharge to changes in the physical parameters such as water depth, current velocity and substrate particle size, within the aquatic environment. Habitat utilization curves are developed to describe the physical habitat most needed, preferred or tolerated for a selected species at various life stages (Bovee and Cochnauer 1977; Raleigh et al. 1984). Through the use of physical habitat simulation computer models, hydraulic and physical variables are simulated for differing flows, and the amount of usable habitat is predicted for the selected species and life stages. The Kootenai River IFIM project was first initiated in 1990, with the collection of habitat utilization and physical hydraulic data through 1996. The physical habitat simulation computer modeling was completed from 1996 through 2000 with the assistance from Thomas Payne and Associates. This report summarizes the results of these efforts.« less

  13. WE-DE-202-02: Are Track Structure Simulations Truly Needed for Radiobiology at the Cellular and Tissue Levels?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  14. Leisure time computer use and adolescent bone health--findings from the Tromsø Study, Fit Futures: a cross-sectional study.

    PubMed

    Winther, Anne; Ahmed, Luai Awad; Furberg, Anne-Sofie; Grimnes, Guri; Jorde, Rolf; Nilsen, Ole Andreas; Dennison, Elaine; Emaus, Nina

    2015-04-22

    Low levels of physical activity may have considerable negative effects on bone health in adolescence, and increasing screen time in place of sporting activity during growth is worrying. This study explored the associations between self-reported screen time at weekends and bone mineral density (BMD). In 2010/2011, 1038 (93%) of the region's first-year upper-secondary school students (15-18 years) attended the Tromsø Study, Fit Futures 1 (FF1). A follow-up survey (FF2) took place in 2012/2013. BMD at total hip, femoral neck and total body was measured as g/cm(²) by dual X-ray absorptiometry (GE Lunar prodigy). Lifestyle variables were self-reported, including questions on hours per day spent in front of television/computer during weekends and hours spent on leisure time physical activities. Complete data sets for 388/312 girls and 359/231 boys at FF1/FF2, respectively, were used in analyses. Sex stratified multiple regression analyses were performed. Many adolescents balanced 2-4 h screen time with moderate or high physical activity levels. Screen time was positively related to body mass index (BMI) in boys (p=0.002), who spent more time in front of the computer than girls did (p<0.001). In boys, screen time was adversely associated with BMDFF1 at all sites, and these associations remained robust to adjustments for age, puberty, height, BMI, physical activity, vitamin D levels, smoking, alcohol, calcium and carbonated drink consumption (p<0.05). Screen time was also negatively associated with total hip BMD(FF2) (p=0.031). In contrast, girls who spent 4-6 h in front of the computer had higher BMD than the reference (<2 h). In Norwegian boys, time spent on screen-based sedentary activity was negatively associated with BMD levels; this relationship persisted 2 years later. Such negative associations were not present among girls. Whether this surprising result is explained by biological differences remains unclear. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Increasing exercise capacity and quality of life of patients with heart failure through Wii gaming: the rationale, design and methodology of the HF-Wii study; a multicentre randomized controlled trial.

    PubMed

    Jaarsma, Tiny; Klompstra, Leonie; Ben Gal, Tuvia; Boyne, Josiane; Vellone, Ercole; Bäck, Maria; Dickstein, Kenneth; Fridlund, Bengt; Hoes, Arno; Piepoli, Massimo F; Chialà, Oronzo; Mårtensson, Jan; Strömberg, Anna

    2015-07-01

    Exercise is known to be beneficial for patients with heart failure (HF), and these patients should therefore be routinely advised to exercise and to be or to become physically active. Despite the beneficial effects of exercise such as improved functional capacity and favourable clinical outcomes, the level of daily physical activity in most patients with HF is low. Exergaming may be a promising new approach to increase the physical activity of patients with HF at home. The aim of this study is to determine the effectiveness of the structured introduction and access to a Wii game computer in patients with HF to improve exercise capacity and level of daily physical activity, to decrease healthcare resource use, and to improve self-care and health-related quality of life. A multicentre randomized controlled study with two treatment groups will include 600 patients with HF. In each centre, patients will be randomized to either motivational support only (control) or structured access to a Wii game computer (Wii). Patients in the control group will receive advice on physical activity and will be contacted by four telephone calls. Patients in the Wii group also will receive advice on physical activity along with a Wii game computer, with instructions and training. The primary endpoint will be exercise capacity at 3 months as measured by the 6 min walk test. Secondary endpoints include exercise capacity at 6 and 12 months, level of daily physical activity, muscle function, health-related quality of life, and hospitalization or death during the 12 months follow-up. The HF-Wii study is a randomized study that will evaluate the effect of exergaming in patients with HF. The findings can be useful to healthcare professionals and improve our understanding of the potential role of exergaming in the treatment and management of patients with HF. NCT01785121. © 2015 The Authors. European Journal of Heart Failure © 2015 European Society of Cardiology.

  16. Parallel computing of a climate model on the dawn 1000 by domain decomposition method

    NASA Astrophysics Data System (ADS)

    Bi, Xunqiang

    1997-12-01

    In this paper the parallel computing of a grid-point nine-level atmospheric general circulation model on the Dawn 1000 is introduced. The model was developed by the Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS). The Dawn 1000 is a MIMD massive parallel computer made by National Research Center for Intelligent Computer (NCIC), CAS. A two-dimensional domain decomposition method is adopted to perform the parallel computing. The potential ways to increase the speed-up ratio and exploit more resources of future massively parallel supercomputation are also discussed.

  17. Using ecological momentary assessment to examine antecedents and correlates of physical activity bouts in adults age 50+ years: a pilot study.

    PubMed

    Dunton, Genevieve Fridlund; Atienza, Audie A; Castro, Cynthia M; King, Abby C

    2009-12-01

    National recommendations supporting the promotion of multiple short (10+ minute) physical activity bouts each day to increase overall physical activity levels in middle-aged and older adults underscore the need to identify antecedents and correlates of such daily physical activity episodes. This pilot study used Ecological Momentary Assessment to examine the time-lagged and concurrent effects of empirically supported social, cognitive, affective, and physiological factors on physical activity among adults age 50+ years. Participants (N = 23) responded to diary prompts on a handheld computer four times per day across a 2-week period. Moderate-to-vigorous physical activity (MVPA), self-efficacy, positive and negative affect, control, demand, fatigue, energy, social interactions, and stressful events were assessed during each sequence. Multivariate results showed that greater self-efficacy and control predicted greater MVPA at each subsequent assessment throughout the day (p < 0.05). Also, having a positive social interaction was concurrently related to higher levels of MVPA (p = 0.052). Time-varying multidimensional individual processes predict within daily physical activity levels.

  18. Comparison of gross anatomy test scores using traditional specimens vs. QuickTime Virtual Reality animated specimens

    NASA Astrophysics Data System (ADS)

    Maza, Paul Sadiri

    In recent years, technological advances such as computers have been employed in teaching gross anatomy at all levels of education, even in professional schools such as medical and veterinary medical colleges. Benefits of computer based instructional tools for gross anatomy include the convenience of not having to physically view or dissect a cadaver. Anatomy educators debate over the advantages versus the disadvantages of computer based resources for gross anatomy instruction. Many studies, case reports, and editorials argue for the increased use of computer based anatomy educational tools, while others discuss the necessity of dissection for various reasons important in learning anatomy, such as a three-dimensional physical view of the specimen, physical handling of tissues, interactions with fellow students during dissection, and differences between specific specimens. While many articles deal with gross anatomy education using computers, there seems to be a lack of studies investigating the use of computer based resources as an assessment tool for gross anatomy, specifically using the Apple application QuickTime Virtual Reality (QTVR). This study investigated the use of QTVR movie modules to assess if using computer based QTVR movie module assessments were equal in quality to actual physical specimen examinations. A gross anatomy course in the College of Veterinary Medicine at Cornell University was used as a source of anatomy students and gross anatomy examinations. Two groups were compared, one group taking gross anatomy examinations in a traditional manner, by viewing actual physical specimens and answering questions based on those specimens. The other group took the same examinations using the same specimens, but the specimens were viewed as simulated three-dimensional objects in a QTVR movie module. Sample group means for the assessments were compared. A survey was also administered asking students' perceptions of quality and user-friendliness of the QTVR movie modules. The comparison of the two sample group means of the examinations show that there was no difference in results between using QTVR movie modules to test gross anatomy knowledge versus using physical specimens. The results of this study are discussed to explain the benefits of using such computer based anatomy resources in gross anatomy assessments.

  19. Innovation and Persistence: The Evaluation of the C.U.P.L.E. Studio Physics Course.

    ERIC Educational Resources Information Center

    Cooper, Marie A.; O'Donnell, Angela M.

    The last decade has seen the development of a number of computer-based interactive physics programs at the university level. Set in a cognitive apprenticeship framework, such programs view the instructor as a mentor, and the essential learning constructed in a collaborative process. It is expected that such programs, grounded as they are in…

  20. Baseline Intraocular Pressure Is Associated with Subjective Sensitivity to Physical Exertion in Young Males

    ERIC Educational Resources Information Center

    Vera, Jesús; Jiménez, Raimundo; García, José Antonio; Perales, José Cesar; Cárdenas, David

    2018-01-01

    Purpose: The purposes of this study were to (a) investigate the effect of physical effort (cycling for 60 min at 60 ± 5% of individually computed reserve heart-rate capacity), combined with 2 different levels of cognitive demand (2-back, oddball), on intraocular pressure (IOP) and subjective judgments of perceived exertion (ratings of perceived…

  1. Decreasing excessive media usage while increasing physical activity: a single-subject research study.

    PubMed

    Larwin, Karen H; Larwin, David A

    2008-11-01

    The Kaiser Family Foundation released a report entitled Kids and Media Use in the United States that concluded that children's use of media--including television, computers, Internet, video games, and phones--may be one of the primary contributor's to the poor fitness and obesity of many of today's adolescents. The present study examines the potential of increasing physical activity and decreasing media usage in a 14-year-old adolescent female by making time spent on the Internet and/or cell phone contingent on physical activity. Results of this investigation indicate that requiring the participant to earn her media-usage time did correspond with an increase in physical activity and a decrease in media-usage time relative to baseline measures. Five weeks after cessation of the intervention, the participant's new level of physical activity was still being maintained. One year after the study, the participant's level of physical activity continued to increase.

  2. Spatial ability in secondary school students: intra-sex differences based on self-selection for physical education.

    PubMed

    Tlauka, Michael; Williams, Jennifer; Williamson, Paul

    2008-08-01

    Past research has demonstrated consistent sex differences with men typically outperforming women on tests of spatial ability. However, less is known about intra-sex effects. In the present study, two groups of female students (physical education and non-physical education secondary students) and two corresponding groups of male students explored a large-scale virtual shopping centre. In a battery of tasks, spatial knowledge of the shopping centre as well as mental rotation ability were tested. Additional variables considered were circulating testosterone levels, the ratio of 2D:4D digit length, and computer experience. The results revealed both sex and intra-sex differences in spatial ability. Variables related to virtual navigation and computer ability and experience were found to be the most powerful predictors of group membership. Our results suggest that in female and male secondary students, participation in physical education and spatial skill are related.

  3. Physics education through computational tools: the case of geometrical and physical optics

    NASA Astrophysics Data System (ADS)

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-09-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new relevant curricular material and, among them, just-in-time teaching (JiTT) has arisen as an effective and successful way to improve the content of classes. In this paper, we will show the implemented pedagogic strategies for the courses of geometrical and optical physics for students of optometry. Thus, the use of the GeoGebra software for the geometrical optics class and the employment of new in-house software for the physical optics class created using the high-level programming language Python is shown with the corresponding activities developed for each of these applets.

  4. Effectiveness of a Video-Versus Text-Based Computer-Tailored Intervention for Obesity Prevention after One Year: A Randomized Controlled Trial

    PubMed Central

    Cheung, Kei Long; Schwabe, Inga; Walthouwer, Michel J. L.; Oenema, Anke; de Vries, Hein

    2017-01-01

    Computer-tailored programs may help to prevent overweight and obesity, which are worldwide public health problems. This study investigated (1) the 12-month effectiveness of a video- and text-based computer-tailored intervention on energy intake, physical activity, and body mass index (BMI), and (2) the role of educational level in intervention effects. A randomized controlled trial in The Netherlands was conducted, in which adults were allocated to a video-based condition, text-based condition, or control condition, with baseline, 6 months, and 12 months follow-up. Outcome variables were self-reported BMI, physical activity, and energy intake. Mixed-effects modelling was used to investigate intervention effects and potential interaction effects. Compared to the control group, the video intervention group was effective regarding energy intake after 6 months (least squares means (LSM) difference = −205.40, p = 0.00) and 12 months (LSM difference = −128.14, p = 0.03). Only video intervention resulted in lower average daily energy intake after one year (d = 0.12). Educational role and BMI did not seem to interact with this effect. No intervention effects on BMI and physical activity were found. The video computer-tailored intervention was effective on energy intake after one year. This effect was not dependent on educational levels or BMI categories, suggesting that video tailoring can be effective for a broad range of risk groups and may be preferred over text tailoring. PMID:29065545

  5. Associations between physical activity of primary school first-graders during leisure time and family socioeconomic status.

    PubMed

    Dregval, Liudmila; Petrauskiene, Ausra

    2009-01-01

    In 2008, an international survey on obesity among first-graders and its risk factors was performed in Lithuania. The objective of this study was to assess physical activity of first-graders during leisure time according to family socioeconomic status. The study was performed in Siauliai region schools selected randomly in 2008. The anonymous questionnaires were distributed among 630 first-graders and filled out by 515 parents (response rate was 81.8%). It was showed that physical activity of first-graders during leisure time is insufficient. More than half of them (60.4%) did not attend sports or dancing clubs; children spent much time passively watching TV or playing on a computer. Mostly children watched TV for 2 hours on workdays (45.1%) and for 3 hours or more on weekends (41.4%). Mostly children spent about an hour per day playing on a computer: one-third of first-graders spent it on workdays; during weekends, the percentage of children spending about an hour per day playing on a computer was lower (28.5%). One-third of first-graders (36.9%) spent their leisure time outside for 3 or more hours on workdays and 87.1% on weekends independently of parents' educational level, income, and place of residence. The associations between family socioeconomic status and physical activity of children were observed. The lowest percentage of children attending sports or dancing clubs and playing computer games was seen in low-income families and families where parents had low educational level. They spent more time outside (on workdays) compared with those children whose parents had university education and high income. Fewer first-graders from families living in villages than those living in cities attended sports or dancing clubs and played on a computer, but more of them spent leisure time outside.

  6. Comparing the cognitive differences resulting from modeling instruction: Using computer microworld and physical object instruction to model real world problems

    NASA Astrophysics Data System (ADS)

    Oursland, Mark David

    This study compared the modeling achievement of students receiving mathematical modeling instruction using the computer microworld, Interactive Physics, and students receiving instruction using physical objects. Modeling instruction included activities where students applied the (a) linear model to a variety of situations, (b) linear model to two-rate situations with a constant rate, (c) quadratic model to familiar geometric figures. Both quantitative and qualitative methods were used to analyze achievement differences between students (a) receiving different methods of modeling instruction, (b) with different levels of beginning modeling ability, or (c) with different levels of computer literacy. Student achievement was analyzed quantitatively through a three-factor analysis of variance where modeling instruction, beginning modeling ability, and computer literacy were used as the three independent factors. The SOLO (Structure of the Observed Learning Outcome) assessment framework was used to design written modeling assessment instruments to measure the students' modeling achievement. The same three independent factors were used to collect and analyze the interviews and observations of student behaviors. Both methods of modeling instruction used the data analysis approach to mathematical modeling. The instructional lessons presented problem situations where students were asked to collect data, analyze the data, write a symbolic mathematical equation, and use equation to solve the problem. The researcher recommends the following practice for modeling instruction based on the conclusions of this study. A variety of activities with a common structure are needed to make explicit the modeling process of applying a standard mathematical model. The modeling process is influenced strongly by prior knowledge of the problem context and previous modeling experiences. The conclusions of this study imply that knowledge of the properties about squares improved the students' ability to model a geometric problem more than instruction in data analysis modeling. The uses of computer microworlds such as Interactive Physics in conjunction with cooperative groups are a viable method of modeling instruction.

  7. Perspective: Reaches of chemical physics in biology.

    PubMed

    Gruebele, Martin; Thirumalai, D

    2013-09-28

    Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry.

  8. Perspective: Reaches of chemical physics in biology

    PubMed Central

    Gruebele, Martin; Thirumalai, D.

    2013-01-01

    Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry. PMID:24089712

  9. Comparison of adult physical activity levels in three Swiss alpine communities with varying access to motorized transportation.

    PubMed

    Dombois, Oliver Thommen; Braun-Fahrländer, Charlotte; Martin-Diener, Eva

    2007-09-01

    To compare physical activity levels of residents of three Swiss alpine communities with varying access to motorized transport and to investigate whether socio-demographic factors, the settlement structure or means of transport affect these levels. Between January and February 2004 a computer assisted telephone interview was conducted with 901 randomly selected adults aged 18 years or older living in three Swiss alpine communities. In particular, information on moderate and vigorous intensity physical activities and on transport behaviour was collected. Respondents were categorized as 'sufficiently active' or 'insufficiently active' according to self-reported physical activity. People living in community 1 without access to motorized traffic were significantly more likely to be sufficiently active (Sex- and age-adjusted prevalences of sufficient total physical activity, 43.9% 95% CI: 38.3%-49.8%) compared to individuals living in the other two communities (community 2: 35.9%, 95% CI: 30.6%-41.6%, community 3: 32.7%, 95% CI: 27.5%-38.3%). The differences were due to higher levels of moderate physical activities. Vigorous physical activity levels did not differ between the communities. Community differences were explained by passive means of transport to work and for leisure time activities. Although the environment encountered in the three alpine communities is generally conducive to physical activity the majority of the participants did not achieve recommended activity levels. Passive mode of transport to work and during leisure time was strongly associated with insufficient total physical activity. Walking and cycling for transportation is thus a promising approach to promote health enhancing physical activity.

  10. Intention to be Physically Active is Influenced by Physical Activity and Fitness, Sedentary Behaviours, and Life Satisfaction in Adolescents.

    PubMed

    Grao-Cruces, Alberto; Fernández-Martínez, Antonio; Nuviala, Alberto; Pérez-Turpin, José A

    2015-09-01

    The aim of this study was to determine the association of levels of physical activity (PA), physical fitness (PF), sedentary lifestyle and life satisfaction with the intention to be physically active after secondary school graduation, in teenagers of both genders. A total of 1986 Spanish adolescents (12-16 years) participated in this cross-sectional study. PA, sedentary lifestyle, life satisfaction and intention to be physically active were assessed through validated questionnaires, and PF was evaluated objectively with the ALPHA battery tests. In both genders, adolescents who had significantly higher odds ratios (OR) of showing low intention to be physically active had low level of PA, cardiorespiratory fitness and muscular fitness in the lower body, and they were more sedentary in front of the computer. The girls that spent a lot of time watching TV and the boys with low life satisfaction also showed higher OR of having low intention to be physically active.

  11. A simple, low-cost, data logging pendulum built from a computer mouse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gintautas, Vadas; Hubler, Alfred

    Lessons and homework problems involving a pendulum are often a big part of introductory physics classes and laboratory courses from high school to undergraduate levels. Although laboratory equipment for pendulum experiments is commercially available, it is often expensive and may not be affordable for teachers on fixed budgets, particularly in developing countries. We present a low-cost, easy-to-build rotary sensor pendulum using the existing hardware in a ball-type computer mouse. We demonstrate how this apparatus may be used to measure both the frequency and coefficient of damping of a simple physical pendulum. This easily constructed laboratory equipment makes it possible formore » all students to have hands-on experience with one of the most important simple physical systems.« less

  12. Computer programs of information processing of nuclear physical methods as a demonstration material in studying nuclear physics and numerical methods

    NASA Astrophysics Data System (ADS)

    Bateev, A. B.; Filippov, V. P.

    2017-01-01

    The principle possibility of using computer program Univem MS for Mössbauer spectra fitting as a demonstration material at studying such disciplines as atomic and nuclear physics and numerical methods by students is shown in the article. This program is associated with nuclear-physical parameters such as isomer (or chemical) shift of nuclear energy level, interaction of nuclear quadrupole moment with electric field and of magnetic moment with surrounded magnetic field. The basic processing algorithm in such programs is the Least Square Method. The deviation of values of experimental points on spectra from the value of theoretical dependence is defined on concrete examples. This value is characterized in numerical methods as mean square deviation. The shape of theoretical lines in the program is defined by Gaussian and Lorentzian distributions. The visualization of the studied material on atomic and nuclear physics can be improved by similar programs of the Mössbauer spectroscopy, X-ray Fluorescence Analyzer or X-ray diffraction analysis.

  13. Tablet Computer Literacy Levels of the Physical Education and Sports Department Students

    ERIC Educational Resources Information Center

    Hergüner, Gülten

    2016-01-01

    Education systems are being affected in parallel by newly emerging hardware and new developments occurring in technology daily. Tablet usage especially is becoming ubiquitous in the teaching-learning processes in recent years. Therefore, using the tablets effectively, managing them and having a high level of tablet literacy play an important role…

  14. Computational fluid dynamics and frequency-dependent finite-difference time-domain method coupling for the interaction between microwaves and plasma in rocket plumes

    NASA Astrophysics Data System (ADS)

    Kinefuchi, K.; Funaki, I.; Shimada, T.; Abe, T.

    2012-10-01

    Under certain conditions during rocket flights, ionized exhaust plumes from solid rocket motors may interfere with radio frequency transmissions. To understand the relevant physical processes involved in this phenomenon and establish a prediction process for in-flight attenuation levels, we attempted to measure microwave attenuation caused by rocket exhaust plumes in a sea-level static firing test for a full-scale solid propellant rocket motor. The microwave attenuation level was calculated by a coupling simulation of the inviscid-frozen-flow computational fluid dynamics of an exhaust plume and detailed analysis of microwave transmissions by applying a frequency-dependent finite-difference time-domain method with the Drude dispersion model. The calculated microwave attenuation level agreed well with the experimental results, except in the case of interference downstream the Mach disk in the exhaust plume. It was concluded that the coupling estimation method based on the physics of the frozen plasma flow with Drude dispersion would be suitable for actual flight conditions, although the mixing and afterburning in the plume should be considered depending on the flow condition.

  15. Air, Ocean and Climate Monitoring Enhancing Undergraduate Training in the Physical, Environmental and Computer Sciences

    NASA Technical Reports Server (NTRS)

    Hope, W. W.; Johnson, L. P.; Obl, W.; Stewart, A.; Harris, W. C.; Craig, R. D.

    2000-01-01

    Faculty in the Department of Physical, Environmental and Computer Sciences strongly believe in the concept that undergraduate research and research-related activities must be integrated into the fabric of our undergraduate Science and Technology curricula. High level skills, such as problem solving, reasoning, collaboration and the ability to engage in research, are learned for advanced study in graduate school or for competing for well paying positions in the scientific community. One goal of our academic programs is to have a pipeline of research activities from high school to four year college, to graduate school, based on the GISS Institute on Climate and Planets model.

  16. Computational Thermomechanical Modelling of Early-Age Silicate Composites

    NASA Astrophysics Data System (ADS)

    Vala, J.; Št'astník, S.; Kozák, V.

    2009-09-01

    Strains and stresses in early-age silicate composites, widely used in civil engineering, especially in fresh concrete mixtures, in addition to those caused by exterior mechanical loads, are results of complicated non-deterministic physical and chemical processes. Their numerical prediction at the macro-scale level requires the non-trivial physical analysis based on the thermodynamic principles, making use of micro-structural information from both theoretical and experimental research. The paper introduces a computational model, based on a nonlinear system of macroscopic equations of evolution, supplied with certain effective material characteristics, coming from the micro-scale analysis, and sketches the algorithm for its numerical analysis.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.

  18. Superpersistent Currents in Dirac Fermion Systems

    DTIC Science & Technology

    2017-03-06

    development of quantum mechanics,, but also to quantum information processing and computing . Exploiting various physical systems to realize two-level...Here, using the QSD method, we calculated the dynamical trajectories of the system in the quantum regime. Our computations extending to the long time...currents in 2D Dirac material systems and pertinent phenomena in the emerging field of relativistic quantum nonlinear dynamics and chaos. Systematic

  19. High-Productivity Computing in Computational Physics Education

    NASA Astrophysics Data System (ADS)

    Tel-Zur, Guy

    2011-03-01

    We describe the development of a new course in Computational Physics at the Ben-Gurion University. This elective course for 3rd year undergraduates and MSc. students is being taught during one semester. Computational Physics is by now well accepted as the Third Pillar of Science. This paper's claim is that modern Computational Physics education should deal also with High-Productivity Computing. The traditional approach of teaching Computational Physics emphasizes ``Correctness'' and then ``Accuracy'' and we add also ``Performance.'' Along with topics in Mathematical Methods and case studies in Physics the course deals a significant amount of time with ``Mini-Courses'' in topics such as: High-Throughput Computing - Condor, Parallel Programming - MPI and OpenMP, How to build a Beowulf, Visualization and Grid and Cloud Computing. The course does not intend to teach neither new physics nor new mathematics but it is focused on an integrated approach for solving problems starting from the physics problem, the corresponding mathematical solution, the numerical scheme, writing an efficient computer code and finally analysis and visualization.

  20. Computational Physics for Space Flight Applications

    NASA Technical Reports Server (NTRS)

    Reed, Robert A.

    2004-01-01

    This paper presents viewgraphs on computational physics for space flight applications. The topics include: 1) Introduction to space radiation effects in microelectronics; 2) Using applied physics to help NASA meet mission objectives; 3) Example of applied computational physics; and 4) Future directions in applied computational physics.

  1. Adolescents' physical activity: competition between perceived neighborhood sport facilities and home media resources.

    PubMed

    Wong, Bonny Yee-Man; Cerin, Ester; Ho, Sai-Yin; Mak, Kwok-Kei; Lo, Wing-Sze; Lam, Tai-Hing

    2010-04-01

    To examine the independent, competing, and interactive effects of perceived availability of specific types of media in the home and neighborhood sport facilities on adolescents' leisure-time physical activity (PA). Survey data from 34 369 students in 42 Hong Kong secondary schools were collected (2006-07). Respondents reported moderate-to-vigorous leisure-time PA, presence of sport facilities in the neighborhood and of media equipment in the home. Being sufficiently physically active was defined as engaging in at least 30 minutes of non-school leisure-time PA on a daily basis. Logistic regression and post-estimation linear combinations of regression coefficients were used to examine the independent and competing effects of sport facilities and media equipment on leisure-time PA. Perceived availability of sport facilities was positively (OR(boys) = 1.17; OR(girls) = 1.26), and that of computer/Internet negatively (OR(boys) = 0.48; OR(girls) = 0.41), associated with being sufficiently active. A significant positive association between video game console and being sufficiently active was found in girls (OR(girls) = 1.19) but not in boys. Compared with adolescents without sport facilities and media equipment, those who reported sport facilities only were more likely to be physically active (OR(boys) = 1.26; OR(girls) = 1.34), while those who additionally reported computer/Internet were less likely to be physically active (OR(boys) = 0.60; OR(girls) = 0.54). Perceived availability of sport facilities in the neighborhood may positively impact on adolescents' level of physical activity. However, having computer/Internet may cancel out the effects of active opportunities in the neighborhood. This suggests that physical activity programs for adolescents need to consider limiting the access to computer-mediated communication as an important intervention component.

  2. Muons in the CMS High Level Trigger System

    NASA Astrophysics Data System (ADS)

    Verwilligen, Piet; CMS Collaboration

    2016-04-01

    The trigger systems of LHC detectors play a fundamental role in defining the physics capabilities of the experiments. A reduction of several orders of magnitude in the rate of collected events, with respect to the proton-proton bunch crossing rate generated by the LHC, is mandatory to cope with the limits imposed by the readout and storage system. An accurate and efficient online selection mechanism is thus required to fulfill the task keeping maximal the acceptance to physics signals. The CMS experiment operates using a two-level trigger system. Firstly a Level-1 Trigger (L1T) system, implemented using custom-designed electronics, is designed to reduce the event rate to a limit compatible to the CMS Data Acquisition (DAQ) capabilities. A High Level Trigger System (HLT) follows, aimed at further reducing the rate of collected events finally stored for analysis purposes. The latter consists of a streamlined version of the CMS offline reconstruction software and operates on a computer farm. It runs algorithms optimized to make a trade-off between computational complexity, rate reduction and high selection efficiency. With the computing power available in 2012 the maximum reconstruction time at HLT was about 200 ms per event, at the nominal L1T rate of 100 kHz. An efficient selection of muons at HLT, as well as an accurate measurement of their properties, such as transverse momentum and isolation, is fundamental for the CMS physics programme. The performance of the muon HLT for single and double muon triggers achieved in Run I will be presented. Results from new developments, aimed at improving the performance of the algorithms for the harsher scenarios of collisions per event (pile-up) and luminosity expected for Run II will also be discussed.

  3. Desktop publishing and validation of custom near visual acuity charts.

    PubMed

    Marran, Lynn; Liu, Lei; Lau, George

    2008-11-01

    Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.

  4. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  5. Usual Physical Activity and Hip Fracture in Older Men: An Application of Semiparametric Methods to Observational Data

    PubMed Central

    Mackey, Dawn C.; Hubbard, Alan E.; Cawthon, Peggy M.; Cauley, Jane A.; Cummings, Steven R.; Tager, Ira B.

    2011-01-01

    Few studies have examined the relation between usual physical activity level and rate of hip fracture in older men or applied semiparametric methods from the causal inference literature that estimate associations without assuming a particular parametric model. Using the Physical Activity Scale for the Elderly, the authors measured usual physical activity level at baseline (2000–2002) in 5,682 US men ≥65 years of age who were enrolled in the Osteoporotic Fractures in Men Study. Physical activity levels were classified as low (bottom quartile of Physical Activity Scale for the Elderly score), moderate (middle quartiles), or high (top quartile). Hip fractures were confirmed by central review. Marginal associations between physical activity and hip fracture were estimated with 3 estimation methods: inverse probability-of-treatment weighting, G-computation, and doubly robust targeted maximum likelihood estimation. During 6.5 years of follow-up, 95 men (1.7%) experienced a hip fracture. The unadjusted risk of hip fracture was lower in men with a high physical activity level versus those with a low physical activity level (relative risk = 0.51, 95% confidence interval: 0.28, 0.92). In semiparametric analyses that controlled confounding, hip fracture risk was not lower with moderate (e.g., targeted maximum likelihood estimation relative risk = 0.92, 95% confidence interval: 0.62, 1.44) or high (e.g., targeted maximum likelihood estimation relative risk = 0.88, 95% confidence interval: 0.53, 2.03) physical activity relative to low. This study does not support a protective effect of usual physical activity on hip fracture in older men. PMID:21303805

  6. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  7. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  8. Investigating the Usability and Efficacy of Customizable Computer Coaches for Introductory Physics Problem Solving

    NASA Astrophysics Data System (ADS)

    Aryal, Bijaya

    2016-03-01

    We have studied the impacts of web-based Computer Coaches on educational outputs and outcomes. This presentation will describe the technical and conceptual framework related to the Coaches and discuss undergraduate students' favorability of the Coaches. Moreover, its impacts on students' physics problem solving performance and on their conceptual understanding of physics will be reported. We used a qualitative research technique to collect and analyze interview data from 19 undergraduate students who used the Coaches in the interview setting. The empirical results show that the favorability and efficacy of the Computer Coaches differ considerably across students of different educational backgrounds, preparation levels, attitudes and epistemologies about physics learning. The interview data shows that female students tend to have more favorability supporting the use of the Coach. Likewise, our assessment suggests that female students seem to benefit more from the Coaches in their problem solving performance and in conceptual learning of physics. Finally, the analysis finds evidence that the Coach has potential for increasing efficiency in usage and for improving students' educational outputs and outcomes under its customized usage. This work was partially supported by the Center for Educational Innovation, Office of the Senior Vice President for Academic Affairs and Provost, University of Minnesota.

  9. Integrating numerical computation into the undergraduate education physics curriculum using spreadsheet excel

    NASA Astrophysics Data System (ADS)

    Fauzi, Ahmad

    2017-11-01

    Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.

  10. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  11. Reviews

    NASA Astrophysics Data System (ADS)

    2005-01-01

    WE RECOMMEND Advancing Physics CD Quick Tour This software makes the Advancing Physics CD easier to use. From Silicon to Computer This CD on computer technology operates like an electronic textbook. Powers of Ten This documentary film gives pupils a feel for the scale of our universe. Multimedia Waves The material on this CD demonstrates various wave phenomena. Infrared thermometer This instant response, remote sensor has numerous lab applications. Magic Universe, The Oxford Guide to Modern Science Acollection of short essays, this book is aimed at A-level students. Fermi Remembered Ajoy to read, this piece of non-fiction leaves you eager for more. Big Bang (lecture and book) Both the book and the lecture are engaging and hugely entertaining. WORTH A LOOK The Way Things Go Lasting just 30 minutes, this film will liven up any mechanics lesson. The Video Encyclopaedia of Physics Demonstrations It may blow your budget, but this DVD is a superb physics resource. Go!Link and Go!Temp Go!Link is a useful, cheap datalogger. Go!Temp seems superfluous. Cracker snaps Cheap and cheerful, cracker snaps can be used to demonstrate force. VPython This 3D animation freeware can be adapted to fit your needs. HANDLE WITH CARE Physics A-Level Presentations It might be better to generate slides yourself rather than modify these. London Planetarium and Madame Tussaud's A day out here is definitely not a worthwhile science excursion.

  12. Physical activity and screen time: trends in U.S. children aged 9-13 years, 2002-2006.

    PubMed

    Huhman, Marian; Lowry, Richard; Lee, Sarah M; Fulton, Janet E; Carlson, Susan A; Patnode, Carrie D

    2012-05-01

    We examined trends of physical activity and screen time among nationally representative samples of children aged 9-13 years to explore whether children overall are becoming less physically active and less likely to be in compliance with screen time recommendations. We analyzed Youth Media Campaign Longitudinal Survey data for trends and demographic patterns of free time and organized physical activity, and hours and minutes of watching television and playing video or computer games. Child-parent dyads for 2002 (N = 3114), 2004 (N = 5177), and 2006 (N = 1200) were analyzed. On the day before the interview, and for free time physical activity in the past week, children reported a significant increase in physical activity from 2002-2006. Screen time levels were stable overall; 76.4% of children met the recommendations of 2 hours or less of daily screen time. Levels of physical activity among U.S. children aged 9-13 years were stable, or levels slightly improved from 2002-2006. Except for some subgroup differences, trends for compliance with screen time recommendations were also stable from 2002-2006 for U.S. children aged 9-13 years.

  13. Modern Physics Simulations

    NASA Astrophysics Data System (ADS)

    Brandt, Douglas; Hiller, John R.; Moloney, Michael J.

    1995-10-01

    The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.

  14. HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics

    NASA Astrophysics Data System (ADS)

    Wiebusch, Martin

    2015-10-01

    This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.

  15. Federal Plan for High-End Computing. Report of the High-End Computing Revitalization Task Force (HECRTF)

    DTIC Science & Technology

    2004-07-01

    steadily for the past fifteen years, while memory latency and bandwidth have improved much more slowly. For example, Intel processor clock rates38 have... processor and memory performance) all greatly restrict the ability to achieve high levels of performance for science, engineering, and national...sub-nuclear distances. Guide experiments to identify transition from quantum chromodynamics to quark -gluon plasma. Accelerator Physics Accurate

  16. Li-ion synaptic transistor for low power analog computing

    DOE PAGES

    Fuller, Elliot J.; Gabaly, Farid El; Leonard, Francois; ...

    2016-11-22

    Nonvolatile redox transistors (NVRTs) based upon Li-ion battery materials are demonstrated as memory elements for neuromorphic computer architectures with multi-level analog states, “write” linearity, low-voltage switching, and low power dissipation. Simulations of back propagation using the device properties reach ideal classification accuracy. Finally, physics-based simulations predict energy costs per “write” operation of <10 aJ when scaled to 200 nm × 200 nm.

  17. Microcomputer Simulation of Real Gases--Part 1.

    ERIC Educational Resources Information Center

    Sperandeo-Mineo, R. M.; Tripi, G.

    1987-01-01

    Describes some simple computer programs designed to simulate the molecular dynamics of two-dimensional systems with a Lennard-Jones interaction potential. Discusses the use of the software in introductory physics courses at the high school and college level. (TW)

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, S.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  19. On the Use of a Standard Spreadsheet to Model Physical Systems in School Teaching

    ERIC Educational Resources Information Center

    Quale, Andreas

    2012-01-01

    In the teaching of physics at upper secondary school level (K10-K12), the students are generally taught to solve problems analytically, i.e. using the dynamics describing a system (typically in the form of differential equations) to compute its evolution in time, e.g. the motion of a body along a straight line or in a plane. This reduces the scope…

  20. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  1. Application of computational methods to analyse and investigate physical and chemical processes of high-temperature mineralizing of condensed substances in gas stream

    NASA Astrophysics Data System (ADS)

    Markelov, A. Y.; Shiryaevskii, V. L.; Kudrinskiy, A. A.; Anpilov, S. V.; Bobrakov, A. N.

    2017-11-01

    A computational method of analysis of physical and chemical processes of high-temperature mineralizing of low-level radioactive waste in gas stream in the process of plasma treatment of radioactive waste in shaft furnaces was introduced. It was shown that the thermodynamic simulation method allows fairly adequately describing the changes in the composition of the pyrogas withdrawn from the shaft furnace at different waste treatment regimes. This offers a possibility of developing environmentally and economically viable technologies and small-sized low-cost facilities for plasma treatment of radioactive waste to be applied at currently operating nuclear power plants.

  2. A study of the use of abstract types for the representation of engineering units in integration and test applications

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    Physical quantities using various units of measurement can be well represented in Ada by the use of abstract types. Computation involving these quantities (electric potential, mass, volume) can also automatically invoke the computation and checking of some of the implicitly associable attributes of measurements. Quantities can be held internally in SI units, transparently to the user, with automatic conversion. Through dimensional analysis, the type of the derived quantity resulting from a computation is known, thereby allowing dynamic checks of the equations used. The impact of the possible implementation of these techniques in integration and test applications is discussed. The overhead of computing and transporting measurement attributes is weighed against the advantages gained by their use. The construction of a run time interpreter using physical quantities in equations can be aided by the dynamic equation checks provided by dimensional analysis. The effects of high levels of abstraction on the generation and maintenance of software used in integration and test applications are also discussed.

  3. Applied Physics Education: PER focused on Physics-Intensive Careers

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin

    2017-01-01

    Physics education research is moving beyond classroom learning to study the application of physics education within STEM jobs and PhD-level research. Workforce-related PER is vital to supporting physics departments as they educate students for a diverse range of careers. Results from an on-going study involving interviews with entry-level employees, academic researchers, and supervisors in STEM jobs describe the ways that mathematics, physics, and communication are needed for workplace success. Math and physics are often used for solving ill-structured problems that involve data analysis, computational modeling, or hands-on work. Communication and collaboration are utilized in leadership, sales, and as way to transfer information capital throughout the organization through documentation, emails, memos, and face-to-face discussions. While managers and advisors think a physics degree typically establishes technical competency, communication skills are vetted through interviews and developed on the job. Significant learning continues after graduation, showing the importance of cultivating self-directed learning habits and the critical role of employers as educators of specialized technical abilities through on-the-job training. Supported by NSF DGE-1432578.

  4. [Systematic review on the physical activity level and nutritional status of Brazilian children].

    PubMed

    Graziele Bento, Gisele; Cascaes da Silva, Franciele; Gonçalves, Elizandra; Domingos Dos Santos, Patrícia; da Silva, Rudney

    2016-08-01

    Objective To systematically review the literature on the prevalence and the factors associated with physical activity level and nutritional status of Brazilian children. Methods The electronic database MEDLINE (via PubMed), SciELO, SCOPUS and Web of Science were selected. The search strategy included the descriptors proposed in the Medical Subject Headings (MeSH): "Motor Activity", "Activities", "Nutritional Status", "Overweight", "Obesity", "Body Mass Index", "Child", "Brazil". Results The search allowed the identification of 141 articles, of which 16 studies were considered potentially relevant and were included in the review. Conclusions Studies about nutritional status and physical activity levels in Brazilian children are still scarce, but the work on this has increased in recent years, especially those that use cross designs, as well as questionnaires to measure physical activity; BMI for nutritional status is still widely used. Furthermore, studies that analyzed the amount of hours designated to sedentary behaviors such as watching TV, playing video-games and using the computer, found that these activities took more than two hours every day.

  5. Teachers' self-assessed levels of preparation

    NASA Astrophysics Data System (ADS)

    White, Susan C.

    2016-02-01

    Every four years we survey a nationally representative sample of high school physics teachers. We define anyone who teaches at least one physics class to be a "physics teacher." About 40% of these teachers teach a majority of their classes in subjects other than physics. We also ask teachers to rate how well prepared they felt in various aspects of teaching. The response choices are "not adequately prepared," "adequately prepared," and "very well prepared." The accompanying figure shows the proportion of teachers who reported feeling adequately or very well prepared in the following aspects of teaching: • Basic physics knowledge, • Other science knowledge, • Application of physics to everyday experience, • Use of demonstrations, • Instructional laboratory design, • Use of computers in physics instruction and labs, and • Recent developments in physics.

  6. A reinterpretation of transparency perception in terms of gamut relativity.

    PubMed

    Vladusich, Tony

    2013-03-01

    Classical approaches to transparency perception assume that transparency constitutes a perceptual dimension corresponding to the physical dimension of transmittance. Here I present an alternative theory, termed gamut relativity, that naturally explains key aspects of transparency perception. Rather than being computed as values along a perceptual dimension corresponding to transmittance, gamut relativity postulates that transparency is built directly into the fabric of the visual system's representation of surface color. The theory, originally developed to explain properties of brightness and lightness perception, proposes how the relativity of the achromatic color gamut in a perceptual blackness-whiteness space underlies the representation of foreground and background surface layers. Whereas brightness and lightness perception were previously reanalyzed in terms of the relativity of the achromatic color gamut with respect to illumination level, transparency perception is here reinterpreted in terms of relativity with respect to physical transmittance. The relativity of the achromatic color gamut thus emerges as a fundamental computational principle underlying surface perception. A duality theorem relates the definition of transparency provided in gamut relativity with the classical definition underlying the physical blending models of computer graphics.

  7. The influence of leg-to-body ratio (LBR) on judgments of female physical attractiveness: assessments of computer-generated images varying in LBR.

    PubMed

    Frederick, David A; Hadji-Michael, Maria; Furnham, Adrian; Swami, Viren

    2010-01-01

    The leg-to-body ratio (LBR), which is reliably associated with developmental stability and health outcomes, is an understudied component of human physical attractiveness. Several studies examining the effects of LBR on aesthetic judgments have been limited by the reliance on stimuli composed of hand-drawn silhouettes. In the present study, we developed a new set of female computer-generated images portraying eight levels of LBR that fell within the typical range of human variation. A community sample of 207 Britons in London and students from two samples drawn from a US university (Ns=940, 114) rated the physical attractiveness of the images. We found that mid-ranging female LBRs were perceived as maximally attractive. The present research overcomes some of the problems associated with past work on LBR and aesthetic preferences through use of computer-generated images rather than hand-drawn images and provides an instrument that may be useful in future investigations of LBR preferences. Copyright 2009 Elsevier Ltd. All rights reserved.

  8. Summer Institute for Physical Science Teachers

    NASA Astrophysics Data System (ADS)

    Maheswaranathan, Ponn; Calloway, Cliff

    2007-04-01

    A summer institute for physical science teachers was conducted at Winthrop University, June 19-29, 2006. Ninth grade physical science teachers at schools within a 50-mile radius from Winthrop were targeted. We developed a graduate level physics professional development course covering selected topics from both the physics and chemistry content areas of the South Carolina Science Standards. Delivery of the material included traditional lectures and the following new approaches in science teaching: hands-on experiments, group activities, computer based data collection, computer modeling, with group discussions & presentations. Two experienced master teachers assisted us during the delivery of the course. The institute was funded by the South Carolina Department of Education. The requested funds were used for the following: faculty salaries, the University contract course fee, some of the participants' room and board, startup equipment for each teacher, and indirect costs to Winthrop University. Startup equipment included a Pasco stand-alone, portable Xplorer GLX interface with sensors (temperature, voltage, pH, pressure, motion, and sound), and modeling software (Wavefunction's Spartan Student and Odyssey). What we learned and ideas for future K-12 teacher preparation initiatives will be presented.

  9. Comparison of Different Methods of Grading a Level Turn Task on a Flight Simulator

    NASA Technical Reports Server (NTRS)

    Heath, Bruce E.; Crier, tomyka

    2003-01-01

    With the advancements in the computing power of personal computers, pc-based flight simulators and trainers have opened new avenues in the training of airplane pilots. It may be desirable to have the flight simulator make a quantitative evaluation of the progress of a pilot's training thereby reducing the physical requirement of the flight instructor who must, in turn, watch every flight. In an experiment, University students conducted six different flights, each consisting of two level turns. The flights were three minutes in duration. By evaluating videotapes, two certified flight instructors provided separate letter grades for each turn. These level turns were also evaluated using two other computer based grading methods. One method determined automated grades based on prescribed tolerances in bank angle, airspeed and altitude. The other method used was deviations in altitude and bank angle for performance index and performance grades.

  10. First results from a combined analysis of CERN computing infrastructure metrics

    NASA Astrophysics Data System (ADS)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  11. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  12. Computational fluid dynamics and frequency-dependent finite-difference time-domain method coupling for the interaction between microwaves and plasma in rocket plumes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinefuchi, K.; Funaki, I.; Shimada, T.

    Under certain conditions during rocket flights, ionized exhaust plumes from solid rocket motors may interfere with radio frequency transmissions. To understand the relevant physical processes involved in this phenomenon and establish a prediction process for in-flight attenuation levels, we attempted to measure microwave attenuation caused by rocket exhaust plumes in a sea-level static firing test for a full-scale solid propellant rocket motor. The microwave attenuation level was calculated by a coupling simulation of the inviscid-frozen-flow computational fluid dynamics of an exhaust plume and detailed analysis of microwave transmissions by applying a frequency-dependent finite-difference time-domain method with the Drude dispersion model.more » The calculated microwave attenuation level agreed well with the experimental results, except in the case of interference downstream the Mach disk in the exhaust plume. It was concluded that the coupling estimation method based on the physics of the frozen plasma flow with Drude dispersion would be suitable for actual flight conditions, although the mixing and afterburning in the plume should be considered depending on the flow condition.« less

  13. Individual and family environmental correlates of television and computer time in 10- to 12-year-old European children: the ENERGY-project.

    PubMed

    Verloigne, Maïté; Van Lippevelde, Wendy; Bere, Elling; Manios, Yannis; Kovács, Éva; Grillenberger, Monika; Maes, Lea; Brug, Johannes; De Bourdeaudhuij, Ilse

    2015-09-18

    The aim was to investigate which individual and family environmental factors are related to television and computer time separately in 10- to-12-year-old children within and across five European countries (Belgium, Germany, Greece, Hungary, Norway). Data were used from the ENERGY-project. Children and one of their parents completed a questionnaire, including questions on screen time behaviours and related individual and family environmental factors. Family environmental factors included social, political, economic and physical environmental factors. Complete data were obtained from 2022 child-parent dyads (53.8 % girls, mean child age 11.2 ± 0.8 years; mean parental age 40.5 ± 5.1 years). To examine the association between individual and family environmental factors (i.e. independent variables) and television/computer time (i.e. dependent variables) in each country, multilevel regression analyses were performed using MLwiN 2.22, adjusting for children's sex and age. In all countries, children reported more television and/or computer time, if children and their parents thought that the maximum recommended level for watching television and/or using the computer was higher and if children had a higher preference for television watching and/or computer use and a lower self-efficacy to control television watching and/or computer use. Most physical and economic environmental variables were not significantly associated with television or computer time. Slightly more individual factors were related to children's computer time and more parental social environmental factors to children's television time. We also found different correlates across countries: parental co-participation in television watching was significantly positively associated with children's television time in all countries, except for Greece. A higher level of parental television and computer time was only associated with a higher level of children's television and computer time in Hungary. Having rules regarding children's television time was related to less television time in all countries, except for Belgium and Norway. Most evidence was found for an association between screen time and individual and parental social environmental factors, which means that future interventions aiming to reduce screen time should focus on children's individual beliefs and habits as well parental social factors. As we identified some different correlates for television and computer time and across countries, cross-European interventions could make small adaptations per specific screen time activity and lay different emphases per country.

  14. Gamut relativity: a new computational approach to brightness and lightness perception.

    PubMed

    Vladusich, Tony

    2013-01-09

    This article deconstructs the conventional theory that "brightness" and "lightness" constitute perceptual dimensions corresponding to the physical dimensions of luminance and reflectance, and builds in its place the theory that brightness and lightness correspond to computationally defined "modes," rather than dimensions, of perception. According to the theory, called gamut relativity, "blackness" and "whiteness" constitute the perceptual dimensions (forming a two-dimensional "blackness-whiteness" space) underlying achromatic color perception (black, white, and gray shades). These perceptual dimensions are postulated to be related to the neural activity levels in the ON and OFF channels of vision. The theory unifies and generalizes a number of extant concepts in the brightness and lightness literature, such as simultaneous contrast, anchoring, and scission, and quantitatively simulates several challenging perceptual phenomena, including the staircase Gelb effect and the effects of task instructions on achromatic color-matching behavior, all with a single free parameter. The theory also provides a new conception of achromatic color constancy in terms of the relative distances between points in blackness-whiteness space. The theory suggests a host of striking conclusions, the most important of which is that the perceptual dimensions of vision should be generically specified according to the computational properties of the brain, rather than in terms of "reified" physical dimensions. This new approach replaces the computational goal of estimating absolute physical quantities ("inverse optics") with the goal of computing object properties relatively.

  15. XXV IUPAP Conference on Computational Physics (CCP2013): Preface

    NASA Astrophysics Data System (ADS)

    2014-05-01

    XXV IUPAP Conference on Computational Physics (CCP2013) was held from 20-24 August 2013 at the Russian Academy of Sciences in Moscow, Russia. The annual Conferences on Computational Physics (CCP) present an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas. The CCP series aims to draw computational scientists from around the world and to stimulate interdisciplinary discussion and collaboration by putting together researchers interested in various fields of computational science. It is organized under the auspices of the International Union of Pure and Applied Physics and has been in existence since 1989. The CCP series alternates between Europe, America and Asia-Pacific. The conferences are traditionally supported by European Physical Society and American Physical Society. This year the Conference host was Landau Institute for Theoretical Physics. The Conference contained 142 presentations, and, in particular, 11 plenary talks with comprehensive reviews from airbursts to many-electron systems. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), European Physical Society (EPS), Division of Computational Physics of American Physical Society (DCOMP/APS), Russian Foundation for Basic Research, Department of Physical Sciences of Russian Academy of Sciences, RSC Group company. Further conference information and images from the conference are available in the pdf.

  16. Longitudinal effects of college type and selectivity on degrees conferred upon undergraduate females in physical science, life science, math and computer science, and social science

    NASA Astrophysics Data System (ADS)

    Stevens, Stacy Mckimm

    There has been much research to suggest that a single-sex college experience for female undergraduate students can increase self-confidence and leadership ability during the college years and beyond. The results of previous studies also suggest that these students achieve in the workforce and enter graduate school at higher rates than their female peers graduating from coeducational institutions. However, some researchers have questioned these findings, suggesting that it is the selectivity level of the colleges rather than the comprised gender of the students that causes these differences. The purpose of this study was to justify the continuation of single-sex educational opportunities for females at the post-secondary level by examining the effects that college selectivity, college type, and time have on the rate of undergraduate females pursuing majors in non-traditional fields. The study examined the percentage of physical science, life science, math and computer science, and social science degrees conferred upon females graduating from women's colleges from 1985-2001, as compared to those at comparable coeducational colleges. Sampling for this study consisted of 42 liberal arts women's (n = 21) and coeducational (n = 21) colleges. Variables included the type of college, the selectivity level of the college, and the effect of time on the percentage of female graduates. Doubly multivariate repeated measures analysis of variance testing revealed significant main effects for college selectivity on social science graduates, and time on both life science and math and computer science graduates. Significant interaction was also found between the college type and time on social science graduates, as well as the college type, selectivity level, and time on math and computer science graduates. Implications of the results and suggestions for further research are discussed.

  17. Motor performance of tongue with a computer-integrated system under different levels of background physical exertion

    PubMed Central

    Huo, Xueliang; Johnson-Long, Ashley N.; Ghovanloo, Maysam; Shinohara, Minoru

    2015-01-01

    The purpose of this study was to compare the motor performance of tongue, using Tongue Drive System, to hand operation for relatively complex tasks under different levels of background physical exertion. Thirteen young able-bodied adults performed tasks that tested the accuracy and variability in tracking a sinusoidal waveform, and the performance in playing two video games that require accurate and rapid movements with cognitive processing using tongue and hand under two levels of background physical exertion. Results show additional background physical activity did not influence rapid and accurate displacement motor performance, but compromised the slow waveform tracking and shooting performances in both hand and tongue. Slow waveform tracking performance by the tongue was compromised with an additional motor or cognitive task, but with an additional motor task only for the hand. Practitioner Summary We investigated the influence of task complexity and background physical exertion on the motor performance of tongue and hand. Results indicate the task performance degrades with an additional concurrent task or physical exertion due to the limited attentional resources available for handling both the motor task and background exertion. PMID:24003900

  18. The QUANTGRID Project (RO)—Quantum Security in GRID Computing Applications

    NASA Astrophysics Data System (ADS)

    Dima, M.; Dulea, M.; Petre, M.; Petre, C.; Mitrica, B.; Stoica, M.; Udrea, M.; Sterian, R.; Sterian, P.

    2010-01-01

    The QUANTGRID Project, financed through the National Center for Programme Management (CNMP-Romania), is the first attempt at using Quantum Crypted Communications (QCC) in large scale operations, such as GRID Computing, and conceivably in the years ahead in the banking sector and other security tight communications. In relation with the GRID activities of the Center for Computing & Communications (Nat.'l Inst. Nucl. Phys.—IFIN-HH), the Quantum Optics Lab. (Nat.'l Inst. Plasma and Lasers—INFLPR) and the Physics Dept. (University Polytechnica—UPB) the project will build a demonstrator infrastructure for this technology. The status of the project in its incipient phase is reported, featuring tests for communications in classical security mode: socket level communications under AES (Advanced Encryption Std.), both proprietary code in C++ technology. An outline of the planned undertaking of the project is communicated, highlighting its impact in quantum physics, coherent optics and information technology.

  19. Use of the computer for research on student thinking in physics

    NASA Astrophysics Data System (ADS)

    Grayson, Diane J.; McDermott, Lillian C.

    1996-05-01

    This paper describes the use of the computer-based interview as a research technique for investigating how students think about physics. Two computer programs provide the context: one intended for instruction, the other for research. The one designed for use as an instructional aid displays the motion of a ball rolling along a track that has level and inclined segments. The associated motion graphs are also shown. The other program, which was expressly designed for use in research, is based on the simulated motion of a modified Atwood's machine. The programs require students to predict the effect of the initial conditions and system parameters on the motion or on a graph of the motion. The motion that would actually occur is then displayed. The investigation focuses on the reasoning used by the students as they try to resolve discrepancies between their predictions and observations.

  20. [Clinical and magnetic resonance imaging characteristics of isolated congenital anosmia].

    PubMed

    Liu, Jian-feng; Wang, Jian; You, Hui; Ni, Dao-feng; Yang, Da-zhang

    2010-05-25

    To report a series of patients with isolated congenital anosmia and summarize their clinical and magnetic resonance imaging (MRI) characteristics. Twenty patients with isolated congenital anosmia were reviewed retrospectively. A thorough medical and chemosensory history, physical examination, nasal endoscopy, T&T olfactory testing, olfactory event-related potentials, sinonasal computed tomography scan and magnetic resonance image of olfactory pathway were performed in all patients. Neither ENT physical examination nor nasal endoscopy was remarkable. Subjective olfactory testing indicated all of them were of anosmia. No olfactory event-related potentials to maximal stimulus were obtained. Computed tomography scan was normal. MRI revealed the absence of olfactory bulbs and tracts in all cases. And hypoplasia or aplasia of olfactory sulcus was found in all cases. All the patients had normal sex hormone level. The diagnosis of isolated congenital anosmia is established on chief complaints, physical examination, olfactory testing and olfactory imaging. MRI of olfactory pathway is indispensable.

  1. A New Interface Specification Methodology and its Application to Transducer Synthesis

    DTIC Science & Technology

    1988-05-01

    structural, and physical. Within each domain descriptive methods are distinguished by the level of abstraction they emphasize. The Gajski -Kuhn Y...4.2. The Gajski -Kuhn Y-chart’s three axes correspond to three different domains for describing designs: behavioral, structural, and physical. The...Gajski83] D. Gajski , R. Kuhn, Guest Editors’ Introduction: New VLSI Tools, IEEE Computer, Vol. 16, No. 12, December 1983. [Girczyc85] E. Girczyc, R

  2. [Physical activity in a probabilistic sample in the city of Rio de Janeiro].

    PubMed

    Gomes, V B; Siqueira, K S; Sichieri, R

    2001-01-01

    This study evaluated physical activity in a probabilistic sample of 4,331 individuals 12 years of age and older residing in the city of Rio de Janeiro, who participated in a household survey in 1996. Occupation and leisure activity were grouped according to categories of energy expenditure. The study also evaluated number of hours watching TV, using the computer, or playing video-games. Only 3.6% of males and 0.3% of females reported heavy occupational work. A full 59.8% of males and 77.8% of females reported never performing recreational physical activity, and there was an increase in this prevalence with age, especially for men. Women's leisure activities involved less energy expenditure and had a lower median duration than those of men. Mean daily TV/video/computer time was greater for women than for men. The greater the level of schooling, the higher the frequency of physical activity for both sexes. Analyzed jointly, these data show the low energy expenditure through physical activity by the population of the city of Rio de Janeiro. Women, the middle-aged, the elderly, and low-income individuals were at greatest risk of not performing recreational physical activity.

  3. Arthur L. Schawlow Prize in Laser Science Talk: Trapped Ion Quantum Networks with Light

    NASA Astrophysics Data System (ADS)

    Monroe, Christopher

    2015-05-01

    Laser-cooled atomic ions are standards for quantum information science, acting as qubit memories with unsurpassed levels of quantum coherence while also allowing near-perfect measurement. When qubit state-dependent optical dipole forces are applied to a collection of trapped ions, their Coulomb interaction is modulated in a way that allows the entanglement of the qubits through quantum gates that can form the basis of a quantum computer. Similar optical forces allow the simulation of quantum many-body physics, where recent experiments are approaching a level of complexity that cannot be modelled with conventional computers. Scaling to much larger numbers of qubits can be accomplished by coupling trapped ion qubits through optical photons, where entanglement over remote distances can be used for quantum communication and large-scale distributed quantum computers. Laser sources and quantum optical techniques are the workhorse for such quantum networks, and will continue to lead the way as future quantum hardware is developed. This work is supported by the ARO with funding from the IARPA MQCO program, the DARPA Quiness Program, the ARO MURI on Hybrid Quantum Circuits, the AFOSR MURIs on Quantum Transduction and Quantum Verification, and the NSF Physics Frontier Center at JQI.

  4. Rehabilitation Aids.

    ERIC Educational Resources Information Center

    National Center on Educational Media and Materials for the Handicapped, Columbus, OH.

    Selected from the National Instructional Materials Information System (NIMIS)--a computer based on-line interactive retrieval system on special education materials--the bibliography covers 40 equipment items for rehabilitation and physical therapy programs for all levels of handicapped children. Described are such items as a handygym, a suspension…

  5. 25 CFR 36.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... education emphasizing literacy in language arts, mathematics, natural and physical sciences, history, and related social sciences. Bureau means the Bureau of Indian Affairs of the Department of the Interior... specified level of mastery. Computer literacy used here means the general range of skills and understanding...

  6. 25 CFR 36.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... education emphasizing literacy in language arts, mathematics, natural and physical sciences, history, and related social sciences. Bureau means the Bureau of Indian Affairs of the Department of the Interior... specified level of mastery. Computer literacy used here means the general range of skills and understanding...

  7. 25 CFR 36.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... education emphasizing literacy in language arts, mathematics, natural and physical sciences, history, and related social sciences. Bureau means the Bureau of Indian Affairs of the Department of the Interior... specified level of mastery. Computer literacy used here means the general range of skills and understanding...

  8. Twenty Years of Symbiosis Between Art and Science

    ERIC Educational Resources Information Center

    Reichardt, Jasia

    1974-01-01

    During the past two decades advances in biology, nuclear physics, computer and material sciences, and audiovisual engineering have brought a radically new dimension to most art forms and have stimulated the artist and his innovations to breath-taking levels of achievement. (Editor/JR)

  9. Tracking at High Level Trigger in CMS

    NASA Astrophysics Data System (ADS)

    Tosi, M.

    2016-04-01

    The trigger systems of the LHC detectors play a crucial role in determining the physics capabilities of experiments. A reduction of several orders of magnitude of the event rate is needed to reach values compatible with detector readout, offline storage and analysis capability. The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger (L1T), implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a trade-off between the complexity of the algorithms, the sustainable output rate, and the selection efficiency. With the computing power available during the 2012 data taking the maximum reconstruction time at HLT was about 200 ms per event, at the nominal L1T rate of 100 kHz. Track reconstruction algorithms are widely used in the HLT, for the reconstruction of the physics objects as well as in the identification of b-jets and lepton isolation. Reconstructed tracks are also used to distinguish the primary vertex, which identifies the hard interaction process, from the pileup ones. This task is particularly important in the LHC environment given the large number of interactions per bunch crossing: on average 25 in 2012, and expected to be around 40 in Run II. We will present the performance of HLT tracking algorithms, discussing its impact on CMS physics program, as well as new developments done towards the next data taking in 2015.

  10. A Probabilistic Framework for the Validation and Certification of Computer Simulations

    NASA Technical Reports Server (NTRS)

    Ghanem, Roger; Knio, Omar

    2000-01-01

    The paper presents a methodology for quantifying, propagating, and managing the uncertainty in the data required to initialize computer simulations of complex phenomena. The purpose of the methodology is to permit the quantitative assessment of a certification level to be associated with the predictions from the simulations, as well as the design of a data acquisition strategy to achieve a target level of certification. The value of a methodology that can address the above issues is obvious, specially in light of the trend in the availability of computational resources, as well as the trend in sensor technology. These two trends make it possible to probe physical phenomena both with physical sensors, as well as with complex models, at previously inconceivable levels. With these new abilities arises the need to develop the knowledge to integrate the information from sensors and computer simulations. This is achieved in the present work by tracing both activities back to a level of abstraction that highlights their commonalities, thus allowing them to be manipulated in a mathematically consistent fashion. In particular, the mathematical theory underlying computer simulations has long been associated with partial differential equations and functional analysis concepts such as Hilbert spares and orthogonal projections. By relying on a probabilistic framework for the modeling of data, a Hilbert space framework emerges that permits the modeling of coefficients in the governing equations as random variables, or equivalently, as elements in a Hilbert space. This permits the development of an approximation theory for probabilistic problems that parallels that of deterministic approximation theory. According to this formalism, the solution of the problem is identified by its projection on a basis in the Hilbert space of random variables, as opposed to more traditional techniques where the solution is approximated by its first or second-order statistics. The present representation, in addition to capturing significantly more information than the traditional approach, facilitates the linkage between different interacting stochastic systems as is typically observed in real-life situations.

  11. Neuromorphic neural interfaces: from neurophysiological inspiration to biohybrid coupling with nervous systems

    NASA Astrophysics Data System (ADS)

    Broccard, Frédéric D.; Joshi, Siddharth; Wang, Jun; Cauwenberghs, Gert

    2017-08-01

    Objective. Computation in nervous systems operates with different computational primitives, and on different hardware, than traditional digital computation and is thus subjected to different constraints from its digital counterpart regarding the use of physical resources such as time, space and energy. In an effort to better understand neural computation on a physical medium with similar spatiotemporal and energetic constraints, the field of neuromorphic engineering aims to design and implement electronic systems that emulate in very large-scale integration (VLSI) hardware the organization and functions of neural systems at multiple levels of biological organization, from individual neurons up to large circuits and networks. Mixed analog/digital neuromorphic VLSI systems are compact, consume little power and operate in real time independently of the size and complexity of the model. Approach. This article highlights the current efforts to interface neuromorphic systems with neural systems at multiple levels of biological organization, from the synaptic to the system level, and discusses the prospects for future biohybrid systems with neuromorphic circuits of greater complexity. Main results. Single silicon neurons have been interfaced successfully with invertebrate and vertebrate neural networks. This approach allowed the investigation of neural properties that are inaccessible with traditional techniques while providing a realistic biological context not achievable with traditional numerical modeling methods. At the network level, populations of neurons are envisioned to communicate bidirectionally with neuromorphic processors of hundreds or thousands of silicon neurons. Recent work on brain-machine interfaces suggests that this is feasible with current neuromorphic technology. Significance. Biohybrid interfaces between biological neurons and VLSI neuromorphic systems of varying complexity have started to emerge in the literature. Primarily intended as a computational tool for investigating fundamental questions related to neural dynamics, the sophistication of current neuromorphic systems now allows direct interfaces with large neuronal networks and circuits, resulting in potentially interesting clinical applications for neuroengineering systems, neuroprosthetics and neurorehabilitation.

  12. Transforming the advanced lab: Part I - Learning goals

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin; Finkelstein, Noah; Lewandowski, H. J.

    2012-02-01

    Within the physics education research community relatively little attention has been given to laboratory courses, especially at the upper-division undergraduate level. As part of transforming our senior-level Optics and Modern Physics Lab at the University of Colorado Boulder we are developing learning goals, revising curricula, and creating assessments. In this paper, we report on the establishment of our learning goals and a surrounding framework that have emerged from discussions with a wide variety of faculty, from a review of the literature on labs, and from identifying the goals of existing lab courses. Our goals go beyond those of specific physics content and apparatus, allowing instructors to personalize them to their contexts. We report on four broad themes and associated learning goals: Modeling (math-physics-data connection, statistical error analysis, systematic error, modeling of engineered "black boxes"), Design (of experiments, apparatus, programs, troubleshooting), Communication, and Technical Lab Skills (computer-aided data analysis, LabVIEW, test and measurement equipment).

  13. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    ScienceCinema

    Arnold, Jeffrey

    2018-05-14

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided. About the speaker: Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  14. The contribution of walking to work to adult physical activity levels: a cross sectional study.

    PubMed

    Audrey, Suzanne; Procter, Sunita; Cooper, Ashley R

    2014-03-11

    To objectively examine the contribution to adult physical activity levels of walking to work. Employees (n = 103; 36.3 ± 11.7 years) at 17 workplaces in south-west England, who lived within 2 miles (3.2 km) of their workplace, wore Actigraph accelerometers for seven days during waking hours and carried GPS receivers during the commute to and from work. Physical activity volume (accelerometer counts per minute (cpm)) and intensity (minutes of moderate to vigorous physical activity (MVPA)) were computed overall and during the walk to work. Total weekday physical activity was 45% higher in participants who walked to work compared to those travelling by car (524.6. ± 170.4 vs 364.6 ± 138.4 cpm) and MVPA almost 60% higher (78.1 ± 24.9 vs 49.8 ± 25.2 minutes per day). No differences were seen in weekend physical activity, and sedentary time did not differ between the groups. Combined accelerometer and GPS data showed that walking to work contributed 47.3% of total weekday MVPA. Walking to work was associated with overall higher levels of physical activity in young and middle-aged adults. These data provide preliminary evidence to underpin the need for interventions to increase active commuting, specifically walking, in adults.

  15. The relationship between qualified personnel and self-reported implementation of recommended physical education practices and programs in U.S. schools.

    PubMed

    Davis, Kristen S; Burgeson, Charlene R; Brener, Nancy D; McManus, Tim; Wechsler, Howell

    2005-06-01

    The authors analyzed data from the School Health Policies and Programs Study 2000 to assess the associations between the presence of a district physical education coordinator and district-level physical education policies and practices recommended by federal government agencies and national organizations. The authors also examined the relationship between teacher qualifications and staff development related to physical education and self-reported implementation of recommended teachingpractices. District-level data were collected by self-administered mail questionnaires from a nationally representative sample of school districts. Classroom-level data were collected by computer-assisted personal interviews with teachers of randomly selected classes in elementary schools and randomly selected required physical education courses in middle/junior high and senior high schools. Nearly two thirds (62.2%) of districts had a physical education coordinator, and those were generally more likely than other districts to report having policies and practices that corresponded with national recommendations for high-quality physical education programs. More than two thirds of teachers (66.9%) met the criteria for teacher qualifications based on their education and certification. These teachers were more likely than others to report use of certain recommended physical education teaching practices. Teachers who participated in staff development also were more likely to use recommended teaching practices in their classrooms. Using a district physical education coordinator and teachers with appropriate qualifications as well as offering staff development opportunities on physical education may enhance school physical education programs.

  16. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    NASA Astrophysics Data System (ADS)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.

  17. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation analogue' of algorithmic information complexity. It is proven in that second paper that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.

  18. Physics-based subsurface visualization of human tissue.

    PubMed

    Sharp, Richard; Adams, Jacob; Machiraju, Raghu; Lee, Robert; Crane, Robert

    2007-01-01

    In this paper, we present a framework for simulating light transport in three-dimensional tissue with inhomogeneous scattering properties. Our approach employs a computational model to simulate light scattering in tissue through the finite element solution of the diffusion equation. Although our model handles both visible and nonvisible wavelengths, we especially focus on the interaction of near infrared (NIR) light with tissue. Since most human tissue is permeable to NIR light, tools to noninvasively image tumors, blood vasculature, and monitor blood oxygenation levels are being constructed. We apply this model to a numerical phantom to visually reproduce the images generated by these real-world tools. Therefore, in addition to enabling inverse design of detector instruments, our computational tools produce physically-accurate visualizations of subsurface structures.

  19. Assessment of physical activity of the human body considering the thermodynamic system.

    PubMed

    Hochstein, Stefan; Rauschenberger, Philipp; Weigand, Bernhard; Siebert, Tobias; Schmitt, Syn; Schlicht, Wolfgang; Převorovská, Světlana; Maršík, František

    2016-01-01

    Correctly dosed physical activity is the basis of a vital and healthy life, but the measurement of physical activity is certainly rather empirical resulting in limited individual and custom activity recommendations. Certainly, very accurate three-dimensional models of the cardiovascular system exist, however, requiring the numeric solution of the Navier-Stokes equations of the flow in blood vessels. These models are suitable for the research of cardiac diseases, but computationally very expensive. Direct measurements are expensive and often not applicable outside laboratories. This paper offers a new approach to assess physical activity using thermodynamical systems and its leading quantity of entropy production which is a compromise between computation time and precise prediction of pressure, volume, and flow variables in blood vessels. Based on a simplified (one-dimensional) model of the cardiovascular system of the human body, we develop and evaluate a setup calculating entropy production of the heart to determine the intensity of human physical activity in a more precise way than previous parameters, e.g. frequently used energy considerations. The knowledge resulting from the precise real-time physical activity provides the basis for an intelligent human-technology interaction allowing to steadily adjust the degree of physical activity according to the actual individual performance level and thus to improve training and activity recommendations.

  20. Microstructure Applications for Battery Design | Transportation Research |

    Science.gov Websites

    NREL Microstructure Applications for Battery Design Microstructure Applications for Battery Design NREL's Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) work includes simulating physics at the electrode microstructure level and created a virtual design tool for battery

  1. An Expert System Shell to Teach Problem Solving.

    ERIC Educational Resources Information Center

    Lippert, Renate C.

    1988-01-01

    Discusses the use of expert systems to teach problem-solving skills to students from grade 6 to college level. The role of computer technology in the future of education is considered, and the construction of knowledge bases is described, including an example for physics. (LRW)

  2. 2016 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runnels, Scott Robert; Bachrach, Harrison Ian; Carlson, Nils

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transportmore » and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.« less

  3. Gross anatomy of network security

    NASA Technical Reports Server (NTRS)

    Siu, Thomas J.

    2002-01-01

    Information security involves many branches of effort, including information assurance, host level security, physical security, and network security. Computer network security methods and implementations are given a top-down description to permit a medically focused audience to anchor this information to their daily practice. The depth of detail of network functionality and security measures, like that of the study of human anatomy, can be highly involved. Presented at the level of major gross anatomical systems, this paper will focus on network backbone implementation and perimeter defenses, then diagnostic tools, and finally the user practices (the human element). Physical security measures, though significant, have been defined as beyond the scope of this presentation.

  4. FY08 DRMRP Clinical Trial: Strengthening Pathways to PTSD Recovery Using Systems-Level Intervention

    DTIC Science & Technology

    2015-09-01

    telephone cognitive-behavioral therapy , continuous RN nurse care management, and computer-automated care management support. Both arms can refer patients... physically occurring at the study sites. These closure reports were approved by the local DDEAMC and lead WRNMMC IRBs in May 2015 and by HRPO in June... physical symptom burden (as measured by the PHQ-15), improved mental health functioning (as measured by the SF-12 mental component), but no changes for

  5. A Randomized Effectiveness Trial of a Systems-Level Approach to Stepped Care for War-Related PTSD

    DTIC Science & Technology

    2015-09-01

    behavioral therapy , continuous RN nurse care management, and computer-automated care management support. Both arms can refer patients for mental health... physically occurring at the study sites. These closure reports were approved by the local DDEAMC and lead WRNMMC IRBs in May 2015 and by HRPO in June...significantly associated with decreased physical symptom burden (as measured by the PHQ-15), improved mental health functioning (as measured by the

  6. Comparing levels of school performance to science teachers' reports on knowledge/skills, instructional use and student use of computers

    NASA Astrophysics Data System (ADS)

    Kerr, Rebecca

    The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and interview responses from fifth grade and eighth grade general and physical science teachers. Even though they may not be generalizable to other teachers or classrooms due to a low response rate, findings from this study indicated teachers with fewer years of teaching science had a higher level of computer use but less computer access, especially for students, in the classroom. Furthermore, teachers' choice of professional development moderated the relationship between the level of school performance and teachers' knowledge/skills, with the most positive relationship being with workshops that occurred outside of the school. Eighteen interviews revealed that teachers perceived the role of technology in classroom instruction mainly as teacher-centered and supplemental, rather than student-centered activities.

  7. Physical Computing and Its Scope--Towards a Constructionist Computer Science Curriculum with Physical Computing

    ERIC Educational Resources Information Center

    Przybylla, Mareen; Romeike, Ralf

    2014-01-01

    Physical computing covers the design and realization of interactive objects and installations and allows students to develop concrete, tangible products of the real world, which arise from the learners' imagination. This can be used in computer science education to provide students with interesting and motivating access to the different topic…

  8. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    PubMed

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  9. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†

    PubMed Central

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-01-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370

  10. Fast Particle Methods for Multiscale Phenomena Simulations

    NASA Technical Reports Server (NTRS)

    Koumoutsakos, P.; Wray, A.; Shariff, K.; Pohorille, Andrew

    2000-01-01

    We are developing particle methods oriented at improving computational modeling capabilities of multiscale physical phenomena in : (i) high Reynolds number unsteady vortical flows, (ii) particle laden and interfacial flows, (iii)molecular dynamics studies of nanoscale droplets and studies of the structure, functions, and evolution of the earliest living cell. The unifying computational approach involves particle methods implemented in parallel computer architectures. The inherent adaptivity, robustness and efficiency of particle methods makes them a multidisciplinary computational tool capable of bridging the gap of micro-scale and continuum flow simulations. Using efficient tree data structures, multipole expansion algorithms, and improved particle-grid interpolation, particle methods allow for simulations using millions of computational elements, making possible the resolution of a wide range of length and time scales of these important physical phenomena.The current challenges in these simulations are in : [i] the proper formulation of particle methods in the molecular and continuous level for the discretization of the governing equations [ii] the resolution of the wide range of time and length scales governing the phenomena under investigation. [iii] the minimization of numerical artifacts that may interfere with the physics of the systems under consideration. [iv] the parallelization of processes such as tree traversal and grid-particle interpolations We are conducting simulations using vortex methods, molecular dynamics and smooth particle hydrodynamics, exploiting their unifying concepts such as : the solution of the N-body problem in parallel computers, highly accurate particle-particle and grid-particle interpolations, parallel FFT's and the formulation of processes such as diffusion in the context of particle methods. This approach enables us to transcend among seemingly unrelated areas of research.

  11. Fermilab | Science at Fermilab | Experiments & Projects | Intensity

    Science.gov Websites

    Theory Computing High-performance Computing Grid Computing Networking Mass Storage Plan for the Future List Historic Results Inquiring Minds Questions About Physics Other High-Energy Physics Sites More About Particle Physics Library Visual Media Services Timeline History High-Energy Physics Accelerator

  12. In Their Own Words: Dealing with Dyslexia | NIH MedlinePlus the Magazine

    MedlinePlus

    ... occurs in people of all backgrounds and intellectual levels. People with dyslexia can be very bright. They are often capable or even gifted in areas such as art, computer science, design, drama, electronics, math, mechanics, music, physics, sales, and sports. Some of ...

  13. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    PubMed

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI-CAT showed strong validity and high reliability when used to assess physical function and disability in older adults dwelling in the community. © 2016 American Physical Therapy Association.

  14. Development of a SaaS application probe to the physical properties of the Earth's interior: An attempt at moving HPC to the cloud

    NASA Astrophysics Data System (ADS)

    Huang, Qian

    2014-09-01

    Scientific computing often requires the availability of a massive number of computers for performing large-scale simulations, and computing in mineral physics is no exception. In order to investigate physical properties of minerals at extreme conditions in computational mineral physics, parallel computing technology is used to speed up the performance by utilizing multiple computer resources to process a computational task simultaneously thereby greatly reducing computation time. Traditionally, parallel computing has been addressed by using High Performance Computing (HPC) solutions and installed facilities such as clusters and super computers. Today, it has been seen that there is a tremendous growth in cloud computing. Infrastructure as a Service (IaaS), the on-demand and pay-as-you-go model, creates a flexible and cost-effective mean to access computing resources. In this paper, a feasibility report of HPC on a cloud infrastructure is presented. It is found that current cloud services in IaaS layer still need to improve performance to be useful to research projects. On the other hand, Software as a Service (SaaS), another type of cloud computing, is introduced into an HPC system for computing in mineral physics, and an application of which is developed. In this paper, an overall description of this SaaS application is presented. This contribution can promote cloud application development in computational mineral physics, and cross-disciplinary studies.

  15. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task, a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.

  16. [Physical activity, screen time, and use of medicines among adolescents: the 1993 Pelotas (Brazil) birth cohort study].

    PubMed

    Bergmann, Gabriel Gustavo; Bertoldi, Andréa Dâmaso; Mielke, Grégore Iven; Camargo, Aline Lins; Matijasevich, Alicia; Hallal, Pedro Curi

    2016-01-01

    This study aimed to evaluate cross-sectional and longitudinal associations between physical activity, screen time, and use of medicines among adolescents from the 1993 Pelotas (Brazil) birth cohort study, followed at 11 (N = 4,452), 15 (N = 4,325), and 18 years of age (N = 4,106). The study recorded the use of medicines in the previous 15 days, continuous use of some medication, level of physical activity (by questionnaire and accelerometry), and screen time (TV, computer, and videogame). One-third of adolescents had used at least one medicine in the previous 15 days and approximately 10% were on some continuous medication. In the adjusted analysis, the results showed that higher levels of physical activity at 18 years and less screen time at 15 years in boys were associated with lower overall use of medicines (p < 0.05). For boys, physical activity at 11 and 18 years were inversely related to continuous medication (p < 0.05). More physically active boys and those with less screen time in adolescence showed lower use of medicines at 18 years of age.

  17. EFTofPNG: a package for high precision computation with the effective field theory of post-Newtonian gravity

    NASA Astrophysics Data System (ADS)

    Levi, Michele; Steinhoff, Jan

    2017-12-01

    We present a novel public package ‘EFTofPNG’ for high precision computation in the effective field theory of post-Newtonian (PN) gravity, including spins. We created this package in view of the timely need to publicly share automated computation tools, which integrate the various types of physics manifested in the expected increasing influx of gravitational wave (GW) data. Hence, we created a free and open source package, which is self-contained, modular, all-inclusive, and accessible to the classical gravity community. The ‘EFTofPNG’ Mathematica package also uses the power of the ‘xTensor’ package, suited for complicated tensor computation, where our coding also strategically approaches the generic generation of Feynman contractions, which is universal to all perturbation theories in physics, by efficiently treating n-point functions as tensors of rank n. The package currently contains four independent units, which serve as subsidiaries to the main one. Its final unit serves as a pipeline chain for the obtainment of the final GW templates, and provides the full computation of derivatives and physical observables of interest. The upcoming ‘EFTofPNG’ package version 1.0 should cover the point mass sector, and all the spin sectors, up to the fourth PN order, and the two-loop level. We expect and strongly encourage public development of the package to improve its efficiency, and to extend it to further PN sectors, and observables useful for the waveform modelling.

  18. DIRAC in Large Particle Physics Experiments

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Tsaregorodtsev, A.; Arrabito, L.; Sailer, A.; Hara, T.; Zhang, X.; Consortium, DIRAC

    2017-10-01

    The DIRAC project is developing interware to build and operate distributed computing systems. It provides a development framework and a rich set of services for both Workload and Data Management tasks of large scientific communities. A number of High Energy Physics and Astrophysics collaborations have adopted DIRAC as the base for their computing models. DIRAC was initially developed for the LHCb experiment at LHC, CERN. Later, the Belle II, BES III and CTA experiments as well as the linear collider detector collaborations started using DIRAC for their computing systems. Some of the experiments built their DIRAC-based systems from scratch, others migrated from previous solutions, ad-hoc or based on different middlewares. Adaptation of DIRAC for a particular experiment was enabled through the creation of extensions to meet their specific requirements. Each experiment has a heterogeneous set of computing and storage resources at their disposal that were aggregated through DIRAC into a coherent pool. Users from different experiments can interact with the system in different ways depending on their specific tasks, expertise level and previous experience using command line tools, python APIs or Web Portals. In this contribution we will summarize the experience of using DIRAC in particle physics collaborations. The problems of migration to DIRAC from previous systems and their solutions will be presented. An overview of specific DIRAC extensions will be given. We hope that this review will be useful for experiments considering an update, or for those designing their computing models.

  19. A physics-motivated Centroidal Voronoi Particle domain decomposition method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Lin, E-mail: lin.fu@tum.de; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de; Adams, Nikolaus A., E-mail: nikolaus.adams@tum.de

    2017-04-15

    In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state ismore » developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.« less

  20. A physics-motivated Centroidal Voronoi Particle domain decomposition method

    NASA Astrophysics Data System (ADS)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-04-01

    In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state is developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.

  1. 2015 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runnels, Scott Robert; Caldwell, Wendy; Brown, Barton Jed

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transportmore » and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.« less

  2. Opportunities for Computational Discovery in Basic Energy Sciences

    NASA Astrophysics Data System (ADS)

    Pederson, Mark

    2011-03-01

    An overview of the broad-ranging support of computational physics and computational science within the Department of Energy Office of Science will be provided. Computation as the third branch of physics is supported by all six offices (Advanced Scientific Computing, Basic Energy, Biological and Environmental, Fusion Energy, High-Energy Physics, and Nuclear Physics). Support focuses on hardware, software and applications. Most opportunities within the fields of~condensed-matter physics, chemical-physics and materials sciences are supported by the Officeof Basic Energy Science (BES) or through partnerships between BES and the Office for Advanced Scientific Computing. Activities include radiation sciences, catalysis, combustion, materials in extreme environments, energy-storage materials, light-harvesting and photovoltaics, solid-state lighting and superconductivity.~ A summary of two recent reports by the computational materials and chemical communities on the role of computation during the next decade will be provided. ~In addition to materials and chemistry challenges specific to energy sciences, issues identified~include a focus on the role of the domain scientist in integrating, expanding and sustaining applications-oriented capabilities on evolving high-performance computing platforms and on the role of computation in accelerating the development of innovative technologies. ~~

  3. Quantum Accelerators for High-performance Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Britt, Keith A.; Mohiyaddin, Fahd A.

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, themore » prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.« less

  4. Toward using games to teach fundamental computer science concepts

    NASA Astrophysics Data System (ADS)

    Edgington, Jeffrey Michael

    Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arimura, Hidetaka, E-mail: arimurah@med.kyushu-u.ac.jp; Kamezawa, Hidemi; Jin, Ze

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  6. The role of computational physics in the liberal arts curriculum

    NASA Astrophysics Data System (ADS)

    Dominguez, Rachele; Huff, Benjamin

    2015-09-01

    The role of computational physics education varies dramatically from department to department. We will discuss a new computational physics course at Randolph-Macon College and our attempt to identify where it fits (or should fit) into the larger liberal arts curriculum and why. In doing so, we will describe the goals of the course, and how the liberal arts curriculum conditions the exploration of computational physics.

  7. PREFACE: 3rd International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE 2014)

    NASA Astrophysics Data System (ADS)

    2015-01-01

    The third International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place at Madrid, Spain, from Thursday 28 to Sunday 31 August 2014. The Conference was attended by more than 200 participants and hosted about 350 oral, poster, and virtual presentations. More than 600 pre-registered authors were also counted. The third IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics etc. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, three parallel oral sessions and one poster session were running every day. However, according to all attendees, the program was excellent with high level of talks and the scientific environment was fruitful, thus all attendees had a creative time. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee.

  8. Body image dissatisfaction, physical activity and screen-time in Spanish adolescents.

    PubMed

    Añez, Elizabeth; Fornieles-Deu, Albert; Fauquet-Ars, Jordi; López-Guimerà, Gemma; Puntí-Vidal, Joaquim; Sánchez-Carracedo, David

    2018-01-01

    This cross-sectional study contributes to the literature on whether body dissatisfaction is a barrier/facilitator to engaging in physical activity and to investigate the impact of mass-media messages via computer-time on body dissatisfaction. High-school students ( N = 1501) reported their physical activity, computer-time (homework/leisure) and body dissatisfaction. Researchers measured students' weight and height. Analyses revealed that body dissatisfaction was negatively associated with physical activity on both genders, whereas computer-time was associated only with girls' body dissatisfaction. Specifically, as computer-homework increased, body dissatisfaction decreased; as computer-leisure increased, body dissatisfaction increased. Weight-related interventions should improve body image and physical activity simultaneously, while critical consumption of mass-media interventions should include a computer component.

  9. Introduction to Computational Physics for Undergraduates

    NASA Astrophysics Data System (ADS)

    Zubairi, Omair; Weber, Fridolin

    2018-03-01

    This is an introductory textbook on computational methods and techniques intended for undergraduates at the sophomore or junior level in the fields of science, mathematics, and engineering. It provides an introduction to programming languages such as FORTRAN 90/95/2000 and covers numerical techniques such as differentiation, integration, root finding, and data fitting. The textbook also entails the use of the Linux/Unix operating system and other relevant software such as plotting programs, text editors, and mark up languages such as LaTeX. It includes multiple homework assignments.

  10. Introducing Seismic Tomography with Computational Modeling

    NASA Astrophysics Data System (ADS)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  11. Recent advances in QM/MM free energy calculations using reference potentials.

    PubMed

    Duarte, Fernanda; Amrein, Beat A; Blaha-Nelson, David; Kamerlin, Shina C L

    2015-05-01

    Recent years have seen enormous progress in the development of methods for modeling (bio)molecular systems. This has allowed for the simulation of ever larger and more complex systems. However, as such complexity increases, the requirements needed for these models to be accurate and physically meaningful become more and more difficult to fulfill. The use of simplified models to describe complex biological systems has long been shown to be an effective way to overcome some of the limitations associated with this computational cost in a rational way. Hybrid QM/MM approaches have rapidly become one of the most popular computational tools for studying chemical reactivity in biomolecular systems. However, the high cost involved in performing high-level QM calculations has limited the applicability of these approaches when calculating free energies of chemical processes. In this review, we present some of the advances in using reference potentials and mean field approximations to accelerate high-level QM/MM calculations. We present illustrative applications of these approaches and discuss challenges and future perspectives for the field. The use of physically-based simplifications has shown to effectively reduce the cost of high-level QM/MM calculations. In particular, lower-level reference potentials enable one to reduce the cost of expensive free energy calculations, thus expanding the scope of problems that can be addressed. As was already demonstrated 40 years ago, the usage of simplified models still allows one to obtain cutting edge results with substantially reduced computational cost. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014. Published by Elsevier B.V.

  12. Software For Design Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1991-01-01

    Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.

  13. Technical Drafting and Mental Visualization in Interior Architecture Education

    ERIC Educational Resources Information Center

    Arslan, Ali Riza; Dazkir, Sibel Seda

    2017-01-01

    We explored how beginning-level interior architecture students develop skills to create mental visualizations of three-dimensional objects and environments, how they develop their technical drawing skills, and whether or not physical and computer generated models aid this design process. We used interviews and observations to collect data. The…

  14. FPGA-based distributed computing microarchitecture for complex physical dynamics investigation.

    PubMed

    Borgese, Gianluca; Pace, Calogero; Pantano, Pietro; Bilotta, Eleonora

    2013-09-01

    In this paper, we present a distributed computing system, called DCMARK, aimed at solving partial differential equations at the basis of many investigation fields, such as solid state physics, nuclear physics, and plasma physics. This distributed architecture is based on the cellular neural network paradigm, which allows us to divide the differential equation system solving into many parallel integration operations to be executed by a custom multiprocessor system. We push the number of processors to the limit of one processor for each equation. In order to test the present idea, we choose to implement DCMARK on a single FPGA, designing the single processor in order to minimize its hardware requirements and to obtain a large number of easily interconnected processors. This approach is particularly suited to study the properties of 1-, 2- and 3-D locally interconnected dynamical systems. In order to test the computing platform, we implement a 200 cells, Korteweg-de Vries (KdV) equation solver and perform a comparison between simulations conducted on a high performance PC and on our system. Since our distributed architecture takes a constant computing time to solve the equation system, independently of the number of dynamical elements (cells) of the CNN array, it allows us to reduce the elaboration time more than other similar systems in the literature. To ensure a high level of reconfigurability, we design a compact system on programmable chip managed by a softcore processor, which controls the fast data/control communication between our system and a PC Host. An intuitively graphical user interface allows us to change the calculation parameters and plot the results.

  15. PREFACE: New trends in Computer Simulations in Physics and not only in physics

    NASA Astrophysics Data System (ADS)

    Shchur, Lev N.; Krashakov, Serge A.

    2016-02-01

    In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf

  16. Nonlinear dynamics as an engine of computation.

    PubMed

    Kia, Behnam; Lindner, John F; Ditto, William L

    2017-03-06

    Control of chaos teaches that control theory can tame the complex, random-like behaviour of chaotic systems. This alliance between control methods and physics-cybernetical physics-opens the door to many applications, including dynamics-based computing. In this article, we introduce nonlinear dynamics and its rich, sometimes chaotic behaviour as an engine of computation. We review our work that has demonstrated how to compute using nonlinear dynamics. Furthermore, we investigate the interrelationship between invariant measures of a dynamical system and its computing power to strengthen the bridge between physics and computation.This article is part of the themed issue 'Horizons of cybernetical physics'. © 2017 The Author(s).

  17. The effectiveness of interactive computer simulations on college engineering student conceptual understanding and problem-solving ability related to circular motion

    NASA Astrophysics Data System (ADS)

    Chien, Cheng-Chih

    In the past thirty years, the effectiveness of computer assisted learning was found varied by individual studies. Today, with drastic technical improvement, computers have been widely spread in schools and used in a variety of ways. In this study, a design model involving educational technology, pedagogy, and content domain is proposed for effective use of computers in learning. Computer simulation, constructivist and Vygotskian perspectives, and circular motion are the three elements of the specific Chain Model for instructional design. The goal of the physics course is to help students remove the ideas which are not consistent with the physics community and rebuild new knowledge. To achieve the learning goal, the strategies of using conceptual conflicts and using language to internalize specific tasks into mental functions were included. Computer simulations and accompanying worksheets were used to help students explore their own ideas and to generate questions for discussions. Using animated images to describe the dynamic processes involved in the circular motion may reduce the complexity and possible miscommunications resulting from verbal explanations. The effectiveness of the instructional material on student learning is evaluated. The results of problem solving activities show that students using computer simulations had significantly higher scores than students not using computer simulations. For conceptual understanding, on the pretest students in the non-simulation group had significantly higher score than students in the simulation group. There was no significant difference observed between the two groups in the posttest. The relations of gender, prior physics experience, and frequency of computer uses outside the course to student achievement were also studied. There were fewer female students than male students and fewer students using computer simulations than students not using computer simulations. These characteristics affect the statistical power for detecting differences. For the future research, more intervention of simulations may be introduced to explore the potential of computer simulation in helping students learning. A test for conceptual understanding with more problems and appropriate difficulty level may be needed.

  18. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models

    DOE PAGES

    Rao, Nageswara S. V.; Poole, Stephen W.; Ma, Chris Y. T.; ...

    2015-04-06

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical sub-infrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein theirmore » components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. In conclusion, the analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures.« less

  19. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S. V.; Poole, Stephen W.; Ma, Chris Y. T.

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical sub-infrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein theirmore » components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. In conclusion, the analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures.« less

  20. Defense of Cyber Infrastructures Against Cyber-Physical Attacks Using Game-Theoretic Models.

    PubMed

    Rao, Nageswara S V; Poole, Stephen W; Ma, Chris Y T; He, Fei; Zhuang, Jun; Yau, David K Y

    2016-04-01

    The operation of cyber infrastructures relies on both cyber and physical components, which are subject to incidental and intentional degradations of different kinds. Within the context of network and computing infrastructures, we study the strategic interactions between an attacker and a defender using game-theoretic models that take into account both cyber and physical components. The attacker and defender optimize their individual utilities, expressed as sums of cost and system terms. First, we consider a Boolean attack-defense model, wherein the cyber and physical subinfrastructures may be attacked and reinforced as individual units. Second, we consider a component attack-defense model wherein their components may be attacked and defended, and the infrastructure requires minimum numbers of both to function. We show that the Nash equilibrium under uniform costs in both cases is computable in polynomial time, and it provides high-level deterministic conditions for the infrastructure survival. When probabilities of successful attack and defense, and of incidental failures, are incorporated into the models, the results favor the attacker but otherwise remain qualitatively similar. This approach has been motivated and validated by our experiences with UltraScience Net infrastructure, which was built to support high-performance network experiments. The analytical results, however, are more general, and we apply them to simplified models of cloud and high-performance computing infrastructures. © 2015 Society for Risk Analysis.

  1. A Framework for Load Balancing of Tensor Contraction Expressions via Dynamic Task Partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Pai-Wei; Stock, Kevin; Rajbhandari, Samyam

    In this paper, we introduce the Dynamic Load-balanced Tensor Contractions (DLTC), a domain-specific library for efficient task parallel execution of tensor contraction expressions, a class of computation encountered in quantum chemistry and physics. Our framework decomposes each contraction into smaller unit of tasks, represented by an abstraction referred to as iterators. We exploit an extra level of parallelism by having tasks across independent contractions executed concurrently through a dynamic load balancing run- time. We demonstrate the improved performance, scalability, and flexibility for the computation of tensor contraction expressions on parallel computers using examples from coupled cluster methods.

  2. The CP-PACS parallel computer

    NASA Astrophysics Data System (ADS)

    Ukawa, Akira

    1998-05-01

    The CP-PACS computer is a massively parallel computer consisting of 2048 processing units and having a peak speed of 614 GFLOPS and 128 GByte of main memory. It was developed over the four years from 1992 to 1996 at the Center for Computational Physics, University of Tsukuba, for large-scale numerical simulations in computational physics, especially those of lattice QCD. The CP-PACS computer has been in full operation for physics computations since October 1996. In this article we describe the chronology of the development, the hardware and software characteristics of the computer, and its performance for lattice QCD simulations.

  3. News | Computing

    Science.gov Websites

    Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact

  4. Analysis of a hydraulic a scaled asymmetric labyrinth weir with Ansys-Fluent

    NASA Astrophysics Data System (ADS)

    Otálora Carmona, Andrés Humberto; Santos Granados, Germán Ricardo

    2017-04-01

    This document presents the three dimensional computational modeling of a labyrinth weir, using the version 17.0 of the Computational Fluid Dynamics (CFD) software ANSYS - FLUENT. The computational characteristics of the model such as the geometry consideration, the mesh sensitivity, the numerical scheme, and the turbulence modeling parameters. The volume fraction of the water mixture - air, the velocity profile, the jet trajectory, the discharge coefficient and the velocity field are analyzed. With the purpose of evaluating the hydraulic behavior of the labyrinth weir of the Naveta's hydroelectric, in Apulo - Cundinamarca, was development a 1:21 scale model of the original structure, which was tested in the laboratory of the hydraulic studies in the Escuela Colombiana de Ingeniería Julio Garavito. The scale model of the structure was initially developed to determine the variability of the discharge coefficient with respect to the flow rate and their influence on the water level. It was elaborate because the original weir (labyrinth weir with not symmetrical rectangular section), did not have the capacity to work with the design flow of 31 m3/s, because over 15 m3/s, there were overflows in the adduction channel. This variation of efficiency was due to the thickening of the lateral walls by structural requirements. During the physical modeling doing by Rodríguez, H. and Matamoros H. (2015) in the test channel, it was found that, with the increase in the width of the side walls, the discharge coefficient is reduced an average by 34%, generating an increase of the water level by 0.26 m above the structure. This document aims to develop a splicing methodology between the physical models of a labyrinth weir and numerical modeling, using concepts of computational fluid dynamics and finite volume theories. For this, was carried out a detailed analysis of the variations in the different directions of the main hydraulic variables involved in the behavior, such as, the components of the velocity and the distribution of pressures, For the numerical development, we worked with ANSYS - FLUENT software modeling version 17.0. Initially, a digital model of a conventional triangular weir with a vertical angle of 102° was developed in order to find the most appropriate numerical scheme and conditions. The numerical results were compared with conventional theories, evaluating the path and discharge coefficient. Subsequently, one of the five cycles that compose the labyrinth weir was simulated, evaluating the behavior of the discharge coefficient, the water level, the streamline and the velocity field, with the purpose of understanding the hydraulic variables that are related in these geometries. According to the previous results, the numerical modeling of labyrinth weir was performed, comparing the obtained results with the data of the physical scale model, analyzing the variation of the discharge coefficient, the streamline, velocity field, pressure distribution and shear stress. Finally, based on the lessons learned from physical and numerical modeling, a methodological guide was created for any user with a computational and hydraulic fluid mechanics knowledge to develop a good practice of a computational and physical modeling.

  5. Automatic computation of transfer functions

    DOEpatents

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  6. Computer work and self-reported variables on anthropometrics, computer usage, work ability, productivity, pain, and physical activity.

    PubMed

    Madeleine, Pascal; Vangsgaard, Steffen; Hviid Andersen, Johan; Ge, Hong-You; Arendt-Nielsen, Lars

    2013-08-01

    Computer users often report musculoskeletal complaints and pain in the upper extremities and the neck-shoulder region. However, recent epidemiological studies do not report a relationship between the extent of computer use and work-related musculoskeletal disorders (WMSD).The aim of this study was to conduct an explorative analysis on short and long-term pain complaints and work-related variables in a cohort of Danish computer users. A structured web-based questionnaire including questions related to musculoskeletal pain, anthropometrics, work-related variables, work ability, productivity, health-related parameters, lifestyle variables as well as physical activity during leisure time was designed. Six hundred and ninety office workers completed the questionnaire responding to an announcement posted in a union magazine. The questionnaire outcomes, i.e., pain intensity, duration and locations as well as anthropometrics, work-related variables, work ability, productivity, and level of physical activity, were stratified by gender and correlations were obtained. Women reported higher pain intensity, longer pain duration as well as more locations with pain than men (P < 0.05). In parallel, women scored poorer work ability and ability to fulfil the requirements on productivity than men (P < 0.05). Strong positive correlations were found between pain intensity and pain duration for the forearm, elbow, neck and shoulder (P < 0.001). Moderate negative correlations were seen between pain intensity and work ability/productivity (P < 0.001). The present results provide new key information on pain characteristics in office workers. The differences in pain characteristics, i.e., higher intensity, longer duration and more pain locations as well as poorer work ability reported by women workers relate to their higher risk of contracting WMSD. Overall, this investigation confirmed the complex interplay between anthropometrics, work ability, productivity, and pain perception among computer users.

  7. Long-Duration Environmentally-Adaptive Autonomous Rigorous Naval Systems

    DTIC Science & Technology

    2015-09-30

    equations). The accuracy of the DO level-set equations for solving the governing stochastic level-set reachability fronts was first verified in part by...reachable set contours computed by DO and MC. We see that it is less than the spatial resolution used, indicating our DO solutions are accurate. We solved ...the interior of the sensors’ reachable sets, all the physically impossible trajectories are immediately ruled out. However, this approach is myopic

  8. An efficient dynamic load balancing algorithm

    NASA Astrophysics Data System (ADS)

    Lagaros, Nikos D.

    2014-01-01

    In engineering problems, randomness and uncertainties are inherent. Robust design procedures, formulated in the framework of multi-objective optimization, have been proposed in order to take into account sources of randomness and uncertainty. These design procedures require orders of magnitude more computational effort than conventional analysis or optimum design processes since a very large number of finite element analyses is required to be dealt. It is therefore an imperative need to exploit the capabilities of computing resources in order to deal with this kind of problems. In particular, parallel computing can be implemented at the level of metaheuristic optimization, by exploiting the physical parallelization feature of the nondominated sorting evolution strategies method, as well as at the level of repeated structural analyses required for assessing the behavioural constraints and for calculating the objective functions. In this study an efficient dynamic load balancing algorithm for optimum exploitation of available computing resources is proposed and, without loss of generality, is applied for computing the desired Pareto front. In such problems the computation of the complete Pareto front with feasible designs only, constitutes a very challenging task. The proposed algorithm achieves linear speedup factors and almost 100% speedup factor values with reference to the sequential procedure.

  9. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Schöbi, Roland; Sudret, Bruno

    2017-06-01

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions to surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.

  10. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schöbi, Roland, E-mail: schoebi@ibk.baug.ethz.ch; Sudret, Bruno, E-mail: sudret@ibk.baug.ethz.ch

    2017-06-15

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions tomore » surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.« less

  11. High Energy Physics

    Science.gov Websites

    Collider Physics Cosmic Frontier Cosmic Frontier Theory & Computing Detector R&D Electronic Design Theory Seminar Argonne >High Energy Physics Cosmic Frontier Theory & Computing Homepage General Cosmic Frontier Theory & Computing Group led the analysis to begin mapping dark matter. There have

  12. Metacognitive gimmicks and their use by upper level physics students

    NASA Astrophysics Data System (ADS)

    White, Gary; Sikorski, Tiffany-Rose; Landay, Justin

    2017-01-01

    We report on the initial phases of a study of three particular metacognitive gimmicks that upper-level physics students can use as a tool in their problem-solving kit, namely: checking units for consistency, discerning whether limiting cases match physical intuition, and computing numerical values for reasonable-ness. Students in a one semester Griffiths electromagnetism course at a small private urban university campus are asked to respond to explicit prompts that encourage adopting these three methods for checking answers to physics problems, especially those problems for which an algebraic expression is part of the final answer. We explore how, and to what extent, these students adopt these gimmicks, as well as the time development of their use. While the term ``gimmick'' carries with it some pejorative baggage, we feel it describes the essential nature of the pedagogical idea adequately in that it gets attention, is easy for the students to remember, and represents, albeit perhaps in a surface way, some key ideas about which professional physicists care.

  13. Self-Reported Pediatric Measures of Physical Activity, Sedentary Behavior and Strength Impact for PROMIS®: Conceptual Framework

    PubMed Central

    Tucker, Carole A.; Bevans, Katherine B.; Teneralli, Rachel E.; Smith, Ashley Wilder; Bowles, Heather R; Forrest, Christopher B.

    2014-01-01

    Purpose Children's physical activity (PA) levels are commonly assessed in pediatric clinical research, but rigorous self-report assessment tools for children are scarce, and computer adaptive test implementations are rare. Our objective was to improve pediatric self-report measures of activity using semi-structured interviews with experts and children for conceptualization of a child-informed framework. Methods Semi-structured interviews were conducted to conceptualize physical activity, sedentary behaviors, and strengthening activities. We performed systematic literature reviews to identify item-level concepts used to assess these 3 domains. Results We developed conceptual frameworks for each domain using words and phrases identified by children as relevant. Conclusions Semi-structured interview methods provide valuable information of children's perspectives and the ways children recall previous activities. Conceptualized domains of physical activity are based on the literature and expert views that also reflect children's experiences and understanding providing a basis for pediatric self-report instruments. PMID:25251789

  14. Bespoke physics for living technology.

    PubMed

    Ackley, David H

    2013-01-01

    In the physics of the natural world, basic tasks of life, such as homeostasis and reproduction, are extremely complex operations, requiring the coordination of billions of atoms even in simple cases. By contrast, artificial living organisms can be implemented in computers using relatively few bits, and copying a data structure is trivial. Of course, the physical overheads of the computers themselves are huge, but since their programmability allows digital "laws of physics" to be tailored like a custom suit, deploying living technology atop an engineered computational substrate might be as or more effective than building directly on the natural laws of physics, for a substantial range of desirable purposes. This article suggests basic criteria and metrics for bespoke physics computing architectures, describes one such architecture, and offers data and illustrations of custom living technology competing to reproduce while collaborating on an externally useful computation.

  15. Learning optimal quantum models is NP-hard

    NASA Astrophysics Data System (ADS)

    Stark, Cyril J.

    2018-02-01

    Physical modeling translates measured data into a physical model. Physical modeling is a major objective in physics and is generally regarded as a creative process. How good are computers at solving this task? Here, we show that in the absence of physical heuristics, the inference of optimal quantum models cannot be computed efficiently (unless P=NP ). This result illuminates rigorous limits to the extent to which computers can be used to further our understanding of nature.

  16. Intensity Maps Production Using Real-Time Joint Streaming Data Processing From Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Y. Y.; Tiampo, K. F.; Qin, J.; Bauer, M.

    2015-12-01

    Intensity is one of the most useful measures of earthquake hazard, as it quantifies the strength of shaking produced at a given distance from the epicenter. Today, there are several data sources that could be used to determine intensity level which can be divided into two main categories. The first category is represented by social data sources, in which the intensity values are collected by interviewing people who experienced the earthquake-induced shaking. In this case, specially developed questionnaires can be used in addition to personal observations published on social networks such as Twitter. These observations are assigned to the appropriate intensity level by correlating specific details and descriptions to the Modified Mercalli Scale. The second category of data sources is represented by observations from different physical sensors installed with the specific purpose of obtaining an instrumentally-derived intensity level. These are usually based on a regression of recorded peak acceleration and/or velocity amplitudes. This approach relates the recorded ground motions to the expected felt and damage distribution through empirical relationships. The goal of this work is to implement and evaluate streaming data processing separately and jointly from both social and physical sensors in order to produce near real-time intensity maps and compare and analyze their quality and evolution through 10-minute time intervals immediately following an earthquake. Results are shown for the case study of the M6.0 2014 South Napa, CA earthquake that occurred on August 24, 2014. The using of innovative streaming and pipelining computing paradigms through IBM InfoSphere Streams platform made it possible to read input data in real-time for low-latency computing of combined intensity level and production of combined intensity maps in near-real time. The results compare three types of intensity maps created based on physical, social and combined data sources. Here we correlate the count and density of Tweets with intensity level and show the importance of processing combined data sources at the earliest time stages after earthquake happens. This method can supplement existing approaches of intensity level detection, especially in the regions with high number of Twitter users and low density of seismic networks.

  17. MIRO Computational Model

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2010-01-01

    A computational model calculates the excitation of water rotational levels and emission-line spectra in a cometary coma with applications for the Micro-wave Instrument for Rosetta Orbiter (MIRO). MIRO is a millimeter-submillimeter spectrometer that will be used to study the nature of cometary nuclei, the physical processes of outgassing, and the formation of the head region of a comet (coma). The computational model is a means to interpret the data measured by MIRO. The model is based on the accelerated Monte Carlo method, which performs a random angular, spatial, and frequency sampling of the radiation field to calculate the local average intensity of the field. With the model, the water rotational level populations in the cometary coma and the line profiles for the emission from the water molecules as a function of cometary parameters (such as outgassing rate, gas temperature, and gas and electron density) and observation parameters (such as distance to the comet and beam width) are calculated.

  18. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  19. Physics in Screening Environments

    NASA Astrophysics Data System (ADS)

    Certik, Ondrej

    In the current study, we investigated atoms in screening environments like plasmas. It is common practice to extract physical data, such as temperature and electron densities, from plasma experiments. We present results that address inherent computational difficulties that arise when the screening approach is extended to include the interaction between the atomic electrons. We show that there may arise an ambiguity in the interpretation of physical properties, such as temperature and charge density, from experimental data due to the opposing effects of electron-nucleus screening and electron-electron screening. The focus of the work, however, is on the resolution of inherent computational challenges that appear in the computation of two-particle matrix elements. Those enter already at the Hartree-Fock level. Furthermore, as examples of post Hartree-Fock calculations, we show second-order Green's function results and many body perturbation theory results of second order. A self-contained derivation of all necessary equations has been included. The accuracy of the implementation of the method is established by comparing standard unscreened results for various atoms and molecules against literature for Hartree-Fock as well as Green's function and many body perturbation theory. The main results of the thesis are presented in the chapter called Screened Results, where the behavior of several atomic systems depending on electron-electron and electron-nucleus Debye screening was studied. The computer code that we have developed has been made available for anybody to use. Finally, we present and discuss results obtained for screened interactions. We also examine thoroughly the computational details of the calculations and particular implementations of the method.

  20. The Case for Developing Professional Master's Degrees to Compete in the Business World

    NASA Astrophysics Data System (ADS)

    Bozler, Hans M.

    2002-04-01

    Graduate education in most physics programs is oriented towards preparing students for research careers even though the majority of the students do not actively pursue research after graduation. This research orientation causes physics graduate programs to lose potential students. In addition science-trained professionals are often underrepresented in corporate decision making. Meanwhile, many physics graduates at all levels supplement their skills by taking courses in professional schools (engineering, law, and business). A survey of our graduates shows that combinations of knowledge and skills from physics and applied disciplines including business often form the basis for successful careers. The objective of our new Professional Master's in Physics for Business Applications program is to streamline this education by combining disciplines so that physics graduates can rapidly move into decision making positions within business and industry. We combine a traditional physics curriculum with courses that add to problem solving and computational skills. Students take courses in our Business School and also do an internship. Our physics courses are kept at the same level as those taken by Ph.D. students. The business courses are selected from offerings by the Marshall School of Business to their own MBA students. The progress and problems associated with the development of curriculum, recruiting, and placement will be discussed.

  1. The Inversion Potential of Ammonia: An Intrinsic Reaction Coordinate Calculation for Student Investigation

    ERIC Educational Resources Information Center

    Halpern, Arthur M.; Ramachandran, B. R.; Glendening, Eric D.

    2007-01-01

    A report is presented to describe how students can be empowered to construct the full, double minimum inversion potential for ammonia by performing intrinsic reaction coordinate calculations. This work can be associated with the third year physical chemistry lecture laboratory or an upper level course in computational chemistry.

  2. A Simple, Low-Cost, Data-Logging Pendulum Built from a Computer Mouse

    ERIC Educational Resources Information Center

    Gintautas, Vadas; Hubler, Alfred

    2009-01-01

    Lessons and homework problems involving a pendulum are often a big part of introductory physics classes and laboratory courses from high school to undergraduate levels. Although laboratory equipment for pendulum experiments is commercially available, it is often expensive and may not be affordable for teachers on fixed budgets, particularly in…

  3. The Seven-Segment Data Logger

    ERIC Educational Resources Information Center

    Bates, Alan

    2015-01-01

    Instruments or digital meters with data values visible on a seven-segment display can easily be found in the physics lab. Examples include multimeters, sound level meters, Geiger-Müller counters and electromagnetic field meters, where the display is used to show numerical data. Such instruments, without the ability to connect to computers or data…

  4. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels ofmore » parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.« less

  5. Extension of the quantum-kinetic model to lunar and Mars return physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liechty, D. S.; Lewis, M. J.

    The ability to compute rarefied, ionized hypersonic flows is becoming more important as missions such as Earth reentry, landing high-mass payloads on Mars, and the exploration of the outer planets and their satellites are being considered. A recently introduced molecular-level chemistry model, the quantum-kinetic, or Q-K, model that predicts reaction rates for gases in thermal equilibrium and non-equilibrium using only kinetic theory and fundamental molecular properties, is extended in the current work to include electronic energy level transitions and reactions involving charged particles. Like the Q-K procedures for neutral species chemical reactions, these new models are phenomenological procedures that aimmore » to reproduce the reaction/transition rates but do not necessarily capture the exact physics. These engineering models are necessarily efficient due to the requirement to compute billions of simulated collisions in direct simulation Monte Carlo (DSMC) simulations. The new models are shown to generally agree within the spread of reported transition and reaction rates from the literature for near equilibrium conditions.« less

  6. Neck/shoulder pain in adolescents is not related to the level or nature of self-reported physical activity or type of sedentary activity in an Australian pregnancy cohort.

    PubMed

    Briggs, Andrew M; Straker, Leon M; Bear, Natasha L; Smith, Anne J

    2009-07-20

    An inconsistent relationship between physical activity and neck/shoulder pain (NSP) in adolescents has been reported in the literature. Earlier studies may be limited by not assessing physical activity in sufficient detail. The aim of this study was to comprehensively examine the association between NSP and the level and nature of physical activity, and type of sedentary activity in adolescents. A cross-sectional analysis using data from 924 adolescents in the Western Australian Pregnancy Cohort (RAINE) study was performed. Complete data were available for 643 adolescents (54.6% female) at the 14-year follow-up. Physical activity was measured using a detailed self-report electronic activity diary requiring participants to input details of all physical activities over the day in segments of 5 minutes for a one-week period. Physical activity levels were categorised as: sedentary, light, moderate, or vigorous based on metabolic energy equivalents. Nature of activity was determined by assigning each activity to categories based on the amount of movement (static/dynamic) and the main posture assumed for the activity (standing/sitting/lying). Type of sedentary activity was characterised by exposure time to watching TV, using a computer, and reading. Logistic regression was used to explore the association between NSP and activity. Females reported a higher prevalence of lifetime, 1-month and chronic NSP than males (50.9 vs 41.7%, 34.1 vs 23.5%, and 9.2 vs 6.2% respectively). No consistent, dose-response relationship was found between NSP and the level, nature, and type of physical activity. Self-reported one month and lifetime NSP prevalence in adolescents is not related to the level or intensity of physical activity or the type of sedentary activity over a one week period.

  7. Scheduling applications for execution on a plurality of compute nodes of a parallel computer to manage temperature of the nodes during execution

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E

    2012-10-16

    Methods, apparatus, and products are disclosed for scheduling applications for execution on a plurality of compute nodes of a parallel computer to manage temperature of the plurality of compute nodes during execution that include: identifying one or more applications for execution on the plurality of compute nodes; creating a plurality of physically discontiguous node partitions in dependence upon temperature characteristics for the compute nodes and a physical topology for the compute nodes, each discontiguous node partition specifying a collection of physically adjacent compute nodes; and assigning, for each application, that application to one or more of the discontiguous node partitions for execution on the compute nodes specified by the assigned discontiguous node partitions.

  8. Efficient and Effective Change Principles in Active Videogames

    PubMed Central

    Fenner, Ashley A.; Howie, Erin K.; Feltz, Deborah L.; Gray, Cindy M.; Lu, Amy Shirong; Mueller, Florian “Floyd”; Simons, Monique; Barnett, Lisa M.

    2015-01-01

    Abstract Active videogames have the potential to enhance population levels of physical activity but have not been successful in achieving this aim to date. This article considers a range of principles that may be important to the design of effective and efficient active videogames from diverse discipline areas, including behavioral sciences (health behavior change, motor learning, and serious games), business production (marketing and sales), and technology engineering and design (human–computer interaction/ergonomics and flow). Both direct and indirect pathways to impact on population levels of habitual physical activity are proposed, along with the concept of a game use lifecycle. Examples of current active and sedentary electronic games are used to understand how such principles may be applied. Furthermore, limitations of the current usage of theoretical principles are discussed. A suggested list of principles for best practice in active videogame design is proposed along with suggested research ideas to inform practice to enhance physical activity. PMID:26181680

  9. Efficient and Effective Change Principles in Active Videogames.

    PubMed

    Straker, Leon M; Fenner, Ashley A; Howie, Erin K; Feltz, Deborah L; Gray, Cindy M; Lu, Amy Shirong; Mueller, Florian Floyd; Simons, Monique; Barnett, Lisa M

    2015-02-01

    Active videogames have the potential to enhance population levels of physical activity but have not been successful in achieving this aim to date. This article considers a range of principles that may be important to the design of effective and efficient active videogames from diverse discipline areas, including behavioral sciences (health behavior change, motor learning, and serious games), business production (marketing and sales), and technology engineering and design (human-computer interaction/ergonomics and flow). Both direct and indirect pathways to impact on population levels of habitual physical activity are proposed, along with the concept of a game use lifecycle. Examples of current active and sedentary electronic games are used to understand how such principles may be applied. Furthermore, limitations of the current usage of theoretical principles are discussed. A suggested list of principles for best practice in active videogame design is proposed along with suggested research ideas to inform practice to enhance physical activity.

  10. The Effects of Pathological Gaming on Aggressive Behavior

    PubMed Central

    Valkenburg, Patti M.; Peter, Jochen

    2010-01-01

    Studies have shown that pathological involvement with computer or video games is related to excessive gaming binges and aggressive behavior. Our aims for this study were to longitudinally examine if pathological gaming leads to increasingly excessive gaming habits, and how pathological gaming may cause an increase in physical aggression. For this purpose, we conducted a two-wave panel study among 851 Dutch adolescents (49% female) of which 540 played games (30% female). Our analyses indicated that higher levels of pathological gaming predicted an increase in time spent playing games 6 months later. Time spent playing violent games specifically, and not just games per se, increased physical aggression. Furthermore, higher levels of pathological gaming, regardless of violent content, predicted an increase in physical aggression among boys. That this effect only applies to boys does not diminish its importance, because adolescent boys are generally the heaviest players of violent games and most susceptible to pathological involvement. PMID:20549320

  11. The effects of pathological gaming on aggressive behavior.

    PubMed

    Lemmens, Jeroen S; Valkenburg, Patti M; Peter, Jochen

    2011-01-01

    Studies have shown that pathological involvement with computer or video games is related to excessive gaming binges and aggressive behavior. Our aims for this study were to longitudinally examine if pathological gaming leads to increasingly excessive gaming habits, and how pathological gaming may cause an increase in physical aggression. For this purpose, we conducted a two-wave panel study among 851 Dutch adolescents (49% female) of which 540 played games (30% female). Our analyses indicated that higher levels of pathological gaming predicted an increase in time spent playing games 6 months later. Time spent playing violent games specifically, and not just games per se, increased physical aggression. Furthermore, higher levels of pathological gaming, regardless of violent content, predicted an increase in physical aggression among boys. That this effect only applies to boys does not diminish its importance, because adolescent boys are generally the heaviest players of violent games and most susceptible to pathological involvement.

  12. Evaluation of tablet computers for visual function assessment.

    PubMed

    Bodduluri, Lakshmi; Boon, Mei Ying; Dain, Stephen J

    2017-04-01

    Recent advances in technology and the increased use of tablet computers for mobile health applications such as vision testing necessitate an understanding of the behavior of the displays of such devices, to facilitate the reproduction of existing or the development of new vision assessment tests. The purpose of this study was to investigate the physical characteristics of one model of tablet computer (iPad mini Retina display) with regard to display consistency across a set of devices (15) and their potential application as clinical vision assessment tools. Once the tablet computer was switched on, it required about 13 min to reach luminance stability, while chromaticity remained constant. The luminance output of the device remained stable until a battery level of 5%. Luminance varied from center to peripheral locations of the display and with viewing angle, whereas the chromaticity did not vary. A minimal (1%) variation in luminance was observed due to temperature, and once again chromaticity remained constant. Also, these devices showed good temporal stability of luminance and chromaticity. All 15 tablet computers showed gamma functions approximating the standard gamma (2.20) and showed similar color gamut sizes, except for the blue primary, which displayed minimal variations. The physical characteristics across the 15 devices were similar and are known, thereby facilitating the use of this model of tablet computer as visual stimulus displays.

  13. The role of physicality in rich programming environments

    NASA Astrophysics Data System (ADS)

    Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin

    2013-12-01

    Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot Virtual Worlds (RVWs), can be used to teach computer science principles within a robotics context by examining its use in high-school classrooms. We also investigated whether the lack of physicality in these environments impacts student learning by comparing classrooms that used either virtual or physical robots for the RVW curriculum. Results suggest that the RVW environment leads to significant gains in computer science knowledge, that virtual robots lead to faster learning, and that physical robots may have some influence on algorithmic thinking. We discuss the implications of physicality in these programming environments for learning computer science.

  14. Physical Function Assessment in a Community-Dwelling Population of U.S. Chinese Older Adults

    PubMed Central

    Chang, E-Shien; Simon, Melissa A.

    2014-01-01

    Background. This report describes the levels of physical function in U.S. Chinese older adults utilizing self-reported and performance-based measures, and examines the association between sociodemographic characteristics and physical function. Methods. The Population Study of Chinese Elderly in Chicago enrolled an epidemiological cohort of 3,159 community-dwelling Chinese older adults aged 60 and older. We collected self-reported physical function using Katz activities of daily living and Lawton instrumental activities of daily living items, the Index of Mobility scale, and the Index of Basic Physical Activities scale. Participants were also asked to perform tasks in chair stand, tandem stand, and timed walk. We computed Pearson and Spearman correlation coefficients to examine the correlation between sociodemographic and physical function variables. Results. A total of 7.8% of study participants experienced activities of daily living impairment, and 50.2% experienced instrumental activities of daily living impairment. With respect to physical performance testing, 11.4% of the participants were not able to complete chair stand for five times, 8.5% of the participants were unable to do chair stands at all. Older age, female gender, lower education level, being unmarried, living with fewer people in the same household, having fewer children, living fewer years in the United States, living fewer years in the community, and worsening health status were significantly correlated with lower levels of physical function. Conclusions. Utilizing self-reported and performance-based measures of physical function in a large population-based study of U.S. Chinese older adults, our findings expand current understanding of minority older adults’ functional status. PMID:25378446

  15. Preparing Students for Careers in Science and Industry with Computational Physics

    NASA Astrophysics Data System (ADS)

    Florinski, V. A.

    2011-12-01

    Funded by NSF CAREER grant, the University of Alabama (UAH) in Huntsville has launched a new graduate program in Computational Physics. It is universally accepted that today's physics is done on a computer. The program blends the boundary between physics and computer science by teaching student modern, practical techniques of solving difficult physics problems using diverse computational platforms. Currently consisting of two courses first offered in the Fall of 2011, the program will eventually include 5 courses covering methods for fluid dynamics, particle transport via stochastic methods, and hybrid and PIC plasma simulations. The UAH's unique location allows courses to be shaped through discussions with faculty, NASA/MSFC researchers and local R&D business representatives, i.e., potential employers of the program's graduates. Students currently participating in the program have all begun their research careers in space and plasma physics; many are presenting their research at this meeting.

  16. An interactive NASTRAN preprocessor. [graphic display of undeformed structure using CDC 6000 series computer

    NASA Technical Reports Server (NTRS)

    Smith, W. W.

    1973-01-01

    A Langley Research Center version of NASTRAN Level 15.1.0 designed to provide the analyst with an added tool for debugging massive NASTRAN input data is described. The program checks all NASTRAN input data cards and displays on a CRT the graphic representation of the undeformed structure. In addition, the program permits the display and alteration of input data and allows reexecution without physically resubmitting the job. Core requirements on the CDC 6000 computer are approximately 77,000 octal words of central memory.

  17. Photoionization and High Density Gas

    NASA Technical Reports Server (NTRS)

    Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)

    2002-01-01

    We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.

  18. Recent advances in QM/MM free energy calculations using reference potentials☆

    PubMed Central

    Duarte, Fernanda; Amrein, Beat A.; Blaha-Nelson, David; Kamerlin, Shina C.L.

    2015-01-01

    Background Recent years have seen enormous progress in the development of methods for modeling (bio)molecular systems. This has allowed for the simulation of ever larger and more complex systems. However, as such complexity increases, the requirements needed for these models to be accurate and physically meaningful become more and more difficult to fulfill. The use of simplified models to describe complex biological systems has long been shown to be an effective way to overcome some of the limitations associated with this computational cost in a rational way. Scope of review Hybrid QM/MM approaches have rapidly become one of the most popular computational tools for studying chemical reactivity in biomolecular systems. However, the high cost involved in performing high-level QM calculations has limited the applicability of these approaches when calculating free energies of chemical processes. In this review, we present some of the advances in using reference potentials and mean field approximations to accelerate high-level QM/MM calculations. We present illustrative applications of these approaches and discuss challenges and future perspectives for the field. Major conclusions The use of physically-based simplifications has shown to effectively reduce the cost of high-level QM/MM calculations. In particular, lower-level reference potentials enable one to reduce the cost of expensive free energy calculations, thus expanding the scope of problems that can be addressed. General significance As was already demonstrated 40 years ago, the usage of simplified models still allows one to obtain cutting edge results with substantially reduced computational cost. This article is part of a Special Issue entitled Recent developments of molecular dynamics. PMID:25038480

  19. C P -odd sector and θ dynamics in holographic QCD

    NASA Astrophysics Data System (ADS)

    Areán, Daniel; Iatrakis, Ioannis; Järvinen, Matti; Kiritsis, Elias

    2017-07-01

    The holographic model of V-QCD is used to analyze the physics of QCD in the Veneziano large-N limit. An unprecedented analysis of the C P -odd physics is performed going beyond the level of effective field theories. The structure of holographic saddle points at finite θ is determined, as well as its interplay with chiral symmetry breaking. Many observables (vacuum energy and higher-order susceptibilities, singlet and nonsinglet masses and mixings) are computed as functions of θ and the quark mass m . Wherever applicable the results are compared to those of chiral Lagrangians, finding agreement. In particular, we recover the Witten-Veneziano formula in the small x →0 limit, we compute the θ dependence of the pion mass, and we derive the hyperscaling relation for the topological susceptibility in the conformal window in terms of the quark mass.

  20. Overview of sports vision

    NASA Astrophysics Data System (ADS)

    Moore, Linda A.; Ferreira, Jannie T.

    2003-03-01

    Sports vision encompasses the visual assessment and provision of sports-specific visual performance enhancement and ocular protection for athletes of all ages, genders and levels of participation. In recent years, sports vision has been identified as one of the key performance indicators in sport. It is built on four main cornerstones: corrective eyewear, protective eyewear, visual skills enhancement and performance enhancement. Although clinically well established in the US, it is still a relatively new area of optometric specialisation elsewhere in the world and is gaining increasing popularity with eyecare practitioners and researchers. This research is often multi-disciplinary and involves input from a variety of subject disciplines, mainly those of optometry, medicine, physiology, psychology, physics, chemistry, computer science and engineering. Collaborative research projects are currently underway between staff of the Schools of Physics and Computing (DIT) and the Academy of Sports Vision (RAU).

  1. Automating quantum experiment control

    NASA Astrophysics Data System (ADS)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  2. PREFACE: 2nd International Conference on Mathematical Modeling in Physical Sciences 2013 (IC-MSQUARE 2013)

    NASA Astrophysics Data System (ADS)

    2014-03-01

    The second International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place at Prague, Czech Republic, from Sunday 1 September to Thursday 5 September 2013. The Conference was attended by more than 280 participants and hosted about 400 oral, poster, and virtual presentations while counted more than 600 pre-registered authors. The second IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, three parallel sessions were running every day. However, according to all attendees, the program was excellent with high level of talks and the scientific environment was fruitful, thus all attendees had a creative time. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee. Further information on the editors, speakers and committees is available in the attached pdf.

  3. Advanced Computing Tools and Models for Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  4. What Does "Fast" Mean? Understanding the Physical World through Computational Representations

    ERIC Educational Resources Information Center

    Parnafes, Orit

    2007-01-01

    This article concerns the development of conceptual understanding of a physical phenomenon through the use of computational representations. It examines how students make sense of and interpret computational representations, and how their understanding of the represented physical phenomenon develops in this process. Eight studies were conducted,…

  5. Television screen time, but not computer use and reading time, is associated with cardio-metabolic biomarkers in a multiethnic Asian population: a cross-sectional study.

    PubMed

    Nang, Ei Ei Khaing; Salim, Agus; Wu, Yi; Tai, E Shyong; Lee, Jeannette; Van Dam, Rob M

    2013-05-30

    Recent evidence shows that sedentary behaviour may be an independent risk factor for cardiovascular diseases, diabetes, cancers and all-cause mortality. However, results are not consistent and different types of sedentary behaviour might have different effects on health. Thus the aim of this study was to evaluate the association between television screen time, computer/reading time and cardio-metabolic biomarkers in a multiethnic urban Asian population. We also sought to understand the potential mediators of this association. The Singapore Prospective Study Program (2004-2007), was a cross-sectional population-based study in a multiethnic population in Singapore. We studied 3305 Singaporean adults of Chinese, Malay and Indian ethnicity who did not have pre-existing diseases and conditions that could affect their physical activity. Multiple linear regression analysis was used to assess the association of television screen time and computer/reading time with cardio-metabolic biomarkers [blood pressure, lipids, glucose, adiponectin, C reactive protein and homeostasis model assessment of insulin resistance (HOMA-IR)]. Path analysis was used to examine the role of mediators of the observed association. Longer television screen time was significantly associated with higher systolic blood pressure, total cholesterol, triglycerides, C reactive protein, HOMA-IR, and lower adiponectin after adjustment for potential socio-demographic and lifestyle confounders. Dietary factors and body mass index, but not physical activity, were potential mediators that explained most of these associations between television screen time and cardio-metabolic biomarkers. The associations of television screen time with triglycerides and HOMA-IR were only partly explained by dietary factors and body mass index. No association was observed between computer/ reading time and worse levels of cardio-metabolic biomarkers. In this urban Asian population, television screen time was associated with worse levels of various cardio-metabolic risk factors. This may reflect detrimental effects of television screen time on dietary habits rather than replacement of physical activity.

  6. A high-resolution physically-based global flood hazard map

    NASA Astrophysics Data System (ADS)

    Kaheil, Y.; Begnudelli, L.; McCollum, J.

    2016-12-01

    We present the results from a physically-based global flood hazard model. The model uses a physically-based hydrologic model to simulate river discharges, and 2D hydrodynamic model to simulate inundation. The model is set up such that it allows the application of large-scale flood hazard through efficient use of parallel computing. For hydrology, we use the Hillslope River Routing (HRR) model. HRR accounts for surface hydrology using Green-Ampt parameterization. The model is calibrated against observed discharge data from the Global Runoff Data Centre (GRDC) network, among other publicly-available datasets. The parallel-computing framework takes advantage of the river network structure to minimize cross-processor messages, and thus significantly increases computational efficiency. For inundation, we implemented a computationally-efficient 2D finite-volume model with wetting/drying. The approach consists of simulating flood along the river network by forcing the hydraulic model with the streamflow hydrographs simulated by HRR, and scaled up to certain return levels, e.g. 100 years. The model is distributed such that each available processor takes the next simulation. Given an approximate criterion, the simulations are ordered from most-demanding to least-demanding to ensure that all processors finalize almost simultaneously. Upon completing all simulations, the maximum envelope of flood depth is taken to generate the final map. The model is applied globally, with selected results shown from different continents and regions. The maps shown depict flood depth and extent at different return periods. These maps, which are currently available at 3 arc-sec resolution ( 90m) can be made available at higher resolutions where high resolution DEMs are available. The maps can be utilized by flood risk managers at the national, regional, and even local levels to further understand their flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs.

  7. The Impact of IBM Cell Technology on the Programming Paradigm in the Context of Computer Systems for Climate and Weather Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Shujia; Duffy, Daniel; Clune, Thomas

    The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratiomore » of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.« less

  8. High-energy physics software parallelization using database techniques

    NASA Astrophysics Data System (ADS)

    Argante, E.; van der Stok, P. D. V.; Willers, I.

    1997-02-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.

  9. Mesoscale Computational Investigation of Shocked Heterogeneous Materials with Application to Large Impact Craters

    NASA Technical Reports Server (NTRS)

    Crawford, D. A.; Barnouin-Jha, O. S.; Cintala, M. J.

    2003-01-01

    The propagation of shock waves through target materials is strongly influenced by the presence of small-scale structure, fractures, physical and chemical heterogeneities. Pre-existing fractures often create craters that appear square in outline (e.g. Meteor Crater). Reverberations behind the shock from the presence of physical heterogeneity have been proposed as a mechanism for transient weakening of target materials. Pre-existing fractures can also affect melt generation. In this study, we are attempting to bridge the gap in numerical modeling between the micro-scale and the continuum, the so-called meso-scale. To accomplish this, we are developing a methodology to be used in the shock physics hydrocode (CTH) using Monte-Carlo-type methods to investigate the shock properties of heterogeneous materials. By comparing the results of numerical experiments at the micro-scale with experimental results and by using statistical techniques to evaluate the performance of simple constitutive models, we hope to embed the effect of physical heterogeneity into the field variables (pressure, stress, density, velocity) allowing us to directly imprint the effects of micro-scale heterogeneity at the continuum level without incurring high computational cost.

  10. Call to Adopt a Nominal Set of Astrophysical Parameters and Constants to Improve the Accuracy of Fundamental Physical Properties of Stars

    NASA Astrophysics Data System (ADS)

    Harmanec, Petr; Prša, Andrej

    2011-08-01

    The increasing precision of astronomical observations of stars and stellar systems is gradually getting to a level where the use of slightly different values of the solar mass, radius, and luminosity, as well as different values of fundamental physical constants, can lead to measurable systematic differences in the determination of basic physical properties. An equivalent issue with an inconsistent value of the speed of light was resolved by adopting a nominal value that is constant and has no error associated with it. Analogously, we suggest that the systematic error in stellar parameters may be eliminated by (1) replacing the solar radius R⊙ and luminosity L⊙ by the nominal values that are by definition exact and expressed in SI units: and ; (2) computing stellar masses in terms of M⊙ by noting that the measurement error of the product GM⊙ is 5 orders of magnitude smaller than the error in G; (3) computing stellar masses and temperatures in SI units by using the derived values and ; and (4) clearly stating the reference for the values of the fundamental physical constants used. We discuss the need and demonstrate the advantages of such a paradigm shift.

  11. Learning physical biology via modeling and simulation: A new course and textbook for science and engineering undergraduates

    NASA Astrophysics Data System (ADS)

    Nelson, Philip

    To a large extent, undergraduate physical-science curricula remain firmly rooted in pencil-and-paper calculation, despite the fact that most research is done with computers. To a large extent, undergraduate life-science curricula remain firmly rooted in descriptive approaches, despite the fact that much current research involves quantitative modeling. Not only does our pedagogy not reflect current reality; it also creates a spurious barrier between the fields, reinforcing the narrow silos that prevent students from connecting them. I'll describe an intermediate-level course on ``Physical Models of Living Systems.'' The prerequisite is first-year university physics and calculus. The course is a response to rapidly growing interest among undergraduates in a broad range of science and engineering majors. Students acquire several research skills that are often not addressed in traditional undergraduate courses: •Basic modeling skills; •Probabilistic modeling skills; •Data analysis methods; •Computer programming using a general-purpose platform like MATLAB or Python; •Pulling datasets from the Web for analysis; •Data visualization; •Dynamical systems, particularly feedback control. Partially supported by the NSF under Grants EF-0928048 and DMR-0832802.

  12. Modeling Materials: Design for Planetary Entry, Electric Aircraft, and Beyond

    NASA Technical Reports Server (NTRS)

    Thompson, Alexander; Lawson, John W.

    2014-01-01

    NASA missions push the limits of what is possible. The development of high-performance materials must keep pace with the agency's demanding, cutting-edge applications. Researchers at NASA's Ames Research Center are performing multiscale computational modeling to accelerate development times and further the design of next-generation aerospace materials. Multiscale modeling combines several computationally intensive techniques ranging from the atomic level to the macroscale, passing output from one level as input to the next level. These methods are applicable to a wide variety of materials systems. For example: (a) Ultra-high-temperature ceramics for hypersonic aircraft-we utilized the full range of multiscale modeling to characterize thermal protection materials for faster, safer air- and spacecraft, (b) Planetary entry heat shields for space vehicles-we computed thermal and mechanical properties of ablative composites by combining several methods, from atomistic simulations to macroscale computations, (c) Advanced batteries for electric aircraft-we performed large-scale molecular dynamics simulations of advanced electrolytes for ultra-high-energy capacity batteries to enable long-distance electric aircraft service; and (d) Shape-memory alloys for high-efficiency aircraft-we used high-fidelity electronic structure calculations to determine phase diagrams in shape-memory transformations. Advances in high-performance computing have been critical to the development of multiscale materials modeling. We used nearly one million processor hours on NASA's Pleiades supercomputer to characterize electrolytes with a fidelity that would be otherwise impossible. For this and other projects, Pleiades enables us to push the physics and accuracy of our calculations to new levels.

  13. A cloud physics investigation utilizing Skylab data

    NASA Technical Reports Server (NTRS)

    Alishouse, J.; Jacobowitz, H.; Wark, D. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. The Lowtran 2 program, S191 spectral response, and solar spectrum were used to compute the expected absorption by 2.0 micron band for a variety of cloud pressure levels and solar zenith angles. Analysis of the three long wavelength data channels continued in which it was found necessary to impose a minimum radiance criterion. It was also found necessary to modify the computer program to permit the computation of mean values and standard deviations for selected subsets of data on a given tape. A technique for computing the integrated absorption in the A band was devised. The technique normalizes the relative maximum at approximately .78 micron to the solar irradiance curve and then adjusts the relative maximum at approximately .74 micron to fit the solar curve.

  14. Automated Ecological Assessment of Physical Activity: Advancing Direct Observation.

    PubMed

    Carlson, Jordan A; Liu, Bo; Sallis, James F; Kerr, Jacqueline; Hipp, J Aaron; Staggs, Vincent S; Papa, Amy; Dean, Kelsey; Vasconcelos, Nuno M

    2017-12-01

    Technological advances provide opportunities for automating direct observations of physical activity, which allow for continuous monitoring and feedback. This pilot study evaluated the initial validity of computer vision algorithms for ecological assessment of physical activity. The sample comprised 6630 seconds per camera (three cameras in total) of video capturing up to nine participants engaged in sitting, standing, walking, and jogging in an open outdoor space while wearing accelerometers. Computer vision algorithms were developed to assess the number and proportion of people in sedentary, light, moderate, and vigorous activity, and group-based metabolic equivalents of tasks (MET)-minutes. Means and standard deviations (SD) of bias/difference values, and intraclass correlation coefficients (ICC) assessed the criterion validity compared to accelerometry separately for each camera. The number and proportion of participants sedentary and in moderate-to-vigorous physical activity (MVPA) had small biases (within 20% of the criterion mean) and the ICCs were excellent (0.82-0.98). Total MET-minutes were slightly underestimated by 9.3-17.1% and the ICCs were good (0.68-0.79). The standard deviations of the bias estimates were moderate-to-large relative to the means. The computer vision algorithms appeared to have acceptable sample-level validity (i.e., across a sample of time intervals) and are promising for automated ecological assessment of activity in open outdoor settings, but further development and testing is needed before such tools can be used in a diverse range of settings.

  15. Quantum and semiclassical spin networks: from atomic and molecular physics to quantum computing and gravity

    NASA Astrophysics Data System (ADS)

    Aquilanti, Vincenzo; Bitencourt, Ana Carla P.; Ferreira, Cristiane da S.; Marzuoli, Annalisa; Ragni, Mirco

    2008-11-01

    The mathematical apparatus of quantum-mechanical angular momentum (re)coupling, developed originally to describe spectroscopic phenomena in atomic, molecular, optical and nuclear physics, is embedded in modern algebraic settings which emphasize the underlying combinatorial aspects. SU(2) recoupling theory, involving Wigner's 3nj symbols, as well as the related problems of their calculations, general properties, asymptotic limits for large entries, nowadays plays a prominent role also in quantum gravity and quantum computing applications. We refer to the ingredients of this theory—and of its extension to other Lie and quantum groups—by using the collective term of 'spin networks'. Recent progress is recorded about the already established connections with the mathematical theory of discrete orthogonal polynomials (the so-called Askey scheme), providing powerful tools based on asymptotic expansions, which correspond on the physical side to various levels of semi-classical limits. These results are useful not only in theoretical molecular physics but also in motivating algorithms for the computationally demanding problems of molecular dynamics and chemical reaction theory, where large angular momenta are typically involved. As for quantum chemistry, applications of these techniques include selection and classification of complete orthogonal basis sets in atomic and molecular problems, either in configuration space (Sturmian orbitals) or in momentum space. In this paper, we list and discuss some aspects of these developments—such as for instance the hyperquantization algorithm—as well as a few applications to quantum gravity and topology, thus providing evidence of a unifying background structure.

  16. Automated Ecological Assessment of Physical Activity: Advancing Direct Observation

    PubMed Central

    Carlson, Jordan A.; Liu, Bo; Sallis, James F.; Kerr, Jacqueline; Papa, Amy; Dean, Kelsey; Vasconcelos, Nuno M.

    2017-01-01

    Technological advances provide opportunities for automating direct observations of physical activity, which allow for continuous monitoring and feedback. This pilot study evaluated the initial validity of computer vision algorithms for ecological assessment of physical activity. The sample comprised 6630 seconds per camera (three cameras in total) of video capturing up to nine participants engaged in sitting, standing, walking, and jogging in an open outdoor space while wearing accelerometers. Computer vision algorithms were developed to assess the number and proportion of people in sedentary, light, moderate, and vigorous activity, and group-based metabolic equivalents of tasks (MET)-minutes. Means and standard deviations (SD) of bias/difference values, and intraclass correlation coefficients (ICC) assessed the criterion validity compared to accelerometry separately for each camera. The number and proportion of participants sedentary and in moderate-to-vigorous physical activity (MVPA) had small biases (within 20% of the criterion mean) and the ICCs were excellent (0.82–0.98). Total MET-minutes were slightly underestimated by 9.3–17.1% and the ICCs were good (0.68–0.79). The standard deviations of the bias estimates were moderate-to-large relative to the means. The computer vision algorithms appeared to have acceptable sample-level validity (i.e., across a sample of time intervals) and are promising for automated ecological assessment of activity in open outdoor settings, but further development and testing is needed before such tools can be used in a diverse range of settings. PMID:29194358

  17. Architecture Framework for Trapped-Ion Quantum Computer based on Performance Simulation Tool

    NASA Astrophysics Data System (ADS)

    Ahsan, Muhammad

    The challenge of building scalable quantum computer lies in striking appropriate balance between designing a reliable system architecture from large number of faulty computational resources and improving the physical quality of system components. The detailed investigation of performance variation with physics of the components and the system architecture requires adequate performance simulation tool. In this thesis we demonstrate a software tool capable of (1) mapping and scheduling the quantum circuit on a realistic quantum hardware architecture with physical resource constraints, (2) evaluating the performance metrics such as the execution time and the success probability of the algorithm execution, and (3) analyzing the constituents of these metrics and visualizing resource utilization to identify system components which crucially define the overall performance. Using this versatile tool, we explore vast design space for modular quantum computer architecture based on trapped ions. We find that while success probability is uniformly determined by the fidelity of physical quantum operation, the execution time is a function of system resources invested at various layers of design hierarchy. At physical level, the number of lasers performing quantum gates, impact the latency of the fault-tolerant circuit blocks execution. When these blocks are used to construct meaningful arithmetic circuit such as quantum adders, the number of ancilla qubits for complicated non-clifford gates and entanglement resources to establish long-distance communication channels, become major performance limiting factors. Next, in order to factorize large integers, these adders are assembled into modular exponentiation circuit comprising bulk of Shor's algorithm. At this stage, the overall scaling of resource-constraint performance with the size of problem, describes the effectiveness of chosen design. By matching the resource investment with the pace of advancement in hardware technology, we find optimal designs for different types of quantum adders. Conclusively, we show that 2,048-bit Shor's algorithm can be reliably executed within the resource budget of 1.5 million qubits.

  18. Robust fault diagnosis of physical systems in operation. Ph.D. Thesis - Rutgers - The State Univ.

    NASA Technical Reports Server (NTRS)

    Abbott, Kathy Hamilton

    1991-01-01

    Ideas are presented and demonstrated for improved robustness in diagnostic problem solving of complex physical systems in operation, or operative diagnosis. The first idea is that graceful degradation can be viewed as reasoning at higher levels of abstraction whenever the more detailed levels proved to be incomplete or inadequate. A form of abstraction is defined that applies this view to the problem of diagnosis. In this form of abstraction, named status abstraction, two levels are defined. The lower level of abstraction corresponds to the level of detail at which most current knowledge-based diagnosis systems reason. At the higher level, a graph representation is presented that describes the real-world physical system. An incremental, constructive approach to manipulating this graph representation is demonstrated that supports certain characteristics of operative diagnosis. The suitability of this constructive approach is shown for diagnosing fault propagation behavior over time, and for sometimes diagnosing systems with feedback. A way is shown to represent different semantics in the same type of graph representation to characterize different types of fault propagation behavior. An approach is demonstrated that threats these different behaviors as different fault classes, and the approach moves to other classes when previous classes fail to generate suitable hypotheses. These ideas are implemented in a computer program named Draphys (Diagnostic Reasoning About Physical Systems) and demonstrated for the domain of inflight aircraft subsystems, specifically a propulsion system (containing two turbofan systems and a fuel system) and hydraulic subsystem.

  19. Computational Physics' Greatest Hits

    NASA Astrophysics Data System (ADS)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  20. Launching applications on compute and service processors running under different operating systems in scalable network of processor boards with routers

    DOEpatents

    Tomkins, James L [Albuquerque, NM; Camp, William J [Albuquerque, NM

    2009-03-17

    A multiple processor computing apparatus includes a physical interconnect structure that is flexibly configurable to support selective segregation of classified and unclassified users. The physical interconnect structure also permits easy physical scalability of the computing apparatus. The computing apparatus can include an emulator which permits applications from the same job to be launched on processors that use different operating systems.

  1. Perceived benefits and barriers to exercise among persons with physical disabilities or chronic health conditions within action or maintenance stages of exercise.

    PubMed

    Malone, Laurie A; Barfield, J P; Brasher, Joel D

    2012-10-01

    Information regarding factors that affect the initial step to exercise behavior change among persons with physical disabilities or chronic health conditions is available in the literature but much less is known regarding perceived benefits and barriers to exercise among those who are regularly active. The purpose of this study was to examine the perceived benefits and barriers to exercise among persons with physical disabilities or chronic health conditions within action or maintenance stages of exercise. Participants (n = 152) completed the Exercise Benefits and Barriers Scale (EBBS). For data analyses, disabilities and health conditions were grouped as neuromuscular, orthopedic, cardiovascular/pulmonary, or multiple conditions. Multivariate analysis of variance (MANOVA) was conducted to determine if mean differences on EBBS benefits and barriers scores existed among disability types, between sexes, among age groups, and between physical activity levels. Sum scores were computed to determine the strongest benefit and barrier responses. No significant mean differences in EBBS scores were found between disability types, sexes, age groups, or physical activity levels (p > 0.05). Strongest benefit responses varied by group. Strongest barrier responses were the same for all demographic groups: "Exercise tires me," "Exercise is hard work for me," and "I am fatigued by exercise." EBBS scores were similar across disability/health condition, sex, age, and physical activity level. Primary benefits reported were in the areas of improved physical performance and psychological outlook whereas the primary barriers were in the area of physical exertion. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Physical Impairment

    NASA Astrophysics Data System (ADS)

    Trewin, Shari

    Many health conditions can lead to physical impairments that impact computer and Web access. Musculoskeletal conditions such as arthritis and cumulative trauma disorders can make movement stiff and painful. Movement disorders such as tremor, Parkinsonism and dystonia affect the ability to control movement, or to prevent unwanted movements. Often, the same underlying health condition also has sensory or cognitive effects. People with dexterity impairments may use a standard keyboard and mouse, or any of a wide range of alternative input mechanisms. Examples are given of the diverse ways that specific dexterity impairments and input mechanisms affect the fundamental actions of Web browsing. As the Web becomes increasingly sophisticated, and physically demanding, new access features at the Web browser and page level will be necessary.

  3. RIKEN BNL Research Center

    NASA Astrophysics Data System (ADS)

    Samios, Nicholas

    2014-09-01

    Since its inception in 1997, the RIKEN BNL Research Center (RBRC) has been a major force in the realms of Spin Physics, Relativistic Heavy Ion Physics, large scale Computing Physics and the training of a new generation of extremely talented physicists. This has been accomplished through the recruitment of an outstanding non-permanent staff of Fellows and Research associates in theory and experiment. RBRC is now a mature organization that has reached a steady level in the size of scientific and support staff while at the same time retaining its vibrant youth. A brief history of the scientific accomplishments and contributions of the RBRC physicists will be presented as well as a discussion of the unique RBRC management structure.

  4. Kinetic Theory and Simulation of Single-Channel Water Transport

    NASA Astrophysics Data System (ADS)

    Tajkhorshid, Emad; Zhu, Fangqiang; Schulten, Klaus

    Water translocation between various compartments of a system is a fundamental process in biology of all living cells and in a wide variety of technological problems. The process is of interest in different fields of physiology, physical chemistry, and physics, and many scientists have tried to describe the process through physical models. Owing to advances in computer simulation of molecular processes at an atomic level, water transport has been studied in a variety of molecular systems ranging from biological water channels to artificial nanotubes. While simulations have successfully described various kinetic aspects of water transport, offering a simple, unified model to describe trans-channel translocation of water turned out to be a nontrivial task.

  5. Randomness in quantum mechanics: philosophy, physics and technology.

    PubMed

    Bera, Manabendra Nath; Acín, Antonio; Kuś, Marek; Mitchell, Morgan W; Lewenstein, Maciej

    2017-12-01

    This progress report covers recent developments in the area of quantum randomness, which is an extraordinarily interdisciplinary area that belongs not only to physics, but also to philosophy, mathematics, computer science, and technology. For this reason the article contains three parts that will be essentially devoted to different aspects of quantum randomness, and even directed, although not restricted, to various audiences: a philosophical part, a physical part, and a technological part. For these reasons the article is written on an elementary level, combining simple and non-technical descriptions with a concise review of more advanced results. In this way readers of various provenances will be able to gain while reading the article.

  6. Randomness in quantum mechanics: philosophy, physics and technology

    NASA Astrophysics Data System (ADS)

    Nath Bera, Manabendra; Acín, Antonio; Kuś, Marek; Mitchell, Morgan W.; Lewenstein, Maciej

    2017-12-01

    This progress report covers recent developments in the area of quantum randomness, which is an extraordinarily interdisciplinary area that belongs not only to physics, but also to philosophy, mathematics, computer science, and technology. For this reason the article contains three parts that will be essentially devoted to different aspects of quantum randomness, and even directed, although not restricted, to various audiences: a philosophical part, a physical part, and a technological part. For these reasons the article is written on an elementary level, combining simple and non-technical descriptions with a concise review of more advanced results. In this way readers of various provenances will be able to gain while reading the article.

  7. Design and Application of Interactive Simulations in Problem-Solving in University-Level Physics Education

    ERIC Educational Resources Information Center

    Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel

    2016-01-01

    In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving…

  8. Introducing a "Means-End" Approach to Human-Computer Interaction: Why Users Choose Particular Web Sites Over Others.

    ERIC Educational Resources Information Center

    Subramony, Deepak Prem

    Gutman's means-end theory, widely used in market research, identifies three levels of abstraction: attributes, consequences, and values--associated with the use of products, representing the process by which physical attributes of products gain personal meaning for users. The primary methodological manifestation of means-end theory is the…

  9. Jumpstarting Jill: Strategies to Nurture Talented Girls in Your Science Classroom

    ERIC Educational Resources Information Center

    Heilbronner, Nancy N.

    2008-01-01

    Women are making progress in many areas of science, but a gender gap still remains, especially in physics, computer science, and engineering, and at advanced levels of academic and career achievement. Today's teachers can help narrow this gap by instilling a love for science in their female students and by helping them to understand and develop…

  10. Women of Science, Technology, Engineering, and Mathematics: A Qualitative Exploration into Factors of Success

    ERIC Educational Resources Information Center

    Olund, Jeanine K.

    2012-01-01

    Although the number of women entering science, technology, engineering, and mathematics (STEM) disciplines has increased in recent years, overall there are still more men than women completing four-year degrees in these fields, especially in physics, engineering, and computer science. At higher levels of education and within the workplace, the…

  11. Task Design and Skill Level Perceptions of Middle School Students toward Competition in Dance-Related Active Gaming

    ERIC Educational Resources Information Center

    Bernstein, Eve; Gibbone, Anne; Rukavina, Paul

    2015-01-01

    In this study, we drew upon McCaughtry, Tischler, and Flory's (2008) reconceptualized ecological framework to examine middle school students' perceptions (N = 391) of competition in physical education, specifically after participating in noncompetitive and competitive active gaming (AG) sessions. Chi-square tests of independence were computed on…

  12. Real-Time Assessment of Problem-Solving of Physics Students Using Computer-Based Technology

    ERIC Educational Resources Information Center

    Gok, Tolga

    2012-01-01

    The change in students' problem solving ability in upper-level course through the application of a technological interactive environment--Tablet PC running InkSurvey--was investigated in present study. Tablet PC/InkSurvey interactive technology allowing the instructor to receive real-time formative assessment as the class works through the problem…

  13. Report of the theory panel. [space physics

    NASA Technical Reports Server (NTRS)

    Ashourabdalla, Maha; Rosner, Robert; Antiochos, Spiro; Curtis, Steven; Fejer, B.; Goertz, Christoph K.; Goldstein, Melvyn L.; Holzer, Thomas E.; Jokipii, J. R.; Lee, Lou-Chuang

    1991-01-01

    The ultimate goal of this research is to develop an understanding which is sufficiently comprehensive to allow realistic predictions of the behavior of the physical systems. Theory has a central role to play in the quest for this understanding. The level of theoretical description is dependent on three constraints: (1) the available computer hardware may limit both the number and the size of physical processes the model system can describe; (2) the fact that some natural systems may only be described in a statistical manner; and (3) the fact that some natural systems may be observable only through remote sensing which is intrinsically limited by spatial resolution and line of sight integration. From this the report discusses present accomplishments and future goals of theoretical space physics. Finally, the development and use of new supercomputer is examined.

  14. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  15. Improved inland water levels from SAR altimetry using novel empirical and physical retrackers

    NASA Astrophysics Data System (ADS)

    Villadsen, Heidi; Deng, Xiaoli; Andersen, Ole B.; Stenseng, Lars; Nielsen, Karina; Knudsen, Per

    2016-06-01

    Satellite altimetry has proven a valuable resource of information on river and lake levels where in situ data are sparse or non-existent. In this study several new methods for obtaining stable inland water levels from CryoSat-2 Synthetic Aperture Radar (SAR) altimetry are presented and evaluated. In addition, the possible benefits from combining physical and empirical retrackers are investigated. The retracking methods evaluated in this paper include the physical SAR Altimetry MOde Studies and Applications (SAMOSA3) model, a traditional subwaveform threshold retracker, the proposed Multiple Waveform Persistent Peak (MWaPP) retracker, and a method combining the physical and empirical retrackers. Using a physical SAR waveform retracker over inland water has not been attempted before but shows great promise in this study. The evaluation is performed for two medium-sized lakes (Lake Vänern in Sweden and Lake Okeechobee in Florida), and in the Amazon River in Brazil. Comparing with in situ data shows that using the SAMOSA3 retracker generally provides the lowest root-mean-squared-errors (RMSE), closely followed by the MWaPP retracker. For the empirical retrackers, the RMSE values obtained when comparing with in situ data in Lake Vänern and Lake Okeechobee are in the order of 2-5 cm for well-behaved waveforms. Combining the physical and empirical retrackers did not offer significantly improved mean track standard deviations or RMSEs. Based on these studies, it is suggested that future SAR derived water levels are obtained using the SAMOSA3 retracker whenever information about other physical properties apart from range is desired. Otherwise we suggest using the empirical MWaPP retracker described in this paper, which is both easy to implement, computationally efficient, and gives a height estimate for even the most contaminated waveforms.

  16. Modellus: Learning Physics with Mathematical Modelling

    NASA Astrophysics Data System (ADS)

    Teodoro, Vitor

    Computers are now a major tool in research and development in almost all scientific and technological fields. Despite recent developments, this is far from true for learning environments in schools and most undergraduate studies. This thesis proposes a framework for designing curricula where computers, and computer modelling in particular, are a major tool for learning. The framework, based on research on learning science and mathematics and on computer user interface, assumes that: 1) learning is an active process of creating meaning from representations; 2) learning takes place in a community of practice where students learn both from their own effort and from external guidance; 3) learning is a process of becoming familiar with concepts, with links between concepts, and with representations; 4) direct manipulation user interfaces allow students to explore concrete-abstract objects such as those of physics and can be used by students with minimal computer knowledge. Physics is the science of constructing models and explanations about the physical world. And mathematical models are an important type of models that are difficult for many students. These difficulties can be rooted in the fact that most students do not have an environment where they can explore functions, differential equations and iterations as primary objects that model physical phenomena--as objects-to-think-with, reifying the formal objects of physics. The framework proposes that students should be introduced to modelling in a very early stage of learning physics and mathematics, two scientific areas that must be taught in very closely related way, as they were developed since Galileo and Newton until the beginning of our century, before the rise of overspecialisation in science. At an early stage, functions are the main type of objects used to model real phenomena, such as motions. At a later stage, rates of change and equations with rates of change play an important role. This type of equations--differential equations--are the most important mathematical objects used for modelling Natural phenomena. In traditional approaches, they are introduced only at advanced level, because it takes a long time for students to be introduced to the fundamental principles of Calculus. With the new proposed approach, rates of change can be introduced also at early stages on learning if teachers stress semi-quantitative reasoning and use adequate computer tools. In this thesis, there is also presented Modellus, a computer tool for modelling and experimentation. This computer tool has a user interface that allows students to start doing meaningful conceptual and empirical experiments without the need to learn new syntax, as is usual with established tools. The different steps in the process of constructing and exploring models can be done with Modellus, both from physical points of view and from mathematical points of view. Modellus activities show how mathematics and physics have a unity that is very difficult to see with traditional approaches. Mathematical models are treated as concrete-abstract objects: concrete in the sense that they can be manipulated directly with a computer and abstract in the sense that they are representations of relations between variables. Data gathered from two case studies, one with secondary school students and another with first year undergraduate students support the main ideas of the thesis. Also data gathered from teachers (from college and secondary schools), mainly through an email structured questionnaire, shows that teachers agree on the potential of modelling in the learning of physics (and mathematics) and of the most important aspects of the proposed framework to integrate modelling as an essential component of the curriculum. Schools, as all institutions, change at a very slow rate. There are a multitude of reasons for this. And traditional curricula, where the emphasis is on rote learning of facts, can only be changed if schools have access to new and powerful views of learning and to new tools, that support meaningful conceptual learning and are as common and easy to use as pencil and paper.

  17. PREDICTORS OF COMPUTER USE IN COMMUNITY-DWELLING ETHNICALLY DIVERSE OLDER ADULTS

    PubMed Central

    Werner, Julie M.; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence

    2011-01-01

    Objective In this study we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders in order to investigate the relationship computer use has with demographics, well-being and other key psychosocial variables in older adults. Background Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors, or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. Method With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: email and general computer use. Results Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Conclusion Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Application Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities. PMID:22046718

  18. Predictors of computer use in community-dwelling, ethnically diverse older adults.

    PubMed

    Werner, Julie M; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence

    2011-10-01

    In this study, we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders to investigate the relationship computer use has with demographics, well-being, and other key psychosocial variables in older adults. Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than do others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: e-mail and general computer use. Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities.

  19. Using computers to overcome math-phobia in an introductory course in musical acoustics

    NASA Astrophysics Data System (ADS)

    Piacsek, Andrew A.

    2002-11-01

    In recent years, the desktop computer has acquired the signal processing and visualization capabilities once obtained only with expensive specialized equipment. With the appropriate A/D card and software, a PC can behave like an oscilloscope, a real-time signal analyzer, a function generator, and a synthesizer, with both audio and visual outputs. In addition, the computer can be used to visualize specific wave behavior, such as superposition and standing waves, refraction, dispersion, etc. These capabilities make the computer an invaluable tool to teach basic acoustic principles to students with very poor math skills. In this paper I describe my approach to teaching the introductory-level Physics of Musical Sound at Central Washington University, in which very few science students enroll. Emphasis is placed on how vizualization with computers can help students appreciate and apply quantitative methods for analyzing sound.

  20. Microelectromechanical reprogrammable logic device.

    PubMed

    Hafiz, M A A; Kosuru, L; Younis, M I

    2016-03-29

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme.

  1. Microelectromechanical reprogrammable logic device

    PubMed Central

    Hafiz, M. A. A.; Kosuru, L.; Younis, M. I.

    2016-01-01

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme. PMID:27021295

  2. Influence of Learning Strategy of Cognitive Conflict on Student Misconception in Computational Physics Course

    NASA Astrophysics Data System (ADS)

    Akmam, A.; Anshari, R.; Amir, H.; Jalinus, N.; Amran, A.

    2018-04-01

    Misconception is one of the factors causing students are not suitable in to choose a method for problem solving. Computational Physics course is a major subject in the Department of Physics FMIPA UNP Padang. The problem in Computational Physics learning lately is that students have difficulties in constructing knowledge. The indication of this problem was the student learning outcomes do not achieve mastery learning. The root of the problem is the ability of students to think critically weak. Student critical thinking can be improved using cognitive by conflict learning strategies. The research aims to determine the effect of cognitive conflict learning strategy to student misconception on the subject of Computational Physics Course at the Department of Physics, Faculty of Mathematics and Science, Universitas Negeri Padang. The experimental research design conducted after-before design cycles with a sample of 60 students by cluster random sampling. Data were analyzed using repeated Anova measurements. The cognitive conflict learning strategy has a significant effect on student misconception in the subject of Computational Physics Course.

  3. Development of a Computer-Assisted Instrumentation Curriculum for Physics Students: Using LabVIEW and Arduino Platform

    ERIC Educational Resources Information Center

    Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang

    2016-01-01

    We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum "Computer-Assisted Instrumentation in the Design of Physics Laboratories" brings…

  4. Effects of Computer-Assisted STAD, LTM and ICI Cooperative Learning Strategies on Nigerian Secondary School Students' Achievement, Gender and Motivation in Physics

    ERIC Educational Resources Information Center

    Gambari, Amosa Isiaka; Yusuf, Mudasiru Olalere; Thomas, David Akpa

    2015-01-01

    This study examined the effectiveness of computer-assisted instruction on Student Team Achievement Division (STAD) and Learning Together (LT) cooperative learning strategies on Nigerian secondary students' achievement and motivation in physics. The effectiveness of computer assisted instructional package (CAI) for teaching physics concepts in…

  5. SCB Quantum Computers Using iSWAP and 1-Qubit Rotations

    NASA Technical Reports Server (NTRS)

    Williams, Colin; Echtemach, Pierre

    2005-01-01

    Units of superconducting circuitry that exploit the concept of the single- Cooper-pair box (SCB) have been built and are undergoing testing as prototypes of logic gates that could, in principle, constitute building blocks of clocked quantum computers. These units utilize quantized charge states as the quantum information-bearing degrees of freedom. An SCB is an artificial two-level quantum system that comprises a nanoscale superconducting electrode connected to a reservoir of Cooper-pair charges via a Josephson junction. The logical quantum states of the device, .0. and .1., are implemented physically as a pair of charge-number states that differ by 2e (where e is the charge of an electron). Typically, some 109 Cooper pairs are involved. Transitions between the logical states are accomplished by tunneling of Cooper pairs through the Josephson junction. Although the two-level system contains a macroscopic number of charges, in the superconducting regime, they behave collectively, as a Bose-Einstein condensate, making possible a coherent superposition of the two logical states. This possibility makes the SCB a candidate for the physical implementation of a qubit. A set of quantum logic operations and the gates that implement them is characterized as universal if, in principle, one can form combinations of the operations in the set to implement any desired quantum computation. To be able to design a practical quantum computer, one must first specify how to decompose any valid quantum computation into a sequence of elementary 1- and 2-qubit quantum gates that are universal and that can be realized in hardware that is feasible to fabricate. Traditionally, the set of universal gates has been taken to be the set of all 1-qubit quantum gates in conjunction with the controlled-NOT (CNOT) gate, which is a 2-qubit gate. Also, it has been known for some time that the SWAP gate, which implements square root of the simple 2-qubit exchange interaction, is as computationally universal as is the CNOT operation.

  6. A Computational Study of the Flow Physics of Acoustic Liners

    NASA Technical Reports Server (NTRS)

    Tam, Christopher

    2006-01-01

    The present investigation is a continuation of a previous joint project between the Florida State University and the NASA Langley Research Center Liner Physics Team. In the previous project, a study of acoustic liners, in two dimensions, inside a normal incidence impedance tube was carried out. The study consisted of two parts. The NASA team was responsible for the experimental part of the project. This involved performing measurements in an impedance tube with a large aspect ratio slit resonator. The FSU team was responsible for the computation part of the project. This involved performing direct numerical simulation (DNS) of the NASA experiment in two dimensions using CAA methodology. It was agreed that upon completion of numerical simulation, the computed values of the liner impedance were to be sent to NASA for validation with experimental results. On following this procedure good agreements were found between numerical results and experimental measurements over a wide range of frequencies and sound-pressure-level. Broadband incident sound waves were also simulated numerically and measured experimentally. Overall, good agreements were also found.

  7. Enhancement of DFT-calculations at petascale: Nuclear Magnetic Resonance, Hybrid Density Functional Theory and Car-Parrinello calculations

    NASA Astrophysics Data System (ADS)

    Varini, Nicola; Ceresoli, Davide; Martin-Samos, Layla; Girotto, Ivan; Cavazzoni, Carlo

    2013-08-01

    One of the most promising techniques used for studying the electronic properties of materials is based on Density Functional Theory (DFT) approach and its extensions. DFT has been widely applied in traditional solid state physics problems where periodicity and symmetry play a crucial role in reducing the computational workload. With growing compute power capability and the development of improved DFT methods, the range of potential applications is now including other scientific areas such as Chemistry and Biology. However, cross disciplinary combinations of traditional Solid-State Physics, Chemistry and Biology drastically improve the system complexity while reducing the degree of periodicity and symmetry. Large simulation cells containing of hundreds or even thousands of atoms are needed to model these kind of physical systems. The treatment of those systems still remains a computational challenge even with modern supercomputers. In this paper we describe our work to improve the scalability of Quantum ESPRESSO (Giannozzi et al., 2009 [3]) for treating very large cells and huge numbers of electrons. To this end we have introduced an extra level of parallelism, over electronic bands, in three kernels for solving computationally expensive problems: the Sternheimer equation solver (Nuclear Magnetic Resonance, package QE-GIPAW), the Fock operator builder (electronic ground-state, package PWscf) and most of the Car-Parrinello routines (Car-Parrinello dynamics, package CP). Final benchmarks show our success in computing the Nuclear Magnetic Response (NMR) chemical shift of a large biological assembly, the electronic structure of defected amorphous silica with hybrid exchange-correlation functionals and the equilibrium atomic structure of height Porphyrins anchored to a Carbon Nanotube, on many thousands of CPU cores.

  8. Is physical activity differentially associated with different types of sedentary pursuits?

    PubMed

    Feldman, Debbie Ehrmann; Barnett, Tracie; Shrier, Ian; Rossignol, Michel; Abenhaim, Lucien

    2003-08-01

    To determine whether there is a relationship between the time adolescents spend in physical activity and time they spend in different sedentary pursuits: watching television, playing video games, working on computers, doing homework, and reading, taking into account the effect of part-time work on students' residual time. Cross-sectional cohort design. Seven hundred forty-three high school students from 2 inner-city public schools and 1 private school. Students completed a self-administered questionnaire that addressed time spent in physical activity, time spent in sedentary pursuits, musculoskeletal pain, and psychosocial issues and were also measured for height and weight. Main Outcome Measure Level of physical activity (low, moderate, high). There were more girls than boys in the low and moderate physical activity groups and more boys than girls in the high activity group. Ordinal logistic regression showed that increased time spent in "productive sedentary behavior" (reading or doing homework and working on computers) was associated with increased physical activity (odds ratio, 1.7; 95% confidence interval, 1.2-2.4), as was time spent working (odds ratio, 1.3; 95% confidence interval, 1.2-1.4). Time spent watching television and playing video games was not associated with decreased physical activity. Physical activity was not inversely associated with watching television or playing video games, but was positively associated with productive sedentary behavior and part-time work. Some students appear capable of managing their time better than others. Future studies should explore the ability of students to manage their time and also determine what characteristics are conducive to better time management.

  9. Investigating the applicability of activity-based quantum mechanics in a few high school physics classrooms

    NASA Astrophysics Data System (ADS)

    Escalada, Lawrence Todd

    Quantum physics is not traditionally introduced in high school physics courses because of the level of abstraction and mathematical formalism associated with the subject. As part of the Visual Quantum Mechanics project, activity-based instructional units have been developed that introduce quantum principles to students who have limited backgrounds in physics and mathematics. This study investigates the applicability of one unit, Solids & Light, that introduces quantum principles within the context of learning about light emitting diodes. An observation protocol, attitude surveys, and questionnaires were used to examine the implementation of materials and student-teacher interactions in various secondary physics classrooms. Aspects of Solids & Light including the use of hands-on activities, interactive computer programs, inexpensive materials, and the focus on conceptual understanding were very applicable in the various physics classrooms observed. Both teachers and students gave these instructional strategies favorable ratings in motivating students to make observations and to learn. These ratings were not significantly affected by gender or students, attitudes towards physics or computers. Solid's & Light was applicable in terms of content and teaching style for some teachers. However, a mismatch of teaching styles between some instructors and the unit posed some problems in determining applicability. Observations indicated that some instructors were not able to utilize the exploratory instructional strategy of Solid's & Light. Thus, Solids & Light must include additional support necessary to make the instructor comfortable with the subject matter and pedagogical style. With these revisions, Solids & Light, will have all the key components to make its implementation in a high school physics classroom a successful one.

  10. Accelerator-based techniques for the support of senior-level undergraduate physics laboratories

    NASA Astrophysics Data System (ADS)

    Williams, J. R.; Clark, J. C.; Isaacs-Smith, T.

    2001-07-01

    Approximately three years ago, Auburn University replaced its aging Dynamitron accelerator with a new 2MV tandem machine (Pelletron) manufactured by the National Electrostatics Corporation (NEC). This new machine is maintained and operated for the University by Physics Department personnel, and the accelerator supports a wide variety of materials modification/analysis studies. Computer software is available that allows the NEC Pelletron to be operated from a remote location, and an Internet link has been established between the Accelerator Laboratory and the Upper-Level Undergraduate Teaching Laboratory in the Physics Department. Additional software supplied by Canberra Industries has also been used to create a second Internet link that allows live-time data acquisition in the Teaching Laboratory. Our senior-level undergraduates and first-year graduate students perform a number of experiments related to radiation detection and measurement as well as several standard accelerator-based experiments that have been added recently. These laboratory exercises will be described, and the procedures used to establish the Internet links between our Teaching Laboratory and the Accelerator Laboratory will be discussed.

  11. Development of a Toolbox Using Chemical, Physical and Biological Technologies for Decontamination of Sediments to Support Strategic Army Response to Natural Disasters

    DTIC Science & Technology

    2006-11-01

    disinfection) was tested using soil microcosms and respirometry to determine diesel range and total organic compound degradation. These tests were...grease) such as benzo(a)pyrene were detected above chronic (long term-measured in years) screening levels. Levels of diesel and oil range organics... bioremediation , and toxicity of liquid and solid samples. The Comput-OX 4R is a 4 reactor unit with no stirring modules or temperature controlled water bath

  12. GPU computing in medical physics: a review.

    PubMed

    Pratx, Guillem; Xing, Lei

    2011-05-01

    The graphics processing unit (GPU) has emerged as a competitive platform for computing massively parallel problems. Many computing applications in medical physics can be formulated as data-parallel tasks that exploit the capabilities of the GPU for reducing processing times. The authors review the basic principles of GPU computing as well as the main performance optimization techniques, and survey existing applications in three areas of medical physics, namely image reconstruction, dose calculation and treatment plan optimization, and image processing.

  13. Self-reported physical activity behavior of a multi-ethnic adult population within the urban and rural setting in Suriname.

    PubMed

    Baldew, Se-Sergio M; Krishnadath, Ingrid S K; Smits, Christel C F; Toelsie, Jerry R; Vanhees, Luc; Cornelissen, Veronique

    2015-05-12

    Physical activity (PA) plays an important role in the combat against noncommunicable diseases including cardiovascular diseases. In order to develop appropriate PA intervention programs, there is a need to evaluate PA behavior. So far, there are no published data on PA available for Suriname. Therefore, we aim to describe PA behavior among the multi-ethnic population living in urban and rural areas of Suriname. The World Health Organization (WHO) STEPwise approach to chronic disease risk factor surveillance (STEPS) was conducted in a national representative sample (N = 5751; 48.6% men) aged 15-64 years between March and September 2013. Physical activity data were assessed using the Global physical activity questionnaire (GPAQ) and analyzed according to the GPAQ guidelines. The prevalence of meeting the recommended PA level and prevalence ratios (PR) were computed. Only 55.5% of the overall population met the WHO recommended PA levels (urban coastal area: 55.7%, rural coastal area: 57.9%, rural interior area: 49.1%). Women were less likely to meet the recommended PA level (49% vs 62.4%; p < 0.0001) and with increasing age the PR for recommended level of PA decreased (p < 0.0001). Compared to the Hindustani's, the largest ethnic group, the Javanese reported the lowest percentage of people meeting recommended PA level (PR = 0.92; p = 0.07). Around half of the population meets the recommended PA level. Future lifestyle interventions aiming at increasing PA should especially focus on women and older individuals as they are less likely to meet the recommended levels of PA.

  14. Recursive Techniques for Computing Gluon Scattering in Anti-de-Sitter Space

    NASA Astrophysics Data System (ADS)

    Shyaka, Claude; Kharel, Savan

    2016-03-01

    The anti-de Sitter/conformal field theory correspondence is a relationship between two kinds of physical theories. On one side of the duality are special type of quantum (conformal) field theories known as the Yang-Mills theory. These quantum field theories are known to be equivalent to theories of gravity in Anti-de Sitter (AdS) space. The physical observables in the theory are the correlation functions that live in the boundary of AdS space. In general correlation functions are computed using configuration space and the expressions are extremely complicated. Using momentum basis and recursive techniques developed by Raju, we extend tree level correlation functions for four and five-point correlation functions in Yang-Mills theory in Anti-de Sitter space. In addition, we show that for certain external helicity, the correlation functions have simple analytic structure. Finally, we discuss how one can generalize these results to n-point functions. Hendrix college odyssey Grant.

  15. Content range and precision of a computer adaptive test of upper extremity function for children with cerebral palsy.

    PubMed

    Montpetit, Kathleen; Haley, Stephen; Bilodeau, Nathalie; Ni, Pengsheng; Tian, Feng; Gorton, George; Mulcahey, M J

    2011-02-01

    This article reports on the content range and measurement precision of an upper extremity (UE) computer adaptive testing (CAT) platform of physical function in children with cerebral palsy. Upper extremity items representing skills of all abilities were administered to 305 parents. These responses were compared with two traditional standardized measures: Pediatric Outcomes Data Collection Instrument and Functional Independence Measure for Children. The UE CAT correlated strongly with the upper extremity component of these measures and had greater precision when describing individual functional ability. The UE item bank has wider range with items populating the lower end of the ability spectrum. This new UE item bank and CAT have the capability to quickly assess children of all ages and abilities with good precision and, most importantly, with items that are meaningful and appropriate for their age and level of physical function.

  16. Physical function assessment in a community-dwelling population of U.S. Chinese older adults.

    PubMed

    Dong, XinQi; Chang, E-Shien; Simon, Melissa A

    2014-11-01

    This report describes the levels of physical function in U.S. Chinese older adults utilizing self-reported and performance-based measures, and examines the association between sociodemographic characteristics and physical function. The Population Study of Chinese Elderly in Chicago enrolled an epidemiological cohort of 3,159 community-dwelling Chinese older adults aged 60 and older. We collected self-reported physical function using Katz activities of daily living and Lawton instrumental activities of daily living items, the Index of Mobility scale, and the Index of Basic Physical Activities scale. Participants were also asked to perform tasks in chair stand, tandem stand, and timed walk. We computed Pearson and Spearman correlation coefficients to examine the correlation between sociodemographic and physical function variables. A total of 7.8% of study participants experienced activities of daily living impairment, and 50.2% experienced instrumental activities of daily living impairment. With respect to physical performance testing, 11.4% of the participants were not able to complete chair stand for five times, 8.5% of the participants were unable to do chair stands at all. Older age, female gender, lower education level, being unmarried, living with fewer people in the same household, having fewer children, living fewer years in the United States, living fewer years in the community, and worsening health status were significantly correlated with lower levels of physical function. Utilizing self-reported and performance-based measures of physical function in a large population-based study of U.S. Chinese older adults, our findings expand current understanding of minority older adults' functional status. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Relationship of weight status, physical activity and screen time with academic achievement in adolescents.

    PubMed

    García-Hermoso, Antonio; Marina, Raquel

    The aim of this study was to examine the relationship of weight status, physical activity and screen time with academic achievement in Chilean adolescents. The present cross-sectional study included 395 adolescents. The International Obesity Task Force cut-off points were used to define the weight status. Physical activity was assessed using the Physical Activity Questionnaire for Adolescents and screen time was assessed using several questions about television, videogame and computer use. Academic achievement was measured using the mean of the grades obtained in mathematics and language subjects. In both genders, adolescents with obesity and excessive screen time earned worse grades compared to their non-obese peers and their peers that complied with screen time recommendations. The logistic regression analysis showed that adolescents with obesity, classified with medium-low physical activity and excessive screen time recommendations (excess ≥2h/day) are less likely to obtain high academic achievement (boys: OR=0.26; girls: OR=0.23) compared to their non-obese peers, high levels of physical activity and those who comply with the current screen time recommendations. Similar results were observed in adolescents with obesity and classified with medium-low physical activity (boys: OR=0.46; girls: OR=0.33) or excessive screen time (boys: OR=0.35; girls: OR=0.36) compared to adolescents with high levels of physical activity and those who complied with the screen time recommendations, respectively. This study shows that when combined, obesity, low-medium levels of physical activity and excessive screen time might be related to poor academic achievement. Copyright © 2015 Asia Oceania Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  18. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.

  19. Dietary and activity correlates of sugar-sweetened beverage consumption among adolescents.

    PubMed

    Ranjit, Nalini; Evans, Martin H; Byrd-Williams, Courtney; Evans, Alexandra E; Hoelscher, Deanna M

    2010-10-01

    To examine the dietary and activity correlates of sugar-sweetened beverage consumption by children in middle and high school. Data were obtained from a cross-sectional survey of 15,283 children in middle and high schools in Texas. Consumption of sodas and noncarbonated flavored and sports beverages (FSBs) were examined separately for their associations with the level of (1) unhealthy food (fried meats, French fries, desserts) consumption, (2) healthy food (vegetables, fruit, and milk) consumption, (3) physical activity including usual vigorous physical activity and participation in organized physical activity, and (4) sedentary activity, including hours spent watching television, using the computer, and playing video games. For both genders, consumption of soda and FSBs was systematically associated with a number of unhealthy dietary practices and with sedentary behaviors. However, consumption of FSBs showed significant positive graded associations with several healthy dietary practices and level of physical activity, whereas soda consumption showed no such associations with healthy behaviors. Consumption of FSBs coexists with healthy dietary and physical activity behaviors, which suggests popular misperception of these beverages as being consistent with a healthy lifestyle. Assessment and obesity-prevention efforts that target sugar-sweetened beverages need to distinguish between FSBs and sodas.

  20. Use of clickers and sustainable reform in upper-division physics courses

    NASA Astrophysics Data System (ADS)

    Dubson, Michael

    2008-03-01

    At the University of Colorado at Boulder, successful reforms of our freshmen and sophomore-level physics courses are now being extended to upper-division courses, including Mechanics, Math Methods, QM, E&M, and Thermal Physics. Our course reforms include clicker questions (ConcepTests) in lecture, peer instruction, and an added emphasis on conceptual understanding and qualitative reasoning on homework assignments and exams. Student feedback has been strongly positive, and I will argue that such conceptual training improves rather than dilutes, traditional, computationally-intensive problem-solving skills. In order for these reforms to be sustainable, reform efforts must begin with department-wide consensus and agreed-upon measures of success. I will discuss the design of good clicker questions and effective incorporation into upper-level courses, including examples from materials science. Condensed matter physics, which by nature involve intelligent use of approximation, particularly lends itself to conceptual training. I will demonstrate the use of a clicker system (made by iClicker) with audience-participation questions. Come prepared to think and interact, rather than just sit there!

  1. Markov Task Network: A Framework for Service Composition under Uncertainty in Cyber-Physical Systems.

    PubMed

    Mohammed, Abdul-Wahid; Xu, Yang; Hu, Haixiao; Agyemang, Brighter

    2016-09-21

    In novel collaborative systems, cooperative entities collaborate services to achieve local and global objectives. With the growing pervasiveness of cyber-physical systems, however, such collaboration is hampered by differences in the operations of the cyber and physical objects, and the need for the dynamic formation of collaborative functionality given high-level system goals has become practical. In this paper, we propose a cross-layer automation and management model for cyber-physical systems. This models the dynamic formation of collaborative services pursuing laid-down system goals as an ontology-oriented hierarchical task network. Ontological intelligence provides the semantic technology of this model, and through semantic reasoning, primitive tasks can be dynamically composed from high-level system goals. In dealing with uncertainty, we further propose a novel bridge between hierarchical task networks and Markov logic networks, called the Markov task network. This leverages the efficient inference algorithms of Markov logic networks to reduce both computational and inferential loads in task decomposition. From the results of our experiments, high-precision service composition under uncertainty can be achieved using this approach.

  2. The Development and Assessment of Particle Physics Summer Program for High School Students

    NASA Astrophysics Data System (ADS)

    Prefontaine, Brean; Kurahashi Neilson, Naoko, , Dr.; Love, Christina, , Dr.

    2017-01-01

    A four week immersive summer program for high school students was developed and implemented to promote awareness of university level research. The program was completely directed by an undergraduate physics major and included a hands-on and student-led capstone project for the high school students. The goal was to create an adaptive and shareable curriculum in order to influence high school students' views of university level research and what it means to be a scientist. The program was assessed through various methods including a survey developed for this program, a scientific attitudes survey, weekly blog posts, and an oral exit interview. The curriculum included visits to local laboratories, an introduction to particle physics and the IceCube collaboration, an introduction to electronics and computer programming, and their capstone project: planning and building a scale model of the IceCube detector. At the conclusion of the program, the students participated an informal outreach event for the general public and gave an oral presentation to the Department of Physics at Drexel University. Assessment results and details concerning the curriculum and its development will be discussed.

  3. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  4. Opportunities and choice in a new vector era

    NASA Astrophysics Data System (ADS)

    Nowak, A.

    2014-06-01

    This work discusses the significant changes in computing landscape related to the progression of Moore's Law, and the implications on scientific computing. Particular attention is devoted to the High Energy Physics domain (HEP), which has always made good use of threading, but levels of parallelism closer to the hardware were often left underutilized. Findings of the CERN openlab Platform Competence Center are reported in the context of expanding "performance dimensions", and especially the resurgence of vectors. These suggest that data oriented designs are feasible in HEP and have considerable potential for performance improvements on multiple levels, but will rarely trump algorithmic enhancements. Finally, an analysis of upcoming hardware and software technologies identifies heterogeneity as a major challenge for software, which will require more emphasis on scalable, efficient design.

  5. Interactive Computation for Undergraduates: The Next Generation

    NASA Astrophysics Data System (ADS)

    Kolan, Amy J.

    2017-05-01

    A generation ago (29 years ago), Leo Kadanoff and Michael Vinson created the Computers, Chaos, and Physics course. A major pedagogical thrust of this course was to help students form and test hypotheses via computer simulation of small problems in physics. Recently, this aspect of the 1987 course has been revived for use with first year physics undergraduate students at St. Olaf College.

  6. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  7. Effect of addiction to computer games on physical and mental health of female and male students of guidance school in city of isfahan.

    PubMed

    Zamani, Eshrat; Chashmi, Maliheh; Hedayati, Nasim

    2009-01-01

    This study aimed to investigate the effects of addiction to computer games on physical and mental health of students. The study population includes all students in the second year of public guidance schools in the city of Isfahan in the educational year of 2009-2010. The sample size includes 564 students selected by multiple steps stratified sampling. Dependent variables include general health in dimensions of physical health, anxiety and sleeplessness and impaired social functioning. Data were collected using General Health Questionnaire (GHQ-28) scale and a questionnaire on addiction to computer games. Pearson's correlation coefficient and structural model were used for data analysis. There was a significant positive correlation between students' computer games addiction and their physical and mental health in dimensions of physical health, anxiety and sleeplessness There was a significant negative relationship between addictions to computer games and impaired social functioning. The results of this study are in agreement with the findings of other studies around the world. As the results show, addiction to computer games affects various dimensions of health and increases physical problems, anxiety and depression, while decreases social functioning disorder.

  8. Effect of Addiction to Computer Games on Physical and Mental Health of Female and Male Students of Guidance School in City of Isfahan

    PubMed Central

    Zamani, Eshrat; Chashmi, Maliheh; Hedayati, Nasim

    2009-01-01

    Background: This study aimed to investigate the effects of addiction to computer games on physical and mental health of students. Methods: The study population includes all students in the second year of public guidance schools in the city of Isfahan in the educational year of 2009-2010. The sample size includes 564 students selected by multiple steps stratified sampling. Dependent variables include general health in dimensions of physical health, anxiety and sleeplessness and impaired social functioning. Data were collected using General Health Questionnaire (GHQ-28) scale and a questionnaire on addiction to computer games. Pearson's correlation coefficient and structural model were used for data analysis. Findings: There was a significant positive correlation between students' computer games addiction and their physical and mental health in dimensions of physical health, anxiety and sleeplessness There was a significant negative relationship between addictions to computer games and impaired social functioning. Conclusion: The results of this study are in agreement with the findings of other studies around the world. As the results show, addiction to computer games affects various dimensions of health and increases physical problems, anxiety and depression, while decreases social functioning disorder. PMID:24494091

  9. Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.; Mavris, Dimitri N.

    2006-01-01

    An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.

  10. A streamline splitting pore-network approach for computationally inexpensive and accurate simulation of transport in porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehmani, Yashar; Oostrom, Martinus; Balhoff, Matthew

    2014-03-20

    Several approaches have been developed in the literature for solving flow and transport at the pore-scale. Some authors use a direct modeling approach where the fundamental flow and transport equations are solved on the actual pore-space geometry. Such direct modeling, while very accurate, comes at a great computational cost. Network models are computationally more efficient because the pore-space morphology is approximated. Typically, a mixed cell method (MCM) is employed for solving the flow and transport system which assumes pore-level perfect mixing. This assumption is invalid at moderate to high Peclet regimes. In this work, a novel Eulerian perspective on modelingmore » flow and transport at the pore-scale is developed. The new streamline splitting method (SSM) allows for circumventing the pore-level perfect mixing assumption, while maintaining the computational efficiency of pore-network models. SSM was verified with direct simulations and excellent matches were obtained against micromodel experiments across a wide range of pore-structure and fluid-flow parameters. The increase in the computational cost from MCM to SSM is shown to be minimal, while the accuracy of SSM is much higher than that of MCM and comparable to direct modeling approaches. Therefore, SSM can be regarded as an appropriate balance between incorporating detailed physics and controlling computational cost. The truly predictive capability of the model allows for the study of pore-level interactions of fluid flow and transport in different porous materials. In this paper, we apply SSM and MCM to study the effects of pore-level mixing on transverse dispersion in 3D disordered granular media.« less

  11. Richard Feynman and computation

    NASA Astrophysics Data System (ADS)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  12. A High School Level Course On Robot Design And Construction

    NASA Astrophysics Data System (ADS)

    Sadler, Paul M.; Crandall, Jack L.

    1984-02-01

    The Robotics Design and Construction Class at Sehome High School was developed to offer gifted and/or highly motivated students an in-depth introduction to a modern engineering topic. The course includes instruction in basic electronics, digital and radio electronics, construction skills, robotics literacy, construction of the HERO 1 Heathkit Robot, computer/ robot programming, and voice synthesis. A key element which leads to the success of the course is the involvement of various community assets including manpower and financial assistance. The instructors included a physics/electronics teacher, a computer science teacher, two retired engineers, and an electronics technician.

  13. On the prediction of turbulent secondary flows

    NASA Technical Reports Server (NTRS)

    Speziale, C. G.; So, R. M. C.; Younis, B. A.

    1992-01-01

    The prediction of turbulent secondary flows, with Reynolds stress models, in circular pipes and non-circular ducts is reviewed. Turbulence-driven secondary flows in straight non-circular ducts are considered along with turbulent secondary flows in pipes and ducts that arise from curvature or a system rotation. The physical mechanisms that generate these different kinds of secondary flows are outlined and the level of turbulence closure required to properly compute each type is discussed in detail. Illustrative computations of a variety of different secondary flows obtained from two-equation turbulence models and second-order closures are provided to amplify these points.

  14. Discovery & Interaction in Astro 101 Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Maloney, Frank Patrick; Maurone, Philip; DeWarf, Laurence E.

    2016-01-01

    The availability of low-cost, high-performance computing hardware and software has transformed the manner by which astronomical concepts can be re-discovered and explored in a laboratory that accompanies an astronomy course for arts students. We report on a strategy, begun in 1992, for allowing each student to understand fundamental scientific principles by interactively confronting astronomical and physical phenomena, through direct observation and by computer simulation. These experiments have evolved as :a) the quality and speed of the hardware has greatly increasedb) the corresponding hardware costs have decreasedc) the students have become computer and Internet literated) the importance of computationally and scientifically literate arts graduates in the workplace has increased.We present the current suite of laboratory experiments, and describe the nature, procedures, and goals in this two-semester laboratory for liberal arts majors at the Astro 101 university level.

  15. How Computer-Assisted Teaching in Physics Can Enhance Student Learning

    ERIC Educational Resources Information Center

    Karamustafaoglu, O.

    2012-01-01

    Simple harmonic motion (SHM) is an important topic for physics or science students and has wide applications all over the world. Computer simulations are applications of special interest in physics teaching because they support powerful modeling environments involving physics concepts. This article is aimed to compare the effect of…

  16. The Design of a Computer Table for the Physically Handicapped Student.

    ERIC Educational Resources Information Center

    Fitterman, L. Jeffrey

    The paper describes the development of a computer table for physically handicapped students including persons with moderate to severe cerebral palsy, muscular dystrophy, uncontrolled epilepsy, and paralysis due to physical trauma. The project first reviewed furniture currently available for the physically handicapped and then conducted ergonomic…

  17. Mapping University Students' Epistemic Framing of Computational Physics Using Network Analysis

    ERIC Educational Resources Information Center

    Bodin, Madelen

    2012-01-01

    Solving physics problem in university physics education using a computational approach requires knowledge and skills in several domains, for example, physics, mathematics, programming, and modeling. These competences are in turn related to students' beliefs about the domains as well as about learning. These knowledge and beliefs components are…

  18. Computer Integrated Manufacturing: Physical Modelling Systems Design. A Personal View.

    ERIC Educational Resources Information Center

    Baker, Richard

    A computer-integrated manufacturing (CIM) Physical Modeling Systems Design project was undertaken in a time of rapid change in the industrial, business, technological, training, and educational areas in Australia. A specification of a manufacturing physical modeling system was drawn up. Physical modeling provides a flexibility and configurability…

  19. Why I think Computational Physics has been the most valuable part of my undergraduate physics education

    NASA Astrophysics Data System (ADS)

    Parsons, Matthew

    2015-04-01

    Computational physics is a rich and vibrant field in its own right, but often not given the attention that it should receive in the typical undergraduate physics curriculum. It appears that the partisan theorist vs. experimentalist view is still pervasive in academia, or at least still portrayed to students, while in fact there is a continuous spectrum of opportunities in between these two extremes. As a case study, I'll give my perspective as a graduating physics student with examples of computational coursework at Drexel University and research opportunities that this experience has led to.

  20. Principles for the wise use of computers by children.

    PubMed

    Straker, L; Pollock, C; Maslen, B

    2009-11-01

    Computer use by children at home and school is now common in many countries. Child computer exposure varies with the type of computer technology available and the child's age, gender and social group. This paper reviews the current exposure data and the evidence for positive and negative effects of computer use by children. Potential positive effects of computer use by children include enhanced cognitive development and school achievement, reduced barriers to social interaction, enhanced fine motor skills and visual processing and effective rehabilitation. Potential negative effects include threats to child safety, inappropriate content, exposure to violence, bullying, Internet 'addiction', displacement of moderate/vigorous physical activity, exposure to junk food advertising, sleep displacement, vision problems and musculoskeletal problems. The case for child specific evidence-based guidelines for wise use of computers is presented based on children using computers differently to adults, being physically, cognitively and socially different to adults, being in a state of change and development and the potential to impact on later adult risk. Progress towards child-specific guidelines is reported. Finally, a set of guideline principles is presented as the basis for more detailed guidelines on the physical, cognitive and social impact of computer use by children. The principles cover computer literacy, technology safety, child safety and privacy and appropriate social, cognitive and physical development. The majority of children in affluent communities now have substantial exposure to computers. This is likely to have significant effects on child physical, cognitive and social development. Ergonomics can provide and promote guidelines for wise use of computers by children and by doing so promote the positive effects and reduce the negative effects of computer-child, and subsequent computer-adult, interaction.

  1. Indiana Wesleyan University SPS Physics Outreach to Rural Middle School and High School Students

    NASA Astrophysics Data System (ADS)

    Ostrander, Joshua; Rose, Heath; Burchell, Robert; Ramos, Roberto

    2013-03-01

    The Society of Physics Students chapter at Indiana Wesleyan University is unusual in that it has no physics major, only physics minors. Yet while just over a year old, IWU-SPS has been active in performing physics outreach to middle school and high school students, and the rural community of Grant County. Our year-old SPS chapter consists of majors from Chemistry, Nursing, Biology, Exercise Science, Computer Science, Psychology, Pastoral Studies, and Science Education, who share a common interest in physics and service to the community. IWU currently has a physics minor and is currently working to build a physics major program. Despite the intrinsic challenges, our multi-disciplinary group has been successful at using physics demonstration equipment and hands-on activities and their universal appeal to raise the interest in physics in Grant County. We report our experience, challenges, and successes with physics outreach. We describe in detail our two-pronged approach: raising the level of physics appreciation among the IWU student community and among pre-college students in a rural community of Indiana. Acknowledgements: We acknowledge the support of the Society of Physics Students through a Marsh White Outreach Award and a Blake Lilly Prize.

  2. ISSM-SESAW v1.0: mesh-based computation of gravitationally consistent sea-level and geodetic signatures caused by cryosphere and climate driven mass change

    NASA Astrophysics Data System (ADS)

    Adhikari, Surendra; Ivins, Erik R.; Larour, Eric

    2016-03-01

    A classical Green's function approach for computing gravitationally consistent sea-level variations associated with mass redistribution on the earth's surface employed in contemporary sea-level models naturally suits the spectral methods for numerical evaluation. The capability of these methods to resolve high wave number features such as small glaciers is limited by the need for large numbers of pixels and high-degree (associated Legendre) series truncation. Incorporating a spectral model into (components of) earth system models that generally operate on a mesh system also requires repetitive forward and inverse transforms. In order to overcome these limitations, we present a method that functions efficiently on an unstructured mesh, thus capturing the physics operating at kilometer scale yet capable of simulating geophysical observables that are inherently of global scale with minimal computational cost. The goal of the current version of this model is to provide high-resolution solid-earth, gravitational, sea-level and rotational responses for earth system models operating in the domain of the earth's outer fluid envelope on timescales less than about 1 century when viscous effects can largely be ignored over most of the globe. The model has numerous important geophysical applications. For example, we compute time-varying computations of global geodetic and sea-level signatures associated with recent ice-sheet changes that are derived from space gravimetry observations. We also demonstrate the capability of our model to simultaneously resolve kilometer-scale sources of the earth's time-varying surface mass transport, derived from high-resolution modeling of polar ice sheets, and predict the corresponding local and global geodetic signatures.

  3. Rapid Ice-Sheet Changes and Mechanical Coupling to Solid-Earth/Sea-Level and Space Geodetic Observation

    NASA Astrophysics Data System (ADS)

    Adhikari, S.; Ivins, E. R.; Larour, E. Y.

    2015-12-01

    Perturbations in gravitational and rotational potentials caused by climate driven mass redistribution on the earth's surface, such as ice sheet melting and terrestrial water storage, affect the spatiotemporal variability in global and regional sea level. Here we present a numerically accurate, computationally efficient, high-resolution model for sea level. Unlike contemporary models that are based on spherical-harmonic formulation, the model can operate efficiently in a flexible embedded finite-element mesh system, thus capturing the physics operating at km-scale yet capable of simulating geophysical quantities that are inherently of global scale with minimal computational cost. One obvious application is to compute evolution of sea level fingerprints and associated geodetic and astronomical observables (e.g., geoid height, gravity anomaly, solid-earth deformation, polar motion, and geocentric motion) as a companion to a numerical 3-D thermo-mechanical ice sheet simulation, thus capturing global signatures of climate driven mass redistribution. We evaluate some important time-varying signatures of GRACE inferred ice sheet mass balance and continental hydrological budget; for example, we identify dominant sources of ongoing sea-level change at the selected tide gauge stations, and explain the relative contribution of different sources to the observed polar drift. We also report our progress on ice-sheet/solid-earth/sea-level model coupling efforts toward realistic simulation of Pine Island Glacier over the past several hundred years.

  4. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1983-01-01

    Adequate computer methods, based on interactions between discrete particles, provide information leading to an atomic level understanding of various physical processes. The success of these simulation methods, however, is related to the accuracy of the potential energy function representing the interactions among the particles. The development of a potential energy function for crystalline SiO2 forms that can be employed in lengthy computer modelling procedures was investigated. In many of the simulation methods which deal with discrete particles, semiempirical two body potentials were employed to analyze energy and structure related properties of the system. Many body interactions are required for a proper representation of the total energy for many systems. Many body interactions for simulations based on discrete particles are discussed.

  5. Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2013-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a coupled aeroelastic modeling capability by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed in the framework of modal analysis. Transient aeroelastic nozzle startup analyses of the Block I Space Shuttle Main Engine at sea level were performed. The computed results from the aeroelastic nozzle modeling are presented.

  6. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are generated from this file but share an infrastructure for services common to all models, e.g. diagnostics, checkpointing and global non-linear convergence monitoring. This maximizes code reusability, reliability and longevity ensuring that scientific results and the methods used to acquire them are transparent and reproducible. TerraFERMA has been tested against many published geodynamic benchmarks including 2D/3D thermal convection problems, the subduction zone benchmarks and benchmarks for magmatic solitary waves. It is currently being used in the investigation of reactive cracking phenomena with applications to carbon sequestration, but we will principally discuss its use in modeling the migration of fluids in subduction zones. Subduction zones require an understanding of the highly nonlinear interactions of fluids with solids and thus provide an excellent scientific driver for the development of multi-physics software.

  7. Factors influencing hand/eye synchronicity in the computer age.

    PubMed

    Grant, A H

    1992-09-01

    In using a computer, the relation of vision to hand/finger actuated keyboard usage in performing fine motor-coordinated functions is influenced by the physical location, size, and collective placement of the keys. Traditional nonprehensile flat/rectangular keyboard applications usually require a high and nearly constant level of visual attention. Biometrically shaped keyboards would allow for prehensile hand-posturing, thus affording better tactile familiarity with the keys, requiring less intense and less constant level of visual attention to the task, and providing a greater measure of freedom from having to visualize the key(s). Workpace and related physiological changes, aging, onset of monocularization (intermittent lapsing of binocularity for near vision) that accompanies presbyopia, tool colors, and background contrast are factors affecting constancy of visual attention to task performance. Capitas extension, excessive excyclotorsion, and repetitive strain injuries (such as carpal tunnel syndrome) are common and debilitating concomitants to computer usage. These problems can be remedied by improved keyboard design. The salutary role of mnemonics in minimizing visual dependency is discussed.

  8. Development of Boundary Condition Independent Reduced Order Thermal Models using Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Raghupathy, Arun; Ghia, Karman; Ghia, Urmila

    2008-11-01

    Compact Thermal Models (CTM) to represent IC packages has been traditionally developed using the DELPHI-based (DEvelopment of Libraries of PHysical models for an Integrated design) methodology. The drawbacks of this method are presented, and an alternative method is proposed. A reduced-order model that provides the complete thermal information accurately with less computational resources can be effectively used in system level simulations. Proper Orthogonal Decomposition (POD), a statistical method, can be used to reduce the order of the degree of freedom or variables of the computations for such a problem. POD along with the Galerkin projection allows us to create reduced-order models that reproduce the characteristics of the system with a considerable reduction in computational resources while maintaining a high level of accuracy. The goal of this work is to show that this method can be applied to obtain a boundary condition independent reduced-order thermal model for complex components. The methodology is applied to the 1D transient heat equation.

  9. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  10. Burnout among clinical dental students at Jordanian universities.

    PubMed

    Badran, D H; Al-Ali, M H; Duaibis, R B; Amin, W M

    2010-04-01

    Dentistry is a profession demanding physical and mental efforts as well as people contact, which can result in burnout. The level of burnout among 307 clinical dental students in 2 Jordanian universities was evaluated using the Maslach Burnout Inventory survey. Scores for the inventory's 3 subscales were calculated and the mean values for the students' groups were computed separately. Dental students in both universities suffered high levels of emotional exhaustion and depersonalization. The dental students at the University of Jordan demonstrated a significantly higher level of emotional exhaustion than their counterparts at the Jordan University of Science and Technology.

  11. Greater Leisure Time Physical Activity Is Associated with Lower Allostatic Load in White, Black, and Mexican American Midlife Women: Findings from the National Health and Nutrition Examination Survey, 1999 through 2004.

    PubMed

    Upchurch, Dawn M; Rainisch, Bethany Wexler; Chyu, Laura

    2015-01-01

    Allostatic load is a useful construct to understand how social and environmental conditions get under the skin to affect health. To date, few studies have examined health-enhancing lifestyle behaviors and their potential benefits in reducing allostatic load. The purpose of this study was to investigate the contributions of leisure time physical activity on level of allostatic load among White, Black, and Mexican American midlife women. Data were from the National Health and Nutrition Examination Survey, 1999 through 2004 (n = 1,680, women ages 40-59). All analyses were weighted. Negative binomial regression was used to model a summative count measure of allostatic load (M = 2.30). Models were also computed to estimate adjusted predicted allostatic load for given levels of physical activity, and by race/ethnicity for each age category (40-44, 45-49, 50-54, 55-59), controlling for other demographics and medication use. Higher levels of physical activity were associated significantly with lower levels of allostatic load, independent of demographics. Compared with White women ages 40 to 44, all other racial/ethnic-by-age groups had significantly higher allostatic load. Higher socioeconomic status was associated with a lower allostatic load. Adjusted prediction models demonstrated associations between greater levels of physical activity and lower allostatic load for all ages and racial/ethnic groups. Our findings suggest physical activity may ameliorate some of the effects of cumulative physiological dysregulation and subsequent disease burden in midlife women. Programs and policies that encourage and promote healthy aging and provide opportunities for a diversity of women to engage in health-enhancing lifestyle practices such as physical activity are recommended. Copyright © 2015 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  12. Advanced Machine Learning Emulators of Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.

    2017-12-01

    Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.

  13. An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Randal Scott

    CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less

  14. Effective STEM Programs for Adolescent Girls: Three Approaches and Many Lessons Learned

    ERIC Educational Resources Information Center

    Mosatche, Harriet S.; Matloff-Nieves, Susan; Kekelis, Linda; Lawner, Elizabeth K.

    2013-01-01

    While women's participation in math and physical science continues to lag to some degree behind that of men, the disparity is much greater in engineering and computer science. Though boys may outperform girls at the highest levels on math and science standardized tests, girls tend to get better course grades in math and science than boys do.…

  15. A Simple Experiment for Determining the Elastic Constant of a Fine Wire

    ERIC Educational Resources Information Center

    Freeman, W. Larry; Freda, Ronald F.

    2007-01-01

    Many general physics laboratories involve the use of springs to demonstrate Hooke's law, and much ado is made about how this can be used as a model for describing the elastic characteristics of materials at the molecular or atomic level. In recent years, the proliferation of computers, and appropriate sensors, have made it possible to demonstrate…

  16. Cumulative trauma disorder risk for children using computer products: results of a pilot investigation with a student convenience sample.

    PubMed

    Burke, Adam; Peper, Erik

    2002-01-01

    Cumulative trauma disorder is a major health problem for adults. Despite a growing understanding of adult cumulative trauma disorder, however, little is known about the risks for younger populations. This investigation examined issues related to child/adolescent computer product use and upper body physical discomfort. A convenience sample of 212 students, grades 1-12, was interviewed at their homes by a college-age sibling or relative. One of the child's parents was also interviewed. A 22-item questionnaire was used for data-gathering. Questionnaire items included frequency and duration of use, type of computer products/games and input devices used, presence of physical discomfort, and parental concerns related to the child's computer use. Many students experienced physical discomfort attributed to computer use, such as wrist pain (30%) and back pain (15%). Specific computer activities-such as using a joystick or playing noneducational games-were significantly predictive of physical discomfort using logistic multiple regression. Many parents reported difficulty getting their children off the computer (46%) and that their children spent less time outdoors (35%). Computer product use within this cohort was associated with self-reported physical discomfort. Results suggest a need for more extensive study, including multiyear longitudinal surveys.

  17. Level set immersed boundary method for gas-liquid-solid interactions with phase-change

    NASA Astrophysics Data System (ADS)

    Dhruv, Akash; Balaras, Elias; Riaz, Amir; Kim, Jungho

    2017-11-01

    We will discuss an approach to simulate the interaction between two-phase flows with phase changes and stationary/moving structures. In our formulation, the Navier-Stokes and heat advection-diffusion equations are solved on a block-structured grid using adaptive mesh refinement (AMR) along with sharp jump in pressure, velocity and temperature across the interface separating the different phases. The jumps are implemented using a modified Ghost Fluid Method (Lee et al., J. Comput. Physics, 344:381-418, 2017), and the interface is tracked with a level set approach. Phase transition is achieved by calculating mass flux near the interface and extrapolating it to the rest of the domain using a Hamilton-Jacobi equation. Stationary/moving structures are simulated with an immersed boundary formulation based on moving least squares (Vanella & Balaras, J. Comput. Physics, 228:6617-6628, 2009). A variety of canonical problems involving vaporization, film boiling and nucleate boiling is presented to validate the method and demonstrate the its formal accuracy. The robustness of the solver in complex problems, which are crucial in efficient design of heat transfer mechanisms for various applications, will also be demonstrated. Work supported by NASA, Grant NNX16AQ77G.

  18. Proceedings of the Finnish-Russian Symposium on Information Technology in Modern Physics Classroom (Halsinki, Finland, April 21-24, 1993). Research Report 123.

    ERIC Educational Resources Information Center

    Ahtee, Maija, Ed.; Meisalo, Veijo, Ed.; Lavonen, Jari, Ed.

    The 15 conference papers in this report address a variety of issues such as computer applications in mechanics and optics, three-dimensional representation in physics teaching, computers in the physics laboratory, information technologies, the perceptual approach in physics education, improving students' conceptual understanding in physics, using…

  19. Learning physics in a water park

    NASA Astrophysics Data System (ADS)

    Cabeza, Cecilia; Rubido, Nicolás; Martí, Arturo C.

    2014-03-01

    Entertaining and educational experiments that can be conducted in a water park, illustrating physics concepts, principles and fundamental laws, are described. These experiments are suitable for students ranging from senior secondary school to junior university level. Newton’s laws of motion, Bernoulli’s equation, based on the conservation of energy, buoyancy, linear and non-linear wave propagation, turbulence, thermodynamics, optics and cosmology are among the topics that can be discussed. Commonly available devices like smartphones, digital cameras, laptop computers and tablets, can be used conveniently to enable accurate calculation and a greater degree of engagement on the part of students.

  20. Scholarly literature and the press: scientific impact and social perception of physics computing

    NASA Astrophysics Data System (ADS)

    Pia, M. G.; Basaglia, T.; Bell, Z. W.; Dressendorfer, P. V.

    2014-06-01

    The broad coverage of the search for the Higgs boson in the mainstream media is a relative novelty for high energy physics (HEP) research, whose achievements have traditionally been limited to scholarly literature. This paper illustrates the results of a scientometric analysis of HEP computing in scientific literature, institutional media and the press, and a comparative overview of similar metrics concerning representative particle physics measurements. The picture emerging from these scientometric data documents the relationship between the scientific impact and the social perception of HEP physics research versus that of HEP computing. The results of this analysis suggest that improved communication of the scientific and social role of HEP computing via press releases from the major HEP laboratories would be beneficial to the high energy physics community.

  1. PREFACE: IUPAP C20 Conference on Computational Physics (CCP 2011)

    NASA Astrophysics Data System (ADS)

    Troparevsky, Claudia; Stocks, George Malcolm

    2012-12-01

    Increasingly, computational physics stands alongside experiment and theory as an integral part of the modern approach to solving the great scientific challenges of the day on all scales - from cosmology and astrophysics, through climate science, to materials physics, and the fundamental structure of matter. Computational physics touches aspects of science and technology with direct relevance to our everyday lives, such as communication technologies and securing a clean and efficient energy future. This volume of Journal of Physics: Conference Series contains the proceedings of the scientific contributions presented at the 23rd Conference on Computational Physics held in Gatlinburg, Tennessee, USA, in November 2011. The annual Conferences on Computational Physics (CCP) are dedicated to presenting an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas and from around the world. The CCP series has been in existence for more than 20 years, serving as a lively forum for computational physicists. The topics covered by this conference were: Materials/Condensed Matter Theory and Nanoscience, Strongly Correlated Systems and Quantum Phase Transitions, Quantum Chemistry and Atomic Physics, Quantum Chromodynamics, Astrophysics, Plasma Physics, Nuclear and High Energy Physics, Complex Systems: Chaos and Statistical Physics, Macroscopic Transport and Mesoscopic Methods, Biological Physics and Soft Materials, Supercomputing and Computational Physics Teaching, Computational Physics and Sustainable Energy. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), IUPAP Commission on Computational Physics (C20), American Physical Society Division of Computational Physics (APS-DCOMP), Oak Ridge National Laboratory (ORNL), Center for Defect Physics (CDP), the University of Tennessee (UT)/ORNL Joint Institute for Computational Sciences (JICS) and Cray, Inc. We are grateful to the committees that helped put the conference together, especially the local organizing committee. Particular thanks are also due to a number of ORNL staff who spent long hours with the administrative details. We are pleased to express our thanks to the conference administrator Ann Strange (ORNL/CDP) for her responsive and efficient day-to-day handling of this event, Sherry Samples, Assistant Conference Administrator (ORNL), Angie Beach and the ORNL Conference Office, and Shirley Shugart (ORNL) and Fern Stooksbury (ORNL) who created and maintained the conference website. Editors: G Malcolm Stocks (ORNL) and M Claudia Troparevsky (UT) http://ccp2011.ornl.gov Chair: Dr Malcolm Stocks (ORNL) Vice Chairs: Adriana Moreo (ORNL/UT) James Guberrnatis (LANL) Local Program Committee: Don Batchelor (ORNL) Jack Dongarra (UTK/ORNL) James Hack (ORNL) Robert Harrison (ORNL) Paul Kent (ORNL) Anthony Mezzacappa (ORNL) Adriana Moreo (ORNL) Witold Nazarewicz (UT) Loukas Petridis (ORNL) David Schultz (ORNL) Bill Shelton (ORNL) Claudia Troparevsky (ORNL) Mina Yoon (ORNL) International Advisory Board Members: Joan Adler (Israel Institute of Technology, Israel) Constantia Alexandrou (University of Cyprus, Cyprus) Claudia Ambrosch-Draxl (University of Leoben, Austria) Amanda Barnard (CSIRO, Australia) Peter Borcherds (University of Birmingham, UK) Klaus Cappelle (UFABC, Brazil) Giovanni Ciccotti (Università degli Studi di Roma 'La Sapienza', Italy) Nithaya Chetty (University of Pretoria, South Africa) Charlotte Froese-Fischer (NIST, US) Giulia A. Galli (University of California, Davis, US) Gillian Gehring (University of Sheffield, UK) Guang-Yu Guo (National Taiwan University, Taiwan) Sharon Hammes-Schiffer (Penn State, US) Alex Hansen (Norweigan UST) Duane D. Johnson (University of Illinois at Urbana-Champaign, US) David Landau (University of Georgia, US) Joaquin Marro (University of Granada, Spain) Richard Martin (UIUC, US) Todd Martinez (Stanford University, US) Bill McCurdy (Lawrence Berkeley National Laboratory, US) Ingrid Mertig (Martin Luther University, Germany) Alejandro Muramatsu (Universitat Stuttgart, Germany) Richard Needs (Cavendish Laboratory, UK) Giuseppina Orlandini (University of Trento, Italy) Martin Savage (University of Washington, US) Thomas Schulthess (ETH, Switzerland) Dzidka Szotek (Daresbury Laboratory, UK) Hideaki Takabe (Osaka University, Japan) William M. Tang (Princeton University, US) James Vary (Iowa State, US) Enge Wang (Chinese Academy of Science, China) Jian-Guo Wang (Institute of Applied Physics and Computational Mathematics, China) Jian-Sheng Wang (National University, Singapore) Dan Wei (Tsinghua University, China) Tony Williams (University of Adelaide, Australia) Rudy Zeller (Julich, Germany) Conference Administrator: Ann Strange (ORNL)

  2. Anniversary Paper: Image processing and manipulation through the pages of Medical Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armato, Samuel G. III; Ginneken, Bram van; Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, Room Q0S.459, 3584 CX Utrecht

    The language of radiology has gradually evolved from ''the film'' (the foundation of radiology since Wilhelm Roentgen's 1895 discovery of x-rays) to ''the image,'' an electronic manifestation of a radiologic examination that exists within the bits and bytes of a computer. Rather than simply storing and displaying radiologic images in a static manner, the computational power of the computer may be used to enhance a radiologist's ability to visually extract information from the image through image processing and image manipulation algorithms. Image processing tools provide a broad spectrum of opportunities for image enhancement. Gray-level manipulations such as histogram equalization, spatialmore » alterations such as geometric distortion correction, preprocessing operations such as edge enhancement, and enhanced radiography techniques such as temporal subtraction provide powerful methods to improve the diagnostic quality of an image or to enhance structures of interest within an image. Furthermore, these image processing algorithms provide the building blocks of more advanced computer vision methods. The prominent role of medical physicists and the AAPM in the advancement of medical image processing methods, and in the establishment of the ''image'' as the fundamental entity in radiology and radiation oncology, has been captured in 35 volumes of Medical Physics.« less

  3. Computational modeling of fully-ionized, magnetized plasmas using the fluid approximation

    NASA Astrophysics Data System (ADS)

    Schnack, Dalton

    2005-10-01

    Strongly magnetized plasmas are rich in spatial and temporal scales, making a computational approach useful for studying these systems. The most accurate model of a magnetized plasma is based on a kinetic equation that describes the evolution of the distribution function for each species in six-dimensional phase space. However, the high dimensionality renders this approach impractical for computations for long time scales in relevant geometry. Fluid models, derived by taking velocity moments of the kinetic equation [1] and truncating (closing) the hierarchy at some level, are an approximation to the kinetic model. The reduced dimensionality allows a wider range of spatial and/or temporal scales to be explored. Several approximations have been used [2-5]. Successful computational modeling requires understanding the ordering and closure approximations, the fundamental waves supported by the equations, and the numerical properties of the discretization scheme. We review and discuss several ordering schemes, their normal modes, and several algorithms that can be applied to obtain a numerical solution. The implementation of kinetic parallel closures is also discussed [6].[1] S. Chapman and T.G. Cowling, ``The Mathematical Theory of Non-Uniform Gases'', Cambridge University Press, Cambridge, UK (1939).[2] R.D. Hazeltine and J.D. Meiss, ``Plasma Confinement'', Addison-Wesley Publishing Company, Redwood City, CA (1992).[3] L.E. Sugiyama and W. Park, Physics of Plasmas 7, 4644 (2000).[4] J.J. Ramos, Physics of Plasmas, 10, 3601 (2003).[5] P.J. Catto and A.N. Simakov, Physics of Plasmas, 11, 90 (2004).[6] E.D. Held et al., Phys. Plasmas 11, 2419 (2004)

  4. Engineering physics and mathematics division

    NASA Astrophysics Data System (ADS)

    Sincovec, R. F.

    1995-07-01

    This report provides a record of the research activities of the Engineering Physics and Mathematics Division for the period 1 Jan. 1993 - 31 Dec. 1994. This report is the final archival record of the EPM Division. On 1 Oct. 1994, ORELA was transferred to Physics Division and on 1 Jan. 1995, the Engineering Physics and Mathematics Division and the Computer Applications Division reorganized to form the Computer Science and Mathematics Division and the Computational Physics and Engineering Division. Earlier reports in this series are identified on the previous pages, along with the progress reports describing ORNL's research in the mathematical sciences prior to 1984 when those activities moved into the Engineering Physics and Mathematics Division.

  5. Assessment for Effective Intervention: Enrichment Science Academic Program

    NASA Astrophysics Data System (ADS)

    Sasson, Irit; Cohen, Donita

    2013-10-01

    Israel suffers from a growing problem of socio-economic gaps between those who live in the center of the country and residents of outlying areas. As a result, there is a low level of accessibility to higher education among the peripheral population. The goal of the Sidney Warren Science Education Center for Youth at Tel-Hai College is to strengthen the potential of middle and high school students and encourage them to pursue higher education, with an emphasis on majoring in science and technology. This study investigated the implementation and evaluation of the enrichment science academic program, as an example of informal learning environment, with an emphasis on physics studies. About 500 students conducted feedback survey after participating in science activities in four domains: biology, chemistry, physics, and computer science. Results indicated high level of satisfaction among the students. No differences were found with respect to gender excluding in physics with a positive attitudes advantage among boys. In order to get a deeper understanding of this finding, about 70 additional students conducted special questionnaires, both 1 week before the physics enrichment day and at the end of that day. Questionnaires were intended to assess both their attitudes toward physics and their knowledge and conceptions of the physical concept "pressure." We found that the activity moderately improved boys' attitudes toward physics, but that girls displayed decreased interest in and lower self-efficacy toward physics. Research results were used to the improvement of the instructional design of the physics activity demonstrating internal evaluation process for effective intervention.

  6. Computer work and self-reported variables on anthropometrics, computer usage, work ability, productivity, pain, and physical activity

    PubMed Central

    2013-01-01

    Background Computer users often report musculoskeletal complaints and pain in the upper extremities and the neck-shoulder region. However, recent epidemiological studies do not report a relationship between the extent of computer use and work-related musculoskeletal disorders (WMSD). The aim of this study was to conduct an explorative analysis on short and long-term pain complaints and work-related variables in a cohort of Danish computer users. Methods A structured web-based questionnaire including questions related to musculoskeletal pain, anthropometrics, work-related variables, work ability, productivity, health-related parameters, lifestyle variables as well as physical activity during leisure time was designed. Six hundred and ninety office workers completed the questionnaire responding to an announcement posted in a union magazine. The questionnaire outcomes, i.e., pain intensity, duration and locations as well as anthropometrics, work-related variables, work ability, productivity, and level of physical activity, were stratified by gender and correlations were obtained. Results Women reported higher pain intensity, longer pain duration as well as more locations with pain than men (P < 0.05). In parallel, women scored poorer work ability and ability to fulfil the requirements on productivity than men (P < 0.05). Strong positive correlations were found between pain intensity and pain duration for the forearm, elbow, neck and shoulder (P < 0.001). Moderate negative correlations were seen between pain intensity and work ability/productivity (P < 0.001). Conclusions The present results provide new key information on pain characteristics in office workers. The differences in pain characteristics, i.e., higher intensity, longer duration and more pain locations as well as poorer work ability reported by women workers relate to their higher risk of contracting WMSD. Overall, this investigation confirmed the complex interplay between anthropometrics, work ability, productivity, and pain perception among computer users. PMID:23915209

  7. Federated data storage system prototype for LHC experiments and data intensive science

    NASA Astrophysics Data System (ADS)

    Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.

    2017-10-01

    Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.

  8. Computing in Secondary Physics at Armdale, W.A.

    ERIC Educational Resources Information Center

    Smith, Clifton L.

    1976-01-01

    An Australian secondary school physics course utilizing an electronic programmable calculator and computer is described. Calculation techniques and functions, programming techniques, and simulation of physical systems are detailed. A summary of student responses to the program is included. (BT)

  9. Bottom-quark forward-backward asymmetry in the standard model and beyond.

    PubMed

    Grinstein, Benjamín; Murphy, Christopher W

    2013-08-09

    We computed the bottom-quark forward-backward asymmetry at the Tevatron in the standard model (SM) and for several new physics scenarios. Near the Z pole, the SM bottom asymmetry is dominated by tree level exchanges of electroweak gauge bosons. While above the Z pole, next-to-leading order QCD dominates the SM asymmetry as was the case with the top-quark forward-backward asymmetry. Light new physics, M(NP)≲150  GeV, can cause significant deviations from the SM prediction for the bottom asymmetry. The bottom asymmetry can be used to distinguish between competing new physics (NP) explanations of the top asymmetry based on how the NP interferes with s-channel gluon and Z exchange.

  10. PREFACE: IC-MSQUARE 2012: International Conference on Mathematical Modelling in Physical Sciences

    NASA Astrophysics Data System (ADS)

    Kosmas, Theocharis; Vagenas, Elias; Vlachos, Dimitrios

    2013-02-01

    The first International Conference on Mathematical Modelling in Physical Sciences (IC-MSQUARE) took place in Budapest, Hungary, from Monday 3 to Friday 7 September 2012. The conference was attended by more than 130 participants, and hosted about 290 oral, poster and virtual papers by more than 460 pre-registered authors. The first IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields in which mathematical modelling is used, such as theoretical/mathematical physics, neutrino physics, non-integrable systems, dynamical systems, computational nanoscience, biological physics, computational biomechanics, complex networks, stochastic modelling, fractional statistics, DNA dynamics, and macroeconomics. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, two parallel sessions ran every day. However, according to all attendees, the program was excellent with a high level of talks and the scientific environment was fruitful; thus all attendees had a creative time. The mounting question is whether this occurred accidentally, or whether IC-MSQUARE is a necessity in the field of physical and mathematical modelling. For all of us working in the field, the existing and established conferences in this particular field suffer from two distinguished and recognized drawbacks: the first is the increasing orientation, while the second refers to the extreme specialization of the meetings. Therefore, a conference which aims to promote the knowledge and development of high-quality research in mathematical fields concerned with applications of other scientific fields as well as modern technological trends in physics, chemistry, biology, medicine, economics, sociology, environmental sciences etc., appears to be a necessity. This is the key role that IC-MSQUARE will play. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contributions to IC-MSQUARE. We would also like to thank the members of the International Scientific Committee and the members of the Organizing Committee. Conference Chairmen Theocharis Kosmas Department of Physics, University of Ioannina Elias Vagenas RCAAM, Academy of Athens Dimitrios Vlachos Department of Computer Science and Technology, University of Peloponnese The PDF also contains a list of members of the International Scientific Committes and details of the Keynote and Invited Speakers.

  11. Physical activity and occupational risk of colon cancer in Shanghai, China.

    PubMed

    Chow, W H; Dosemeci, M; Zheng, W; Vetter, R; McLaughlin, J K; Gao, Y T; Blot, W J

    1993-02-01

    Using occupational data for over 2000 colon cancer cases diagnosed between 1980 and 1984 in Shanghai, and employment information from the 1982 census for the Shanghai population, standardized incidence ratios (SIR) were computed for occupational groups classified by job types and physical activity levels. Men employed in occupations with low physical activity levels had modest but significantly elevated risks of colon cancer. SIR for jobs with low activity based on sitting time was 121 (95% confidence interval, Cl: 108-135) and based on energy expenditure was 126 (95% Cl: 115-138). Corresponding SIR for women were 99 (95% Cl: 83-118) and 113 (95% Cl: 100-127). The data were also used to screen for specific occupations with elevated SIR to generate leads to occupational colon cancer. Increased incidence was observed for professional and other white collar workers, and male chemical processors and female textile workers. The findings add to the emerging evidence that workplace activity may influence the risk of this common cancer.

  12. Automated analysis of short responses in an interactive synthetic tutoring system for introductory physics

    NASA Astrophysics Data System (ADS)

    Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.

    2016-06-01

    Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part of a project to develop and test an interactive learning environment designed to help students learn introductory physics concepts. The system is designed around an interactive video tutoring interface. We have analyzed 9 with about 150 responses or less. We observe for 4 of the 9 automated assessment with interrater agreement of 70% or better with the human rater. This level of agreement may represent a baseline for practical utility in instruction and indicates that the method warrants further investigation for use in this type of application. Our results also suggest strategies that may be useful for writing activities and questions that are more appropriate for automated assessment. These strategies include building activities that have relatively few conceptually distinct ways of perceiving the physical behavior of relatively few physical objects. Further success in this direction may allow us to promote interactivity and better provide feedback in online learning systems. These capabilities could enable our system to function more like a real tutor.

  13. Physical activity, screen time, and school absenteeism: self-reports from NHANES 2005-2008.

    PubMed

    Hansen, Andrew R; Pritchard, Tony; Melnic, Irina; Zhang, Jian

    2016-01-01

    The purpose of this study was to examine how lifestyle behaviors in the context of physical activity levels and screen time are associated with school absenteeism. We analyzed 2005-2008 NHANES data of proxy interviews for 1048 children aged 6-11 years and in-person self-reports of 1117 adolescents aged 12-18 years. Missing 10% of school days during the past school year was defined as severe school absenteeism (SSA). Watching TV ≥2 hours a day was significantly associated with SSA among both children (OR = 3.51 [1.03-12.0]) and adolescents (OR = 3.96 [1.84-8.52]) compared with their peers watching <2 hours a day. A U-shaped association was identified between the level of physical activity and SSA among children. Both inactive children (OR = 12.4 [1.43-108]) and highly active children (14.8 [2.82-77.7]) had higher odds of SSA compared with children with medium levels of physical activity. No associations were observed for either children 0.57 ([0.16-1.99]) or adolescents (0.94 [0.44-2.03]) using a computer ≥3 hours a day. Cross-sectional study involving self-reports. Transportation to and from school not included in physical activity assessment. Absenteeism was not validated with report cards. Unable to account for the absence type or frequency of illness or injury. No psychometric properties provided for subjective measures regarding participants' attitudes and characteristic traits towards physical activity, TV viewing, and school attendance. Excessive TV watching among children and adolescents, and inactivity and high activity levels (≥7 times per week) among children are independently associated with severe school absenteeism.

  14. "Let's get physical": advantages of a physical model over 3D computer models and textbooks in learning imaging anatomy.

    PubMed

    Preece, Daniel; Williams, Sarah B; Lam, Richard; Weller, Renate

    2013-01-01

    Three-dimensional (3D) information plays an important part in medical and veterinary education. Appreciating complex 3D spatial relationships requires a strong foundational understanding of anatomy and mental 3D visualization skills. Novel learning resources have been introduced to anatomy training to achieve this. Objective evaluation of their comparative efficacies remains scarce in the literature. This study developed and evaluated the use of a physical model in demonstrating the complex spatial relationships of the equine foot. It was hypothesized that the newly developed physical model would be more effective for students to learn magnetic resonance imaging (MRI) anatomy of the foot than textbooks or computer-based 3D models. Third year veterinary medicine students were randomly assigned to one of three teaching aid groups (physical model; textbooks; 3D computer model). The comparative efficacies of the three teaching aids were assessed through students' abilities to identify anatomical structures on MR images. Overall mean MRI assessment scores were significantly higher in students utilizing the physical model (86.39%) compared with students using textbooks (62.61%) and the 3D computer model (63.68%) (P < 0.001), with no significant difference between the textbook and 3D computer model groups (P = 0.685). Student feedback was also more positive in the physical model group compared with both the textbook and 3D computer model groups. Our results suggest that physical models may hold a significant advantage over alternative learning resources in enhancing visuospatial and 3D understanding of complex anatomical architecture, and that 3D computer models have significant limitations with regards to 3D learning. © 2013 American Association of Anatomists.

  15. Logistics in the Computer Lab.

    ERIC Educational Resources Information Center

    Cowles, Jim

    1989-01-01

    Discusses ways to provide good computer laboratory facilities for elementary and secondary schools. Topics discussed include establishing the computer lab and selecting hardware; types of software; physical layout of the room; printers; networking possibilities; considerations relating to the physical environment; and scheduling methods. (LRW)

  16. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  17. Dynamical Approach Study of Spurious Numerics in Nonlinear Computations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Mansour, Nagi (Technical Monitor)

    2002-01-01

    The last two decades have been an era when computation is ahead of analysis and when very large scale practical computations are increasingly used in poorly understood multiscale complex nonlinear physical problems and non-traditional fields. Ensuring a higher level of confidence in the predictability and reliability (PAR) of these numerical simulations could play a major role in furthering the design, understanding, affordability and safety of our next generation air and space transportation systems, and systems for planetary and atmospheric sciences, and in understanding the evolution and origin of life. The need to guarantee PAR becomes acute when computations offer the ONLY way of solving these types of data limited problems. Employing theory from nonlinear dynamical systems, some building blocks to ensure a higher level of confidence in PAR of numerical simulations have been revealed by the author and world expert collaborators in relevant fields. Five building blocks with supporting numerical examples were discussed. The next step is to utilize knowledge gained by including nonlinear dynamics, bifurcation and chaos theories as an integral part of the numerical process. The third step is to design integrated criteria for reliable and accurate algorithms that cater to the different multiscale nonlinear physics. This includes but is not limited to the construction of appropriate adaptive spatial and temporal discretizations that are suitable for the underlying governing equations. In addition, a multiresolution wavelets approach for adaptive numerical dissipation/filter controls for high speed turbulence, acoustics and combustion simulations will be sought. These steps are corner stones for guarding against spurious numerical solutions that are solutions of the discretized counterparts but are not solutions of the underlying governing equations.

  18. Combined inverse-forward artificial neural networks for fast and accurate estimation of the diffusion coefficients of cartilage based on multi-physics models.

    PubMed

    Arbabi, Vahid; Pouran, Behdad; Weinans, Harrie; Zadpoor, Amir A

    2016-09-06

    Analytical and numerical methods have been used to extract essential engineering parameters such as elastic modulus, Poisson׳s ratio, permeability and diffusion coefficient from experimental data in various types of biological tissues. The major limitation associated with analytical techniques is that they are often only applicable to problems with simplified assumptions. Numerical multi-physics methods, on the other hand, enable minimizing the simplified assumptions but require substantial computational expertise, which is not always available. In this paper, we propose a novel approach that combines inverse and forward artificial neural networks (ANNs) which enables fast and accurate estimation of the diffusion coefficient of cartilage without any need for computational modeling. In this approach, an inverse ANN is trained using our multi-zone biphasic-solute finite-bath computational model of diffusion in cartilage to estimate the diffusion coefficient of the various zones of cartilage given the concentration-time curves. Robust estimation of the diffusion coefficients, however, requires introducing certain levels of stochastic variations during the training process. Determining the required level of stochastic variation is performed by coupling the inverse ANN with a forward ANN that receives the diffusion coefficient as input and returns the concentration-time curve as output. Combined together, forward-inverse ANNs enable computationally inexperienced users to obtain accurate and fast estimation of the diffusion coefficients of cartilage zones. The diffusion coefficients estimated using the proposed approach are compared with those determined using direct scanning of the parameter space as the optimization approach. It has been shown that both approaches yield comparable results. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Computing in high-energy physics

    DOE PAGES

    Mount, Richard P.

    2016-05-31

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  20. Computing in high-energy physics

    NASA Astrophysics Data System (ADS)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  1. Computing in high-energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mount, Richard P.

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  2. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  3. FY10 Report on Multi-scale Simulation of Solvent Extraction Processes: Molecular-scale and Continuum-scale Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wardle, Kent E.; Frey, Kurt; Pereira, Candido

    2014-02-02

    This task is aimed at predictive modeling of solvent extraction processes in typical extraction equipment through multiple simulation methods at various scales of resolution. We have conducted detailed continuum fluid dynamics simulation on the process unit level as well as simulations of the molecular-level physical interactions which govern extraction chemistry. Through combination of information gained through simulations at each of these two tiers along with advanced techniques such as the Lattice Boltzmann Method (LBM) which can bridge these two scales, we can develop the tools to work towards predictive simulation for solvent extraction on the equipment scale (Figure 1). Themore » goal of such a tool-along with enabling optimized design and operation of extraction units-would be to allow prediction of stage extraction effrciency under specified conditions. Simulation efforts on each of the two scales will be described below. As the initial application of FELBM in the work performed during FYl0 has been on annular mixing it will be discussed in context of the continuum-scale. In the future, however, it is anticipated that the real value of FELBM will be in its use as a tool for sub-grid model development through highly refined DNS-like multiphase simulations facilitating exploration and development of droplet models including breakup and coalescence which will be needed for the large-scale simulations where droplet level physics cannot be resolved. In this area, it can have a significant advantage over traditional CFD methods as its high computational efficiency allows exploration of significantly greater physical detail especially as computational resources increase in the future.« less

  4. Development of a Computer-Assisted Instrumentation Curriculum for Physics Students: Using LabVIEW and Arduino Platform

    NASA Astrophysics Data System (ADS)

    Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang

    2016-06-01

    We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum `Computer-Assisted Instrumentation in the Design of Physics Laboratories' brings rigorous algorithm and syntax protocols together with imagination, communication, scientific applications and experimental innovation. The effectiveness of the curriculum was evaluated via statistical analysis of questionnaires, interview responses, the increase in student numbers majoring in physics, and performance in a competition. The results provide quantitative support that the curriculum remove huge barriers to programming which occur in text-based environments, helped students gain knowledge of programming and instrumentation, and increased the students' confidence and motivation to learn physics and computer languages.

  5. Computer Simulations of Quantum Theory of Hydrogen Atom for Natural Science Education Students in a Virtual Lab

    ERIC Educational Resources Information Center

    Singh, Gurmukh

    2012-01-01

    The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…

  6. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  7. The Quantum Measurement Problem and Physical reality: A Computation Theoretic Perspective

    NASA Astrophysics Data System (ADS)

    Srikanth, R.

    2006-11-01

    Is the universe computable? If yes, is it computationally a polynomial place? In standard quantum mechanics, which permits infinite parallelism and the infinitely precise specification of states, a negative answer to both questions is not ruled out. On the other hand, empirical evidence suggests that NP-complete problems are intractable in the physical world. Likewise, computational problems known to be algorithmically uncomputable do not seem to be computable by any physical means. We suggest that this close correspondence between the efficiency and power of abstract algorithms on the one hand, and physical computers on the other, finds a natural explanation if the universe is assumed to be algorithmic; that is, that physical reality is the product of discrete sub-physical information processing equivalent to the actions of a probabilistic Turing machine. This assumption can be reconciled with the observed exponentiality of quantum systems at microscopic scales, and the consequent possibility of implementing Shor's quantum polynomial time algorithm at that scale, provided the degree of superposition is intrinsically, finitely upper-bounded. If this bound is associated with the quantum-classical divide (the Heisenberg cut), a natural resolution to the quantum measurement problem arises. From this viewpoint, macroscopic classicality is an evidence that the universe is in BPP, and both questions raised above receive affirmative answers. A recently proposed computational model of quantum measurement, which relates the Heisenberg cut to the discreteness of Hilbert space, is briefly discussed. A connection to quantum gravity is noted. Our results are compatible with the philosophy that mathematical truths are independent of the laws of physics.

  8. The limits of predictability: Indeterminism and undecidability in classical and quantum physics

    NASA Astrophysics Data System (ADS)

    Korolev, Alexandre V.

    This thesis is a collection of three case studies, investigating various sources of indeterminism and undecidability as they bear upon in principle unpredictability of the behaviour of mechanistic systems in both classical and quantum physics. I begin by examining the sources of indeterminism and acausality in classical physics. Here I discuss the physical significance of an often overlooked and yet important Lipschitz condition, the violation of which underlies the existence of anomalous non-trivial solutions in the Norton-type indeterministic systems. I argue that the singularity arising from the violation of the Lipschitz condition in the systems considered appears to be so fragile as to be easily destroyed by slightly relaxing certain (infinite) idealizations required by these models. In particular, I show that the idealization of an absolutely nondeformable, or infinitely rigid, dome appears to be an essential assumption for anomalous motion to begin; any slightest elastic deformations of the dome due to finite rigidity of the dome destroy the shape of the dome required for indeterminism to obtain. I also consider several modifications of the original Norton's example and show that indeterminism in these cases, too, critically depends on the nature of certain idealizations pertaining to elastic properties of the bodies in these models. As a result, I argue that indeterminism of the Norton-type Lipschitz-indeterministic systems should rather be viewed as an artefact of certain (infinite) idealizations essential for the models, depriving the examples of much of their intended metaphysical import, as, for example, in Norton's antifundamentalist programme. Second, I examine the predictive computational limitations of a classical Laplace's demon. I demonstrate that in situations of self-fulfilling prognoses the class of undecidable propositions about certain future events, in general, is not empty; any Laplace's demon having all the information about the world now will be unable to predict all the future. In order to answer certain questions about the future it needs to resort occasionally to, or to consult with, a demon of a higher order in the computational hierarchy whose computational powers are beyond that of any Turing machine. In computer science such power is attributed to a theoretical device called an Oracle---a device capable of looking through an infinite domain in a finite time. I also discuss the distinction between ontological and epistemological views of determinism, and how adopting Wheeler-Landauer view of physical laws can entangle these aspects on a more fundamental level. Thirdly, I examine a recent proposal from the area of quantum computation purporting to utilize peculiarities of quantum reality to perform hypercomputation. While the current view is that quantum algorithms (such as Shor's) lead to re-description of the complexity space for computational problems, recently it has been argued (by Kieu) that certain novel quantum adiabatic algorithms may even require reconsideration of the whole notion of computability, by being able to break the Turing limit and "compute the non-computable". If implemented, such algorithms could serve as a physical realization of an Oracle needed for a Laplacian demon to accomplish its job. I critically review this latter proposal by exposing the weaknesses of Kieu's quantum adiabatic demon, pointing out its failure to deliver the purported hypercomputation. Regardless of whether the class of hypercomputers is non-empty, Kieu's proposed algorithm is not a member of this distinguished club, and a quantum computer powered Laplace's demon can do no more than its ordinary classical counterpart.

  9. Computations in Plasma Physics.

    ERIC Educational Resources Information Center

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  10. Projectile and Circular Motion: A Model Four-Week Unit of Study for a High School Physics Class Using Physics Courseware.

    ERIC Educational Resources Information Center

    Geigel, Joan; And Others

    A self-paced program designed to integrate the use of computers and physics courseware into the regular classroom environment is offered for physics high school teachers in this module on projectile and circular motion. A diversity of instructional strategies including lectures, demonstrations, videotapes, computer simulations, laboratories, and…

  11. Ambient belonging: how stereotypical cues impact gender participation in computer science.

    PubMed

    Cheryan, Sapna; Plaut, Victoria C; Davies, Paul G; Steele, Claude M

    2009-12-01

    People can make decisions to join a group based solely on exposure to that group's physical environment. Four studies demonstrate that the gender difference in interest in computer science is influenced by exposure to environments associated with computer scientists. In Study 1, simply changing the objects in a computer science classroom from those considered stereotypical of computer science (e.g., Star Trek poster, video games) to objects not considered stereotypical of computer science (e.g., nature poster, phone books) was sufficient to boost female undergraduates' interest in computer science to the level of their male peers. Further investigation revealed that the stereotypical broadcast a masculine stereotype that discouraged women's sense of ambient belonging and subsequent interest in the environment (Studies 2, 3, and 4) but had no similar effect on men (Studies 3, 4). This masculine stereotype prevented women's interest from developing even in environments entirely populated by other women (Study 2). Objects can thus come to broadcast stereotypes of a group, which in turn can deter people who do not identify with these stereotypes from joining that group.

  12. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    NASA Astrophysics Data System (ADS)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  13. Computer use, symptoms, and quality of life.

    PubMed

    Hayes, John R; Sheedy, James E; Stelmack, Joan A; Heaney, Catherine A

    2007-08-01

    To model the effects of computer use on reported visual and physical symptoms and to measure the effects upon quality of life measures. A survey of 1000 university employees (70.5% adjusted response rate) assessed visual and physical symptoms, job, physical and mental demands, ability to control/influence work, amount of work at a computer, computer work environment, relations with others at work, life and job satisfaction, and quality of life. Data were analyzed to determine whether self-reported eye symptoms are associated with perceived quality of life. The study also explored the factors that are associated with eye symptoms. Structural equation modeling and multiple regression analyses were used to assess the hypotheses. Seventy percent of the employees used some form of vision correction during computer use, 2.9% used glasses specifically prescribed for computer use, and 8% had had refractive surgery. Employees spent an average of 6 h per day at the computer. In a multiple regression framework, the latent variable eye symptoms was significantly associated with a composite quality of life variable (p = 0.02) after adjusting for job quality, job satisfaction, supervisor relations, co-worker relations, mental and physical load of the job, and job demand. Age and gender were not significantly associated with symptoms. After adjusting for age, gender, ergonomics, hours at the computer, and exercise, eye symptoms were significantly associated with physical symptoms (p < 0.001) accounting for 48% of the variance. Environmental variability at work was associated with eye symptoms and eye symptoms demonstrated a significant impact on quality of life and physical symptoms.

  14. Using Computational and Mechanical Models to Study Animal Locomotion

    PubMed Central

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  15. The impact of supercomputers on experimentation: A view from a national laboratory

    NASA Technical Reports Server (NTRS)

    Peterson, V. L.; Arnold, J. O.

    1985-01-01

    The relative roles of large scale scientific computers and physical experiments in several science and engineering disciplines are discussed. Increasing dependence on computers is shown to be motivated both by the rapid growth in computer speed and memory, which permits accurate numerical simulation of complex physical phenomena, and by the rapid reduction in the cost of performing a calculation, which makes computation an increasingly attractive complement to experimentation. Computer speed and memory requirements are presented for selected areas of such disciplines as fluid dynamics, aerodynamics, aerothermodynamics, chemistry, atmospheric sciences, astronomy, and astrophysics, together with some examples of the complementary nature of computation and experiment. Finally, the impact of the emerging role of computers in the technical disciplines is discussed in terms of both the requirements for experimentation and the attainment of previously inaccessible information on physical processes.

  16. Computational Approaches to Chemical Hazard Assessment

    PubMed Central

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  17. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  18. The DZERO Level 3 Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Angstadt, R.; Brooijmans, G.; Chapin, D.; Clements, M.; Cutts, D.; Haas, A.; Hauser, R.; Johnson, M.; Kulyavtsev, A.; Mattingly, S. E. K.; Mulders, M.; Padley, P.; Petravick, D.; Rechenmacher, R.; Snyder, S.; Watts, G.

    2004-06-01

    The DZERO experiment began RunII datataking operation at Fermilab in spring 2001. The physics program of the experiment requires the Level 3 data acquisition (DAQ) system system to handle average event sizes of 250 kilobytes at a rate of 1 kHz. The system routes and transfers event fragments of approximately 1-20 kilobytes from 63 VME crate sources to any of approximately 100 processing nodes. It is built upon a Cisco 6509 Ethernet switch, standard PCs, and commodity VME single board computers (SBCs). The system has been in full operation since spring 2002.

  19. Implementation of Protocols to Enable Doctoral Training in Physical and Computational Chemistry of a Blind Graduate Student

    ERIC Educational Resources Information Center

    Minkara, Mona S.; Weaver, Michael N.; Gorske, Jim; Bowers, Clifford R.; Merz, Kenneth M., Jr.

    2015-01-01

    There exists a sparse representation of blind and low-vision students in science, technology, engineering and mathematics (STEM) fields. This is due in part to these individuals being discouraged from pursuing STEM degrees as well as a lack of appropriate adaptive resources in upper level STEM courses and research. Mona Minkara is a rising fifth…

  20. Implementing Computer Based Laboratories

    NASA Astrophysics Data System (ADS)

    Peterson, David

    2001-11-01

    Physics students at Francis Marion University will complete several required laboratory exercises utilizing computer-based Vernier probes. The simple pendulum, the acceleration due to gravity, simple harmonic motion, radioactive half lives, and radiation inverse square law experiments will be incorporated into calculus-based and algebra-based physics courses. Assessment of student learning and faculty satisfaction will be carried out by surveys and test results. Cost effectiveness and time effectiveness assessments will be presented. Majors in Computational Physics, Health Physics, Engineering, Chemistry, Mathematics and Biology take these courses, and assessments will be categorized by major. To enhance the computer skills of students enrolled in the courses, MAPLE will be used for further analysis of the data acquired during the experiments. Assessment of these enhancement exercises will also be presented.

  1. Developing the Next Generation of Science Data System Engineers

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Behnke, Jeanne; Durachka, Christopher D.

    2016-01-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects.The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peermentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breadth of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multidiscipline science and practitioner communities expect to have access to all types of observational data.This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  2. Developing the Next Generation of Science Data System Engineers

    NASA Astrophysics Data System (ADS)

    Moses, J. F.; Durachka, C. D.; Behnke, J.

    2015-12-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects. The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peer mentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breath of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multi-discipline science and practitioner communities expect to have access to all types of observational data. This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlburg, Jill; Corones, James; Batchelor, Donald

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less

  4. Top-down models in biology: explanation and control of complex living systems above the molecular level.

    PubMed

    Pezzulo, Giovanni; Levin, Michael

    2016-11-01

    It is widely assumed in developmental biology and bioengineering that optimal understanding and control of complex living systems follows from models of molecular events. The success of reductionism has overshadowed attempts at top-down models and control policies in biological systems. However, other fields, including physics, engineering and neuroscience, have successfully used the explanations and models at higher levels of organization, including least-action principles in physics and control-theoretic models in computational neuroscience. Exploiting the dynamic regulation of pattern formation in embryogenesis and regeneration requires new approaches to understand how cells cooperate towards large-scale anatomical goal states. Here, we argue that top-down models of pattern homeostasis serve as proof of principle for extending the current paradigm beyond emergence and molecule-level rules. We define top-down control in a biological context, discuss the examples of how cognitive neuroscience and physics exploit these strategies, and illustrate areas in which they may offer significant advantages as complements to the mainstream paradigm. By targeting system controls at multiple levels of organization and demystifying goal-directed (cybernetic) processes, top-down strategies represent a roadmap for using the deep insights of other fields for transformative advances in regenerative medicine and systems bioengineering. © 2016 The Author(s).

  5. Top-down models in biology: explanation and control of complex living systems above the molecular level

    PubMed Central

    2016-01-01

    It is widely assumed in developmental biology and bioengineering that optimal understanding and control of complex living systems follows from models of molecular events. The success of reductionism has overshadowed attempts at top-down models and control policies in biological systems. However, other fields, including physics, engineering and neuroscience, have successfully used the explanations and models at higher levels of organization, including least-action principles in physics and control-theoretic models in computational neuroscience. Exploiting the dynamic regulation of pattern formation in embryogenesis and regeneration requires new approaches to understand how cells cooperate towards large-scale anatomical goal states. Here, we argue that top-down models of pattern homeostasis serve as proof of principle for extending the current paradigm beyond emergence and molecule-level rules. We define top-down control in a biological context, discuss the examples of how cognitive neuroscience and physics exploit these strategies, and illustrate areas in which they may offer significant advantages as complements to the mainstream paradigm. By targeting system controls at multiple levels of organization and demystifying goal-directed (cybernetic) processes, top-down strategies represent a roadmap for using the deep insights of other fields for transformative advances in regenerative medicine and systems bioengineering. PMID:27807271

  6. Graphics Processors in HEP Low-Level Trigger Systems

    NASA Astrophysics Data System (ADS)

    Ammendola, Roberto; Biagioni, Andrea; Chiozzi, Stefano; Cotta Ramusino, Angelo; Cretaro, Paolo; Di Lorenzo, Stefano; Fantechi, Riccardo; Fiorini, Massimiliano; Frezza, Ottorino; Lamanna, Gianluca; Lo Cicero, Francesca; Lonardo, Alessandro; Martinelli, Michele; Neri, Ilaria; Paolucci, Pier Stanislao; Pastorelli, Elena; Piandani, Roberto; Pontisso, Luca; Rossetti, Davide; Simula, Francesco; Sozzi, Marco; Vicini, Piero

    2016-11-01

    Usage of Graphics Processing Units (GPUs) in the so called general-purpose computing is emerging as an effective approach in several fields of science, although so far applications have been employing GPUs typically for offline computations. Taking into account the steady performance increase of GPU architectures in terms of computing power and I/O capacity, the real-time applications of these devices can thrive in high-energy physics data acquisition and trigger systems. We will examine the use of online parallel computing on GPUs for the synchronous low-level trigger, focusing on tests performed on the trigger system of the CERN NA62 experiment. To successfully integrate GPUs in such an online environment, latencies of all components need analysing, networking being the most critical. To keep it under control, we envisioned NaNet, an FPGA-based PCIe Network Interface Card (NIC) enabling GPUDirect connection. Furthermore, it is assessed how specific trigger algorithms can be parallelized and thus benefit from a GPU implementation, in terms of increased execution speed. Such improvements are particularly relevant for the foreseen Large Hadron Collider (LHC) luminosity upgrade where highly selective algorithms will be essential to maintain sustainable trigger rates with very high pileup.

  7. HEPLIB `91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  8. HEPLIB 91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  9. The Entangled Histories of Physics and Computation

    NASA Astrophysics Data System (ADS)

    Rodriguez, Cesar

    2007-03-01

    The history of physics and computation intertwine in a fascinating manner that is relevant to the field of quantum computation. This talk focuses of the interconnections between both by examining their rhyming philosophies, recurrent characters and common themes. Leibniz not only was one of the lead figures of calculus, but also left his footprint in physics and invented the concept of a universal computational language. This last idea was further developed by Boole, Russell, Hilbert and G"odel. Physicists such as Boltzmann and Maxwell also established the foundation of the field of information theory later developed by Shannon. The war efforts of von Neumann and Turing can be juxtaposed to the Manhattan Project. Professional and personal connections of these characters to the development of physics will be emphasized. Recently, new cryptographic developments lead to a reexamination of the fundamentals of quantum mechanics, while quantum computation is discovering a new perspective on the nature of information itself.

  10. Like-charge attraction in a one-dimensional setting: the importance of being odd

    NASA Astrophysics Data System (ADS)

    Trizac, Emmanuel; Téllez, Gabriel

    2018-03-01

    From cement cohesion to DNA condensation, a proper statistical physics treatment of systems with long-range forces is important for a number of applications in physics, chemistry, and biology. We compute here the effective force between fixed charged macromolecules, screened by oppositely charged mobile ions (counterions). We treat the problem in a one-dimensional configuration that allows for interesting discussion and derivation of exact results, remaining at a level of mathematical difficulty compatible with an undergraduate course. Emphasis is put on the counterintuitive but fundamental phenomenon of like-charge attraction, which our treatment brings for the first time to the level of undergraduate teaching. The parity of the number of counterions is shown to play a prominent role, which sheds light on the binding mechanism at work when like-charge macromolecules do attract.

  11. Computers in Physical Education.

    ERIC Educational Resources Information Center

    Sydow, James Armin

    Although computers have potential applications in the elementary and secondary physical education curriculum, current usage is minimal when compared to other disciplines. However, present trends indicate a substantial growth in the use of the computer in a supportive role in assisting the teacher in the management of instructional activities.…

  12. Simultaneous multislice refocusing via time optimal control.

    PubMed

    Rund, Armin; Aigner, Christoph Stefan; Kunisch, Karl; Stollberger, Rudolf

    2018-02-09

    Joint design of minimum duration RF pulses and slice-selective gradient shapes for MRI via time optimal control with strict physical constraints, and its application to simultaneous multislice imaging. The minimization of the pulse duration is cast as a time optimal control problem with inequality constraints describing the refocusing quality and physical constraints. It is solved with a bilevel method, where the pulse length is minimized in the upper level, and the constraints are satisfied in the lower level. To address the inherent nonconvexity of the optimization problem, the upper level is enhanced with new heuristics for finding a near global optimizer based on a second optimization problem. A large set of optimized examples shows an average temporal reduction of 87.1% for double diffusion and 74% for turbo spin echo pulses compared to power independent number of slices pulses. The optimized results are validated on a 3T scanner with phantom measurements. The presented design method computes minimum duration RF pulse and slice-selective gradient shapes subject to physical constraints. The shorter pulse duration can be used to decrease the effective echo time in existing echo-planar imaging or echo spacing in turbo spin echo sequences. © 2018 International Society for Magnetic Resonance in Medicine.

  13. A computer-based physics laboratory apparatus: Signal generator software

    NASA Astrophysics Data System (ADS)

    Thanakittiviroon, Tharest; Liangrocapart, Sompong

    2005-09-01

    This paper describes a computer-based physics laboratory apparatus to replace expensive instruments such as high-precision signal generators. This apparatus uses a sound card in a common personal computer to give sinusoidal signals with an accurate frequency that can be programmed to give different frequency signals repeatedly. An experiment on standing waves on an oscillating string uses this apparatus. In conjunction with interactive lab manuals, which have been developed using personal computers in our university, we achieve a complete set of low-cost, accurate, and easy-to-use equipment for teaching a physics laboratory.

  14. PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS

    PubMed Central

    Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.

    2013-01-01

    Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390

  15. Evidence-based guidelines for the wise use of computers by children: physical development guidelines.

    PubMed

    Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J

    2010-04-01

    Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.

  16. BODY DISSATISFACTION, PHYSICAL ACTIVITY, AND SEDENTARY BEHAVIOR IN FEMALE ADOLESCENTS.

    PubMed

    Miranda, Valter Paulo Neves; Morais, Núbia Sousa de; Faria, Eliane Rodrigues de; Amorim, Paulo Roberto Dos Santos; Marins, João Carlos Bouzas; Franceschini, Sylvia do Carmo Castro; Teixeira, Paula Costa; Priore, Silvia Eloiza

    2018-05-21

    To evaluate the association of body image with physical activity level, body composition, and sedentary behavior (SB) of female adolescents. Exploratory cross-sectional study conducted with 120 female adolescents aged between 14-19 years, from the city of Viçosa, Minas Gerais, Southeast Brazil. Body image was evaluated with a Body Silhouette Scale (BSS) and a Body Shape Questionnaire (BSQ). Weight, height, and waist circumference values were analyzed, as well as the waist-to-height ratio and body fat percentage. The physical activity level (PAL) was assessed by 24-hour Physical Activity Recall and SB by screen time, that is, time spent in front of a TV, playing video game, on the computer and using tablets, and, separately, the cell phone time. Mean age was 16.5±1.5 years, and most adolescents were eutrophic (77.6%), sedentary/low PAL (84.2%), with high screen time (85.2%) and cell phone time (58.7%). Body dissatisfaction was stated in 40.6% of BSQ and 45.8% of BSS evaluations. Body distortion was identified in 52.9% of participants. All body composition measures, along with cell phone time and PAL, were associated with body dissatisfaction, the more active adolescents presenting higher levels of dissatisfaction. This study concluded that female adolescents with higher cell phone time also present higher body dissatisfaction, as well as the most physically active ones. All body composition measurements were associated with body dissatisfaction, mainly body mass index, waist circumference, and waist-to-height ratio.

  17. Extended computational kernels in a massively parallel implementation of the Trotter-Suzuki approximation

    NASA Astrophysics Data System (ADS)

    Wittek, Peter; Calderaro, Luca

    2015-12-01

    We extended a parallel and distributed implementation of the Trotter-Suzuki algorithm for simulating quantum systems to study a wider range of physical problems and to make the library easier to use. The new release allows periodic boundary conditions, many-body simulations of non-interacting particles, arbitrary stationary potential functions, and imaginary time evolution to approximate the ground state energy. The new release is more resilient to the computational environment: a wider range of compiler chains and more platforms are supported. To ease development, we provide a more extensive command-line interface, an application programming interface, and wrappers from high-level languages.

  18. Assessment of physical server reliability in multi cloud computing system

    NASA Astrophysics Data System (ADS)

    Kalyani, B. J. D.; Rao, Kolasani Ramchand H.

    2018-04-01

    Business organizations nowadays functioning with more than one cloud provider. By spreading cloud deployment across multiple service providers, it creates space for competitive prices that minimize the burden on enterprises spending budget. To assess the software reliability of multi cloud application layered software reliability assessment paradigm is considered with three levels of abstractions application layer, virtualization layer, and server layer. The reliability of each layer is assessed separately and is combined to get the reliability of multi-cloud computing application. In this paper, we focused on how to assess the reliability of server layer with required algorithms and explore the steps in the assessment of server reliability.

  19. A new cognitive rehabilitation programme for patients with multiple sclerosis: the 'MS-line! Project'.

    PubMed

    Gich, Jordi; Freixenet, Jordi; Garcia, Rafael; Vilanova, Joan Carles; Genís, David; Silva, Yolanda; Montalban, Xavier; Ramió-Torrentà, Lluís

    2015-09-01

    Cognitive rehabilitation is often delayed in multiple sclerosis (MS). To develop a free and specific cognitive rehabilitation programme for MS patients to be used from early stages that does not interfere with daily living activities. MS-line!, cognitive rehabilitation materials consisting of written, manipulative and computer-based materials with difficulty levels developed by a multidisciplinary team. Mathematical, problem-solving and word-based exercises were designed. Physical materials included spatial, coordination and reasoning games. Computer-based material included logic and reasoning, working memory and processing speed games. Cognitive rehabilitation exercises that are specific for MS patients have been successfully developed. © The Author(s), 2014.

  20. Can exergaming contribute to improving physical activity levels and health outcomes in children?

    PubMed

    Daley, Amanda J

    2009-08-01

    Physical inactivity among children is a serious public health problem. It has been suggested that high levels of screen time are contributory factors that encourage sedentary lifestyles in young people. As physical inactivity and obesity levels continue to rise in young people, it has been proposed that new-generation active computer- and video-console games (otherwise known as "exergaming") may offer the opportunity to contribute to young people's energy expenditure during their free time. Although studies have produced some encouraging results regarding the energy costs involved in playing active video-console games, the energy costs of playing the authentic versions of activity-based video games are substantially larger, highlighting that active gaming is no substitute for real sports and activities. A small number of exergaming activities engage children in moderate-intensity activity, but most do not. Only 3 very small trials have considered the effects of exergaming on physical activity levels and/or other health outcomes in children. Evidence from these trials has been mixed; positive trends for improvements in some health outcomes in the intervention groups were noted in 2 trials. No adequately powered randomized, controlled trial has been published to date, and no trial has assessed the long-term impact of exergaming on children's health. We now need high-quality randomized, controlled trials to evaluate the effectiveness and sustainability of exergaming, as well as its clinical relevance; until such studies take place, we should remain cautious about its ability to positively affect children's health.

  1. Association between leisure-time physical activity and C-reactive protein levels in adults, in the city of Salvador, Brazil.

    PubMed

    Pitanga, Francisco; Lessa, Ines

    2009-04-01

    Leisure time physical activity (LTPA), defined as any type of bodily movement performed during leisure time, is associated with a reduction in the risk for many cardiovascular injuries. To investigate the existence of an association between leisure time physical activity (LTPA) and C-reactive protein (CRP) levels in adults, in the city of Salvador, State of Bahia, Brazil. This was a cross-sectional study, with a sample of 822 men and women, aged > 20 years. Active in leisure time were those with a self-reported practice of physical activities in leisure time; high serum CRP levels were those with values > 3.0 mg/l. Logistic regression analysis was used to compute the odds ratio (OR) with a 95% confidence interval (CI). Using multivariate analysis to adjust for potential confounders, we found an OR of 0.73 (0.68-0.79) among the men which shows the existence of an association between LTPA and high CRP levels only in male individuals. After a stratification by gender, obesity, diabetes and smoking habit, we found an association between LTPA and high CRP in non-obese and non-diabetic male smokers or former smokers; and in obese and non-smoking females. The results of this study may bring contributions to public health, since they can be used to raise awareness of the importance of LTPA as a prospective strategy for population health improvement.

  2. Braiding by Majorana tracking and long-range CNOT gates with color codes

    NASA Astrophysics Data System (ADS)

    Litinski, Daniel; von Oppen, Felix

    2017-11-01

    Color-code quantum computation seamlessly combines Majorana-based hardware with topological error correction. Specifically, as Clifford gates are transversal in two-dimensional color codes, they enable the use of the Majoranas' non-Abelian statistics for gate operations at the code level. Here, we discuss the implementation of color codes in arrays of Majorana nanowires that avoid branched networks such as T junctions, thereby simplifying their realization. We show that, in such implementations, non-Abelian statistics can be exploited without ever performing physical braiding operations. Physical braiding operations are replaced by Majorana tracking, an entirely software-based protocol which appropriately updates the Majoranas involved in the color-code stabilizer measurements. This approach minimizes the required hardware operations for single-qubit Clifford gates. For Clifford completeness, we combine color codes with surface codes, and use color-to-surface-code lattice surgery for long-range multitarget CNOT gates which have a time overhead that grows only logarithmically with the physical distance separating control and target qubits. With the addition of magic state distillation, our architecture describes a fault-tolerant universal quantum computer in systems such as networks of tetrons, hexons, or Majorana box qubits, but can also be applied to nontopological qubit platforms.

  3. Exploring design features for enhancing players' challenge in strategy games.

    PubMed

    Hsu, Shang Hwa; Wen, Ming-Hui; Wu, Muh-Cherng

    2007-06-01

    This paper examines how to make a player feel more challenged in a strategic computer game. It is hypothesized that information availability and resource advantage affect play difficulty, which in turn affects the challenge experienced. The difficulty of play can be defined in terms of the mental workload that players experience and the physical effort that players exert. Forty-five male college and graduate students participated in a 3 x 3 (information availability x resource advantage) between-subjects factorial design experiment. This experiment measured player mental workload, physical effort, and challenge. The results indicate that information availability affects player mental workload, and resource advantage affects levels of player physical effort, respectively. Moreover, the relationship between mental workload and challenge was found to be an inverted U-shaped curve; in other words, too much or too little mental workload may decrease player challenge. The relationship between physical effort and challenge exhibited similar characteristics.

  4. Validating an operational physical method to compute surface radiation from geostationary satellites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Manajit; Dhere, Neelkanth G.; Wohlgemuth, John H.

    We developed models to compute global horizontal irradiance (GHI) and direct normal irradiance (DNI) over the last three decades. These models can be classified as empirical or physical based on the approach. Empirical models relate ground-based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the physics behind the radiation received at the satellite and create retrievals to estimate surface radiation. Furthermore, while empirical methods have been traditionally used for computing surface radiation for the solar energy industry, the advent of faster computing has made operational physical models viable. The Global Solar Insolation Projectmore » (GSIP) is a physical model that computes DNI and GHI using the visible and infrared channel measurements from a weather satellite. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate GHI and DNI. Developed for polar orbiting satellites, GSIP has been adapted to NOAA's Geostationary Operation Environmental Satellite series and can run operationally at high spatial resolutions. Our method holds the possibility of creating high quality datasets of GHI and DNI for use by the solar energy industry. We present an outline of the methodology and results from running the model as well as a validation study using ground-based instruments.« less

  5. A Framework for Understanding Physics Students' Computational Modeling Practices

    ERIC Educational Resources Information Center

    Lunk, Brandon Robert

    2012-01-01

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content…

  6. Is thinking computable?

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Strong artificial intelligence claims that conscious thought can arise in computers containing the right algorithms even though none of the programs or components of those computers understand which is going on. As proof, it asserts that brains are finite webs of neurons, each with a definite function governed by the laws of physics; this web has a set of equations that can be solved (or simulated) by a sufficiently powerful computer. Strong AI claims the Turing test as a criterion of success. A recent debate in Scientific American concludes that the Turing test is not sufficient, but leaves intact the underlying premise that thought is a computable process. The recent book by Roger Penrose, however, offers a sharp challenge, arguing that the laws of quantum physics may govern mental processes and that these laws may not be computable. In every area of mathematics and physics, Penrose finds evidence of nonalgorithmic human activity and concludes that mental processes are inherently more powerful than computational processes.

  7. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  8. Calculation of the Hadronic Vacuum Polarization Disconnected Contribution to the Muon Anomalous Magnetic Moment

    NASA Astrophysics Data System (ADS)

    Blum, T.; Boyle, P. A.; Izubuchi, T.; Jin, L.; Jüttner, A.; Lehner, C.; Maltman, K.; Marinkovic, M.; Portelli, A.; Spraggs, M.; Rbc; Ukqcd Collaborations

    2016-06-01

    We report the first lattice QCD calculation of the hadronic vacuum polarization (HVP) disconnected contribution to the muon anomalous magnetic moment at physical pion mass. The calculation uses a refined noise-reduction technique that enables the control of statistical uncertainties at the desired level with modest computational effort. Measurements were performed on the 483×96 physical-pion-mass lattice generated by the RBC and UKQCD Collaborations. We find the leading-order hadronic vacuum polarization aμHVP (LO )disc=-9.6 (3.3 )(2.3 )×10-10 , where the first error is statistical and the second systematic.

  9. Calculation of the Hadronic Vacuum Polarization Disconnected Contribution to the Muon Anomalous Magnetic Moment.

    PubMed

    Blum, T; Boyle, P A; Izubuchi, T; Jin, L; Jüttner, A; Lehner, C; Maltman, K; Marinkovic, M; Portelli, A; Spraggs, M

    2016-06-10

    We report the first lattice QCD calculation of the hadronic vacuum polarization (HVP) disconnected contribution to the muon anomalous magnetic moment at physical pion mass. The calculation uses a refined noise-reduction technique that enables the control of statistical uncertainties at the desired level with modest computational effort. Measurements were performed on the 48^{3}×96 physical-pion-mass lattice generated by the RBC and UKQCD Collaborations. We find the leading-order hadronic vacuum polarization a_{μ}^{HVP(LO)disc}=-9.6(3.3)(2.3)×10^{-10}, where the first error is statistical and the second systematic.

  10. Beam and Plasma Physics Research

    DTIC Science & Technology

    1990-06-01

    La di~raDy in high power microwave computations and thi-ory and high energy plasma computations and theory. The HPM computations concentrated on...2.1 REPORT INDEX 7 2.2 TASK AREA 2: HIGH-POWER RF EMISSION AND CHARGED- PARTICLE BEAM PHYSICS COMPUTATION , MODELING AND THEORY 10 2.2.1 Subtask 02-01...Vulnerability of Space Assets 22 2.2.6 Subtask 02-06, Microwave Computer Program Enhancements 22 2.2.7 Subtask 02-07, High-Power Microwave Transvertron Design 23

  11. Simulating the Physical World

    NASA Astrophysics Data System (ADS)

    Berendsen, Herman J. C.

    2004-06-01

    The simulation of physical systems requires a simplified, hierarchical approach which models each level from the atomistic to the macroscopic scale. From quantum mechanics to fluid dynamics, this book systematically treats the broad scope of computer modeling and simulations, describing the fundamental theory behind each level of approximation. Berendsen evaluates each stage in relation to its applications giving the reader insight into the possibilities and limitations of the models. Practical guidance for applications and sample programs in Python are provided. With a strong emphasis on molecular models in chemistry and biochemistry, this book will be suitable for advanced undergraduate and graduate courses on molecular modeling and simulation within physics, biophysics, physical chemistry and materials science. It will also be a useful reference to all those working in the field. Additional resources for this title including solutions for instructors and programs are available online at www.cambridge.org/9780521835275. The first book to cover the wide range of modeling and simulations, from atomistic to the macroscopic scale, in a systematic fashion Providing a wealth of background material, it does not assume advanced knowledge and is eminently suitable for course use Contains practical examples and sample programs in Python

  12. Developing affordable multi-touch technologies for use in physics

    NASA Astrophysics Data System (ADS)

    Potter, Mark; Ilie, Carolina; Schofield, Damian; Vampola, David

    2012-02-01

    Physics is one of many areas which has the ability to benefit from a number of different teaching styles and sophisticated instructional tools due to it having both theoretical and practical applications which can be explored. The purpose of this research is to develop affordable large scale multi-touch interfaces which can be used within and outside of the classroom as both an instruction technology and a computer supported collaborative learning tool. Not only can this technology be implemented at university levels, but also at the K-12 level of education. Pedagogical research indicates that kinesthetic learning is a fundamental, powerful, and ubiquitous learning style [1]. Through the use of these types of multi-touch tools and teaching methods which incorporate them, the classroom can be enriched to allow for better comprehension and retention of information. This is due in part to a wider range of learning styles, such as kinesthetic learning, which are being catered to within the classroom. [4pt] [1] Wieman, C.E, Perkins, K.K., Adams, W.K., ``Oersted Medal Lecture 2007: Interactive Simulations for teaching physics: What works, what doesn't and why,'' American Journal of Physics. 76 393-99.

  13. Physically-Based Models for the Reflection, Transmission and Subsurface Scattering of Light by Smooth and Rough Surfaces, with Applications to Realistic Image Synthesis

    NASA Astrophysics Data System (ADS)

    He, Xiao Dong

    This thesis studies light scattering processes off rough surfaces. Analytic models for reflection, transmission and subsurface scattering of light are developed. The results are applicable to realistic image generation in computer graphics. The investigation focuses on the basic issue of how light is scattered locally by general surfaces which are neither diffuse nor specular; Physical optics is employed to account for diffraction and interference which play a crucial role in the scattering of light for most surfaces. The thesis presents: (1) A new reflectance model; (2) A new transmittance model; (3) A new subsurface scattering model. All of these models are physically-based, depend on only physical parameters, apply to a wide range of materials and surface finishes and more importantly, provide a smooth transition from diffuse-like to specular reflection as the wavelength and incidence angle are increased or the surface roughness is decreased. The reflectance and transmittance models are based on the Kirchhoff Theory and the subsurface scattering model is based on Energy Transport Theory. They are valid only for surfaces with shallow slopes. The thesis shows that predicted reflectance distributions given by the reflectance model compare favorably with experiment. The thesis also investigates and implements fast ways of computing the reflectance and transmittance models. Furthermore, the thesis demonstrates that a high level of realistic image generation can be achieved due to the physically -correct treatment of the scattering processes by the reflectance model.

  14. The impact of working technique on physical loads - an exposure profile among newspaper editors.

    PubMed

    Lindegård, A; Wahlström, J; Hagberg, M; Hansson, G-A; Jonsson, P; Wigaeus Tornqvist, E

    2003-05-15

    The aim of this study was to investigate the possible associations between working technique, sex, symptoms and level of physical load in VDU-work. A study group of 32 employees in the editing department of a daily newspaper answered a questionnaire, about physical working conditions and symptoms from the neck and the upper extremities. Muscular load, wrist positions and computer mouse forces were measured. Working technique was assessed from an observation protocol for computer work. In addition ratings of perceived exertion and overall comfort were collected. The results showed that subjects classified as having a good working technique worked with less muscular load in the forearm (extensor carpi ulnaris p=0.03) and in the trapezius muscle on the mouse operating side (p=0.02) compared to subjects classified as having a poor working technique. Moreover there were no differences in gap frequency (number of episodes when muscle activity is below 2.5% of a reference contraction) or muscular rest (total duration of gaps) between the two working technique groups. Women in this study used more force (mean force p=0.006, peak force p=0.02) expressed as % MVC than the men when operating the computer mouse. No major differences were shown in muscular load, wrist postures, perceived exertion or perceived comfort between men and women or between cases and symptom free subjects. In conclusion a good working technique was associated with reduced muscular load in the forearm muscles and in the trapezius muscle on the mouse operating side. Moreover women used more force (mean force and peak force) than men when operating the click button (left button) of the computer mouse.

  15. Department of Defense Laboratory Civilian Science and Engineering Workforce - 2011

    DTIC Science & Technology

    2011-05-01

    Attorney 130 Foreign Affairs 633 Physical Therapist 1222 Patent Attorney 131 International Relations 644 Medical Technologist 1301 General Physical ... physical movement of people. Governments in many industrialized countries increasingly view the immigration of skilled S&E workers as an important...series and their associated increases are individuals in computer science (+77/2.6%), physics (+67/4.6%), computer engineering (+58/2.7%), general

  16. Association of Markers of Inflammation with Sleep and Physical Activity Among People Living with HIV or AIDS.

    PubMed

    Wirth, Michael D; Jaggers, Jason R; Dudgeon, Wesley D; Hébert, James R; Youngstedt, Shawn D; Blair, Steven N; Hand, Gregory A

    2015-06-01

    This study examined associations of sleep and minutes spent in moderate-vigorous physical activity (MVPA) with C-reactive protein (CRP) and interleukin (IL)-6 among persons living with HIV. Cross-sectional analyses (n = 45) focused on associations of inflammatory outcomes (i.e., CRP and IL-6) with actigraph-derived sleep duration, latency, and efficiency; sleep onset; wake time; and wake-after-sleep-onset; as well as MVPA. Least square means for CRP and IL-6 by levels of sleep and MVPA were computed from general linear models. Individuals below the median of sleep duration, above the median for sleep onset, and below the median of MVPA minutes had higher CRP or IL-6 levels. Generally, individuals with both low MVPA and poor sleep characteristics had higher inflammation levels than those with more MVPA and worse sleep. Understanding the combined impact of multiple lifestyle/behavioral factors on inflammation could inform intervention strategies to reduce inflammation and therefore, chronic disease risk.

  17. Computer-based assistive technology device for use by children with physical disabilities: a cross-sectional study.

    PubMed

    Lidström, Helene; Almqvist, Lena; Hemmingsson, Helena

    2012-07-01

    To investigate the prevalence of children with physical disabilities who used a computer-based ATD, and to examine characteristics differences in children and youths who do or do not use computer-based ATDs, as well as, investigate differences that might influence the satisfaction of those two groups of children and youths when computers are being used for in-school and outside school activities. A cross-sectional survey about computer-based activities in and outside school (n = 287) and group comparisons. The prevalence of using computer-based ATDs was about 44 % (n = 127) of the children in this sample. These children were less satisfied with their computer use in education and outside school activities than the children who did not use an ATD. Improved coordination of the usage of computer-based ATDs in school and in the home, including service and support, could increase the opportunities for children with physical disabilities who use computer-based ATDs to perform the computer activities they want, need and are expected to do in school and outside school.

  18. High fidelity simulation and analysis of liquid jet atomization in a gaseous crossflow at intermediate Weber numbers

    NASA Astrophysics Data System (ADS)

    Li, Xiaoyi; Soteriou, Marios C.

    2016-08-01

    Recent advances in numerical methods coupled with the substantial enhancements in computing power and the advent of high performance computing have presented first principle, high fidelity simulation as a viable tool in the prediction and analysis of spray atomization processes. The credibility and potential impact of such simulations, however, has been hampered by the relative absence of detailed validation against experimental evidence. The numerical stability and accuracy challenges arising from the need to simulate the high liquid-gas density ratio across the sharp interfaces encountered in these flows are key reasons for this. In this work we challenge this status quo by presenting a numerical model able to deal with these challenges, employing it in simulations of liquid jet in crossflow atomization and performing extensive validation of its results against a carefully executed experiment with detailed measurements in the atomization region. We then proceed to the detailed analysis of the flow physics. The computational model employs the coupled level set and volume of fluid approach to directly capture the spatiotemporal evolution of the liquid-gas interface and the sharp-interface ghost fluid method to stably handle high liquid-air density ratio. Adaptive mesh refinement and Lagrangian droplet models are shown to be viable options for computational cost reduction. Moreover, high performance computing is leveraged to manage the computational cost. The experiment selected for validation eliminates the impact of inlet liquid and gas turbulence and focuses on the impact of the crossflow aerodynamic forces on the atomization physics. Validation is demonstrated by comparing column surface wavelengths, deformation, breakup locations, column trajectories and droplet sizes, velocities, and mass rates for a range of intermediate Weber numbers. Analysis of the physics is performed in terms of the instability and breakup characteristics and the features of downstream flow recirculation, and vortex shedding. Formation of "Λ" shape windward column waves is observed and explained by the combined upward and lateral surface motion. The existence of Rayleigh-Taylor instability as the primary mechanism for the windward column waves is verified for this case by comparing wavelengths from the simulations to those predicted by linear stability analyses. Physical arguments are employed to postulate that the type of instability manifested may be related to conditions such as the gas Weber number and the inlet turbulence level. The decreased column wavelength with increasing Weber number is found to cause enhanced surface stripping and early depletion of liquid core at higher Weber number. A peculiar "three-streak-two-membrane" liquid structure is identified at the lowest Weber number and explained as the consequence of the symmetric recirculation zones behind the jet column. It is found that the vortical flow downstream of the liquid column resembles a von Karman vortex street and that the coupling between the gas flow and droplet transport is weak for the conditions explored.

  19. High fidelity simulation and analysis of liquid jet atomization in a gaseous crossflow at intermediate Weber numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xiaoyi, E-mail: lixy2@utrc.utc.com; Soteriou, Marios C.

    Recent advances in numerical methods coupled with the substantial enhancements in computing power and the advent of high performance computing have presented first principle, high fidelity simulation as a viable tool in the prediction and analysis of spray atomization processes. The credibility and potential impact of such simulations, however, has been hampered by the relative absence of detailed validation against experimental evidence. The numerical stability and accuracy challenges arising from the need to simulate the high liquid-gas density ratio across the sharp interfaces encountered in these flows are key reasons for this. In this work we challenge this status quomore » by presenting a numerical model able to deal with these challenges, employing it in simulations of liquid jet in crossflow atomization and performing extensive validation of its results against a carefully executed experiment with detailed measurements in the atomization region. We then proceed to the detailed analysis of the flow physics. The computational model employs the coupled level set and volume of fluid approach to directly capture the spatiotemporal evolution of the liquid-gas interface and the sharp-interface ghost fluid method to stably handle high liquid-air density ratio. Adaptive mesh refinement and Lagrangian droplet models are shown to be viable options for computational cost reduction. Moreover, high performance computing is leveraged to manage the computational cost. The experiment selected for validation eliminates the impact of inlet liquid and gas turbulence and focuses on the impact of the crossflow aerodynamic forces on the atomization physics. Validation is demonstrated by comparing column surface wavelengths, deformation, breakup locations, column trajectories and droplet sizes, velocities, and mass rates for a range of intermediate Weber numbers. Analysis of the physics is performed in terms of the instability and breakup characteristics and the features of downstream flow recirculation, and vortex shedding. Formation of “Λ” shape windward column waves is observed and explained by the combined upward and lateral surface motion. The existence of Rayleigh-Taylor instability as the primary mechanism for the windward column waves is verified for this case by comparing wavelengths from the simulations to those predicted by linear stability analyses. Physical arguments are employed to postulate that the type of instability manifested may be related to conditions such as the gas Weber number and the inlet turbulence level. The decreased column wavelength with increasing Weber number is found to cause enhanced surface stripping and early depletion of liquid core at higher Weber number. A peculiar “three-streak-two-membrane” liquid structure is identified at the lowest Weber number and explained as the consequence of the symmetric recirculation zones behind the jet column. It is found that the vortical flow downstream of the liquid column resembles a von Karman vortex street and that the coupling between the gas flow and droplet transport is weak for the conditions explored.« less

  20. Operational health physics.

    PubMed

    Miller, Kenneth L

    2005-06-01

    A review of the operational health physics papers published in Health Physics and Operational Radiation Safety over the past fifteen years indicated seventeen general categories or areas into which the topics could be readily separated. These areas include academic research programs, use of computers in operational health physics, decontamination and decommissioning, dosimetry, emergency response, environmental health physics, industrial operations, medical health physics, new procedure development, non-ionizing radiation, radiation measurements, radioactive waste disposal, radon measurement and control, risk communication, shielding evaluation and specification, staffing levels for health physics programs, and unwanted or orphan sources. That is not to say that there are no operational papers dealing with specific areas of health physics, such as power reactor health physics, accelerator health physics, or governmental health physics. On the contrary, there have been a number of excellent operational papers from individuals in these specialty areas and they are included in the broader topics listed above. A listing and review of all the operational papers that have been published is beyond the scope of this discussion. However, a sampling of the excellent operational papers that have appeared in Health Physics and Operational Radiation Safety is presented to give the reader the flavor of the wide variety of concerns to the operational health physicist and the current areas of interest where procedures are being refined and solutions to problems are being developed.

  1. Operational health physics.

    PubMed

    Miller, Kenneth L

    2005-01-01

    A review of the operational health physics papers published in Health Physics and Operational Radiation Safety over the past fifteen years indicated seventeen general categories or areas into which the topics could be readily separated. These areas include academic research programs, use of computers in operational health physics, decontamination and decommissioning, dosimetry, emergency response, environmental health physics, industrial operations, medical health physics, new procedure development, non-ionizing radiation, radiation measurements, radioactive waste disposal, radon measurement and control, risk communication, shielding evaluation and specification, staffing levels for health physics programs, and unwanted or orphan sources. That is not to say that there are no operational papers dealing with specific areas of health physics, such as power reactor health physics, accelerator health physics, or governmental health physics. On the contrary, there have been a number of excellent operational papers from individuals in these specialty areas and they are included in the broader topics listed above. A listing and review of all the operational papers that have been published is beyond the scope of this discussion. However, a sampling of the excellent operational papers that have appeared in Health Physics and Operational Radiation Safety is presented to give the reader the flavor of the wide variety of concerns to the operational health physicist and the current areas of interest where procedures are being refined and solutions to problems are being developed.

  2. Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics

    DOE PAGES

    Laney, Daniel; Langer, Steven; Weber, Christopher; ...

    2014-01-01

    This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. Wemore » compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.« less

  3. Klein–Gordon equation in curved space-time

    NASA Astrophysics Data System (ADS)

    Lehn, R. D.; Chabysheva, S. S.; Hiller, J. R.

    2018-07-01

    We report the methods and results of a computational physics project on the solution of the relativistic Klein–Gordon equation for a light particle gravitationally bound to a heavy central mass. The gravitational interaction is prescribed by the metric of a spherically symmetric space-time. Metrics are considered for an impenetrable sphere, a soft sphere of uniform density, and a soft sphere with a linear transition from constant to zero density; in each case the radius of the central mass is chosen to be sufficient to avoid any event horizon. The solutions are obtained numerically and compared with nonrelativistic Coulomb-type solutions, both directly and in perturbation theory, to study the general-relativistic corrections to the quantum solutions for a 1/r potential. The density profile with a linear transition is chosen to avoid singularities in the wave equation that can be caused by a discontinuous derivative of the density. This project should be of interest to instructors and students of computational physics at the graduate and advanced undergraduate levels.

  4. Assessment of physical activity with the Computer Science and Applications, Inc., accelerometer: laboratory versus field validation.

    PubMed

    Nichols, J F; Morgan, C G; Chabot, L E; Sallis, J F; Calfas, K J

    2000-03-01

    Our purpose was to compare the validity of the Computer Science and Applications, (CSA) Inc., accelerometer in laboratory and field settings and establish CSA count ranges for light, moderate, and vigorous physical activity. Validity was determined in 60 adults during treadmill exercise, using oxygen consumption (VO2) as the criterion measure, while 30 adults walked and jogged outdoors on a 400-m track. The relationship between CSA counts and VO2 was linear (R2 = .89 SEE = 3.72 ml.kg-1.min-1), as was the relationship between velocity and counts in the field (R2 = .89, SEE = 0.89 mi.hr-1). However, significant differences were found (p < .05) between laboratory and field measures of CSA counts for light and vigorous intensity. We conclude that the CSA can be used to quantify walking and jogging outdoors on level ground; however, laboratory equations may not be appropriate for use in field settings, particularly for light and vigorous activity.

  5. GPUs in a computational physics course

    NASA Astrophysics Data System (ADS)

    Adler, Joan; Nissim, Gal; Kiswani, Ahmad

    2017-10-01

    In an introductory computational physics class of the type that many of us give, time constraints lead to hard choices on topics. Everyone likes to include their own research in such a class but an overview of many areas is paramount. Parallel programming algorithms using MPI is one important topic. Both the principle and the need to break the “fear barrier” of using a large machine with a queuing system via ssh must be sucessfully passed on. Due to the plateau in chip development and to power considerations future HPC hardware choices will include heavy use of GPUs. Thus the need to introduce these at the level of an introductory course has arisen. Just as for parallel coding, explanation of the benefits and simple examples to guide the hesitant first time user should be selected. Several student projects using GPUs that include how-to pages were proposed at the Technion. Two of the more successful ones were lattice Boltzmann and a finite element code, and we present these in detail.

  6. Light element opacities of astrophysical interest from ATOMIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colgan, J.; Kilcrease, D. P.; Magee, N. H. Jr.

    We present new calculations of local-thermodynamic-equilibrium (LTE) light element opacities from the Los Alamos ATOMIC code for systems of astrophysical interest. ATOMIC is a multi-purpose code that can generate LTE or non-LTE quantities of interest at various levels of approximation. Our calculations, which include fine-structure detail, represent a systematic improvement over previous Los Alamos opacity calculations using the LEDCOP legacy code. The ATOMIC code uses ab-initio atomic structure data computed from the CATS code, which is based on Cowan's atomic structure codes, and photoionization cross section data computed from the Los Alamos ionization code GIPPER. ATOMIC also incorporates a newmore » equation-of-state (EOS) model based on the chemical picture. ATOMIC incorporates some physics packages from LEDCOP and also includes additional physical processes, such as improved free-free cross sections and additional scattering mechanisms. Our new calculations are made for elements of astrophysical interest and for a wide range of temperatures and densities.« less

  7. Virtualization and cloud computing in dentistry.

    PubMed

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  8. A dental public health approach based on computational mathematics: Monte Carlo simulation of childhood dental decay.

    PubMed

    Tennant, Marc; Kruger, Estie

    2013-02-01

    This study developed a Monte Carlo simulation approach to examining the prevalence and incidence of dental decay using Australian children as a test environment. Monte Carlo simulation has been used for a half a century in particle physics (and elsewhere); put simply, it is the probability for various population-level outcomes seeded randomly to drive the production of individual level data. A total of five runs of the simulation model for all 275,000 12-year-olds in Australia were completed based on 2005-2006 data. Measured on average decayed/missing/filled teeth (DMFT) and DMFT of highest 10% of sample (Sic10) the runs did not differ from each other by more than 2% and the outcome was within 5% of the reported sampled population data. The simulations rested on the population probabilities that are known to be strongly linked to dental decay, namely, socio-economic status and Indigenous heritage. Testing the simulated population found DMFT of all cases where DMFT<>0 was 2.3 (n = 128,609) and DMFT for Indigenous cases only was 1.9 (n = 13,749). In the simulation population the Sic25 was 3.3 (n = 68,750). Monte Carlo simulations were created in particle physics as a computational mathematical approach to unknown individual-level effects by resting a simulation on known population-level probabilities. In this study a Monte Carlo simulation approach to childhood dental decay was built, tested and validated. © 2013 FDI World Dental Federation.

  9. The relation between children's accuracy estimates of their physical competence and achievement-related characteristics.

    PubMed

    Weiss, M R; Horn, T S

    1990-09-01

    The relationship between perceptions of competence and control, achievement, and motivated behavior in youth sport has been a topic of considerable interest. The purpose of this study was to examine whether children who are under-, accurate, or overestimators of their physical competence differ in their achievement characteristics. Children (N = 133), 8 to 13 years of age, who were attending a summer sport program, completed a series of questionnaires designed to assess perceptions of competence and control, motivational orientation, and competitive trait anxiety. Measures of physical competence were obtained by teachers' ratings that paralleled the children's measure of perceived competence. Perceived competence and teachers' ratings were standardized by grade level, and an accuracy score was computed from the difference between these scores. Children were then categorized as underestimators, accurate raters, or overestimators according to upper and lower quartiles of this distribution. A 2 x 2 x 3 (age level by gender by accuracy) MANCOVA revealed a significant gender by accuracy interaction. Underestimating girls were lower in challenge motivation, higher in trait anxiety, and more external in their control perceptions than accurate or overestimators. Underestimating boys were higher in perceived unknown control than accurate and overestimating boys. It was concluded that children who seriously underestimate their perceived competence may be likely candidates for discontinuation of sport activities or low levels of physical achievement.

  10. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed Central

    Gulotta, M

    1995-01-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361

  11. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed

    Gulotta, M

    1995-11-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.

  12. Physical biology of human brain development.

    PubMed

    Budday, Silvia; Steinmann, Paul; Kuhl, Ellen

    2015-01-01

    Neurodevelopment is a complex, dynamic process that involves a precisely orchestrated sequence of genetic, environmental, biochemical, and physical events. Developmental biology and genetics have shaped our understanding of the molecular and cellular mechanisms during neurodevelopment. Recent studies suggest that physical forces play a central role in translating these cellular mechanisms into the complex surface morphology of the human brain. However, the precise impact of neuronal differentiation, migration, and connection on the physical forces during cortical folding remains unknown. Here we review the cellular mechanisms of neurodevelopment with a view toward surface morphogenesis, pattern selection, and evolution of shape. We revisit cortical folding as the instability problem of constrained differential growth in a multi-layered system. To identify the contributing factors of differential growth, we map out the timeline of neurodevelopment in humans and highlight the cellular events associated with extreme radial and tangential expansion. We demonstrate how computational modeling of differential growth can bridge the scales-from phenomena on the cellular level toward form and function on the organ level-to make quantitative, personalized predictions. Physics-based models can quantify cortical stresses, identify critical folding conditions, rationalize pattern selection, and predict gyral wavelengths and gyrification indices. We illustrate that physical forces can explain cortical malformations as emergent properties of developmental disorders. Combining biology and physics holds promise to advance our understanding of human brain development and enable early diagnostics of cortical malformations with the ultimate goal to improve treatment of neurodevelopmental disorders including epilepsy, autism spectrum disorders, and schizophrenia.

  13. Toward Computational Design of High-Efficiency Photovoltaics from First-Principles

    DTIC Science & Technology

    2016-08-15

    dependence of exciton diffusion in conjugated small molecules, Applied Physics Letters, (04 2014): 0. doi: 10.1063/1.4871303 Guangfen Wu, Zi Li, Xu...principle approach based on the time- dependent density functional theory (TDDFT) to describe exciton states, including energy levels and many-body wave... depends more sensitively on the dimension and crystallinity of the acceptor parallel to the interface than normal to the interface. Reorganization

  14. Mother, Earth, Father Sky Symposium

    NASA Technical Reports Server (NTRS)

    Bowman, B.

    1977-01-01

    A conference was held in which minority aerospace scientists and engineers interacted with the minority community, particularly at the junior high, high school, and college levels. There were two presentations in the biological sciences, two in the physical and environmental sciences, seven in engineering and computer sciences, and nine in aerospace science and engineering. Aerospace technology careers and aerospace activities were discussed as to how they are relevant to minorities and women.

  15. USAF Summer Faculty Research Program. 1981 Research Reports. Volume I.

    DTIC Science & Technology

    1981-10-01

    Kent, OH 44242 (216) 672-2816 Dr. Martin D. Altschuler Degree: PhD, Physics and Astronomy, 1964 Associate Professor Specialty: Robot Vision, Surface...line inspection and control, computer- aided manufacturing, robot vision, mapping of machine parts and castings, etc. The technique we developed...posture, reduced healing time and bacteria level, and improved capacity for work endurance and efficiency. 1 ,2𔃽 Federal agencies, such as the FDA and

  16. The Self-Assembly of Particles with Multipolar Interactions

    DTIC Science & Technology

    2004-01-01

    the LATEX template in which this thesis has been written. I also thank Kevin Van Workum and Jack Douglas for contributing simulation work and some...of the computational expense of simulating such complex self-assembly systems at the molecular level and a desire to understand the self-assembly at...Dissertation directed by: Professor Wolfgang Losert Department of Physics In this thesis , we describe results from investigations of the self-assembly of

  17. Parallel Computing:. Some Activities in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Willers, Ian

    This paper examines some activities in High Energy Physics that utilise parallel computing. The topic includes all computing from the proposed SIMD front end detectors, the farming applications, high-powered RISC processors and the large machines in the computer centers. We start by looking at the motivation behind using parallelism for general purpose computing. The developments around farming are then described from its simplest form to the more complex system in Fermilab. Finally, there is a list of some developments that are happening close to the experiments.

  18. A proposed protocol for acceptance and constancy control of computed tomography systems: a Nordic Association for Clinical Physics (NACP) work group report.

    PubMed

    Kuttner, Samuel; Bujila, Robert; Kortesniemi, Mika; Andersson, Henrik; Kull, Love; Østerås, Bjørn Helge; Thygesen, Jesper; Tarp, Ivanka Sojat

    2013-03-01

    Quality assurance (QA) of computed tomography (CT) systems is one of the routine tasks for medical physicists in the Nordic countries. However, standardized QA protocols do not yet exist and the QA methods, as well as the applied tolerance levels, vary in scope and extent at different hospitals. To propose a standardized protocol for acceptance and constancy testing of CT scanners in the Nordic Region. Following a Nordic Association for Clinical Physics (NACP) initiative, a group of medical physicists, with representatives from four Nordic countries, was formed. Based on international literature and practical experience within the group, a comprehensive standardized test protocol was developed. The proposed protocol includes tests related to the mechanical functionality, X-ray tube, detector, and image quality for CT scanners. For each test, recommendations regarding the purpose, equipment needed, an outline of the test method, the measured parameter, tolerance levels, and the testing frequency are stated. In addition, a number of optional tests are briefly discussed that may provide further information about the CT system. Based on international references and medical physicists' practical experiences, a comprehensive QA protocol for CT systems is proposed, including both acceptance and constancy tests. The protocol may serve as a reference for medical physicists in the Nordic countries.

  19. Defense strategies for cloud computing multi-site server infrastructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Ma, Chris Y. T.; He, Fei

    We consider cloud computing server infrastructures for big data applications, which consist of multiple server sites connected over a wide-area network. The sites house a number of servers, network elements and local-area connections, and the wide-area network plays a critical, asymmetric role of providing vital connectivity between them. We model this infrastructure as a system of systems, wherein the sites and wide-area network are represented by their cyber and physical components. These components can be disabled by cyber and physical attacks, and also can be protected against them using component reinforcements. The effects of attacks propagate within the systems, andmore » also beyond them via the wide-area network.We characterize these effects using correlations at two levels using: (a) aggregate failure correlation function that specifies the infrastructure failure probability given the failure of an individual site or network, and (b) first-order differential conditions on system survival probabilities that characterize the component-level correlations within individual systems. We formulate a game between an attacker and a provider using utility functions composed of survival probability and cost terms. At Nash Equilibrium, we derive expressions for the expected capacity of the infrastructure given by the number of operational servers connected to the network for sum-form, product-form and composite utility functions.« less

  20. Integrating computation into the undergraduate curriculum: A vision and guidelines for future developments

    NASA Astrophysics Data System (ADS)

    Chonacky, Norman; Winch, David

    2008-04-01

    There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.

  1. On-line computer system for use with low- energy nuclear physics experiments is reported

    NASA Technical Reports Server (NTRS)

    Gemmell, D. S.

    1969-01-01

    Computer program handles data from low-energy nuclear physics experiments which utilize the ND-160 pulse-height analyzer and the PHYLIS computing system. The program allows experimenters to choose from about 50 different basic data-handling functions and to prescribe the order in which these functions will be performed.

  2. Lattice QCD Calculations in Nuclear Physics towards the Exascale

    NASA Astrophysics Data System (ADS)

    Joo, Balint

    2017-01-01

    The combination of algorithmic advances and new highly parallel computing architectures are enabling lattice QCD calculations to tackle ever more complex problems in nuclear physics. In this talk I will review some computational challenges that are encountered in large scale cold nuclear physics campaigns such as those in hadron spectroscopy calculations. I will discuss progress in addressing these with algorithmic improvements such as multi-grid solvers and software for recent hardware architectures such as GPUs and Intel Xeon Phi, Knights Landing. Finally, I will highlight some current topics for research and development as we head towards the Exascale era This material is funded by the U.S. Department of Energy, Office Of Science, Offices of Nuclear Physics, High Energy Physics and Advanced Scientific Computing Research, as well as the Office of Nuclear Physics under contract DE-AC05-06OR23177.

  3. Multicore Architecture-aware Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinivasa, Avinash

    Modern high performance systems are becoming increasingly complex and powerful due to advancements in processor and memory architecture. In order to keep up with this increasing complexity, applications have to be augmented with certain capabilities to fully exploit such systems. These may be at the application level, such as static or dynamic adaptations or at the system level, like having strategies in place to override some of the default operating system polices, the main objective being to improve computational performance of the application. The current work proposes two such capabilites with respect to multi-threaded scientific applications, in particular a largemore » scale physics application computing ab-initio nuclear structure. The first involves using a middleware tool to invoke dynamic adaptations in the application, so as to be able to adjust to the changing computational resource availability at run-time. The second involves a strategy for effective placement of data in main memory, to optimize memory access latencies and bandwidth. These capabilties when included were found to have a significant impact on the application performance, resulting in average speedups of as much as two to four times.« less

  4. Cosmic Reionization On Computers: Numerical and Physical Convergence

    DOE PAGES

    Gnedin, Nickolay Y.

    2016-04-01

    In this paper I show that simulations of reionization performed under the Cosmic Reionization On Computers (CROC) project do converge in space and mass, albeit rather slowly. A fully converged solution (for a given star formation and feedback model) can be determined at a level of precision of about 20%, but such a solution is useless in practice, since achieving it in production-grade simulations would require a large set of runs at various mass and spatial resolutions, and computational resources for such an undertaking are not yet readily available. In order to make progress in the interim, I introduce amore » weak convergence correction factor in the star formation recipe, which allows one to approximate the fully converged solution with finite resolution simulations. The accuracy of weakly converged simulations approaches a comparable, ~20% level of precision for star formation histories of individual galactic halos and other galactic properties that are directly related to star formation rates, like stellar masses and metallicities. Yet other properties of model galaxies, for example, their HI masses, are recovered in the weakly converged runs only within a factor of two.« less

  5. Cosmic Reionization On Computers: Numerical and Physical Convergence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gnedin, Nickolay Y.

    In this paper I show that simulations of reionization performed under the Cosmic Reionization On Computers (CROC) project do converge in space and mass, albeit rather slowly. A fully converged solution (for a given star formation and feedback model) can be determined at a level of precision of about 20%, but such a solution is useless in practice, since achieving it in production-grade simulations would require a large set of runs at various mass and spatial resolutions, and computational resources for such an undertaking are not yet readily available. In order to make progress in the interim, I introduce amore » weak convergence correction factor in the star formation recipe, which allows one to approximate the fully converged solution with finite resolution simulations. The accuracy of weakly converged simulations approaches a comparable, ~20% level of precision for star formation histories of individual galactic halos and other galactic properties that are directly related to star formation rates, like stellar masses and metallicities. Yet other properties of model galaxies, for example, their HI masses, are recovered in the weakly converged runs only within a factor of two.« less

  6. Comparison of Muscle Mass Indices Using Computed Tomography or Dual X-Ray Absorptiometry for Predicting Physical Performance in Hemodialysis Patients.

    PubMed

    Kang, Seok Hui; Lee, Hyun Seok; Lee, Sukyung; Cho, Ji-Hyung; Kim, Jun Chul

    2017-01-01

    Our study aims to evaluate the association between thigh muscle cross-sectional area (TMA) using computed tomography (CT), or appendicular skeletal muscle mass (ASM) using dual energy X-ray absorptiometry (DEXA), and physical performance levels in hemodialysis (HD) patients. Patients were included if they were on HD for ≥6 months (n = 84). ASM and TMA were adjusted to body weight (BW, kg) or height2 (Ht2, m2). Each participant performed a short physical performance battery test (SPPB), a sit-to-stand for 30 second test (STS30), a 6-minute walk test (6-MWT), a timed up and go test (TUG), and hand grip strength (HGS) test. Correlation coefficients for SPPB, GS, 5STS, STS30, 6-MWT, and TUG were highest in TMA/BW. Results from partial correlation or linear regression analyses displayed similar trends to those derived from Pearson's correlation analyses. An increase in TMA/BW or TMA/Ht2 was associated with a decreased odds ratio of low SPPB, GS, or HGS in multivariate analyses. Indices using DEXA were associated with a decreased odds ratio of a low HGS only in multivariate analysis. TMA indices using CT may be more valuable in predicting physical performance or strength in HD patients. © 2017 The Author(s). Published by S. Karger AG, Basel.

  7. America COMPETES Act and the FY2010 Budget

    DTIC Science & Technology

    2009-06-29

    Outstanding Junior Investigator, Fusion Energy Sciences Plasma Physics Junior Faculty Development; Advanced Scientific Computing Research Early Career...the Fusion Energy Sciences Graduate Fellowships.2 If members of Congress agree with this contention, these America COMPETES Act programs were...Physics Outstanding Junior Investigator, Fusion Energy Sciences Plasma Physics Junior Faculty Development; Advanced Scientific Computing Research Early

  8. Computational methodology to predict satellite system-level effects from impacts of untrackable space debris

    NASA Astrophysics Data System (ADS)

    Welty, N.; Rudolph, M.; Schäfer, F.; Apeldoorn, J.; Janovsky, R.

    2013-07-01

    This paper presents a computational methodology to predict the satellite system-level effects resulting from impacts of untrackable space debris particles. This approach seeks to improve on traditional risk assessment practices by looking beyond the structural penetration of the satellite and predicting the physical damage to internal components and the associated functional impairment caused by untrackable debris impacts. The proposed method combines a debris flux model with the Schäfer-Ryan-Lambert ballistic limit equation (BLE), which accounts for the inherent shielding of components positioned behind the spacecraft structure wall. Individual debris particle impact trajectories and component shadowing effects are considered and the failure probabilities of individual satellite components as a function of mission time are calculated. These results are correlated to expected functional impairment using a Boolean logic model of the system functional architecture considering the functional dependencies and redundancies within the system.

  9. Advanced CO2 removal process control and monitor instrumentation development

    NASA Technical Reports Server (NTRS)

    Heppner, D. B.; Dalhausen, M. J.; Klimes, R.

    1982-01-01

    A progam to evaluate, design and demonstrate major advances in control and monitor instrumentation was undertaken. A carbon dioxide removal process, one whose maturity level makes it a prime candidate for early flight demonstration was investigated. The instrumentation design incorporates features which are compatible with anticipated flight requirements. Current electronics technology and projected advances are included. In addition, the program established commonality of components for all advanced life support subsystems. It was concluded from the studies and design activities conducted under this program that the next generation of instrumentation will be greatly smaller than the prior one. Not only physical size but weight, power and heat rejection requirements were reduced in the range of 80 to 85% from the former level of research and development instrumentation. Using a microprocessor based computer, a standard computer bus structure and nonvolatile memory, improved fabrication techniques and aerospace packaging this instrumentation will greatly enhance overall reliability and total system availability.

  10. Emulating weak localization using a solid-state quantum circuit.

    PubMed

    Chen, Yu; Roushan, P; Sank, D; Neill, C; Lucero, Erik; Mariantoni, Matteo; Barends, R; Chiaro, B; Kelly, J; Megrant, A; Mutus, J Y; O'Malley, P J J; Vainsencher, A; Wenner, J; White, T C; Yin, Yi; Cleland, A N; Martinis, John M

    2014-10-14

    Quantum interference is one of the most fundamental physical effects found in nature. Recent advances in quantum computing now employ interference as a fundamental resource for computation and control. Quantum interference also lies at the heart of sophisticated condensed matter phenomena such as Anderson localization, phenomena that are difficult to reproduce in numerical simulations. Here, employing a multiple-element superconducting quantum circuit, with which we manipulate a single microwave photon, we demonstrate that we can emulate the basic effects of weak localization. By engineering the control sequence, we are able to reproduce the well-known negative magnetoresistance of weak localization as well as its temperature dependence. Furthermore, we can use our circuit to continuously tune the level of disorder, a parameter that is not readily accessible in mesoscopic systems. Demonstrating a high level of control, our experiment shows the potential for employing superconducting quantum circuits as emulators for complex quantum phenomena.

  11. User-Defined Data Distributions in High-Level Programming Languages

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Zima, Hans P.

    2006-01-01

    One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.

  12. Interfacing HTCondor-CE with OpenStack

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; Hover, J.

    2017-10-01

    Over the past few years, Grid Computing technologies have reached a high level of maturity. One key aspect of this success has been the development and adoption of newer Compute Elements to interface the external Grid users with local batch systems. These new Compute Elements allow for better handling of jobs requirements and a more precise management of diverse local resources. However, despite this level of maturity, the Grid Computing world is lacking diversity in local execution platforms. As Grid Computing technologies have historically been driven by the needs of the High Energy Physics community, most resource providers run the platform (operating system version and architecture) that best suits the needs of their particular users. In parallel, the development of virtualization and cloud technologies has accelerated recently, making available a variety of solutions, both commercial and academic, proprietary and open source. Virtualization facilitates performing computational tasks on platforms not available at most computing sites. This work attempts to join the technologies, allowing users to interact with computing sites through one of the standard Computing Elements, HTCondor-CE, but running their jobs within VMs on a local cloud platform, OpenStack, when needed. The system will re-route, in a transparent way, end user jobs into dynamically-launched VM worker nodes when they have requirements that cannot be satisfied by the static local batch system nodes. Also, once the automated mechanisms are in place, it becomes straightforward to allow an end user to invoke a custom Virtual Machine at the site. This will allow cloud resources to be used without requiring the user to establish a separate account. Both scenarios are described in this work.

  13. The Dynamic Brain: From Spiking Neurons to Neural Masses and Cortical Fields

    PubMed Central

    Deco, Gustavo; Jirsa, Viktor K.; Robinson, Peter A.; Breakspear, Michael; Friston, Karl

    2008-01-01

    The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space–time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI), electroencephalogram (EEG), and magnetoencephalogram (MEG). Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the physical sciences. PMID:18769680

  14. Utilizing lung sounds analysis for the evaluation of acute asthma in small children.

    PubMed

    Tinkelman, D G; Lutz, C; Conner, B

    1991-09-01

    One of the most difficult aspects of management of acute asthma in the small child is the clinician's inability to quantitate the response or lack of response to bronchodilator agents because of the inability of a child this age to perform objective lung measurements in the acute state. The present study was designed to evaluate bronchodilator responsiveness in children between 2 and 6 years of age with wheezing by means of a computerized lung sound analysis, computer digitized airway phonopneumonography. Children between ages 2 and 6 who were experiencing acute exacerbations of asthma were included in this study population. The 43 children were evaluated by physical examination, pulmonary function testing, if possible, by use of (spirometry or peak flow meter) and transmission of lung sounds to a computer using an electronic stethoscope to obtain a phonopneumograph with sound intensity level determinations during tidal breathing. A control group of 20 known asthmatic patients between the ages of 8 and 52 years who also presented to the office with acute asthma were evaluated similarly. In each of these individuals, a physical examination was followed by complete spirometry as well as computer digitized airway phonopneumonography recordings. Following initial measurements, all patients were treated with nebulized albuterol (0.25 mL in 2 mL of saline). Five minutes after completion of the nebulization all patients were reexamined and repeat pulmonary function tests were performed followed by CDAP recordings. In the study group of children, the mean pretreatment sound intensity level was 1,694 (range 557 to 4,950 SD +/- 745).(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  16. Effective scheme for partitioning covalent bonds in density-functional embedding theory: From molecules to extended covalent systems.

    PubMed

    Huang, Chen; Muñoz-García, Ana Belén; Pavone, Michele

    2016-12-28

    Density-functional embedding theory provides a general way to perform multi-physics quantum mechanics simulations of large-scale materials by dividing the total system's electron density into a cluster's density and its environment's density. It is then possible to compute the accurate local electronic structures and energetics of the embedded cluster with high-level methods, meanwhile retaining a low-level description of the environment. The prerequisite step in the density-functional embedding theory is the cluster definition. In covalent systems, cutting across the covalent bonds that connect the cluster and its environment leads to dangling bonds (unpaired electrons). These represent a major obstacle for the application of density-functional embedding theory to study extended covalent systems. In this work, we developed a simple scheme to define the cluster in covalent systems. Instead of cutting covalent bonds, we directly split the boundary atoms for maintaining the valency of the cluster. With this new covalent embedding scheme, we compute the dehydrogenation energies of several different molecules, as well as the binding energy of a cobalt atom on graphene. Well localized cluster densities are observed, which can facilitate the use of localized basis sets in high-level calculations. The results are found to converge faster with the embedding method than the other multi-physics approach ONIOM. This work paves the way to perform the density-functional embedding simulations of heterogeneous systems in which different types of chemical bonds are present.

  17. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…

  18. Comparing a Video and Text Version of a Web-Based Computer-Tailored Intervention for Obesity Prevention: A Randomized Controlled Trial.

    PubMed

    Walthouwer, Michel Jean Louis; Oenema, Anke; Lechner, Lilian; de Vries, Hein

    2015-10-19

    Web-based computer-tailored interventions often suffer from small effect sizes and high drop-out rates, particularly among people with a low level of education. Using videos as a delivery format can possibly improve the effects and attractiveness of these interventions The main aim of this study was to examine the effects of a video and text version of a Web-based computer-tailored obesity prevention intervention on dietary intake, physical activity, and body mass index (BMI) among Dutch adults. A second study aim was to examine differences in appreciation between the video and text version. The final study aim was to examine possible differences in intervention effects and appreciation per educational level. A three-armed randomized controlled trial was conducted with a baseline and 6 months follow-up measurement. The intervention consisted of six sessions, lasting about 15 minutes each. In the video version, the core tailored information was provided by means of videos. In the text version, the same tailored information was provided in text format. Outcome variables were self-reported and included BMI, physical activity, energy intake, and appreciation of the intervention. Multiple imputation was used to replace missing values. The effect analyses were carried out with multiple linear regression analyses and adjusted for confounders. The process evaluation data were analyzed with independent samples t tests. The baseline questionnaire was completed by 1419 participants and the 6 months follow-up measurement by 1015 participants (71.53%). No significant interaction effects of educational level were found on any of the outcome variables. Compared to the control condition, the video version resulted in lower BMI (B=-0.25, P=.049) and lower average daily energy intake from energy-dense food products (B=-175.58, P<.001), while the text version had an effect only on energy intake (B=-163.05, P=.001). No effects on physical activity were found. Moreover, the video version was rated significantly better than the text version on feelings of relatedness (P=.041), usefulness (P=.047), and grade given to the intervention (P=.018). The video version of the Web-based computer-tailored obesity prevention intervention was the most effective intervention and most appreciated. Future research needs to examine if the effects are maintained in the long term and how the intervention can be optimized. Netherlands Trial Register: NTR3501; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=3501 (Archived by WebCite at http://www.webcitation.org/6cBKIMaW1).

  19. A first-principles study of carbon-related energy levels in GaN. I. Complexes formed by substitutional/interstitial carbons and gallium/nitrogen vacancies

    NASA Astrophysics Data System (ADS)

    Matsubara, Masahiko; Bellotti, Enrico

    2017-05-01

    Various forms of carbon based complexes in GaN are studied with first-principles calculations employing Heyd-Scuseria-Ernzerhof hybrid functionals within the framework of the density functional theory. We consider carbon complexes made of the combinations of single impurities, i.e., CN-CGa, CI-CN , and CI-CGa , where CN, CGa , and CI denote C substituting nitrogen, C substituting gallium, and interstitial C, respectively, and of neighboring gallium/nitrogen vacancies ( VGa / VN ), i.e., CN-VGa and CGa-VN . Formation energies are computed for all these configurations with different charge states after full geometry optimizations. From our calculated formation energies, thermodynamic transition levels are evaluated, which are related to the thermal activation energies observed in experimental techniques such as deep level transient spectroscopy. Furthermore, the lattice relaxation energies (Franck-Condon shift) are computed to obtain optical activation energies, which are observed in experimental techniques such as deep level optical spectroscopy. We compare our calculated values of activation energies with the energies of experimentally observed C-related trap levels and identify the physical origins of these traps, which were unknown before.

  20. Physical activity pattern of prepubescent Filipino school children during school days.

    PubMed

    Gonzalez-Suarez, Consuelo B; Grimmer-Somers, Karen

    2009-07-01

    Little is known about pre-pubescent Filipino children's involvement in moderate-to-vigorous physical activity (MVPA). There are international guidelines regarding required levels of MVPA for healthy children. This study describes participation of 11- to 12-year-olds in randomly selected public and private schools in San Juan, Metromanila, in MVPA and sports during a school day. The Filipino-modified Physical Activity Questionnaire for Older Children (F_PAQ_C) was administered in English and Filipino. Additional data was collected on sex, age, type of school, and amount of time spent using television and computers. Children's self-assessment of physical activities (1 question in the F_PAQ_C) was correlated with their cumulative F_PAQ_C score. Three hundred eighty subjects (167 boys, 213 girls) participated. Participation in MVPA varied between sex and age groups, from 56.1% to 65.0%. Fewer than 10% of participants were very active. The children were more active during physical education classes than at recess or lunch, after class, or in the evening. Walking for exercise, jumping, jogging and running, free play, and dance were most common. Boys, younger children, and private school students most commonly engaged in MVPA. Self-assessed physical activity had modest correlation (r(2)= 0.21) with cumulative F_PAQ_C score, after adjusting for sex, age, and school type. Most children were not physically active during the school day, except in physical education classes. To reduce the gap between recommended and current activity levels, more opportunities should be provided for preteen Filipino children to engage in MVPA during and after school.

Top