Physical Education Curriculum Analysis Tool (PECAT)
ERIC Educational Resources Information Center
Lee, Sarah M.; Wechsler, Howell
2006-01-01
The Physical Education Curriculum Analysis Tool (PECAT) will help school districts conduct a clear, complete, and consistent analysis of written physical education curricula, based upon national physical education standards. The PECAT is customizable to include local standards. The results from the analysis can help school districts enhance…
2011-03-01
Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 14 e) Forces: Griffon seat design assessments include questions of vibration...the suitability of alternative designs . Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 5 e) Performance Measures...configurations to assess Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 8 design and acquisition decisions, and more
NASA Astrophysics Data System (ADS)
Kudryashov, E. A.; Smirnov, I. M.; Grishin, D. V.; Khizhnyak, N. A.
2018-06-01
The work is aimed at selecting a promising grade of a tool material, whose physical-mechanical characteristics would allow using it for processing the surfaces of discontinuous parts in the presence of shock loads. An analysis of the physical-mechanical characteristics of most common tool materials is performed and the data on a possible provision of the metal-working processes with promising composite grades are presented.
HEPDOOP: High-Energy Physics Analysis using Hadoop
NASA Astrophysics Data System (ADS)
Bhimji, W.; Bristow, T.; Washbrook, A.
2014-06-01
We perform a LHC data analysis workflow using tools and data formats that are commonly used in the "Big Data" community outside High Energy Physics (HEP). These include Apache Avro for serialisation to binary files, Pig and Hadoop for mass data processing and Python Scikit-Learn for multi-variate analysis. Comparison is made with the same analysis performed with current HEP tools in ROOT.
A Vision on the Status and Evolution of HEP Physics Software Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canal, P.; Elvira, D.; Hatcher, R.
2013-07-28
This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.
Using Tracker as a Pedagogical Tool for Understanding Projectile Motion
ERIC Educational Resources Information Center
Wee, Loo Kang; Chew, Charles; Goh, Giam Hwee; Tan, Samuel; Lee, Tat Leong
2012-01-01
This article reports on the use of Tracker as a pedagogical tool in the effective learning and teaching of projectile motion in physics. When a computer model building learning process is supported and driven by video analysis data, this free Open Source Physics tool can provide opportunities for students to engage in active enquiry-based…
Slow Speed--Fast Motion: Time-Lapse Recordings in Physics Education
ERIC Educational Resources Information Center
Vollmer, Michael; Möllmann, Klaus-Peter
2018-01-01
Video analysis with a 30 Hz frame rate is the standard tool in physics education. The development of affordable high-speed-cameras has extended the capabilities of the tool for much smaller time scales to the 1 ms range, using frame rates of typically up to 1000 frames s[superscript -1], allowing us to study transient physics phenomena happening…
NASA Astrophysics Data System (ADS)
Génot, V.; André, N.; Cecconi, B.; Bouchemit, M.; Budnik, E.; Bourrel, N.; Gangloff, M.; Dufourg, N.; Hess, S.; Modolo, R.; Renard, B.; Lormant, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.
2014-11-01
The interest for data communication between analysis tools in planetary sciences and space physics is illustrated in this paper via several examples of the uses of SAMP. The Simple Application Messaging Protocol is developed in the frame of the IVOA from an earlier protocol called PLASTIC. SAMP enables easy communication and interoperability between astronomy software, stand-alone and web-based; it is now increasingly adopted by the planetary sciences and space physics community. Its attractiveness is based, on one hand, on the use of common file formats for exchange and, on the other hand, on established messaging models. Examples of uses at the CDPP and elsewhere are presented. The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (Automated Multi Dataset Analysis, http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search and cataloging. Besides AMDA, the 3DView (http://3dview.cdpp.eu/) tool provides immersive visualizations and is further developed to include simulation and observational data. These tools and their interactions with each other, notably via SAMP, are presented via science cases of interest to planetary sciences and space physics communities.
Some applications of mathematics in theoretical physics - A review
NASA Astrophysics Data System (ADS)
Bora, Kalpana
2016-06-01
Mathematics is a very beautiful subject-very much an indispensible tool for Physics, more so for Theoretical Physics (by which we mean here mainly Field Theory and High Energy Physics). These branches of Physics are based on Quantum Mechanics and Special Theory of Relativity, and many mathematical concepts are used in them. In this work, we shall elucidate upon only some of them, like-differential geometry, infinite series, Mellin transforms, Fourier and integral transforms, special functions, calculus, complex algebra, topology, group theory, Riemannian geometry, functional analysis, linear algebra, operator algebra, etc. We shall also present, some physics issues, where these mathematical tools are used. It is not wrong to say that Mathematics is such a powerful tool, without which, there can not be any Physics theory!! A brief review on our research work is also presented.
Analyzing Virtual Physics Simulations with Tracker
ERIC Educational Resources Information Center
Claessens, Tom
2017-01-01
In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical…
Dimensional Analysis in Physics and the Buckingham Theorem
ERIC Educational Resources Information Center
Misic, Tatjana; Najdanovic-Lukic, Marina; Nesic, Ljubisa
2010-01-01
Dimensional analysis is a simple, clear and intuitive method for determining the functional dependence of physical quantities that are of importance to a certain process. However, in physics textbooks, very little space is usually given to this approach and it is often presented only as a diagnostic tool used to determine the validity of…
Martins, Júlia Caetano; Aguiar, Larissa Tavares; Nadeau, Sylvie; Scianni, Aline Alvim; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais
2017-01-01
Introduction Self-report physical activity assessment tools are commonly used for the evaluation of physical activity levels in individuals with stroke. A great variety of these tools have been developed and widely used in recent years, which justify the need to examine their measurement properties and clinical utility. Therefore, the main objectives of this systematic review are to examine the measurement properties and clinical utility of self-report measures of physical activity and discuss the strengths and limitations of the identified tools. Methods and analysis A systematic review of studies that investigated the measurement properties and/or clinical utility of self-report physical activity assessment tools in stroke will be conducted. Electronic searches will be performed in five databases: Medical Literature Analysis and Retrieval System Online (MEDLINE) (PubMed), Excerpta Medica Database (EMBASE), Physiotherapy Evidence Database (PEDro), Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) and Scientific Electronic Library Online (SciELO), followed by hand searches of the reference lists of the included studies. Two independent reviewers will screen all retrieve titles, abstracts, and full texts, according to the inclusion criteria and will also extract the data. A third reviewer will be referred to solve any disagreement. A descriptive summary of the included studies will contain the design, participants, as well as the characteristics, measurement properties, and clinical utility of the self-report tools. The methodological quality of the studies will be evaluated using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist and the clinical utility of the identified tools will be assessed considering predefined criteria. This systematic review will follow the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) statement. Discussion This systematic review will provide an extensive review of the measurement properties and clinical utility of self-report physical activity assessment tools used in individuals with stroke, which would benefit clinicians and researchers. Trial registration number PROSPERO CRD42016037146. PMID:28193848
NASA Astrophysics Data System (ADS)
Keika, Kunihiro; Miyoshi, Yoshizumi; Machida, Shinobu; Ieda, Akimasa; Seki, Kanako; Hori, Tomoaki; Miyashita, Yukinaga; Shoji, Masafumi; Shinohara, Iku; Angelopoulos, Vassilis; Lewis, Jim W.; Flores, Aaron
2017-12-01
This paper introduces ISEE_3D, an interactive visualization tool for three-dimensional plasma velocity distribution functions, developed by the Institute for Space-Earth Environmental Research, Nagoya University, Japan. The tool provides a variety of methods to visualize the distribution function of space plasma: scatter, volume, and isosurface modes. The tool also has a wide range of functions, such as displaying magnetic field vectors and two-dimensional slices of distributions to facilitate extensive analysis. The coordinate transformation to the magnetic field coordinates is also implemented in the tool. The source codes of the tool are written as scripts of a widely used data analysis software language, Interactive Data Language, which has been widespread in the field of space physics and solar physics. The current version of the tool can be used for data files of the plasma distribution function from the Geotail satellite mission, which are publicly accessible through the Data Archives and Transmission System of the Institute of Space and Astronautical Science (ISAS)/Japan Aerospace Exploration Agency (JAXA). The tool is also available in the Space Physics Environment Data Analysis Software to visualize plasma data from the Magnetospheric Multiscale and the Time History of Events and Macroscale Interactions during Substorms missions. The tool is planned to be applied to data from other missions, such as Arase (ERG) and Van Allen Probes after replacing or adding data loading plug-ins. This visualization tool helps scientists understand the dynamics of space plasma better, particularly in the regions where the magnetohydrodynamic approximation is not valid, for example, the Earth's inner magnetosphere, magnetopause, bow shock, and plasma sheet.
The technical analysis of the stock exchange and physics: Japanese candlesticks for solar activity
NASA Astrophysics Data System (ADS)
Dineva, C.; Atanasov, V.
2013-09-01
In this article, we use the Japanese candlesticks, a method popular in the technical analysis of the Stock/Forex markets and apply it to a variable in physics-the solar activity. This method is invented and used exclusively for economic analysis and its application to a physical problem produced unexpected results. We found that the Japanese candlesticks are convenient tool in the analysis of the variables in the physics of the Sun. Based on our observations, we differentiated a new cycle in the solar activity.
NASA Astrophysics Data System (ADS)
Manurung, Sondang; Demonta Pangabean, Deo
2017-05-01
The main purpose of this study is to produce needs analysis, literature review, and learning tools in the study of developmental of interactive multimedia based physic learning charged in problem solving to improve thinking ability of physic prospective student. The first-year result of the study is: result of the draft based on a needs analysis of the facts on the ground, the conditions of existing learning and literature studies. Following the design of devices and instruments performed as well the development of media. Result of the second study is physics learning device -based interactive multimedia charged problem solving in the form of textbooks and scientific publications. Previous learning models tested in a limited sample, then in the evaluation and repair. Besides, the product of research has an economic value on the grounds: (1) a virtual laboratory to offer this research provides a solution purchases physics laboratory equipment is expensive; (2) address the shortage of teachers of physics in remote areas as a learning tool can be accessed offline and online; (3). reducing material or consumables as tutorials can be done online; Targeted research is the first year: i.e story board learning physics that have been scanned in a web form CD (compact disk) and the interactive multimedia of gas Kinetic Theory concept. This draft is based on a needs analysis of the facts on the ground, the existing learning conditions, and literature studies. Previous learning models tested in a limited sample, then in the evaluation and repair.
Analysis of pre-service physics teacher skills designing simple physics experiments based technology
NASA Astrophysics Data System (ADS)
Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.
2018-03-01
Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.
New Tools For Understanding Microbial Diversity Using High-throughput Sequence Data
NASA Astrophysics Data System (ADS)
Knight, R.; Hamady, M.; Liu, Z.; Lozupone, C.
2007-12-01
High-throughput sequencing techniques such as 454 are straining the limits of tools traditionally used to build trees, choose OTUs, and perform other essential sequencing tasks. We have developed a workflow for phylogenetic analysis of large-scale sequence data sets that combines existing tools, such as the Arb phylogeny package and the NAST multiple sequence alignment tool, with new methods for choosing and clustering OTUs and for performing phylogenetic community analysis with UniFrac. This talk discusses the cyberinfrastructure we are developing to support the human microbiome project, and the application of these workflows to analyze very large data sets that contrast the gut microbiota with a range of physical environments. These tools will ultimately help to define core and peripheral microbiomes in a range of environments, and will allow us to understand the physical and biotic factors that contribute most to differences in microbial diversity.
NASA Astrophysics Data System (ADS)
Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan
2004-05-01
We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.
A survey of social media data analysis for physical activity surveillance.
Liu, Sam; Young, Sean D
2018-07-01
Social media data can provide valuable information regarding people's behaviors and health outcomes. Previous studies have shown that social media data can be extracted to monitor and predict infectious disease outbreaks. These same approaches can be applied to other fields including physical activity research and forensic science. Social media data have the potential to provide real-time monitoring and prediction of physical activity level in a given region. This tool can be valuable to public health organizations as it can overcome the time lag in the reporting of physical activity epidemiology data faced by traditional research methods (e.g. surveys, observational studies). As a result, this tool could help public health organizations better mobilize and target physical activity interventions. The first part of this paper aims to describe current approaches (e.g. topic modeling, sentiment analysis and social network analysis) that could be used to analyze social media data to provide real-time monitoring of physical activity level. The second aim of this paper was to discuss ways to apply social media analysis to other fields such as forensic sciences and provide recommendations to further social media research. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)
The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...
Validating a Lifestyle Physical Activity Measure for People with Serious Mental Illness
ERIC Educational Resources Information Center
Bezyak, Jill L.; Chan, Fong; Chiu, Chung-Yi; Kaya, Cahit; Huck, Garrett
2014-01-01
Purpose: To evaluate the measurement structure of the "Physical Activity Scale for Individuals With Physical Disabilities" (PASIPD) as an assessment tool of lifestyle physical activities for people with severe mental illness. Method: A quantitative descriptive research design using factor analysis was employed. A sample of 72 individuals…
A National Solar Digital Observatory
NASA Astrophysics Data System (ADS)
Hill, F.
2000-05-01
The continuing development of the Internet as a research tool, combined with an improving funding climate, has sparked new interest in the development of Internet-linked astronomical data bases and analysis tools. Here I outline a concept for a National Solar Digital Observatory (NSDO), a set of data archives and analysis tools distributed in physical location at sites which already host such systems. A central web site would be implemented from which a user could search all of the component archives, select and download data, and perform analyses. Example components include NSO's Digital Library containing its synoptic and GONG data, and the forthcoming SOLIS archive. Several other archives, in various stages of development, also exist. Potential analysis tools include content-based searches, visualized programming tools, and graphics routines. The existence of an NSDO would greatly facilitate solar physics research, as a user would no longer need to have detailed knowledge of all solar archive sites. It would also improve public outreach efforts. The National Solar Observatory is operated by AURA, Inc. under a cooperative agreement with the National Science Foundation.
ERIC Educational Resources Information Center
WITMER, DAVID R.
WISCONSIN STATE UNIVERSITIES HAVE BEEN USING THE COMPUTER AS A MANAGEMENT TOOL TO STUDY PHYSICAL FACILITIES INVENTORIES, SPACE UTILIZATION, AND ENROLLMENT AND PLANT PROJECTIONS. EXAMPLES ARE SHOWN GRAPHICALLY AND DESCRIBED FOR DIFFERENT TYPES OF ANALYSIS, SHOWING THE CARD FORMAT, CODING SYSTEMS, AND PRINTOUT. EQUATIONS ARE PROVIDED FOR DETERMINING…
ERIC Educational Resources Information Center
Marie, S. Maria Josephine Arokia; Edannur, Sreekala
2015-01-01
This paper focused on the analysis of test items constructed in the paper of teaching Physical Science for B.Ed. class. It involved the analysis of difficulty level and discrimination power of each test item. Item analysis allows selecting or omitting items from the test, but more importantly item analysis is a tool to help the item writer improve…
Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S>
2007-01-01
In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.
Rasch Model Based Analysis of the Force Concept Inventory
ERIC Educational Resources Information Center
Planinic, Maja; Ivanjek, Lana; Susac, Ana
2010-01-01
The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…
The Environment-Power System Analysis Tool development program. [for spacecraft power supplies
NASA Technical Reports Server (NTRS)
Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.
1989-01-01
The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.
Diagnosing alternative conceptions of Fermi energy among undergraduate students
NASA Astrophysics Data System (ADS)
Sharma, Sapna; Ahluwalia, Pardeep Kumar
2012-07-01
Physics education researchers have scientifically established the fact that the understanding of new concepts and interpretation of incoming information are strongly influenced by the preexisting knowledge and beliefs of students, called epistemological beliefs. This can lead to a gap between what students actually learn and what the teacher expects them to learn. In a classroom, as a teacher, it is desirable that one tries to bridge this gap at least on the key concepts of a particular field which is being taught. One such key concept which crops up in statistical physics/solid-state physics courses, and around which the behaviour of materials is described, is Fermi energy (εF). In this paper, we present the results which emerged about misconceptions on Fermi energy in the process of administering a diagnostic tool called the Statistical Physics Concept Survey developed by the authors. It deals with eight themes of basic importance in learning undergraduate solid-state physics and statistical physics. The question items of the tool were put through well-established sequential processes: definition of themes, Delphi study, interview with students, drafting questions, administration, validity and reliability of the tool. The tool was administered to a group of undergraduate students and postgraduate students, in a pre-test and post-test design. In this paper, we have taken one of the themes i.e. Fermi energy of the diagnostic tool for our analysis and discussion. Students’ responses and reasoning comments given during interview were analysed. This analysis helped us to identify prevailing misconceptions/learning gaps among students on this topic. How spreadsheets can be effectively used to remove the identified misconceptions and help appreciate the finer nuances while visualizing the behaviour of the system around Fermi energy, normally sidestepped both by the teachers and learners, is also presented in this paper.
The effect of introducing computers into an introductory physics problem-solving laboratory
NASA Astrophysics Data System (ADS)
McCullough, Laura Ellen
2000-10-01
Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.
Slow speed—fast motion: time-lapse recordings in physics education
NASA Astrophysics Data System (ADS)
Vollmer, Michael; Möllmann, Klaus-Peter
2018-05-01
Video analysis with a 30 Hz frame rate is the standard tool in physics education. The development of affordable high-speed-cameras has extended the capabilities of the tool for much smaller time scales to the 1 ms range, using frame rates of typically up to 1000 frames s-1, allowing us to study transient physics phenomena happening too fast for the naked eye. Here we want to extend the range of phenomena which may be studied by video analysis in the opposite direction by focusing on much longer time scales ranging from minutes, hours to many days or even months. We discuss this time-lapse method, needed equipment and give a few hints of how to produce respective recordings for two specific experiments.
Inference for the physical sciences
Jones, Nick S.; Maccarone, Thomas J.
2013-01-01
There is a disconnect between developments in modern data analysis and some parts of the physical sciences in which they could find ready use. This introduction, and this issue, provides resources to help experimental researchers access modern data analysis tools and exposure for analysts to extant challenges in physical science. We include a table of resources connecting statistical and physical disciplines and point to appropriate books, journals, videos and articles. We conclude by highlighting the relevance of each of the articles in the associated issue. PMID:23277613
Cross-platform validation and analysis environment for particle physics
NASA Astrophysics Data System (ADS)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
2017-11-01
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.
Chapter 13: Tools for analysis
William Elliot; Kevin Hyde; Lee MacDonald; James McKean
2007-01-01
This chapter presents a synthesis of current computer modeling tools that are, or could be, adopted for use in evaluating the cumulative watershed effects of fuel management. The chapter focuses on runoff, soil erosion and slope stability predictive tools. Readers should refer to chapters on soil erosion and stability for more detailed information on the physical...
An Overview of the Role of Systems Analysis in NASA's Hypersonics Project
NASA Technical Reports Server (NTRS)
Robinson, Jeffrey S.; Martin John G.; Bowles, Jeffrey V>
2006-01-01
NASA's Aeronautics Research Mission Directorate recently restructured its Vehicle Systems Program, refocusing it towards understanding the fundamental physics that govern flight in all speed regimes. Now called the Fundamental Aeronautics Program, it is comprised of four new projects, Subsonic Fixed Wing, Subsonic Rotary Wing, Supersonics, and Hypersonics. The Aeronautics Research Mission Directorate has charged the Hypersonics Project with having a basic understanding of all systems that travel at hypersonic speeds within the Earth's and other planets atmospheres. This includes both powered and unpowered systems, such as re-entry vehicles and vehicles powered by rocket or airbreathing propulsion that cruise in and accelerate through the atmosphere. The primary objective of the Hypersonics Project is to develop physics-based predictive tools that enable the design, analysis and optimization of such systems. The Hypersonics Project charges the systems analysis discipline team with providing it the decision-making information it needs to properly guide research and technology development. Credible, rapid, and robust multi-disciplinary system analysis processes and design tools are required in order to generate this information. To this end, the principal challenges for the systems analysis team are the introduction of high fidelity physics into the analysis process and integration into a design environment, quantification of design uncertainty through the use of probabilistic methods, reduction in design cycle time, and the development and implementation of robust processes and tools enabling a wide design space and associated technology assessment capability. This paper will discuss the roles and responsibilities of the systems analysis discipline team within the Hypersonics Project as well as the tools, methods, processes, and approach that the team will undertake in order to perform its project designated functions.
Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report. Version 1.0
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S.; Kramer White, Julie; Labbe, Steve G.; Rotter, Hank A.
2005-01-01
In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments, and real-time on-orbit assessments. The tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.
NASA Astrophysics Data System (ADS)
Kavcar, Nevzat; Korkmaz, Cihan
2017-02-01
Purpose of this work is to determine the physics teacher candidates' views on Physics 10 textbook' content and general properties suitable to the 2013 Secondary School Physics Curriculum. 23 teacher candidates at 2014-2015 school year constituted the sampling of the study in which scanning model based on qualitative research technique was used by performing document analysis. Data collection tool of the research was the files prepared with 51 and nine open ended questions including the subject content and general properties of the textbook. It was concluded that the textbook was sufficient for being life context -based, language, activity-based and student-centered approximation, development of social and inquiry skills, and was insufficient for referring educational gains of the Curriculum, involving activities, projects and homework about application. Activities and applications about affective area, such tools for assessment and evaluation practices as concept map, concept network and semantic analysis table may be involved in the textbook.
Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics
ERIC Educational Resources Information Center
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-01-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…
Nonverbal communication in doctor-elderly patient transactions (NDEPT): development of a tool.
Gorawara-Bhat, Rita; Cook, Mary Ann; Sachs, Greg A
2007-05-01
There are several measurement tools to assess verbal dimensions in clinical encounters; in contrast, there is no established tool to evaluate physical nonverbal dimensions in geriatric encounters. The present paper describes the development of a tool to assess the physical context of exam rooms in doctor-older patient visits. Salient features of the tool were derived from the medical literature and systematic observations of videotapes and refined during current research. The tool consists of two main dimensions of exam rooms: (1) physical dimensions comprising static and dynamic attributes that become operational through the spatial configuration and can influence the manifestation of (2) kinesic attributes. Details of the coding form and inter-rater reliability are presented. The usefulness of the tool is demonstrated through an analysis of 50 National Institute of Aging videotapes. Physicians in exam rooms with no desk in the interaction, no height difference and optimal interaction distance were observed to have greater eye contact and touch than physicians' in exam rooms with a desk, similar height difference and interaction distance. The tool can enable physicians to assess the spatial configuration of exam rooms (through Parts A and B) and thus facilitate the structuring of kinesic attributes (Part C).
DOT National Transportation Integrated Search
2016-10-01
This report details the research undertaken and software tools that were developed that enable digital : images of gusset plates to be converted into orthophotos, establish physical dimensions, collect : geometric information from them, and conduct s...
Some applications of mathematics in theoretical physics - A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bora, Kalpana
2016-06-21
Mathematics is a very beautiful subject−very much an indispensible tool for Physics, more so for Theoretical Physics (by which we mean here mainly Field Theory and High Energy Physics). These branches of Physics are based on Quantum Mechanics and Special Theory of Relativity, and many mathematical concepts are used in them. In this work, we shall elucidate upon only some of them, like−differential geometry, infinite series, Mellin transforms, Fourier and integral transforms, special functions, calculus, complex algebra, topology, group theory, Riemannian geometry, functional analysis, linear algebra, operator algebra, etc. We shall also present, some physics issues, where these mathematical toolsmore » are used. It is not wrong to say that Mathematics is such a powerful tool, without which, there can not be any Physics theory!! A brief review on our research work is also presented.« less
HYPATIA--An Online Tool for ATLAS Event Visualization
ERIC Educational Resources Information Center
Kourkoumelis, C.; Vourakis, S.
2014-01-01
This paper describes an interactive tool for analysis of data from the ATLAS experiment taking place at the world's highest energy particle collider at CERN. The tool, called HYPATIA/applet, enables students of various levels to become acquainted with particle physics and look for discoveries in a similar way to that of real research.
Cross-platform validation and analysis environment for particle physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for onlinemore » validation of Monte Carlo event samples through a web interface.« less
Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis
NASA Astrophysics Data System (ADS)
Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John
2015-11-01
The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.
CMS Analysis and Data Reduction with Apache Spark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutsche, Oliver; Canali, Luca; Cremer, Illia
Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis ofmore » very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.« less
The Use Of Computational Human Performance Modeling As Task Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacuqes Hugo; David Gertman
2012-07-01
During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employedmore » to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. Q.; Shemon, E. R.; Thomas, J. W.
SHARP is an advanced modeling and simulation toolkit for the analysis of nuclear reactors. It is comprised of several components including physical modeling tools, tools to integrate the physics codes for multi-physics analyses, and a set of tools to couple the codes within the MOAB framework. Physics modules currently include the neutronics code PROTEUS, the thermal-hydraulics code Nek5000, and the structural mechanics code Diablo. This manual focuses on performing multi-physics calculations with the SHARP ToolKit. Manuals for the three individual physics modules are available with the SHARP distribution to help the user to either carry out the primary multi-physics calculationmore » with basic knowledge or perform further advanced development with in-depth knowledge of these codes. This manual provides step-by-step instructions on employing SHARP, including how to download and install the code, how to build the drivers for a test case, how to perform a calculation and how to visualize the results. Since SHARP has some specific library and environment dependencies, it is highly recommended that the user read this manual prior to installing SHARP. Verification tests cases are included to check proper installation of each module. It is suggested that the new user should first follow the step-by-step instructions provided for a test problem in this manual to understand the basic procedure of using SHARP before using SHARP for his/her own analysis. Both reference output and scripts are provided along with the test cases in order to verify correct installation and execution of the SHARP package. At the end of this manual, detailed instructions are provided on how to create a new test case so that user can perform novel multi-physics calculations with SHARP. Frequently asked questions are listed at the end of this manual to help the user to troubleshoot issues.« less
Energy evaluation of protection effectiveness of anti-vibration gloves.
Hermann, Tomasz; Dobry, Marian Witalis
2017-09-01
This article describes an energy method of assessing protection effectiveness of anti-vibration gloves on the human dynamic structure. The study uses dynamic models of the human and the glove specified in Standard No. ISO 10068:2012. The physical models of human-tool systems were developed by combining human physical models with a power tool model. The combined human-tool models were then transformed into mathematical models from which energy models were finally derived. Comparative energy analysis was conducted in the domain of rms powers. The energy models of the human-tool systems were solved using numerical simulation implemented in the MATLAB/Simulink environment. The simulation procedure demonstrated the effectiveness of the anti-vibration glove as a method of protecting human operators of hand-held power tools against vibration. The desirable effect is achieved by lowering the flow of energy in the human-tool system when the anti-vibration glove is employed.
An epistemic framing analysis of upper level physics students' use of mathematics
NASA Astrophysics Data System (ADS)
Bing, Thomas Joseph
Mathematics is central to a professional physicist's work and, by extension, to a physics student's studies. It provides a language for abstraction, definition, computation, and connection to physical reality. This power of mathematics in physics is also the source of many of the difficulties it presents students. Simply put, many different activities could all be described as "using math in physics". Expertise entails a complicated coordination of these various activities. This work examines the many different kinds of thinking that are all facets of the use of mathematics in physics. It uses an epistemological lens, one that looks at the type of explanation a student presently sees as appropriate, to analyze the mathematical thinking of upper level physics undergraduates. Sometimes a student will turn to a detailed calculation to produce or justify an answer. Other times a physical argument is explicitly connected to the mathematics at hand. Still other times quoting a definition is seen as sufficient, and so on. Local coherencies evolve in students' thought around these various types of mathematical justifications. We use the cognitive process of framing to model students' navigation of these various facets of math use in physics. We first demonstrate several common framings observed in our students' mathematical thought and give several examples of each. Armed with this analysis tool, we then give several examples of how this framing analysis can be used to address a research question. We consider what effects, if any, a powerful symbolic calculator has on students' thinking. We also consider how to characterize growing expertise among physics students. Framing offers a lens for analysis that is a natural fit for these sample research questions. To active physics education researchers, the framing analysis presented in this dissertation can provide a useful tool for addressing other research questions. To physics teachers, we present this analysis so that it may make them more explicitly aware of the various types of reasoning, and the dynamics among them, that students employ in our physics classes. This awareness will help us better hear students' arguments and respond appropriately.
NASA Technical Reports Server (NTRS)
Perrell, Eric R.
2005-01-01
The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of level-two design tools for PARSEC. The "CFD Multiphysics Tool" will be the propulsive element of the tool set. The name acknowledges that space propulsion performance assessment is primarily a fluid mechanics problem. At the core of the CFD Multiphysics Tool is an open-source CFD code, HYP, under development at ERAU. ERAU is renowned for its undergraduate degree program in Aerospace Engineering the largest in the nation. The strength of the program is its applications-oriented curriculum, which culminates in one of three two-course Engineering Design sequences: Aerospace Propulsion, Spacecraft, or Aircraft. This same philosophy applies to the HYP Project, albeit with fluid physics modeling commensurate with graduate research. HYP s purpose, like the Multiphysics Tool s, is to enable calculations of real (three-dimensional; geometrically complex; intended for hardware development) applications of high speed and propulsive fluid flows.
Saluja, Kiran; Rawal, Tina; Bassi, Shalini; Bhaumik, Soumyadeep; Singh, Ankur; Park, Min Hae; Kinra, Sanjay; Arora, Monika
2018-06-01
We aimed to identify, describe and analyse school environment assessment (SEA) tools that address behavioural risk factors (unhealthy diet, physical inactivity, tobacco and alcohol consumption) for non-communicable diseases (NCD). We searched in MEDLINE and Web of Science, hand-searched reference lists and contacted experts. Basic characteristics, measures assessed and measurement properties (validity, reliability, usability) of identified tools were extracted. We narratively synthesized the data and used content analysis to develop a list of measures used in the SEA tools. Twenty-four SEA tools were identified, mostly from developed countries. Out of these, 15 were questionnaire based, 8 were checklists or observation based tools and one tool used a combined checklist/observation based and telephonic questionnaire approach. Only 1 SEA tool had components related to all the four NCD risk factors, 2 SEA tools has assessed three NCD risk factors (diet/nutrition, physical activity, tobacco), 10 SEA tools has assessed two NCD risk factors (diet/nutrition and physical activity) and 11 SEA tools has assessed only one of the NCD risk factor. Several measures were used in the tools to assess the four NCD risk factors, but tobacco and alcohol was sparingly included. Measurement properties were reported for 14 tools. The review provides a comprehensive list of measures used in SEA tools which could be a valuable resource to guide future development of such tools. A valid and reliable SEA tool which could simultaneously evaluate all NCD risk factors, that has been tested in different settings with varying resource availability is needed.
2013-01-01
Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807
Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa
2013-09-17
Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.
Ulitsky, Igor; Shamir, Ron
2007-01-01
The biological interpretation of genetic interactions is a major challenge. Recently, Kelley and Ideker proposed a method to analyze together genetic and physical networks, which explains many of the known genetic interactions as linking different pathways in the physical network. Here, we extend this method and devise novel analytic tools for interpreting genetic interactions in a physical context. Applying these tools on a large-scale Saccharomyces cerevisiae data set, our analysis reveals 140 between-pathway models that explain 3765 genetic interactions, roughly doubling those that were previously explained. Model genes tend to have short mRNA half-lives and many phosphorylation sites, suggesting that their stringent regulation is linked to pathway redundancy. We also identify ‘pivot' proteins that have many physical interactions with both pathways in our models, and show that pivots tend to be essential and highly conserved. Our analysis of models and pivots sheds light on the organization of the cellular machinery as well as on the roles of individual proteins. PMID:17437029
Insights into teaching quantum mechanics in secondary and lower undergraduate education
NASA Astrophysics Data System (ADS)
Krijtenburg-Lewerissa, K.; Pol, H. J.; Brinkman, A.; van Joolingen, W. R.
2017-06-01
This study presents a review of the current state of research on teaching quantum mechanics in secondary and lower undergraduate education. A conceptual approach to quantum mechanics is being implemented in more and more introductory physics courses around the world. Because of the differences between the conceptual nature of quantum mechanics and classical physics, research on misconceptions, testing, and teaching strategies for introductory quantum mechanics is needed. For this review, 74 articles were selected and analyzed for the misconceptions, research tools, teaching strategies, and multimedia applications investigated. Outcomes were categorized according to their contribution to the various subtopics of quantum mechanics. Analysis shows that students have difficulty relating quantum physics to physical reality. It also shows that the teaching of complex quantum behavior, such as time dependence, superposition, and the measurement problem, has barely been investigated for the secondary and lower undergraduate level. At the secondary school level, this article shows a need to investigate student difficulties concerning wave functions and potential wells. Investigation of research tools shows the necessity for the development of assessment tools for secondary and lower undergraduate education, which cover all major topics and are suitable for statistical analysis. Furthermore, this article shows the existence of very diverse ideas concerning teaching strategies for quantum mechanics and a lack of research into which strategies promote understanding. This article underlines the need for more empirical research into student difficulties, teaching strategies, activities, and research tools intended for a conceptual approach for quantum mechanics.
NASA Enterprise Visual Analysis
NASA Technical Reports Server (NTRS)
Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck
2007-01-01
NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.
Tools for Scientific Thinking: Microcomputer-Based Laboratories for the Naive Science Learner.
ERIC Educational Resources Information Center
Thornton, Ronald K.
A promising new development in science education is the use of microcomputer-based laboratory tools that allow for student-directed data acquisition, display, and analysis. Microcomputer-based laboratories (MBL) make use of inexpensive microcomputer-connected probes to measure such physical quantities as temperature, position, and various…
Alternative Model for Administration and Analysis of Research-Based Assessments
ERIC Educational Resources Information Center
Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.
2016-01-01
Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption…
Meteor Entry and Breakup Based on Evolution of NASAs Entry Capsule Design Tools
NASA Technical Reports Server (NTRS)
Prabku, Dinesh K.; Saunders, D.; Stern, E.; Chen, Y.-K.; Allen, G.; Agrawal, P.; Jaffe, R.; White, S.; Tauber, M.; Bauschlicher, C.;
2015-01-01
Physics of atmospheric entry of meteoroids was an active area of research at NASA ARC up to the early 1970s (e.g., the oft-cited work of Baldwin and Sheaffer). However, research in the area seems to have ended with the Apollo program, and any ties with an active international meteor physics community seem to have significantly diminished thereafter. In the decades following the 1970s, the focus of entry physics at NASA ARC has been on improvement of the math models of shock-layer physics (especially in chemical kinetics and radiation) and thermal response of ablative materials used for capsule heatshields. With the overarching objectives of understanding energy deposition into the atmosphere and fragmentation, could these modern analysis tools and processes be applied to the problem of atmospheric entry of meteoroids as well? In the presentation we will explore: (i) the physics of atmospheric entries of meteoroids using our current state-of-the-art tools and processes, (ii) the influence of shape (and shape change) on flow characteristics, and (iii) how multiple bodies interact.
Measurement of obesity prevention in childcare settings: A systematic review of current instruments.
Stanhope, Kaitlyn K; Kay, Christi; Stevenson, Beth; Gazmararian, Julie A
The incidence of childhood obesity is highest among children entering kindergarten. Overweight and obesity in early childhood track through adulthood. Programs increasingly target children in early life for obesity prevention. However, the published literature lacks a review on tools available for measuring behaviour and environmental level change in child care. The objective is to describe measurement tools currently in use in evaluating obesity-prevention in preschool-aged children. Literature searches were conducted in PubMed using the keywords "early childhood obesity," "early childhood measurement," "early childhood nutrition" and "early childhood physical activity." Inclusion criteria included a discussion of: (1) obesity prevention, risk assessment or treatment in children ages 1-5 years; and (2) measurement of nutrition or physical activity. One hundred thirty-four publications were selected for analysis. Data on measurement tools, population and outcomes were abstracted into tables. Tables are divided by individual and environmental level measures and further divided into physical activity, diet and physical health outcomes. Recommendations are made for weighing advantages and disadvantages of tools. Despite rising numbers of interventions targeting obesity-prevention and treatment in preschool-aged children, there is no consensus for which tools represent a gold standard or threshold of accuracy. Copyright © 2016 Asia Oceania Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.
Herens, Marion; Wagemakers, Annemarie
2017-12-01
In community-based health enhancing physical activity (CBHEPA) programmes, group-based principles for action such as active participation, enjoyment, and fostering group processes are widely advocated. However, not much is known about participants' perceptions of these principles as there are no assessment tools available. Therefore, this article describes the development of the APEF (Active Participation, Enjoyment, and Fostering group processes) tool and reports on its implementation in a Dutch CBHEPA programme. Indicators for the principles have been identified from literature research, interviews with professionals, and secondary analysis of three group interviews with 11 practitioners. To address the identified indicators, the APEF tool was developed, pretested, and used in 10 focus groups with 76 participants. The APEF tool consists of eight statements about group-based principles for action, on which CBHEPA participants vote, followed by in-depth discussion. The voting procedure engages participants. Spider diagrams visualise participants' perceptions of group-based principles. The APEF tool addresses the challenge of relating group level outcomes to individual outcomes such as physical activity behaviour. The tool facilitates as well as evaluates group-based principles for action, it stimulates dialogue and is culturally sensitive, but it needs strong facilitating skills to manage group dynamics. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Rethinking Technology-Enhanced Physics Teacher Education: From Theory to Practice
ERIC Educational Resources Information Center
Milner-Bolotin, Marina
2016-01-01
This article discusses how modern technology, such as electronic response systems, PeerWise system, data collection and analysis tools, computer simulations, and modeling software can be used in physics methods courses to promote teacher-candidates' professional competencies and their positive attitudes about mathematics and science education. We…
Cognitive Issues in Learning Advanced Physics: An Example from Quantum Mechanics
NASA Astrophysics Data System (ADS)
Singh, Chandralekha; Zhu, Guangtian
2009-11-01
We are investigating cognitive issues in learning quantum mechanics in order to develop effective teaching and learning tools. The analysis of cognitive issues is particularly important for bridging the gap between the quantitative and conceptual aspects of quantum mechanics and for ensuring that the learning tools help students build a robust knowledge structure. We discuss the cognitive aspects of quantum mechanics that are similar or different from those of introductory physics and their implications for developing strategies to help students develop a good grasp of quantum mechanics.
NASA Astrophysics Data System (ADS)
Vardanyan, E. L.; Budilov, V. V.; Ramazanov, K. N.; Khusnimardanov, R. N.; Nagimov, R. Sh
2017-05-01
The operation conditions and mechanism of wear of slotting tools from high-speed steel was researched. The analysis of methods increasing durability was carried out. The effect of intermetallic coatings deposited from vacuum-arc discharge plasma on the physical-mechanical high-speed steel EP657MP was discovered. The pilot batch of the slotting tool and production tests were carried out.
Proposing a Mathematical Software Tool in Physics Secondary Education
ERIC Educational Resources Information Center
Baltzis, Konstantinos B.
2009-01-01
MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…
Abstract: This case study application provides discussion on a selected application of advanced concepts, included in the End of Asset Life Reinvestment decision-making process tool, using Milwaukee Metropolitan Sewer District (MMSD) pump and motor data sets. The tool provides s...
NASA Astrophysics Data System (ADS)
Malik, S.; Shipsey, I.; Cavanaugh, R.; Bloom, K.; Chan, Kai-Feng; D'Hondt, J.; Klima, B.; Narain, M.; Palla, F.; Rolandi, G.; Schörner-Sadenius, T.
2014-06-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Introductory Physics Laboratories for Life Scientists - Hands on Physics of Complex Systems
NASA Astrophysics Data System (ADS)
Losert, Wolfgang; Moore, Kim
2015-03-01
We have developed a set of laboratories and hands on activities to accompany a new two-semester interdisciplinary physics course that has been successfully implemented as the required physics course for premeds at the University of Maryland. The laboratories include significant content on physics relevant to cellular scales, from chemical interactions to random motion and charge screening in fluids. We also introduce the students to research-grade equipment and modern physics analysis tools in contexts relevant to biology, while maintaining the pedagogically valuable open-ended laboratory structure of reformed laboratories.
Gravitational Wave Detection in the Introductory Lab
NASA Astrophysics Data System (ADS)
Burko, Lior M.
2017-01-01
Great physics breakthroughs are rarely included in the introductory physics course. General relativity and binary black hole coalescence are no different, and can be included in the introductory course only in a very limited sense. However, we can design activities that directly involve the detection of GW150914, the designation of the Gravitation Wave signal detected on September 14, 2015, thereby engage the students in this exciting discovery directly. The activities naturally do not include the construction of a detector or the detection of gravitational waves. Instead, we design it to include analysis of the data from GW150914, which includes some interesting analysis activities for students of the introductory course. The same activities can be assigned either as a laboratory exercise or as a computational project for the same population of students. The analysis tools used here are simple and available to the intended student population. It does not include the sophisticated analysis tools, which were used by LIGO to carefully analyze the detected signal. However, these simple tools are sufficient to allow the student to get important results. We have successfully assigned this lab project for students of the introductory course with calculus at Georgia Gwinnett College.
NASA Astrophysics Data System (ADS)
Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav
2015-03-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.
Physical Activity during Pregnancy: Recommendations and Assessment Tools.
Oliveira, Cibele; Imakawa, Thiago Dos Santos; Moisés, Elaine Christine Dantas
2017-08-01
The literature that supports and recommends the practice of exercise during pregnancy is extensive.However, although a more complete research on ways to evaluate the physical activity performed by pregnant women has been perfomed, it is found that there is no gold standard and that the articles in the area are inconclusive. Thus, the objective of the present article is to review relevant aspects, such as, technique and applicability of the different methods for the assessment of physical activity during pregnancy to provide more reliable and safe information for health professionals to encourage their pregnant patients to engage in the practice of physical activity. This review concluded that all tools for the analysis of physical activity have limitations. Thus, it is necessary to establish the objectives of evaluation in an appropriate manner, as well as to determine their viability and cost-effectiveness for the population under study. Thieme Revinter Publicações Ltda Rio de Janeiro, Brazil.
De Silva Weliange, Shreenika H; Fernando, Dulitha; Gunatilake, Jagath
2014-05-03
Environmental characteristics are known to be associated with patterns of physical activity (PA). Although several validated tools exist, to measure the environment characteristics, these instruments are not necessarily suitable for application in all settings especially in a developing country. This study was carried out to develop and validate an instrument named the "Physical And Social Environment Scale--PASES" to assess the physical and social environmental factors associated with PA. This will enable identification of various physical and social environmental factors affecting PA in Sri Lanka, which will help in the development of more tailored intervention strategies for promoting higher PA levels in Sri Lanka. The PASES was developed using a scientific approach of defining the construct, item generation, analysis of content of items and item reduction. Both qualitative and quantitative methods of key informant interviews, in-depth interviews and rating of the items generated by experts were conducted. A cross sectional survey among 180 adults was carried out to assess the factor structure through principal component analysis. Another cross sectional survey among a different group of 180 adults was carried out to assess the construct validity through confirmatory factor analysis. Reliability was assessed with test re-test reliability and internal consistency using Spearman r and Cronbach's alpha respectively. Thirty six items were selected after the expert ratings and were developed into interviewer administered questions. Exploration of factor structure of the 34 items which were factorable through principal component analysis with Quartimax rotation extracted 8 factors. The 34 item instrument was assessed for construct validity with confirmatory factor analysis which confirmed an 8 factor model (x2 = 339.9, GFI = 0.90). The identified factors were infrastructure for walking, aesthetics and facilities for cycling, vehicular traffic safety, access and connectivity, recreational facilities for PA, safety, social cohesion and social acceptance of PA with the two non-factorable factors, residential density and land use mix. The PASES also showed good test re-test reliability and a moderate level of internal consistency. The PASES is a valid and reliable tool which could be used to assess the physical and social environment associated with PA in Sri Lanka.
Abstract:This case study application provides discussion on a selected application of advanced concepts, included in the End of Asset Life Reinvestment decision-making process tool, using a utility practitioner’s data set. The tool provides step-by-step process guidance to the as...
CMS Configuration Editor: GUI based application for user analysis job
NASA Astrophysics Data System (ADS)
de Cosa, A.
2011-12-01
We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.
Framing the structural role of mathematics in physics lectures: A case study on electromagnetism
NASA Astrophysics Data System (ADS)
Karam, Ricardo
2014-06-01
Physics education research has shown that students tend to struggle when trying to use mathematics in a meaningful way in physics (e.g., mathematizing a physical situation or making sense of equations). Concerning the possible reasons for these difficulties, little attention has been paid to the way mathematics is treated in physics instruction. Starting from an overall distinction between a technical approach, which involves an instrumental (tool-like) use of mathematics, and a structural one, focused on reasoning about the physical world mathematically, the goal of this study is to characterize the development of the latter in didactic contexts. For this purpose, a case study was conducted on the electromagnetism course given by a distinguished physics professor. The analysis of selected teaching episodes with the software Videograph led to the identification of a set of categories that describe different strategies used by the professor to emphasize the structural role of mathematics in his lectures. As a consequence of this research, an analytic tool to enable future comparative studies between didactic approaches regarding the way mathematics is treated in physics teaching is provided.
Using high speed smartphone cameras and video analysis techniques to teach mechanical wave physics
NASA Astrophysics Data System (ADS)
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-07-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses along a spring and the period of transverse standing waves generated in the same spring. These experiments can be helpful in addressing several relevant concepts about the physics of mechanical waves and in overcoming some of the typical student misconceptions in this same field.
Martins, Júlia Caetano; Aguiar, Larissa Tavares; Nadeau, Sylvie; Scianni, Aline Alvim; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais
2017-02-13
Self-report physical activity assessment tools are commonly used for the evaluation of physical activity levels in individuals with stroke. A great variety of these tools have been developed and widely used in recent years, which justify the need to examine their measurement properties and clinical utility. Therefore, the main objectives of this systematic review are to examine the measurement properties and clinical utility of self-report measures of physical activity and discuss the strengths and limitations of the identified tools. A systematic review of studies that investigated the measurement properties and/or clinical utility of self-report physical activity assessment tools in stroke will be conducted. Electronic searches will be performed in five databases: Medical Literature Analysis and Retrieval System Online (MEDLINE) (PubMed), Excerpta Medica Database (EMBASE), Physiotherapy Evidence Database (PEDro), Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) and Scientific Electronic Library Online (SciELO), followed by hand searches of the reference lists of the included studies. Two independent reviewers will screen all retrieve titles, abstracts, and full texts, according to the inclusion criteria and will also extract the data. A third reviewer will be referred to solve any disagreement. A descriptive summary of the included studies will contain the design, participants, as well as the characteristics, measurement properties, and clinical utility of the self-report tools. The methodological quality of the studies will be evaluated using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist and the clinical utility of the identified tools will be assessed considering predefined criteria. This systematic review will follow the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) statement. This systematic review will provide an extensive review of the measurement properties and clinical utility of self-report physical activity assessment tools used in individuals with stroke, which would benefit clinicians and researchers. PROSPERO CRD42016037146. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Goulding, F S; Stone, Y
1970-10-16
The past decade has seen the rapid development and exploitation of one of the most significant tools of nuclear physics, the semiconductor radiation detector. Applications of the device to the analysis of materials promises to be one of the major contributions of nuclear research to technology, and may even assist in some aspects of our environmental problems. In parallel with the development of these applications, further developments in detectors for nuclear research are taking place: the use of very thin detectors for heavyion identification, position-sensitive detectors for nuclear-reaction studies, and very pure germanium for making more satisfactory detectors for many applications suggest major future contributions to physics.
Investigation of priorities in water quality management based on correlations and variations.
Boyacıoğlu, Hülya; Gündogdu, Vildan; Boyacıoğlu, Hayal
2013-04-15
The development of water quality assessment strategies investigating spatial and temporal changes caused by natural and anthropogenic phenomena is an important tool in management practices. This paper used cluster analysis, water quality index method, sensitivity analysis and canonical correlation analysis to investigate priorities in pollution control activities. Data sets representing 22 surface water quality parameters were subject to analysis. Results revealed that organic pollution was serious threat for overall water quality in the region. Besides, oil and grease, lead and mercury were the critical variables violating the standard. In contrast to inorganic variables, organic and physical-inorganic chemical parameters were influenced by variations in physical conditions (discharge, temperature). This study showed that information produced based on the variations and correlations in water quality data sets can be helpful to investigate priorities in water management activities. Moreover statistical techniques and index methods are useful tools in data - information transformation process. Copyright © 2013 Elsevier Ltd. All rights reserved.
Professional tools and a personal touch - experiences of physical therapy of persons with migraine.
Rutberg, Stina; Kostenius, Catrine; Öhrling, Kerstin
2013-09-01
The aim was to explore the lived experience of physical therapy of persons with migraine. Data were collected by conducting narrative interviews with 11 persons with migraine. Inspired by van Manen, a hermeneutic phenomenological method was used to analyse the experiences of physical therapy which these persons had. Physical therapy for persons with migraine meant making an effort in terms of time and energy to improve their health by meeting a person who was utilising his or her knowledge and skill to help. Being respected and treated as an individual and having confidence in the physical therapist were highlighted aspects. The analysis revealed a main theme, "meeting a physical therapist with professional tools and a personal touch". The main theme included four sub-themes, "investing time and energy to feel better", "relying on the competence of the physical therapist", "wanting to be treated and to become involved as an individual" and "being respected in a trustful relationship". The therapeutic relationship with the physical therapist is important and the findings of this study can increase awareness about relational aspects of physical therapy and encourage thoughtfulness among physical therapists and other healthcare professionals interacting with persons with migraine. Physical therapists use both professional tools and a personal touch in their interaction with persons with migraine and this article can increase physical therapists' awareness and encourage thoughtfulness in their professional practice. Being respected and treated as an individual and having confidence in the physical therapist are important aspects of the therapeutic relationship and indicate a need for patient-centred care. By making the effort of spending the time and energy required, physical therapy could be a complement or an alternative to medication to ease the consequences of migraine.
Guidelines for the analysis of free energy calculations
Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.
2015-01-01
Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134
Computational mechanics and physics at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
South, Jerry C., Jr.
1987-01-01
An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.
Arduino-Based Data Acquisition into Excel, LabVIEW, and MATLAB
ERIC Educational Resources Information Center
Nichols, Daniel
2017-01-01
Data acquisition equipment for physics can be quite expensive. As an alternative, data can be acquired using a low-cost Arduino microcontroller. The Arduino has been used in physics labs where the data are acquired using the Arduino software. The Arduino software, however, does not contain a suite of tools for data fitting and analysis. The data…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malik, S.; Shipsey, I.; Cavanaugh, R.
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals ofmore » CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDonald, Douglas G.; Clements, Samuel L.; Patrick, Scott W.
Securing high value and critical assets is one of the biggest challenges facing this nation and others around the world. In modern integrated systems, there are four potential modes of attack available to an adversary: • physical only attack, • cyber only attack, • physical-enabled cyber attack, • cyber-enabled physical attack. Blended attacks involve an adversary working in one domain to reduce system effectiveness in another domain. This enables the attacker to penetrate further into the overall layered defenses. Existing vulnerability assessment (VA) processes and software tools which predict facility vulnerabilities typically evaluate the physical and cyber domains separately. Vulnerabilitiesmore » which result from the integration of cyber-physical control systems are not well characterized and are often overlooked by existing assessment approaches. In this paper, we modified modification of the timely detection methodology, used for decades in physical security VAs, to include cyber components. The Physical and Cyber Risk Analysis Tool (PACRAT) prototype illustrates an integrated vulnerability assessment that includes cyber-physical interdependencies. Information about facility layout, network topology, and emplaced safeguards is used to evaluate how well suited a facility is to detect, delay, and respond to attacks, to identify the pathways most vulnerable to attack, and to evaluate how often safeguards are compromised for a given threat or adversary type. We have tested the PACRAT prototype on critical infrastructure facilities and the results are promising. Future work includes extending the model to prescribe the recommended security improvements via an automated cost-benefit analysis.« less
Using Image Modelling to Teach Newton's Laws with the Ollie Trick
ERIC Educational Resources Information Center
Dias, Marco Adriano; Carvalho, Paulo Simeão; Vianna, Deise Miranda
2016-01-01
Image modelling is a video-based teaching tool that is a combination of strobe images and video analysis. This tool can enable a qualitative and a quantitative approach to the teaching of physics, in a much more engaging and appealling way than the traditional expositive practice. In a specific scenario shown in this paper, the Ollie trick, we…
Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-02-01
New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less
RESPIZZI, STEFANO; COVELLI, ELISABETTA
2015-01-01
The emotional coaching model uses quantitative and qualitative elements to demonstrate some assumptions relevant to new methods of treatment in physical rehabilitation, considering emotional, cognitive and behavioral aspects in patients, whether or not they are sportsmen. Through quantitative tools (Tampa Kinesiophobia Scale, Emotional Interview Test, Previous Re-Injury Test, and reports on test scores) and qualitative tools (training contracts and relationships of emotional alliance or “contagion”), we investigate initial assumptions regarding: the presence of a cognitive and emotional mental state of impasse in patients at the beginning of the rehabilitation pathway; the curative value of the emotional alliance or “emotional contagion” relationship between healthcare provider and patient; the link between the patient’s pathology and type of contact with his own body and emotions; analysis of the psychosocial variables for the prediction of possible cases of re-injury for patients who have undergone or are afraid to undergo reconstruction of the anterior cruciate ligament (ACL). Although this approach is still in the experimental stage, the scores of the administered tests show the possibility of integrating quantitative and qualitative tools to investigate and develop a patient’s physical, mental and emotional resources during the course of his rehabilitation. Furthermore, it seems possible to identify many elements characterizing patients likely to undergo episodes of re-injury or to withdraw totally from sporting activity. In particular, such patients are competitive athletes, who fear or have previously undergone ACL reconstruction. The theories referred to (the transactional analysis theory, self-determination theory) and the tools used demonstrate the usefulness of continuing this research in order to build a shared coaching model treatment aimed at all patients, sportspeople or otherwise, which is not only physical but also emotional, cognitive and behavioral. PMID:26904525
Comparing Educational Tools Using Activity Theory: Clickers and Flashcards
NASA Astrophysics Data System (ADS)
Price, Edward; De Leone, Charles; Lasry, Nathaniel
2010-10-01
Physics educators and researchers have recently begun to distinguish between pedagogical approaches and the educational technologies that are used to implement them. For instance, peer instruction has been shown to be equally effective, in terms of student learning outcomes, when implemented with clickers or flashcards. Therefore, technological tools (clickers and flashcards) can be viewed as means to mediate pedagogical techniques (peer instruction or traditional instruction). In this paper, we use activity theory to examine peer instruction, with particular attention to the role of tools. This perspective helps clarify clickers' and flashcards' differences, similarities, impacts in the classroom, and utility to education researchers. Our analysis can suggest improvements and new uses. Finally, we propose activity theory as a useful approach in understanding and improving the use of technology in the physics classroom.
Nakamura, Shinichiro; Kondo, Yasushi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya
2011-02-01
Identification of the flow of materials and substances associated with a product system provides useful information for Life Cycle Analysis (LCA), and contributes to extending the scope of complementarity between LCA and Materials Flow Analysis/Substances Flow Analysis (MFA/SFA), the two major tools of industrial ecology. This paper proposes a new methodology based on input-output analysis for identifying the physical input-output flow of individual materials that is associated with the production of a unit of given product, the unit physical input-output by materials (UPIOM). While the Sankey diagram has been a standard tool for the visualization of MFA/SFA, with an increase in the complexity of the flows under consideration, which will be the case when economy-wide intersectoral flows of materials are involved, the Sankey diagram may become too complex for effective visualization. An alternative way to visually represent material flows is proposed which makes use of triangulation of the flow matrix based on degrees of fabrication. The proposed methodology is applied to the flow of pig iron and iron and steel scrap that are associated with the production of a passenger car in Japan. Its usefulness to identify a specific MFA pattern from the original IO table is demonstrated.
Development of risk assessment tool for foundry workers.
Mohan, G Madhan; Prasad, P S S; Mokkapati, Anil Kumar; Venkataraman, G
2008-01-01
Occupational ill-health and work-related disorders are predominant in manufacturing industries due to the inevitable presence of manual work even after several waves of industrial automation and technological advancements. Ergonomic risk factors and musculoskeletal disorders like low-back symptoms have been noted amongst foundry workers. The purpose of this study was to formulate and develop a Physical Effort Index to assess risk factor. The questionnaire tool applicable to foundry environment has been designed and validated. The data recorded through survey across the foundries has been subjected to regression analysis to correlate between proposed physical effort index and the standard Borg's Ratings of Perceived Exertion (RPE) scale. The physical efforts of sixty seven workers in various foundry shop floors were assessed subjectively. The 'Job factors' and 'Work environment' were the two major parameters considered in assessing the worker discomfort level at workplace. A relation between Borg's RPE scale and the above two parameters were arrived at, through regression analysis. The study demonstrates the prevalence of risk factors amongst foundry workers and the effectiveness of the proposed index in estimating the risk factor levels. RELEVANCE TO THE INDUSTRY: The proposed tool will assist foundry supervisors and managers to assess the risk factors and helps in better understanding of the workplace to avoid work-related disorders, ensuring better output.
Mogol, Burçe Ataç; Gökmen, Vural
2014-05-01
Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed. © 2013 Society of Chemical Industry.
ERIC Educational Resources Information Center
Shin, Shin-Shing
2016-01-01
Students attending object-oriented analysis and design (OOAD) courses typically encounter difficulties transitioning from requirements analysis to logical design and then to physical design. Concept maps have been widely used in studies of user learning. The study reported here, based on the relationship of concept maps to learning theory and…
Communication Analysis of Information Complexes.
ERIC Educational Resources Information Center
Malik, M. F.
Communication analysis is a tool for perceptual assessment of existing or projected information complexes, i.e., an established reality perceived by one or many humans. An information complex could be of a physical nature, such as a building, landscape, city street; or of a pure informational nature, such as a film, television program,…
Development of materials for the rapid manufacture of die cast tooling
NASA Astrophysics Data System (ADS)
Hardro, Peter Jason
The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.
Unified Approach to Modeling and Simulation of Space Communication Networks and Systems
NASA Technical Reports Server (NTRS)
Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth
2010-01-01
Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver
2009-11-20
Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less
Increasing physical activity with mobile devices: a meta-analysis.
Fanning, Jason; Mullen, Sean P; McAuley, Edward
2012-11-21
Regular physical activity has established physical and mental health benefits; however, merely one quarter of the U.S. adult population meets national physical activity recommendations. In an effort to engage individuals who do not meet these guidelines, researchers have utilized popular emerging technologies, including mobile devices (ie, personal digital assistants [PDAs], mobile phones). This study is the first to synthesize current research focused on the use of mobile devices for increasing physical activity. To conduct a meta-analysis of research utilizing mobile devices to influence physical activity behavior. The aims of this review were to: (1) examine the efficacy of mobile devices in the physical activity setting, (2) explore and discuss implementation of device features across studies, and (3) make recommendations for future intervention development. We searched electronic databases (PubMed, PsychINFO, SCOPUS) and identified publications through reference lists and requests to experts in the field of mobile health. Studies were included that provided original data and aimed to influence physical activity through dissemination or collection of intervention materials with a mobile device. Data were extracted to calculate effect sizes for individual studies, as were study descriptives. A random effects meta-analysis was conducted using the Comprehensive Meta-Analysis software suite. Study quality was assessed using the quality of execution portion of the Guide to Community Preventative Services data extraction form. Four studies were of "good" quality and seven of "fair" quality. In total, 1351 individuals participated in 11 unique studies from which 18 effects were extracted and synthesized, yielding an overall weight mean effect size of g = 0.54 (95% CI = 0.17 to 0.91, P = .01). Research utilizing mobile devices is gaining in popularity, and this study suggests that this platform is an effective means for influencing physical activity behavior. Our focus must be on the best possible use of these tools to measure and understand behavior. Therefore, theoretically grounded behavior change interventions that recognize and act on the potential of smartphone technology could provide investigators with an effective tool for increasing physical activity.
Identifying items to assess methodological quality in physical therapy trials: a factor analysis.
Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd
2014-09-01
Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. © 2014 American Physical Therapy Association.
Guidelines for the analysis of free energy calculations.
Klimovich, Pavel V; Shirts, Michael R; Mobley, David L
2015-05-01
Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.
Development of wavelet analysis tools for turbulence
NASA Technical Reports Server (NTRS)
Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.
1992-01-01
Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.
Spec Tool; an online education and research resource
NASA Astrophysics Data System (ADS)
Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.
2016-06-01
Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.
Professional tools and a personal touch – experiences of physical therapy of persons with migraine
Kostenius, Catrine; Öhrling, Kerstin
2013-01-01
Purpose: The aim was to explore the lived experience of physical therapy of persons with migraine. Method: Data were collected by conducting narrative interviews with 11 persons with migraine. Inspired by van Manen, a hermeneutic phenomenological method was used to analyse the experiences of physical therapy which these persons had. Results: Physical therapy for persons with migraine meant making an effort in terms of time and energy to improve their health by meeting a person who was utilising his or her knowledge and skill to help. Being respected and treated as an individual and having confidence in the physical therapist were highlighted aspects. The analysis revealed a main theme, “meeting a physical therapist with professional tools and a personal touch”. The main theme included four sub-themes, “investing time and energy to feel better”, “relying on the competence of the physical therapist”, “wanting to be treated and to become involved as an individual” and “being respected in a trustful relationship”. Conclusions: The therapeutic relationship with the physical therapist is important and the findings of this study can increase awareness about relational aspects of physical therapy and encourage thoughtfulness among physical therapists and other healthcare professionals interacting with persons with migraine. PMID:23311671
Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools
NASA Astrophysics Data System (ADS)
Sánchez Pineda, A.
2015-12-01
We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.
Submarine pipeline on-bottom stability. Volume 2: Software and manuals
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-01
The state-of-the-art in pipeline stability design has been changing very rapidly recent. The physics governing on-bottom stability are much better understood now than they were eight years. This is due largely because of research and large scale model tests sponsored by PRCI. Analysis tools utilizing this new knowledge have been developed. These tools provide the design engineer with a rational approach have been developed. These tools provide the design engineer with a rational approach for weight coating design, which he can use with confidence because the tools have been developed based on full scale and near full scale model tests.more » These tools represent the state-of-the-art in stability design and model the complex behavior of pipes subjected to both wave and current loads. These include: hydrodynamic forces which account for the effect of the wake (generated by flow over the pipe) washing back and forth over the pipe in oscillatory flow; and the embedment (digging) which occurs as a pipe resting on the seabed is exposed to oscillatory loadings and small oscillatory deflections. This report has been developed as a reference handbook for use in on-bottom pipeline stability analysis It consists of two volumes. Volume one is devoted descriptions of the various aspects of the problem: the pipeline design process; ocean physics, wave mechanics, hydrodynamic forces, and meteorological data determination; geotechnical data collection and soil mechanics; and stability design procedures. Volume two describes, lists, and illustrates the analysis software. Diskettes containing the software and examples of the software are also included in Volume two.« less
Network monitoring in the Tier2 site in Prague
NASA Astrophysics Data System (ADS)
Eliáš, Marek; Fiala, Lukáš; Horký, Jiří; Chudoba, Jiří; Kouba, Tomáš; Kundrát, Jan; Švec, Jan
2011-12-01
Network monitoring provides different types of view on the network traffic. It's output enables computing centre staff to make qualified decisions about changes in the organization of computing centre network and to spot possible problems. In this paper we present network monitoring framework used at Tier-2 in Prague in Institute of Physics (FZU). The framework consists of standard software and custom tools. We discuss our system for hardware failures detection using syslog logging and Nagios active checks, bandwidth monitoring of physical links and analysis of NetFlow exports from Cisco routers. We present tool for automatic detection of network layout based on SNMP. This tool also records topology changes into SVN repository. Adapted weathermap4rrd is used to visualize recorded data to get fast overview showing current bandwidth usage of links in network.
ERIC Educational Resources Information Center
Pill, Shane; Harvey, Stephen; Hyndman, Brendon
2017-01-01
This paper examines the use of the microblogging platform Twitter as a tool for research in physical education. The research examined teacher use of game-based approaches (GBAs). A rolling Twitter conversation hosted over the course of 12 hours provided the data for the study. Participants were from 18 countries and they contributed on average…
ERIC Educational Resources Information Center
Erduran, Sibel
Eight physical science textbooks were analyzed for coverage on acids, bases, and neutralization. At the level of the text, clarity and coherence of statements were investigated. The conceptual framework for this topic was represented in a concept map which was used as a coding tool for tracing concepts and links present in textbooks. Cognitive…
ERIC Educational Resources Information Center
Laws, Priscilla W.; Willis, Maxine C.; Sokoloff, David R.
2015-01-01
This article describes the 25-year history of development of the activity-based Workshop Physics (WP) at Dickinson College, its adaptation for use at Gettysburg Area High School, and its synergistic influence on curricular materials developed at the University of Oregon and Tufts University and vice versa. WP and these related curricula: 1) are…
Carling, Christopher; Bloomfield, Jonathan; Nelsen, Lee; Reilly, Thomas
2008-01-01
The optimal physical preparation of elite soccer (association football) players has become an indispensable part of the professional game, especially due to the increased physical demands of match-play. The monitoring of players' work rate profiles during competition is now feasible through computer-aided motion analysis. Traditional methods of motion analysis were extremely labour intensive and were largely restricted to university-based research projects. Recent technological developments have meant that sophisticated systems, capable of quickly recording and processing the data of all players' physical contributions throughout an entire match, are now being used in elite club environments. In recognition of the important role that motion analysis now plays as a tool for measuring the physical performance of soccer players, this review critically appraises various motion analysis methods currently employed in elite soccer and explores research conducted using these methods. This review therefore aims to increase the awareness of both practitioners and researchers of the various motion analysis systems available, and identify practical implications of the established body of knowledge, while highlighting areas that require further exploration.
Numerical simulations for active tectonic processes: increasing interoperability and performance
NASA Technical Reports Server (NTRS)
Donnellan, A.; Fox, G.; Rundle, J.; McLeod, D.; Tullis, T.; Grant, L.
2002-01-01
The objective of this project is to produce a system to fully model earthquake-related data. This task develops simulation and analysis tools to study the physics of earthquakes using state-of-the-art modeling.
Understanding Least Squares through Monte Carlo Calculations
ERIC Educational Resources Information Center
Tellinghuisen, Joel
2005-01-01
The method of least squares (LS) is considered as an important data analysis tool available to physical scientists. The mathematics of linear least squares(LLS) is summarized in a very compact matrix rotation that renders it practically "formulaic".
Application of a faith-based integration tool to assess mental and physical health interventions.
Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A
2017-01-01
To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.
Sandia National Laboratories analysis code data base
NASA Astrophysics Data System (ADS)
Peterson, C. W.
1994-11-01
Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.
Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code
NASA Technical Reports Server (NTRS)
Hendricks, Eric S.
2016-01-01
The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.
Laplace Transform Based Radiative Transfer Studies
NASA Astrophysics Data System (ADS)
Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.
2006-12-01
Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.
Examining Chemistry Students Visual-Perceptual Skills Using the VSCS tool and Interview Data
NASA Astrophysics Data System (ADS)
Christian, Caroline
The Visual-Spatial Chemistry Specific (VSCS) assessment tool was developed to test students' visual-perceptual skills, which are required to form a mental image of an object. The VSCS was designed around the theoretical framework of Rochford and Archer that provides eight distinct and well-defined visual-perceptual skills with identified problems students might have with each skill set. Factor analysis was used to analyze the results during the validation process of the VSCS. Results showed that the eight factors could not be separated from each other, but instead two factors emerged as significant to the data. These two factors have been defined and described as a general visual-perceptual skill (factor 1) and a skill that adds on a second level of complexity by involving multiple viewpoints such as changing frames of reference. The questions included in the factor analysis were bolstered by the addition of an item response theory (IRT) analysis. Interviews were also conducted with twenty novice students to test face validity of the tool, and to document student approaches at solving visualization problems of this type. Students used five main physical resources or processes to solve the questions, but the resource that was the most successful was handling or building a physical representation of an object.
Spark and HPC for High Energy Physics Data Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sehrish, Saba; Kowalkowski, Jim; Paterno, Marc
A full High Energy Physics (HEP) data analysis is divided into multiple data reduction phases. Processing within these phases is extremely time consuming, therefore intermediate results are stored in files held in mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive data analytics. Growth in size and complexity of experimental datasets, along with emerging big data tools are beginning to cause changes to the traditional ways of doing data analyses. Use of big data tools for HEP analysis looks promising, mainly because extremely large HEP datasets can be representedmore » and held in memory across a system, and accessed interactively by encoding an analysis using highlevel programming abstractions. The mainstream tools, however, are not designed for scientific computing or for exploiting the available HPC platform features. We use an example from the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in Geneva, Switzerland. The LHC is the highest energy particle collider in the world. Our use case focuses on searching for new types of elementary particles explaining Dark Matter in the universe. We use HDF5 as our input data format, and Spark to implement the use case. We show the benefits and limitations of using Spark with HDF5 on Edison at NERSC.« less
Research in Theoretical High-Energy Physics at Southern Methodist University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olness, Fredrick; Nadolsky, Pavel
2016-08-05
The SMU Theory group has developed a strong expertise in QCD, PDFs, and incisive comparisons between collider data and theory. The group pursues realistic phenomenological calculations for high-energy processes, the highly demanded research area driven by the LHC physics. Our field has seen major discoveries in recent years from a variety of experiments, large and small, including a number recognized by Nobel Prizes. There is a wealth of novel QCD data to explore. The SMU theory group develops the most advanced and innovative tools for comprehensive analysis in applications ranging from Higgs physics and new physics searches to nuclear scattering.
Assessing Student Peer Dialogue in Collaborative Settings: A Window into Student Reasoning
NASA Astrophysics Data System (ADS)
Stone, Antoinette
The use of science classroom discourse analysis as a way to gain a better understanding of various student cognitive outcomes has a rich history in Science Education in general and Physics Education Research (PER) in particular. When students talk to each other in a collaborative peer instruction environment, such as in the CLASP classes (Collaborative Learning and Sense-making in Physics) at UC Davis, they get to practice and enhance their reasoning and sense-making skills, develop collaborative approaches to problem solving, and participate in co-construction of knowledge and shared thinking. To better understand these important cognitive processes, an analysis tool for monitoring, assessing and categorizing the peer talk arising in this environment is needed as a first step in teasing out evidence for these processes inherent in such talk. In order to meaningfully contribute to the extensive body of knowledge that currently exists, deeper, more insightful answers to the question of what happens linguistically when students struggle to "make sense" and how students use language to mediate these important cognitive outcomes is needed. To this end, a new tool for interpreting particularly qualitative linguistic data is needed, and the first part of the dissertation expounds on the development of a discourse analysis tool that has as its underpinnings a framework for coding borrowed extensively from Systemic Functional Linguistics Theory (SFL). The second part of this dissertation illustrates multiple ways in which the tool is used and how it can be utilized to address many current research questions.
General Pressurization Model in Simscape
NASA Technical Reports Server (NTRS)
Servin, Mario; Garcia, Vicky
2010-01-01
System integration is an essential part of the engineering design process. The Ares I Upper Stage (US) is a complex system which is made up of thousands of components assembled into subsystems including a J2-X engine, liquid hydrogen (LH2) and liquid oxygen (LO2) tanks, avionics, thrust vector control, motors, etc. System integration is the task of connecting together all of the subsystems into one large system. To ensure that all the components will "fit together" as well as safety and, quality, integration analysis is required. Integration analysis verifies that, as an integrated system, the system will behave as designed. Models that represent the actual subsystems are built for more comprehensive analysis. Matlab has been an instrument widely use by engineers to construct mathematical models of systems. Simulink, one of the tools offered by Matlab, provides multi-domain graphical environment to simulate and design time-varying systems. Simulink is a powerful tool to analyze the dynamic behavior of systems over time. Furthermore, Simscape, a tool provided by Simulink, allows users to model physical (such as mechanical, thermal and hydraulic) systems using physical networks. Using Simscape, a model representing an inflow of gas to a pressurized tank was created where the temperature and pressure of the tank are measured over time to show the behavior of the gas. By further incorporation of Simscape into model building, the full potential of this software can be discovered and it hopefully can become a more utilized tool.
A model of motor performance during surface penetration: from physics to voluntary control.
Klatzky, Roberta L; Gershon, Pnina; Shivaprabhu, Vikas; Lee, Randy; Wu, Bing; Stetten, George; Swendsen, Robert H
2013-10-01
The act of puncturing a surface with a hand-held tool is a ubiquitous but complex motor behavior that requires precise force control to avoid potentially severe consequences. We present a detailed model of puncture over a time course of approximately 1,000 ms, which is fit to kinematic data from individual punctures, obtained via a simulation with high-fidelity force feedback. The model describes puncture as proceeding from purely physically determined interactions between the surface and tool, through decline of force due to biomechanical viscosity, to cortically mediated voluntary control. When fit to the data, it yields parameters for the inertial mass of the tool/person coupling, time characteristic of force decline, onset of active braking, stopping time and distance, and late oscillatory behavior, all of which the analysis relates to physical variables manipulated in the simulation. While the present data characterize distinct phases of motor performance in a group of healthy young adults, the approach could potentially be extended to quantify the performance of individuals from other populations, e.g., with sensory-motor impairments. Applications to surgical force control devices are also considered.
Development of an Easy-to-Use Tool for the Assessment of Emergency Department Physical Design.
Majidi, Alireza; Tabatabaey, Ali; Motamed, Hassan; Motamedi, Maryam; Forouzanfar, Mohammad Mehdi
2014-01-01
Physical design of the emergency department (ED) has an important effect on its role and function. To date, no guidelines have been introduced to set the standards for the construction of EDs in Iran. In this study, we aim to devise an easy-to-use tool based on the available literature and expert opinion for the quick and effective assessment of EDs in regards to their physical design. For this purpose, based on current literature on emergency design, a comprehensive checklist was developed. Then, this checklist was analyzed by a panel consisting of heads of three major EDs and contradicting items were decided. 178 crude items were derived from available literature. The Items were categorized in to three major domains of Physical space, Equipment, and Accessibility. The final checklist approved by the panel consisted of 163 items categorized into six domains. Each item was phrased as a "Yes or No" question for ease of analysis, meaning that the criterion is either met or not.
Angioi, Manuela; Metsios, George S; Twitchett, Emily; Koutedakis, Yiannis; Wyon, Matthew
2009-01-01
The physical demands imposed on contemporary dancers by choreographers and performance schedules make their physical fitness just as important to them as skill development. Nevertheless, it remains to be confirmed which physical fitness components are associated with aesthetic competence. The aim of this study was to: 1. replicate and test a novel aesthetic competence tool for reliability, and 2. investigate the association between selected physical fitness components and aesthetic competence by using this new tool. Seventeen volunteers underwent a series of physical fitness tests (body composition, flexibility, muscular power and endurance, and aerobic capacity) and aesthetic competence assessments (seven individual criteria commonly used by selected dance companies). Inter-rater reliability of the aesthetic competence tool was very high (r = 0.96). There were significant correlations between the aesthetic competence score and jump ability and push-ups (r = 0.55 and r = 0.55, respectively). Stepwise backward multiple regression analysis revealed that the best predictor of aesthetic competence was push-ups (R(2) = 0.30, p = 0.03). Univariate analyses also revealed that the interaction of push-ups and jump ability improved the prediction power of aesthetic competence (R(2) = 0.44, p = 0.004). It is concluded that upper body muscular endurance and jump ability best predict aesthetic competence of the present sample of contemporary dancers. Further research is required to investigate the contribution of other components of aesthetic competence, including upper body strength, lower body muscular endurance, general coordination, and static and dynamic balance.
Nightingale, Tom E; Rouse, Peter C; Thompson, Dylan; Bilzon, James L J
2017-12-01
Accurately measuring physical activity and energy expenditure in persons with chronic physical disabilities who use wheelchairs is a considerable and ongoing challenge. Quantifying various free-living lifestyle behaviours in this group is at present restricted by our understanding of appropriate measurement tools and analytical techniques. This review provides a detailed evaluation of the currently available measurement tools used to predict physical activity and energy expenditure in persons who use wheelchairs. It also outlines numerous considerations specific to this population and suggests suitable future directions for the field. Of the existing three self-report methods utilised in this population, the 3-day Physical Activity Recall Assessment for People with Spinal Cord Injury (PARA-SCI) telephone interview demonstrates the best reliability and validity. However, the complexity of interview administration and potential for recall bias are notable limitations. Objective measurement tools, which overcome such considerations, have been validated using controlled laboratory protocols. These have consistently demonstrated the arm or wrist as the most suitable anatomical location to wear accelerometers. Yet, more complex data analysis methodologies may be necessary to further improve energy expenditure prediction for more intricate movements or behaviours. Multi-sensor devices that incorporate physiological signals and acceleration have recently been adapted for persons who use wheelchairs. Population specific algorithms offer considerable improvements in energy expenditure prediction accuracy. This review highlights the progress in the field and aims to encourage the wider scientific community to develop innovative solutions to accurately quantify physical activity in this population.
Thermophysics Issues Relevant to High-Speed Earth Entry of Large Asteroids
NASA Technical Reports Server (NTRS)
Prabhu, D.; Saunders, D.; Agrawal, P.; Allen, G.; Bauschlicher, C.; Brandis, A.; Chen, Y.-K.; Jaffe, R.; Schulz, J.; Stern, E.;
2016-01-01
Physics of atmospheric entry of meteoroids was an active area of research at NASA ARC up to the early 1970s (e.g., the oft-cited work of Baldwin and Sheaffer). However, research in the area seems to have ended with the Apollo program, and any ties with an active international meteor physics community seem to have significantly diminished thereafter. In the decades following the 1970s, the focus of entry physics at NASA ARC has been on improvement of the math models of shock-layer physics (especially in chemical kinetics and radiation) and thermal response of ablative materials used for capsule heatshields. With the overarching objectives of understanding energy deposition into the atmosphere and fragmentation, could these modern analysis tools and processes be applied to the problem of atmospheric entry of meteoroids as well? In the presentation we will explore: (i) the physics of atmospheric entries of meteoroids using our current state-of-the-art tools and processes, (ii) how multiple bodies interact, and (iii) the influence of wall blowing on flow dynamics.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.
Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-08-06
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-01-01
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250
Magnetic Braking: A Video Analysis
NASA Astrophysics Data System (ADS)
Molina-Bolívar, J. A.; Abella-Palacios, A. J.
2012-10-01
This paper presents a laboratory exercise that introduces students to the use of video analysis software and the Lenz's law demonstration. Digital techniques have proved to be very useful for the understanding of physical concepts. In particular, the availability of affordable digital video offers students the opportunity to actively engage in kinematics in introductory-level physics.1,2 By using digital videos frame advance features and "marking" the position of a moving object in each frame, students are able to more precisely determine the position of an object at much smaller time increments than would be possible with common time devices. Once the student collects data consisting of positions and times, these values may be manipulated to determine velocity and acceleration. There are a variety of commercial and free applications that can be used for video analysis. Because the relevant technology has become inexpensive, video analysis has become a prevalent tool in introductory physics courses.
Irena : tool suite for modeling and analysis of small-angle scattering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilavsky, J.; Jemian, P.
2009-04-01
Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less
Nanostructure symmetry: Relevance for physics and computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dupertuis, Marc-André; Oberli, D. Y.; Karlsson, K. F.
2014-03-31
We review the research done in recent years in our group on the effects of nanostructure symmetry, and outline its relevance both for nanostructure physics and for computations of their electronic and optical properties. The exemples of C3v and C2v quantum dots are used. A number of surprises and non-trivial aspects are outlined, and a few symmetry-based tools for computing and analysis are shortly presented.
ERIC Educational Resources Information Center
Baran, Medine
2016-01-01
This study was carried out to determine high school students' perceptions of the courses of Physics and the factors influential on their perceptions with respect to gender. The research sample included 154 high school students (F:78; M:76). In the study, as the data collection tool, a structured interview form was used. The data collected in the…
Integrated Modeling and Analysis of Physical Oceanographic and Acoustic Processes
2015-09-30
goal is to improve ocean physical state and acoustic state predictive capabilities. The goal fitting the scope of this project is the creation of... Project -scale objectives are to complete targeted studies of oceanographic processes in a few regimes, accompanied by studies of acoustic propagation...by the basic research efforts of this project . An additional objective is to develop improved computational tools for acoustics and for the
Cytological Analysis of Meiosis in Caenorhabditis elegans
Phillips, Carolyn M.; McDonald, Kent L.; Dernburg, Abby F.
2011-01-01
The nematode Caenorhabditis elegans has emerged as an informative experimental system for analysis of meiosis, in large part because of the advantageous physical organization of meiotic nuclei as a gradient of stages within the germline. Here we provide tools for detailed observational studies of cells within the worm gonad, including techniques for light and electron microscopy. PMID:19685325
The Design and Analysis of Electrically Large Custom-Shaped Reflector Antennas
2013-06-01
GEO) satellite data are imported into STK and plotted to visualize the regions of the sky that the spherical reflector must have line of sight for...Magnetic Conductor PO Physical Optics STK Systems Tool Kit TE Transverse Electric xvii Acronym Definition TLE Two Line Element TM Transverse Magnetic...study for the spherical reflector, Systems Tool Kit ( STK ) software from Analytical Graphics Inc. (AGI) is used. In completing the cross-shaped
The Global Modeling and Assimilation Office (GMAO) 4d-Var and its Adjoint-based Tools
NASA Technical Reports Server (NTRS)
Todling, Ricardo; Tremolet, Yannick
2008-01-01
The fifth generation of the Goddard Earth Observing System (GEOS-5) Data Assimilation System (DAS) is a 3d-var system that uses the Grid-point Statistical Interpolation (GSI) system developed in collaboration with NCEP, and a general circulation model developed at Goddard, that includes the finite-volume hydrodynamics of GEOS-4 wrapped in the Earth System Modeling Framework and physical packages tuned to provide a reliable hydrological cycle for the integration of the Modern Era Retrospective-analysis for Research and Applications (MERRA). This MERRA system is essentially complete and the next generation GEOS is under intense development. A prototype next generation system is now complete and has been producing preliminary results. This prototype system replaces the GSI-based Incremental Analysis Update procedure with a GSI-based 4d-var which uses the adjoint of the finite-volume hydrodynamics of GEOS-4 together with a vertical diffusing scheme for simplified physics. As part of this development we have kept the GEOS-5 IAU procedure as an option and have added the capability to experiment with a First Guess at the Appropriate Time (FGAT) procedure, thus allowing for at least three modes of running the data assimilation experiments. The prototype system is a large extension of GEOS-5 as it also includes various adjoint-based tools, namely, a forecast sensitivity tool, a singular vector tool, and an observation impact tool, that combines the model sensitivity tool with a GSI-based adjoint tool. These features bring the global data assimilation effort at Goddard up to date with technologies used in data assimilation systems at major meteorological centers elsewhere. Various aspects of the next generation GEOS will be discussed during the presentation at the Workshop, and preliminary results will illustrate the discussion.
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
PlasmaPy: beginning a community developed Python package for plasma physics
NASA Astrophysics Data System (ADS)
Murphy, Nicholas A.; Huang, Yi-Min; PlasmaPy Collaboration
2016-10-01
In recent years, researchers in several disciplines have collaborated on community-developed open source Python packages such as Astropy, SunPy, and SpacePy. These packages provide core functionality, common frameworks for data analysis and visualization, and educational tools. We propose that our community begins the development of PlasmaPy: a new open source core Python package for plasma physics. PlasmaPy could include commonly used functions in plasma physics, easy-to-use plasma simulation codes, Grad-Shafranov solvers, eigenmode solvers, and tools to analyze both simulations and experiments. The development will include modern programming practices such as version control, embedding documentation in the code, unit tests, and avoiding premature optimization. We will describe early code development on PlasmaPy, and discuss plans moving forward. The success of PlasmaPy depends on active community involvement and a welcoming and inclusive environment, so anyone interested in joining this collaboration should contact the authors.
Lausberg, Hedda; Kazzer, Philipp; Heekeren, Hauke R; Wartenburger, Isabell
2015-10-01
Neuropsychological lesion studies evidence the necessity to differentiate between various forms of tool-related actions such as real tool use, tool use demonstration with tool in hand and without physical target object, and pantomime without tool in hand. However, thus far, neuroimaging studies have primarily focused only on investigating tool use pantomimes. The present fMRI study investigates pantomime without tool in hand as compared to tool use demonstration with tool in hand in order to explore patterns of cerebral signal modulation associated with acting with imaginary tools in hand. Fifteen participants performed with either hand (i) tool use pantomime with an imaginary tool in hand in response to visual tool presentation and (ii) tool use demonstration with tool in hand in response to visual-tactile tool presentation. In both conditions, no physical target object was present. The conjunction analysis of the right and left hands executions of tool use pantomime relative to tool use demonstration yielded significant activity in the left middle and superior temporal lobe. In contrast, demonstration relative to pantomime revealed large bihemispherically distributed homologous areas of activity. Thus far, fMRI studies have demonstrated the relevance of the left middle and superior temporal gyri in viewing, naming, and matching tools and related actions and contexts. Since in our study all these factors were equally (ir)relevant both in the tool use pantomime and the tool use demonstration conditions, the present findings enhance the knowledge about the function of these brain regions in tool-related cognitive processes. The two contrasted conditions only differ regarding the fact that the pantomime condition requires the individual to act with an imaginary tool in hand. Therefore, we suggest that the left middle and superior temporal gyri are specifically involved in integrating the projected mental image of a tool in the execution of a tool-specific movement concept. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes
NASA Technical Reports Server (NTRS)
Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.
2001-01-01
The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as part of their continuing efforts to support the users of the FLUKA code within the particle physics community. In keeping with the spirit of developing an evolving physics code, we are planning as part of this project, to participate in the efforts to validate the core FLUKA physics in ground-based accelerator test runs. The emphasis of these test runs will be the physics of greatest interest in the simulation of the space radiation environment. Such a tool will be of great value to planners, designers and operators of future space missions, as well as for the design of the vehicles and habitats to be used on such missions. It will also be of aid to future experiments of various kinds that may be affected at some level by the ambient radiation environment, or in the analysis of hybrid experiment designs that have been discussed for space-based astronomy and astrophysics. The tool will be of value to the Life Sciences personnel involved in the prediction and measurement of radiation doses experienced by the crewmembers on such missions. In addition, the tool will be of great use to the planners of experiments to measure and evaluate the space radiation environment itself. It can likewise be useful in the analysis of safe havens, hazard migration plans, and NASA's call for new research in composites and to NASA engineers modeling the radiation exposure of electronic circuits. This code will provide an important complimentary check on the predictions of analytic codes such as BRYNTRN/HZETRN that are presently used for many similar applications, and which have shortcomings that are more easily overcome with Monte Carlo type simulations. Finally, it is acknowledged that there are similar efforts based around the use of the GEANT4 Monte-Carlo transport code currently under development at CERN. It is our intention to make our software modular and sufficiently flexible to allow the parallel use of either FLUKA or GEANT4 as the physics transport engine.
NASA Astrophysics Data System (ADS)
Guisasola, Jenaro; Zuza, Kristina; Almudi, José-Manuel
2013-07-01
Textbooks are a very important tool in the teaching-learning process and influence important aspects of the process. This paper presents an analysis of the chapter on electromagnetic induction and Faraday's law in 19 textbooks on general physics for first-year university courses for scientists and engineers. This analysis was based on criteria formulated from the theoretical framework of electromagnetic induction in classical physics and students' learning difficulties concerning these concepts. The aim of the work presented here is not to compare a textbook against the ideal book, but rather to try and find a series of explanations, examples, questions, etc that provide evidence on how the topic is presented in relation to the criteria above. It concludes that despite many aspects being covered properly, there are others that deserve greater attention.
Integral equation and discontinuous Galerkin methods for the analysis of light-matter interaction
NASA Astrophysics Data System (ADS)
Baczewski, Andrew David
Light-matter interaction is among the most enduring interests of the physical sciences. The understanding and control of this physics is of paramount importance to the design of myriad technologies ranging from stained glass, to molecular sensing and characterization techniques, to quantum computers. The development of complex engineered systems that exploit this physics is predicated at least partially upon in silico design and optimization that properly capture the light-matter coupling. In this thesis, the details of computational frameworks that enable this type of analysis, based upon both Integral Equation and Discontinuous Galerkin formulations will be explored. There will be a primary focus on the development of efficient and accurate software, with results corroborating both. The secondary focus will be on the use of these tools in the analysis of a number of exemplary systems.
NASA Astrophysics Data System (ADS)
Zhang, Jun; Li, Ri Yi
2018-06-01
Building energy simulation is an important supporting tool for green building design and building energy consumption assessment, At present, Building energy simulation software can't meet the needs of energy consumption analysis and cabinet level micro environment control design of prefabricated building. thermal physical model of prefabricated building is proposed in this paper, based on the physical model, the energy consumption calculation software of prefabricated cabin building(PCES) is developed. we can achieve building parameter setting, energy consumption simulation and building thermal process and energy consumption analysis by PCES.
Aeroelastic stability analysis of a Darrieus wind turbine
NASA Astrophysics Data System (ADS)
Popelka, D.
1982-02-01
An aeroelastic stability analysis was developed for predicting flutter instabilities on vertical axis wind turbines. The analytical model and mathematical formulation of the problem are described as well as the physical mechanism that creates flutter in Darrieus turbines. Theoretical results are compared with measured experimental data from flutter tests of the Sandia 2 Meter turbine. Based on this comparison, the analysis appears to be an adequate design evaluation tool.
Identification and Analysis of National Airspace System Resource Constraints
NASA Technical Reports Server (NTRS)
Smith, Jeremy C.; Marien, Ty V.; Viken, Jeffery K.; Neitzke, Kurt W.; Kwa, Tech-Seng; Dollyhigh, Samuel M.; Fenbert, James W.; Hinze, Nicolas K.
2015-01-01
This analysis is the deliverable for the Airspace Systems Program, Systems Analysis Integration and Evaluation Project Milestone for the Systems and Portfolio Analysis (SPA) focus area SPA.4.06 Identification and Analysis of National Airspace System (NAS) Resource Constraints and Mitigation Strategies. "Identify choke points in the current and future NAS. Choke points refer to any areas in the en route, terminal, oceanic, airport, and surface operations that constrain actual demand in current and projected future operations. Use the Common Scenarios based on Transportation Systems Analysis Model (TSAM) projections of future demand developed under SPA.4.04 Tools, Methods and Scenarios Development. Analyze causes, including operational and physical constraints." The NASA analysis is complementary to a NASA Research Announcement (NRA) "Development of Tools and Analysis to Evaluate Choke Points in the National Airspace System" Contract # NNA3AB95C awarded to Logistics Management Institute, Sept 2013.
Pulse Shape Discrimination in the MAJORANA DEMONSTRATOR
NASA Astrophysics Data System (ADS)
Haufe, Christopher; Majorana Collaboration
2017-09-01
The MAJORANA DEMONSTRATOR is an experiment constructed to search for neutrinoless double-beta decays in germanium-76 and to demonstrate the feasibility to deploy a large-scale experiment in a phased and modular fashion. It consists of two modular arrays of natural and 76Ge-enriched germanium p-type point contact detectors totaling 44.1 kg, located at the 4850' level of the Sanford Underground Research Facility in Lead, South Dakota, USA. A large effort is underway to analyze the data currently being taken by the DEMONSTRATOR. Key components of this effort are analysis tools that allow for pulse shape discrimination-techniques that significantly reduce background levels in the neutrinoless double-beta decay region of interest. These tools are able to identify and reject multi-site events from Compton scattering as well as events from alpha particle interactions. This work serves as an overview for these analysis tools and highlights the unique advantages that the HPGe p-type point contact detector provides to pulse shape discrimination. This material is supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, the Particle Astrophysics and Nuclear Physics Programs of the National Science Foundation, and the Sanford Underground Research Facility.
Quod erat demonstrandum: Understanding and Explaining Equations in Physics Teacher Education
NASA Astrophysics Data System (ADS)
Karam, Ricardo; Krey, Olaf
2015-07-01
In physics education, equations are commonly seen as calculation tools to solve problems or as concise descriptions of experimental regularities. In physical science, however, equations often play a much more important role associated with the formulation of theories to provide explanations for physical phenomena. In order to overcome this inconsistency, one crucial step is to improve physics teacher education. In this work, we describe the structure of a course that was given to physics teacher students at the end of their master's degree in two European universities. The course had two main goals: (1) To investigate the complex interplay between physics and mathematics from a historical and philosophical perspective and (2) To expand students' repertoire of explanations regarding possible ways to derive certain school-relevant equations. A qualitative analysis on a case study basis was conducted to investigate the learning outcomes of the course. Here, we focus on the comparative analysis of two students who had considerably different views of the math-physics interplay in the beginning of the course. Our general results point to important changes on some of the students' views on the role of mathematics in physics, an increase in the participants' awareness of the difficulties faced by learners to understand physics equations and a broadening in the students' repertoire to answer "Why?" questions formulated to equations. Based on this analysis, further implications for physics teacher education are derived.
Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis
Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd
2014-01-01
Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942
WE-G-BRC-03: Risk Assessment for Physics Plan Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, S.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
Kaznowska, E; Depciuch, J; Łach, K; Kołodziej, M; Koziorowska, A; Vongsvivut, J; Zawlik, I; Cholewa, M; Cebulski, J
2018-08-15
Lung cancer has the highest mortality rate of all malignant tumours. The current effects of cancer treatment, as well as its diagnostics, are unsatisfactory. Therefore it is very important to introduce modern diagnostic tools, which will allow for rapid classification of lung cancers and their degree of malignancy. For this purpose, the authors propose the use of Fourier Transform InfraRed (FTIR) spectroscopy combined with Principal Component Analysis-Linear Discriminant Analysis (PCA-LDA) and a physics-based computational model. The results obtained for lung cancer tissues, adenocarcinoma and squamous cell carcinoma FTIR spectra, show a shift in wavenumbers compared to control tissue FTIR spectra. Furthermore, in the FTIR spectra of adenocarcinoma there are no peaks corresponding to glutamate or phospholipid functional groups. Moreover, in the case of G2 and G3 malignancy of adenocarcinoma lung cancer, the absence of an OH groups peak was noticed. Thus, it seems that FTIR spectroscopy is a valuable tool to classify lung cancer and to determine the degree of its malignancy. Copyright © 2018 Elsevier B.V. All rights reserved.
School environments and physical activity: the development and testing of an audit tool
Jones, Natalia R; Jones, Andy; van Sluijs, Esther MF; Panter, Jenna; Harrison, Flo; Griffin, Simon J
2013-01-01
The aim of this study was to develop, test, and employ an audit tool to objectively assess the opportunities for physical activity within school environments. A 44 item tool was developed and tested at 92 primary schools in the county of Norfolk, England, during summer term of 2007. Scores from the tool covering 6 domains of facility provision were examined against objectively measured hourly moderate to vigorous physical activity levels in 1868 9-10 year old pupils attending the schools. The tool was found to have acceptable reliability and good construct validity, differentiating the physical activity levels of children attending the highest and lowest scoring schools. The characteristics of school grounds may influence pupil’s physical activity levels. PMID:20435506
How to Determine the Centre of Mass of Bodies from Image Modelling
ERIC Educational Resources Information Center
Dias, Marco Adriano; Carvalho, Paulo Simeão; Rodrigues, Marcelo
2016-01-01
Image modelling is a recent technique in physics education that includes digital tools for image treatment and analysis, such as digital stroboscopic photography (DSP) and video analysis software. It is commonly used to analyse the motion of objects. In this work we show how to determine the position of the centre of mass (CM) of objects with…
ERIC Educational Resources Information Center
Onorato, P.; Mascheretti, P.; DeAmbrosis, A.
2012-01-01
In this paper, we describe how simple experiments realizable by using easily found and low-cost materials allow students to explore quantitatively the magnetic interaction thanks to the help of an Open Source Physics tool, the Tracker Video Analysis software. The static equilibrium of a "column" of permanents magnets is carefully investigated by…
Supporting Scientific Analysis within Collaborative Problem Solving Environments
NASA Technical Reports Server (NTRS)
Watson, Velvin R.; Kwak, Dochan (Technical Monitor)
2000-01-01
Collaborative problem solving environments for scientists should contain the analysis tools the scientists require in addition to the remote collaboration tools used for general communication. Unfortunately, most scientific analysis tools have been designed for a "stand-alone mode" and cannot be easily modified to work well in a collaborative environment. This paper addresses the questions, "What features are desired in a scientific analysis tool contained within a collaborative environment?", "What are the tool design criteria needed to provide these features?", and "What support is required from the architecture to support these design criteria?." First, the features of scientific analysis tools that are important for effective analysis in collaborative environments are listed. Next, several design criteria for developing analysis tools that will provide these features are presented. Then requirements for the architecture to support these design criteria are listed. Sonic proposed architectures for collaborative problem solving environments are reviewed and their capabilities to support the specified design criteria are discussed. A deficiency in the most popular architecture for remote application sharing, the ITU T. 120 architecture, prevents it from supporting highly interactive, dynamic, high resolution graphics. To illustrate that the specified design criteria can provide a highly effective analysis tool within a collaborative problem solving environment, a scientific analysis tool that contains the specified design criteria has been integrated into a collaborative environment and tested for effectiveness. The tests were conducted in collaborations between remote sites in the US and between remote sites on different continents. The tests showed that the tool (a tool for the visual analysis of computer simulations of physics) was highly effective for both synchronous and asynchronous collaborative analyses. The important features provided by the tool (and made possible by the specified design criteria) are: 1. The tool provides highly interactive, dynamic, high resolution, 3D graphics. 2. All remote scientists can view the same dynamic, high resolution, 3D scenes of the analysis as the analysis is being conducted. 3. The responsiveness of the tool is nearly identical to the responsiveness of the tool in a stand-alone mode. 4. The scientists can transfer control of the analysis between themselves. 5. Any analysis session or segment of an analysis session, whether done individually or collaboratively, can be recorded and posted on the Web for other scientists or students to download and play in either a collaborative or individual mode. 6. The scientist or student who downloaded the session can, individually or collaboratively, modify or extend the session with his/her own "what if" analysis of the data and post his/her version of the analysis back onto the Web. 7. The peak network bandwidth used in the collaborative sessions is only 1K bit/second even though the scientists at all sites are viewing high resolution (1280 x 1024 pixels), dynamic, 3D scenes of the analysis. The links between the specified design criteria and these performance features are presented.
Top-attack modeling and automatic target detection using synthetic FLIR scenery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Penn, Joseph A.
2004-09-01
A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.
Karmakar, Sougata; Pal, Madhu Sudan; Majumdar, Deepti; Majumdar, Dhurjati
2012-01-01
Ergonomic evaluation of visual demands becomes crucial for the operators/users when rapid decision making is needed under extreme time constraint like navigation task of jet aircraft. Research reported here comprises ergonomic evaluation of pilot's vision in a jet aircraft in virtual environment to demonstrate how vision analysis tools of digital human modeling software can be used effectively for such study. Three (03) dynamic digital pilot models, representative of smallest, average and largest Indian pilot population were generated from anthropometric database and interfaced with digital prototype of the cockpit in Jack software for analysis of vision within and outside the cockpit. Vision analysis tools like view cones, eye view windows, blind spot area, obscuration zone, reflection zone etc. were employed during evaluation of visual fields. Vision analysis tool was also used for studying kinematic changes of pilot's body joints during simulated gazing activity. From present study, it can be concluded that vision analysis tool of digital human modeling software was found very effective in evaluation of position and alignment of different displays and controls in the workstation based upon their priorities within the visual fields and anthropometry of the targeted users, long before the development of its physical prototype.
Investigating student communities with network analysis of interactions in a physics learning center
NASA Astrophysics Data System (ADS)
Brewe, Eric; Kramer, Laird; Sawtelle, Vashti
2012-06-01
Developing a sense of community among students is one of the three pillars of an overall reform effort to increase participation in physics, and the sciences more broadly, at Florida International University. The emergence of a research and learning community, embedded within a course reform effort, has contributed to increased recruitment and retention of physics majors. We utilize social network analysis to quantify interactions in Florida International University’s Physics Learning Center (PLC) that support the development of academic and social integration. The tools of social network analysis allow us to visualize and quantify student interactions and characterize the roles of students within a social network. After providing a brief introduction to social network analysis, we use sequential multiple regression modeling to evaluate factors that contribute to participation in the learning community. Results of the sequential multiple regression indicate that the PLC learning community is an equitable environment as we find that gender and ethnicity are not significant predictors of participation in the PLC. We find that providing students space for collaboration provides a vital element in the formation of a supportive learning community.
CDPP activities: Promoting research and education in space physics
NASA Astrophysics Data System (ADS)
Genot, V. N.; Andre, N.; Cecconi, B.; Gangloff, M.; Bouchemit, M.; Dufourg, N.; Pitout, F.; Budnik, E.; Lavraud, B.; Rouillard, A. P.; Heulet, D.; Bellucci, A.; Durand, J.; Delmas, D.; Alexandrova, O.; Briand, C.; Biegun, A.
2015-12-01
The French Plasma Physics Data Centre (CDPP, http://cdpp.eu/) addresses for more than 15 years all issues pertaining to natural plasma data distribution and valorization. Initially established by CNES and CNRS on the ground of a solid data archive, CDPP activities diversified with the advent of broader networks and interoperability standards, and through fruitful collaborations (e.g. with NASA/PDS): providing access to remote data, designing and building science driven analysis tools then became at the forefront of CDPP developments. For instance today AMDA helps scientists all over the world accessing and analyzing data from ancient to very recent missions (from Voyager, Galileo, Geotail, ... to Maven, Rosetta, MMS, ...) as well as results from models and numerical simulations. Other tools like the Propagation Tool or 3DView allow users to put their data in context and interconnect with other databases (CDAWeb, MEDOC) and tools (Topcat). This presentation will briefly review this evolution, show technical and science use cases, and finally put CDPP activities in the perspective of ongoing collaborative projects (Europlanet H2020, HELCATS, ...) and future missions (Bepicolombo, Solar Orbiter, ...).
ERIC Educational Resources Information Center
Dadaczynski, Kevin; Paulus, Peter; de Vries, Nanne; de Ruiter, Silvia; Buijs, Goof
2010-01-01
The HEPS Inventory Tool aims to support stakeholders working in school health promotion to promote high quality interventions on healthy eating and physical activity. As a tool it provides a step-by-step approach on how to develop a national or regional inventory of existing school based interventions on healthy eating and physical activity. It…
Big Data in HEP: A comprehensive use case study
NASA Astrophysics Data System (ADS)
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; Jayatilaka, Bo; Kowalkowski, Jim; Pivarski, Jim; Sehrish, Saba; Mantilla Surez, Cristina; Svyatkovskiy, Alexey; Tran, Nhan
2017-10-01
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.
Human performance cognitive-behavioral modeling: a benefit for occupational safety.
Gore, Brian F
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Human performance cognitive-behavioral modeling: a benefit for occupational safety
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Application of a faith-based integration tool to assess mental and physical health interventions
Saunders, Donna M.; Leak, Jean; Carver, Monique E.; Smith, Selina A.
2017-01-01
Background To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Methods Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. Results The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Conclusions Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed. PMID:29354795
Fault management for the Space Station Freedom control center
NASA Technical Reports Server (NTRS)
Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet
1992-01-01
This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.
[Application of fluid mechanics and simulation: urinary tract and ureteral catheters.
Gómez-Blanco, J C; Martínez-Reina, J; Cruz, D; Blas Pagador, J; Sánchez-Margallo, F M; Soria, F
2016-10-01
The mechanics of urine during its transport from the renal pelvis to the bladder is of great interest for urologists. The knowledge of the different physical variables and their interrelationship, both in physiologic movements and pathologies, will help a better diagnosis and treatment. The objective of this chapter is to show the physics principles and their most relevant basic relations in urine transport, and to bring them over the clinical world. For that, we explain the movement of urine during peristalsis, ureteral obstruction and in a ureter with a stent. This explanation is based in two tools used in bioengineering: the theoretical analysis through the Theory of concontinuous media and Ffluid mechanics and computational simulation that offers a practical solution for each scenario. Moreover, we review other contributions of bioengineering to the field of Urology, such as physical simulation or additive and subtractive manufacturing techniques. Finally, we list the current limitations for these tools and the technological development lines with more future projection. In this chapter we aim to help urologists to understand some important concepts of bioengineering, promoting multidisciplinary cooperation to offer complementary tools that help in diagnosis and treatment of diseases.
The effectiveness of physical models in teaching anatomy: a meta-analysis of comparative studies.
Yammine, Kaissar; Violato, Claudio
2016-10-01
There are various educational methods used in anatomy teaching. While three dimensional (3D) visualization technologies are gaining ground due to their ever-increasing realism, reports investigating physical models as a low-cost 3D traditional method are still the subject of considerable interest. The aim of this meta-analysis is to quantitatively assess the effectiveness of such models based on comparative studies. Eight studies (7 randomized trials; 1 quasi-experimental) including 16 comparison arms and 820 learners met the inclusion criteria. Primary outcomes were defined as factual, spatial and overall percentage scores. The meta-analytical results are: educational methods using physical models yielded significantly better results when compared to all other educational methods for the overall knowledge outcome (p < 0.001) and for spatial knowledge acquisition (p < 0.001). Significantly better results were also found with regard to the long-retention knowledge outcome (p < 0.01). No significance was found for the factual knowledge acquisition outcome. The evidence in the present systematic review was found to have high internal validity and at least an acceptable strength. In conclusion, physical anatomical models offer a promising tool for teaching gross anatomy in 3D representation due to their easy accessibility and educational effectiveness. Such models could be a practical tool to bring up the learners' level of gross anatomy knowledge at low cost.
Teaching Physics with Basketball
NASA Astrophysics Data System (ADS)
Chanpichai, N.; Wattanakasiwich, P.
2010-07-01
Recently, technologies and computer takes important roles in learning and teaching, including physics. Advance in technologies can help us better relating physics taught in the classroom to the real world. In this study, we developed a module on teaching a projectile motion through shooting a basketball. Students learned about physics of projectile motion, and then they took videos of their classmates shooting a basketball by using the high speed camera. Then they analyzed videos by using Tracker, a video analysis and modeling tool. While working with Tracker, students learned about the relationships between three kinematics graphs. Moreover, they learned about a real projectile motion (with an air resistance) through modeling tools. Students' abilities to interpret kinematics graphs were investigated before and after the instruction by using the Test of Understanding Graphs in Kinematics (TUG-K). The maximum normalized gain or
Analyzing Virtual Physics Simulations with Tracker
NASA Astrophysics Data System (ADS)
Claessens, Tom
2017-12-01
In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.
PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP'07)
NASA Astrophysics Data System (ADS)
Sobie, Randall; Tafirout, Reda; Thomson, Jana
2007-07-01
The 2007 International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held on 2-7 September 2007 in Victoria, British Columbia, Canada. CHEP is a major series of international conferences for physicists and computing professionals from the High Energy and Nuclear Physics community, Computer Science and Information Technology. The CHEP conference provides an international forum to exchange information on computing experience and needs for the community, and to review recent, ongoing, and future activities. The CHEP'07 conference had close to 500 attendees with a program that included plenary sessions of invited oral presentations, a number of parallel sessions comprising oral and poster presentations, and an industrial exhibition. Conference tracks covered topics in Online Computing, Event Processing, Software Components, Tools and Databases, Software Tools and Information Systems, Computing Facilities, Production Grids and Networking, Grid Middleware and Tools, Distributed Data Analysis and Information Management and Collaborative Tools. The conference included a successful whale-watching excursion involving over 200 participants and a banquet at the Royal British Columbia Museum. The next CHEP conference will be held in Prague in March 2009. We would like thank the sponsors of the conference and the staff at the TRIUMF Laboratory and the University of Victoria who made the CHEP'07 a success. Randall Sobie and Reda Tafirout CHEP'07 Conference Chairs
IViPP: A Tool for Visualization in Particle Physics
NASA Astrophysics Data System (ADS)
Tran, Hieu; Skiba, Elizabeth; Baldwin, Doug
2011-10-01
Experiments and simulations in physics generate a lot of data; visualization is helpful to prepare that data for analysis. IViPP (Interactive Visualizations in Particle Physics) is an interactive computer program that visualizes results of particle physics simulations or experiments. IViPP can handle data from different simulators, such as SRIM or MCNP. It can display relevant geometry and measured scalar data; it can do simple selection from the visualized data. In order to be an effective visualization tool, IViPP must have a software architecture that can flexibly adapt to new data sources and display styles. It must be able to display complicated geometry and measured data with a high dynamic range. We therefore organize it in a highly modular structure, we develop libraries to describe geometry algorithmically, use rendering algorithms running on the powerful GPU to display 3-D geometry at interactive rates, and we represent scalar values in a visual form of scientific notation that shows both mantissa and exponent. This work was supported in part by the US Department of Energy through the Laboratory for Laser Energetics (LLE), with special thanks to Craig Sangster at LLE.
MOD Tool (Microwave Optics Design Tool)
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.
1999-01-01
The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl/Tk, which allows the user to work on a choice of platforms (PC, Mac, or Unix) after downloading the Tcl/Tk binary, which is readily available on the web. The MOD Tool server is written using Expect, and it resides on a Sun workstation. Client/server communications are performed over a socket, where upon a connection from a client to the server, the server spawns a child which is be dedicated to communicating with that client. The server communicates with other machines, such as supercomputers using Expect with the username and password being provided by the user on the client.
Salko, Robert K.; Schmidt, Rodney C.; Avramova, Maria N.
2014-11-23
This study describes major improvements to the computational infrastructure of the CTF subchannel code so that full-core, pincell-resolved (i.e., one computational subchannel per real bundle flow channel) simulations can now be performed in much shorter run-times, either in stand-alone mode or as part of coupled-code multi-physics calculations. These improvements support the goals of the Department Of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL) Energy Innovation Hub to develop high fidelity multi-physics simulation tools for nuclear energy design and analysis.
Inertial focusing of microparticles and its limitations
NASA Astrophysics Data System (ADS)
Cruz, FJ; Hooshmand Zadeh, S.; Wu, ZG; Hjort, K.
2016-10-01
Microfluidic devices are useful tools for healthcare, biological and chemical analysis and materials synthesis amongst fields that can benefit from the unique physics of these systems. In this paper we studied inertial focusing as a tool for hydrodynamic sorting of particles by size. Theory and experimental results are provided as a background for a discussion on how to extend the technology to submicron particles. Different geometries and dimensions of microchannels were designed and simulation data was compared to the experimental results.
Technical Manual for the SAM Physical Trough Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, M. J.; Gilman, P.
2011-06-01
NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field,more » power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.« less
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...
2014-06-30
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less
NASA Astrophysics Data System (ADS)
van den Berg, J. C.
2004-03-01
A guided tour J. C. van den Berg; 1. Wavelet analysis, a new tool in physics J.-P. Antoine; 2. The 2-D wavelet transform, physical applications J.-P. Antoine; 3. Wavelets and astrophysical applications A. Bijaoui; 4. Turbulence analysis, modelling and computing using wavelets M. Farge, N. K.-R. Kevlahan, V. Perrier and K. Schneider; 5. Wavelets and detection of coherent structures in fluid turbulence L. Hudgins and J. H. Kaspersen; 6. Wavelets, non-linearity and turbulence in fusion plasmas B. Ph. van Milligen; 7. Transfers and fluxes of wind kinetic energy between orthogonal wavelet components during atmospheric blocking A. Fournier; 8. Wavelets in atomic physics and in solid state physics J.-P. Antoine, Ph. Antoine and B. Piraux; 9. The thermodynamics of fractals revisited with wavelets A. Arneodo, E. Bacry and J. F. Muzy; 10. Wavelets in medicine and physiology P. Ch. Ivanov, A. L. Goldberger, S. Havlin, C.-K. Peng, M. G. Rosenblum and H. E. Stanley; 11. Wavelet dimension and time evolution Ch.-A. Guérin and M. Holschneider.
NASA Astrophysics Data System (ADS)
van den Berg, J. C.
1999-08-01
A guided tour J. C. van den Berg; 1. Wavelet analysis, a new tool in physics J.-P. Antoine; 2. The 2-D wavelet transform, physical applications J.-P. Antoine; 3. Wavelets and astrophysical applications A. Bijaoui; 4. Turbulence analysis, modelling and computing using wavelets M. Farge, N. K.-R. Kevlahan, V. Perrier and K. Schneider; 5. Wavelets and detection of coherent structures in fluid turbulence L. Hudgins and J. H. Kaspersen; 6. Wavelets, non-linearity and turbulence in fusion plasmas B. Ph. van Milligen; 7. Transfers and fluxes of wind kinetic energy between orthogonal wavelet components during atmospheric blocking A. Fournier; 8. Wavelets in atomic physics and in solid state physics J.-P. Antoine, Ph. Antoine and B. Piraux; 9. The thermodynamics of fractals revisited with wavelets A. Arneodo, E. Bacry and J. F. Muzy; 10. Wavelets in medicine and physiology P. Ch. Ivanov, A. L. Goldberger, S. Havlin, C.-K. Peng, M. G. Rosenblum and H. E. Stanley; 11. Wavelet dimension and time evolution Ch.-A. Guérin and M. Holschneider.
The development and validation of the Physical Appearance Comparison Scale-Revised (PACS-R).
Schaefer, Lauren M; Thompson, J Kevin
2014-04-01
The Physical Appearance Comparison Scale (PACS; Thompson, Heinberg, & Tantleff, 1991) was revised to assess appearance comparisons relevant to women and men in a wide variety of contexts. The revised scale (Physical Appearance Comparison Scale-Revised, PACS-R) was administered to 1176 college females. In Study 1, exploratory factor analysis and parallel analysis using one half of the sample suggested a single factor structure for the PACS-R. Study 2 utilized the remaining half of the sample to conduct confirmatory factor analysis, item analysis, and to examine the convergent validity of the scale. These analyses resulted in an 11-item measure that demonstrated excellent internal consistency and convergent validity with measures of body satisfaction, eating pathology, sociocultural influences on appearance, and self-esteem. Regression analyses demonstrated the utility of the PACS-R in predicting body satisfaction and eating pathology. Overall, results indicate that the PACS-R is a reliable and valid tool for assessing appearance comparison tendencies in women. Copyright © 2014. Published by Elsevier Ltd.
Laser Powered Launch Vehicle Performance Analyses
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Liu, Jiwen; Wang, Ten-See (Technical Monitor)
2001-01-01
The purpose of this study is to establish the technical ground for modeling the physics of laser powered pulse detonation phenomenon. Laser powered propulsion systems involve complex fluid dynamics, thermodynamics and radiative transfer processes. Successful predictions of the performance of laser powered launch vehicle concepts depend on the sophisticate models that reflects the underlying flow physics including the laser ray tracing the focusing, inverse Bremsstrahlung (IB) effects, finite-rate air chemistry, thermal non-equilibrium, plasma radiation and detonation wave propagation, etc. The proposed work will extend the base-line numerical model to an efficient design analysis tool. The proposed model is suitable for 3-D analysis using parallel computing methods.
Commercial D-T FRC Power Plant Systems Analysis
NASA Astrophysics Data System (ADS)
Nguyen, Canh; Santarius, John; Emmert, Gilbert; Steinhauer, Loren; Stubna, Michael
1998-11-01
Results of an engineering issues scoping study of a Field-Reversed Configuration (FRC) burning D-T fuel will be presented. The study primarily focuses on engineering issues, such as tritium-breeding blanket design, radiation shielding, neutron damage, activation, safety, and environment. This presentation will concentrate on plasma physics, current drive, economics, and systems integration, which are important for the overall systems analysis. A systems code serves as the key tool in defining a reference point for detailed physics and engineering calculations plus parametric variations, and typical cases will be presented. Advantages of the cylindrical geometry and high beta (plasma pressure/magnetic-field pressure) are evident.
Infrared Thermal Imaging as a Tool in University Physics Education
ERIC Educational Resources Information Center
Mollmann, Klaus-Peter; Vollmer, Michael
2007-01-01
Infrared thermal imaging is a valuable tool in physics education at the university level. It can help to visualize and thereby enhance understanding of physical phenomena from mechanics, thermal physics, electromagnetism, optics and radiation physics, qualitatively as well as quantitatively. We report on its use as lecture demonstrations, student…
Modeling and Controls Development of 48V Mild Hybrid Electric Vehicles
The Advanced Light-Duty Powertrain and Hybrid Analysis tool (ALPHA) was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...
NASA Astrophysics Data System (ADS)
Nagendra, K. N.; Bagnulo, Stefano; Centeno, Rebecca; Jesús Martínez González, María.
2015-08-01
Preface; 1. Solar and stellar surface magnetic fields; 2. Future directions in astrophysical polarimetry; 3. Physical processes; 4. Instrumentation for astronomical polarimetry; 5. Data analysis techniques for polarization observations; 6. Polarization diagnostics of atmospheres and circumstellar environments; 7. Polarimetry as a tool for discovery science; 8. Numerical modeling of polarized emission; Author index.
The need for monetary information within corporate water accounting.
Burritt, Roger L; Christ, Katherine L
2017-10-01
A conceptual discussion is provided about the need to add monetary data to water accounting initiatives and how best to achieve this if companies are to become aware of the water crisis and to take actions to improve water management. Analysis of current water accounting initiatives reveals the monetary business case for companies to improve water management is rarely considered, there being a focus on physical information about water use. Three possibilities emerge for mainstreaming the integration of monetization into water accounting: add-on to existing water accounting frameworks and tools, develop new tools which include physical and monetary information from the start, and develop environmental management accounting (EMA) into a water-specific application and set of tools. The paper appraises these three alternatives and concludes that development of EMA would be the best way forward. Suggestions for further research include the need to examine the use of a transdisciplinary method to address the complexities of water accounting. Copyright © 2017 Elsevier Ltd. All rights reserved.
Durant, Nefertiti H; Joseph, Rodney P; Cherrington, Andrea; Cuffee, Yendelela; Knight, BernNadette; Lewis, Dwight; Allison, Jeroan J
2014-01-16
Innovative approaches are needed to promote physical activity among young adult overweight and obese African American women. We sought to describe key elements that African American women desire in a culturally relevant Internet-based tool to promote physical activity among overweight and obese young adult African American women. A mixed-method approach combining nominal group technique and traditional focus groups was used to elicit recommendations for the development of an Internet-based physical activity promotion tool. Participants, ages 19 to 30 years, were enrolled in a major university. Nominal group technique sessions were conducted to identify themes viewed as key features for inclusion in a culturally relevant Internet-based tool. Confirmatory focus groups were conducted to verify and elicit more in-depth information on the themes. Twenty-nine women participated in nominal group (n = 13) and traditional focus group sessions (n = 16). Features that emerged to be included in a culturally relevant Internet-based physical activity promotion tool were personalized website pages, diverse body images on websites and in videos, motivational stories about physical activity and women similar to themselves in size and body shape, tips on hair care maintenance during physical activity, and online social support through social media (eg, Facebook, Twitter). Incorporating existing social media tools and motivational stories from young adult African American women in Internet-based tools may increase the feasibility, acceptability, and success of Internet-based physical activity programs in this high-risk, understudied population.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Parsons, T.; King, R.
This report summarizes the theory, verification, and validation of a new sizing tool for wind turbine drivetrain components, the Drivetrain Systems Engineering (DriveSE) tool. DriveSE calculates the dimensions and mass properties of the hub, main shaft, main bearing(s), gearbox, bedplate, transformer if up-tower, and yaw system. The level of fi¬ delity for each component varies depending on whether semiempirical parametric or physics-based models are used. The physics-based models have internal iteration schemes based on system constraints and design criteria. Every model is validated against available industry data or finite-element analysis. The verification and validation results show that the models reasonablymore » capture primary drivers for the sizing and design of major drivetrain components.« less
Topography measurements and applications in ballistics and tool mark identifications*
Vorburger, T V; Song, J; Petraco, N
2016-01-01
The application of surface topography measurement methods to the field of firearm and toolmark analysis is fairly new. The field has been boosted by the development of a number of competing optical methods, which has improved the speed and accuracy of surface topography acquisitions. We describe here some of these measurement methods as well as several analytical methods for assessing similarities and differences among pairs of surfaces. We also provide a few examples of research results to identify cartridge cases originating from the same firearm or tool marks produced by the same tool. Physical standards and issues of traceability are also discussed. PMID:27182440
Tools for Detecting Causality in Space Systems
NASA Astrophysics Data System (ADS)
Johnson, J.; Wing, S.
2017-12-01
Complex systems such as the solar and magnetospheric envivonment often exhibit patterns of behavior that suggest underlying organizing principles. Causality is a key organizing principle that is particularly difficult to establish in strongly coupled nonlinear systems, but essential for understanding and modeling the behavior of systems. While traditional methods of time-series analysis can identify linear correlations, they do not adequately quantify the distinction between causal and coincidental dependence. We discuss tools for detecting causality including: granger causality, transfer entropy, conditional redundancy, and convergent cross maps. The tools are illustrated by applications to magnetospheric and solar physics including radiation belt, Dst (a magnetospheric state variable), substorm, and solar cycle dynamics.
MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format.
Ahmed, Zeeshan; Dandekar, Thomas
2015-01-01
Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool 'Mining Scientific Literature (MSL)', which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system's output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format.
Measuring Physical Activity in Pregnancy Using Questionnaires: A Meta-Analysis
Schuster, Snježana; Šklempe Kokić, Iva; Sindik, Joško
2016-09-01
Physical activity (PA) during normal pregnancy has various positive effects on pregnant women’s health. Determination of the relationship between PA and health outcomes requires accurate measurement of PA in pregnant women. The purpose of this review is to provide a summary of valid and reliable PA questionnaires for pregnant women. During 2013, Pubmed, OvidSP and Web of Science databases were searched for trials on measurement properties of PA questionnaires for pregnant population. Six studies and four questionnaires met the inclusion criteria: Pregnancy Physical Activity Questionnaire, Modified Kaiser Physical Activity Survey, Short Pregnancy Leisure Time Physical Activity Questionnaire and Third Pregnancy Infection and Nutrition Study Physical Activity Questionnaire. Assessment of validity and reliability was performed using correlations of the scores in these questionnaires with objective measures and subjective measures (self-report) of PA, as well as test-retest reliability coefficients. Sample sizes included in analysis varied from 45 to 177 subjects. The best validity and reliability characteristics (together with effect sizes) were identified for the Modified Kaiser Physical Activity Survey and Pregnancy Physical Activity Questionnaire (French, Vietnamese, standard). In conclusion, assessment of PA during pregnancy remains a challenging and complex task. Questionnaires are a simple and effective, yet limited tool for assessing PA.
Advanced Usage of Vehicle Sketch Pad for CFD-Based Conceptual Design
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu
2013-01-01
Conceptual design is the most fluid phase of aircraft design. It is important to be able to perform large scale design space exploration of candidate concepts that can achieve the design intent to avoid more costly configuration changes in later stages of design. This also means that conceptual design is highly dependent on the disciplinary analysis tools to capture the underlying physics accurately. The required level of analysis fidelity can vary greatly depending on the application. Vehicle Sketch Pad (VSP) allows the designer to easily construct aircraft concepts and make changes as the design matures. More recent development efforts have enabled VSP to bridge the gap to high-fidelity analysis disciplines such as computational fluid dynamics and structural modeling for finite element analysis. This paper focuses on the current state-of-the-art geometry modeling for the automated process of analysis and design of low-boom supersonic concepts using VSP and several capability-enhancing design tools.
ERIC Educational Resources Information Center
John, Deborah H.; Gunter, Katherine; Jackson, Jennifer A.; Manore, Melinda
2016-01-01
Background: Practical tools are needed that reliably measure the complex physical activity (PA) and nutrition environments of elementary schools that influence children's health and learning behaviors for obesity prevention. The School Physical Activity and Nutrition-Environment Tool (SPAN-ET) was developed and beta tested in 6 rural Oregon…
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
Modeling and Validation of Lithium-ion Automotive Battery Packs (SAE 2013-01-1539)
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...
Organizational Constraints and Goal Setting
ERIC Educational Resources Information Center
Putney, Frederick B.; Wotman, Stephen
1978-01-01
Management modeling techniques are applied to setting operational and capital goals using cost analysis techniques in this case study at the Columbia University School of Dental and Oral Surgery. The model was created as a planning tool used in developing a financially feasible operating plan and a 100 percent physical renewal plan. (LBH)
Network analysis: a new tool for resource managers
Ruth H. Allen
1980-01-01
Resource managers manipulate ecosystems for direct or indirect human uses. Examples of relatively well studied resource management issues include familiar biological products such as: forests, ranges, fish and wildlife; or physical products such as air, water and soil. Until very recently, urban environments received much less scholarly attention. However, as Spurr (...
ERIC Educational Resources Information Center
Wilcox, Bethany R.; Pollock, Steven J.
2015-01-01
Standardized conceptual assessment represents a widely used tool for educational researchers interested in student learning within the standard undergraduate physics curriculum. For example, these assessments are often used to measure student learning across educational contexts and instructional strategies. However, to support the large-scale…
ERIC Educational Resources Information Center
Villano, Matt
2008-01-01
Colleges and universities can never be too prepared, whether for physical attacks or data security breaches. A quick data slice of over 7,000 US higher ed institutions, using the Office of Postsecondary Education's Campus Security Data Analysis Cutting Tool Website and cutting across public and private two- and four-year schools, reveals some…
Benchmarking and Modeling of a Conventional Mid-Size Car Using ALPHA (SAE Paper 2015-01-1140)
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...
Mulcahy, Nicholas J; Schubiger, Michèle N; Suddendorf, T
2013-02-01
Great apes appear to have limited knowledge of tool functionality when they are presented with tasks that involve a physical connection between a tool and a reward. For instance, they fail to understand that pulling a rope with a reward tied to its end is more beneficial than pulling a rope that only touches a reward. Apes show more success when both ropes have rewards tied to their ends but one rope is nonfunctional because it is clearly separated into aligned sections. It is unclear, however, whether this success is based on perceptual features unrelated to connectivity, such as perceiving the tool's separate sections as independent tools rather than one discontinuous tool. Surprisingly, there appears to be no study that has tested any type of connectivity problem using natural tools made from branches with which wild and captive apes often have extensive experience. It is possible that such ecologically valid tools may better help subjects understand connectivity that involves physical attachment. In this study, we tested orangutans with natural tools and a range of connectivity problems that involved the physical attachment of a reward on continuous and broken tools. We found that the orangutans understood tool connectivity involving physical attachment that apes from other studies failed when tested with similar tasks using artificial as opposed to natural tools. We found no evidence that the orangutans' success in broken tool conditions was based on perceptual features unrelated to connectivity. Our results suggest that artificial tools may limit apes' knowledge of connectivity involving physical attachment, whereas ecologically valid tools may have the opposite effect. PsycINFO Database Record (c) 2013 APA, all rights reserved
Understanding the Physical Optics Phenomena by Using a Digital Application for Light Propagation
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel-Esteban; Ángel-Toro, Luciano
2011-01-01
Understanding the light propagation on the basis of the Huygens-Fresnel principle stands for a fundamental factor for deeper comprehension of different physical optics related phenomena like diffraction, self-imaging, image formation, Fourier analysis and spatial filtering. This constitutes the physical approach of the Fourier optics whose principles and applications have been developed since the 1950's. Both for analytical and digital applications purposes, light propagation can be formulated in terms of the Fresnel Integral Transform. In this work, a digital optics application based on the implementation of the Discrete Fresnel Transform (DFT), and addressed to serve as a tool for applications in didactics of optics is presented. This tool allows, at a basic and intermediate learning level, exercising with the identification of basic phenomena, and observing changes associated with modifications of physical parameters. This is achieved by using a friendly graphic user interface (GUI). It also assists the user in the development of his capacity for abstracting and predicting the characteristics of more complicated phenomena. At an upper level of learning, the application could be used to favor a deeper comprehension of involved physics and models, and experimenting with new models and configurations. To achieve this, two characteristics of the didactic tool were taken into account when designing it. First, all physical operations, ranging from simple diffraction experiments to digital holography and interferometry, were developed on the basis of the more fundamental concept of light propagation. Second, the algorithm was conceived to be easily upgradable due its modular architecture based in MATLAB® software environment. Typical results are presented and briefly discussed in connection with didactics of optics.
Easy Web Interfaces to IDL Code for NSTX Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
W.M. Davis
Reusing code is a well-known Software Engineering practice to substantially increase the efficiency of code production, as well as to reduce errors and debugging time. A variety of "Web Tools" for the analysis and display of raw and analyzed physics data are in use on NSTX [1], and new ones can be produced quickly from existing IDL [2] code. A Web Tool with only a few inputs, and which calls an IDL routine written in the proper style, can be created in less than an hour; more typical Web Tools with dozens of inputs, and the need for some adaptationmore » of existing IDL code, can be working in a day or so. Efficiency is also increased for users of Web Tools because o f the familiar interface of the web browser, and not needing X-windows, accounts, passwords, etc. Web Tools were adapted for use by PPPL physicists accessing EAST data stored in MDSplus with only a few man-weeks of effort; adapting to additional sites should now be even easier. An overview of Web Tools in use on NSTX, and a list of the most useful features, is also presented.« less
Memory Circuit Fault Simulator
NASA Technical Reports Server (NTRS)
Sheldon, Douglas J.; McClure, Tucker
2013-01-01
Spacecraft are known to experience significant memory part-related failures and problems, both pre- and postlaunch. These memory parts include both static and dynamic memories (SRAM and DRAM). These failures manifest themselves in a variety of ways, such as pattern-sensitive failures, timingsensitive failures, etc. Because of the mission critical nature memory devices play in spacecraft architecture and operation, understanding their failure modes is vital to successful mission operation. To support this need, a generic simulation tool that can model different data patterns in conjunction with variable write and read conditions was developed. This tool is a mathematical and graphical way to embed pattern, electrical, and physical information to perform what-if analysis as part of a root cause failure analysis effort.
Factors Controlling Sediment Load in The Central Anatolia Region of Turkey: Ankara River Basin.
Duru, Umit; Wohl, Ellen; Ahmadi, Mehdi
2017-05-01
Better understanding of the factors controlling sediment load at a catchment scale can facilitate estimation of soil erosion and sediment transport rates. The research summarized here enhances understanding of correlations between potential control variables on suspended sediment loads. The Soil and Water Assessment Tool was used to simulate flow and sediment at the Ankara River basin. Multivariable regression analysis and principal component analysis were then performed between sediment load and controlling variables. The physical variables were either directly derived from a Digital Elevation Model or from field maps or computed using established equations. Mean observed sediment rate is 6697 ton/year and mean sediment yield is 21 ton/y/km² from the gage. Soil and Water Assessment Tool satisfactorily simulated observed sediment load with Nash-Sutcliffe efficiency, relative error, and coefficient of determination (R²) values of 0.81, -1.55, and 0.93, respectively in the catchment. Therefore, parameter values from the physically based model were applied to the multivariable regression analysis as well as principal component analysis. The results indicate that stream flow, drainage area, and channel width explain most of the variability in sediment load among the catchments. The implications of the results, efficient siltation management practices in the catchment should be performed to stream flow, drainage area, and channel width.
Factors Controlling Sediment Load in The Central Anatolia Region of Turkey: Ankara River Basin
NASA Astrophysics Data System (ADS)
Duru, Umit; Wohl, Ellen; Ahmadi, Mehdi
2017-05-01
Better understanding of the factors controlling sediment load at a catchment scale can facilitate estimation of soil erosion and sediment transport rates. The research summarized here enhances understanding of correlations between potential control variables on suspended sediment loads. The Soil and Water Assessment Tool was used to simulate flow and sediment at the Ankara River basin. Multivariable regression analysis and principal component analysis were then performed between sediment load and controlling variables. The physical variables were either directly derived from a Digital Elevation Model or from field maps or computed using established equations. Mean observed sediment rate is 6697 ton/year and mean sediment yield is 21 ton/y/km² from the gage. Soil and Water Assessment Tool satisfactorily simulated observed sediment load with Nash-Sutcliffe efficiency, relative error, and coefficient of determination ( R²) values of 0.81, -1.55, and 0.93, respectively in the catchment. Therefore, parameter values from the physically based model were applied to the multivariable regression analysis as well as principal component analysis. The results indicate that stream flow, drainage area, and channel width explain most of the variability in sediment load among the catchments. The implications of the results, efficient siltation management practices in the catchment should be performed to stream flow, drainage area, and channel width.
Baker, Sarah E; Painter, Elizabeth E; Morgan, Brandon C; Kaus, Anna L; Petersen, Evan J; Allen, Christopher S; Deyle, Gail D; Jensen, Gail M
2017-01-01
Clinical reasoning is essential to physical therapist practice. Solid clinical reasoning processes may lead to greater understanding of the patient condition, early diagnostic hypothesis development, and well-tolerated examination and intervention strategies, as well as mitigate the risk of diagnostic error. However, the complex and often subconscious nature of clinical reasoning can impede the development of this skill. Protracted tools have been published to help guide self-reflection on clinical reasoning but might not be feasible in typical clinical settings. This case illustrates how the Systematic Clinical Reasoning in Physical Therapy (SCRIPT) tool can be used to guide the clinical reasoning process and prompt a physical therapist to search the literature to answer a clinical question and facilitate formal mentorship sessions in postprofessional physical therapist training programs. The SCRIPT tool enabled the mentee to generate appropriate hypotheses, plan the examination, query the literature to answer a clinical question, establish a physical therapist diagnosis, and design an effective treatment plan. The SCRIPT tool also facilitated the mentee's clinical reasoning and provided the mentor insight into the mentee's clinical reasoning. The reliability and validity of the SCRIPT tool have not been formally studied. Clinical mentorship is a cornerstone of postprofessional training programs and intended to develop advanced clinical reasoning skills. However, clinical reasoning is often subconscious and, therefore, a challenging skill to develop. The use of a tool such as the SCRIPT may facilitate developing clinical reasoning skills by providing a systematic approach to data gathering and making clinical judgments to bring clinical reasoning to the conscious level, facilitate self-reflection, and make a mentored physical therapist's thought processes explicit to his or her clinical mentor. © 2017 American Physical Therapy Association
A Guided Tour of Mathematical Methods - 2nd Edition
NASA Astrophysics Data System (ADS)
Snieder, Roel
2004-09-01
Mathematical methods are essential tools for all physical scientists. This second edition provides a comprehensive tour of the mathematical knowledge and techniques that are needed by students in this area. In contrast to more traditional textbooks, all the material is presented in the form of problems. Within these problems the basic mathematical theory and its physical applications are well integrated. The mathematical insights that the student acquires are therefore driven by their physical insight. Topics that are covered include vector calculus, linear algebra, Fourier analysis, scale analysis, complex integration, Green's functions, normal modes, tensor calculus, and perturbation theory. The second edition contains new chapters on dimensional analysis, variational calculus, and the asymptotic evaluation of integrals. This book can be used by undergraduates, and lower-level graduate students in the physical sciences. It can serve as a stand-alone text, or as a source of problems and examples to complement other textbooks. All the material is presented in the form of problems Mathematical insights are gained by getting the reader to develop answers themselves Many applications of the mathematics are given
Joseph, Rodney P.; Cherrington, Andrea; Cuffee, Yendelela; Knight, BernNadette; Lewis, Dwight; Allison, Jeroan J.
2014-01-01
Introduction Innovative approaches are needed to promote physical activity among young adult overweight and obese African American women. We sought to describe key elements that African American women desire in a culturally relevant Internet-based tool to promote physical activity among overweight and obese young adult African American women. Methods A mixed-method approach combining nominal group technique and traditional focus groups was used to elicit recommendations for the development of an Internet-based physical activity promotion tool. Participants, ages 19 to 30 years, were enrolled in a major university. Nominal group technique sessions were conducted to identify themes viewed as key features for inclusion in a culturally relevant Internet-based tool. Confirmatory focus groups were conducted to verify and elicit more in-depth information on the themes. Results Twenty-nine women participated in nominal group (n = 13) and traditional focus group sessions (n = 16). Features that emerged to be included in a culturally relevant Internet-based physical activity promotion tool were personalized website pages, diverse body images on websites and in videos, motivational stories about physical activity and women similar to themselves in size and body shape, tips on hair care maintenance during physical activity, and online social support through social media (eg, Facebook, Twitter). Conclusion Incorporating existing social media tools and motivational stories from young adult African American women in Internet-based tools may increase the feasibility, acceptability, and success of Internet-based physical activity programs in this high-risk, understudied population. PMID:24433625
Big Data in HEP: A comprehensive use case study
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; ...
2017-11-23
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity.more » In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. Lastly, we will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.« less
Big Data in HEP: A comprehensive use case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity.more » In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. Lastly, we will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.« less
Toward the automated analysis of plasma physics problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mynick, H.E.
1989-04-01
A program (CALC) is described, which carries out nontrivial plasma physics calculations, in a manner intended to emulate the approach of a human theorist. This includes the initial process of gathering the relevant equations from a plasma knowledge base, and then determining how to solve them. Solution of the sets of equations governing physics problems, which in general have a nonuniform,irregular structure, not amenable to solution by standardized algorithmic procedures, is facilitated by an analysis of the structure of the equations and the relations among them. This often permits decompositions of the full problem into subproblems, and other simplifications inmore » form, which renders the resultant subsystems soluble by more standardized tools. CALC's operation is illustrated by a detailed description of its treatment of a sample plasma calculation. 5 refs., 3 figs.« less
Live cell refractometry using Hilbert phase microscopy and confocal reflectance microscopy.
Lue, Niyom; Choi, Wonshik; Popescu, Gabriel; Yaqoob, Zahid; Badizadegan, Kamran; Dasari, Ramachandra R; Feld, Michael S
2009-11-26
Quantitative chemical analysis has served as a useful tool for understanding cellular metabolisms in biology. Among many physical properties used in chemical analysis, refractive index in particular has provided molecular concentration that is an important indicator for biological activities. In this report, we present a method of extracting full-field refractive index maps of live cells in their native states. We first record full-field optical thickness maps of living cells by Hilbert phase microscopy and then acquire physical thickness maps of the same cells using a custom-built confocal reflectance microscope. Full-field and axially averaged refractive index maps are acquired from the ratio of optical thickness to physical thickness. The accuracy of the axially averaged index measurement is 0.002. This approach can provide novel biological assays of label-free living cells in situ.
Live Cell Refractometry Using Hilbert Phase Microscopy and Confocal Reflectance Microscopy†
Lue, Niyom; Choi, Wonshik; Popescu, Gabriel; Yaqoob, Zahid; Badizadegan, Kamran; Dasari, Ramachandra R.; Feld, Michael S.
2010-01-01
Quantitative chemical analysis has served as a useful tool for understanding cellular metabolisms in biology. Among many physical properties used in chemical analysis, refractive index in particular has provided molecular concentration that is an important indicator for biological activities. In this report, we present a method of extracting full-field refractive index maps of live cells in their native states. We first record full-field optical thickness maps of living cells by Hilbert phase microscopy and then acquire physical thickness maps of the same cells using a custom-built confocal reflectance microscope. Full-field and axially averaged refractive index maps are acquired from the ratio of optical thickness to physical thickness. The accuracy of the axially averaged index measurement is 0.002. This approach can provide novel biological assays of label-free living cells in situ. PMID:19803506
1988-10-01
Structured Analysis involves building a logical (non-physical) model of a system, using graphic techniques which enable users, analysts, and designers to... Design uses tools, especially graphic ones, to render systems readily understandable. 8 Ř. Structured Design offers a set of strategies for...in the overall systems design process, and an overview of the assessment procedures, as well as a guide to the overall assessment. 20. DISTRIBUTION
Exploring physics concepts among novice teachers through CMAP tools
NASA Astrophysics Data System (ADS)
Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.
2018-03-01
Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.
GDR (Genome Database for Rosaceae): integrated web-database for Rosaceae genomics and genetics data
Jung, Sook; Staton, Margaret; Lee, Taein; Blenda, Anna; Svancara, Randall; Abbott, Albert; Main, Dorrie
2008-01-01
The Genome Database for Rosaceae (GDR) is a central repository of curated and integrated genetics and genomics data of Rosaceae, an economically important family which includes apple, cherry, peach, pear, raspberry, rose and strawberry. GDR contains annotated databases of all publicly available Rosaceae ESTs, the genetically anchored peach physical map, Rosaceae genetic maps and comprehensively annotated markers and traits. The ESTs are assembled to produce unigene sets of each genus and the entire Rosaceae. Other annotations include putative function, microsatellites, open reading frames, single nucleotide polymorphisms, gene ontology terms and anchored map position where applicable. Most of the published Rosaceae genetic maps can be viewed and compared through CMap, the comparative map viewer. The peach physical map can be viewed using WebFPC/WebChrom, and also through our integrated GDR map viewer, which serves as a portal to the combined genetic, transcriptome and physical mapping information. ESTs, BACs, markers and traits can be queried by various categories and the search result sites are linked to the mapping visualization tools. GDR also provides online analysis tools such as a batch BLAST/FASTA server for the GDR datasets, a sequence assembly server and microsatellite and primer detection tools. GDR is available at http://www.rosaceae.org. PMID:17932055
Meiners, Kelly M; Rush, Douglas K
2017-01-01
Prior studies have explored variables that had predictive relationships with National Physical Therapy Examination (NPTE) score or NPTE failure. The purpose of this study was to explore whether certain variables were predictive of test-takers' first-time score on the NPTE. The population consisted of 134 students who graduated from the university's Professional DPT Program in 2012 to 2014. This quantitative study used a retrospective design. Two separate data analyses were conducted. First, hierarchical linear multiple regression (HMR) analysis was performed to determine which variables were predictive of first-time NPTE score. Second, a correlation analysis was performed on all 18 Physical Therapy Clinical Performance Instrument (PT CPI) 2006 category scores obtained during the first long-term clinical rotation, overall PT CPI 2006 score, and NPTE passage. With all variables entered, the HMR model predicted 39% of the variance seen in NPTE scores. The HMR results showed that physical therapy program first-year GPA (1PTGPA) was the strongest predictor and explained 24% of the variance in NPTE scores (b=0.572, p<0.001). The correlational analysis found no statistically significant correlation between the 18 PT CPI 2006 category scores, overall PT CPI 2006 score, and NPTE passage. As 1PTGPA had the most significant contribution to prediction of NPTE scores, programs need to monitor first-year students who display academic difficulty. PT CPI version 2006 scores were significantly correlated with each other, but not with NPTE score or NPTE passage. Both tools measure many of the same professional requirements but use different modes of assessment, and they may be considered complementary tools to gain a full picture of both the student's ability and skills.
A process for the quantification of aircraft noise and emissions interdependencies
NASA Astrophysics Data System (ADS)
de Luis, Jorge
The main purpose of this dissertation is to develop a process to improve actual policy-making procedures in terms of aviation environmental effects. This research work expands current practices with physics based publicly available models. The current method uses solely information provided by industry members, and this information is usually proprietary, and not physically intuitive. The process herein proposed provides information regarding the interdependencies between the environmental effects of aircraft. These interdependencies are also tied to the actual physical parameters of the aircraft and the engine, making it more intuitive for decision-makers to understand the impacts to the vehicle due to different policy scenarios. These scenarios involve the use of fleet analysis tools in which the existing aircraft are used to predict the environmental effects of imposing new stringency levels. The aircraft used are reduced to a series of coefficients that represent their performance, in terms of flight characteristics, fuel burn, noise, and emissions. These coefficients are then utilized to model flight operations and calculate what the environmental impacts of those aircraft are. If a particular aircraft does not meet the stringency to be analyzed, a technology response is applied to it, in order to meet that stringency. Depending on the level of reduction needed, this technology response can have an effect on the fuel burn characteristic of the aircraft. Another important point of the current stringency analysis process is that it does not take into account both noise and emissions concurrently, but instead, it considers them separately, one at a time. This assumes that the interdependencies between the two do not exists, which is not realistic. The latest stringency process delineated in 2004 imposed a 2% fuel burn penalty for any required improvements on NOx, no matter the type of aircraft or engine, assuming that no company had the ability to produce a vehicle with similar characteristics. This left all the performance characteristics of the aircraft untouched, except for the fuel burn, including the noise performance. The proposed alternative is to create a fleet of replacement aircraft to the current fleet that does not meet stringency. These replacement aircraft represent the achievable physical limits for state of the art systems. In this research work, the interdependencies between NOx, noise, and fuel burn are not neglected, and it is in fact necessary to take all three into account, simultaneously, to capture the physical limits that can be attained during a stringency analysis. In addition, the replacement aircraft show the linkage between environmental effects and fundamental aircraft and engine characteristics, something that has been neglected in previous policy making procedures. Another aspect that has been ignored is the creation of the coefficients used for the fleet analyses. In current literature, a defined process for the creation of those coefficients does not exist, but this research work develops a process to do so and demonstrates that the characteristics of the aircraft can be propagated to the coefficients and to the fleet analysis tools. The implementation of the process proposed shows that, first, the environmental metrics can be linked to the physical attributes of the aircraft using non-proprietary, physics based tools, second, those interdependencies can be propagated to fleet level tools, and third, this propagation provides an improvement in the policy making process, by showing what needs to change in an aircraft to meet different stringency levels.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
Representing Operational Modes for Situation Awareness
NASA Astrophysics Data System (ADS)
Kirchhübel, Denis; Lind, Morten; Ravn, Ole
2017-01-01
Operating complex plants is an increasingly demanding task for human operators. Diagnosis of and reaction to on-line events requires the interpretation of real time data. Vast amounts of sensor data as well as operational knowledge about the state and design of the plant are necessary to deduct reasonable reactions to abnormal situations. Intelligent computational support tools can make the operator’s task easier, but they require knowledge about the overall system in form of some model. While tools used for fault-tolerant control design based on physical principles and relations are valuable tools for designing robust systems, the models become too complex when considering the interactions on a plant-wide level. The alarm systems meant to support human operators in the diagnosis of the plant-wide situation on the other hand fail regularly in situations where these interactions of systems lead to many related alarms overloading the operator with alarm floods. Functional modelling can provide a middle way to reduce the complexity of plant-wide models by abstracting from physical details to more general functions and behaviours. Based on functional models the propagation of failures through the interconnected systems can be inferred and alarm floods can potentially be reduced to their root-cause. However, the desired behaviour of a complex system changes due to operating procedures that require more than one physical and functional configuration. In this paper a consistent representation of possible configurations is deduced from the analysis of an exemplary start-up procedure by functional models. The proposed interpretation of the modelling concepts simplifies the functional modelling of distinct modes. The analysis further reveals relevant links between the quantitative sensor data and the qualitative perspective of the diagnostics tool based on functional models. This will form the basis for the ongoing development of a novel real-time diagnostics system based on the on-line adaptation of the underlying MFM model.
Modeling and Validation of Power-split and P2 Parallel Hybrid Electric Vehicles SAE 2013-01-1470)
The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types combined ...
2015-03-23
SAMPE, Long Beach, CA, 2008. [28] N Hu and H Fukunaga. A new approach for health monitoring of composite structures through identification of impact...Bernard H Minster . Hysteresis and two- dimensional nonlinear wave propagation in berea sandstone. Journal of Geo- physical Research: Solid Earth (1978–2012
Toward Improved Collections in Medical Humanities: Fiction in Academic Health Sciences Libraries
ERIC Educational Resources Information Center
Dali, Keren; Dilevko, Juris
2006-01-01
Although fiction plays a prominent role in the interdisciplinary field of medical humanities (MH), it is physically and intellectually isolated from non-fiction in academic health sciences libraries. Using the Literature, Arts, and Medicine Database (LAMD) as a tool for selection and subject analysis, we suggest a method of integrating fiction…
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...
Faster than "g", Revisited with High-Speed Imaging
ERIC Educational Resources Information Center
Vollmer, Michael; Mollmann, Klaus-Peter
2012-01-01
The introduction of modern high-speed cameras in physics teaching provides a tool not only for easy visualization, but also for quantitative analysis of many simple though fast occurring phenomena. As an example, we present a very well-known demonstration experiment--sometimes also discussed in the context of falling chimneys--which is commonly…
NASA Technical Reports Server (NTRS)
Roberts, Aaron
2005-01-01
New tools for data access and visualization promise to make the analysis of space plasma data both more efficient and more powerful, especially for answering questions about the global structure and dynamics of the Sun-Earth system. We will show how new existing tools (particularly the Virtual Space Physics Observatory-VSPO-and the Visual System for Browsing, Analysis and Retrieval of Data-ViSBARD; look for the acronyms in Google) already provide rapid access to such information as spacecraft orbits, browse plots, and detailed data, as well as visualizations that can quickly unite our view of multispacecraft observations. We will show movies illustrating multispacecraft observations of the solar wind and magnetosphere during a magnetic storm, and of simulations of 3 0-spacecraft observations derived from MHD simulations of the magnetosphere sampled along likely trajectories of the spacecraft for the MagCon mission. An important issue remaining to be solved is how best to integrate simulation data and services into the Virtual Observatory environment, and this talk will hopefully stimulate further discussion along these lines.
Gelfusa, M; Gaudio, P; Malizia, A; Murari, A; Vega, J; Richetta, M; Gonzalez, S
2014-06-01
Recently, surveying large areas in an automatic way, for early detection of both harmful chemical agents and forest fires, has become a strategic objective of defence and public health organisations. The Lidar and Dial techniques are widely recognized as a cost-effective alternative to monitor large portions of the atmosphere. To maximize the effectiveness of the measurements and to guarantee reliable monitoring of large areas, new data analysis techniques are required. In this paper, an original tool, the Universal Multi Event Locator, is applied to the problem of automatically identifying the time location of peaks in Lidar and Dial measurements for environmental physics applications. This analysis technique improves various aspects of the measurements, ranging from the resilience to drift in the laser sources to the increase of the system sensitivity. The method is also fully general, purely software, and can therefore be applied to a large variety of problems without any additional cost. The potential of the proposed technique is exemplified with the help of data of various instruments acquired during several experimental campaigns in the field.
Installation and Testing of ITER Integrated Modeling and Analysis Suite (IMAS) on DIII-D
NASA Astrophysics Data System (ADS)
Lao, L.; Kostuk, M.; Meneghini, O.; Smith, S.; Staebler, G.; Kalling, R.; Pinches, S.
2017-10-01
A critical objective of the ITER Integrated Modeling Program is the development of IMAS to support ITER plasma operation and research activities. An IMAS framework has been established based on the earlier work carried out within the EU. It consists of a physics data model and a workflow engine. The data model is capable of representing both simulation and experimental data and is applicable to ITER and other devices. IMAS has been successfully installed on a local DIII-D server using a flexible installer capable of managing the core data access tools (Access Layer and Data Dictionary) and optionally the Kepler workflow engine and coupling tools. A general adaptor for OMFIT (a workflow engine) is being built for adaptation of any analysis code to IMAS using a new IMAS universal access layer (UAL) interface developed from an existing OMFIT EU Integrated Tokamak Modeling UAL. Ongoing work includes development of a general adaptor for EFIT and TGLF based on this new UAL that can be readily extended for other physics codes within OMFIT. Work supported by US DOE under DE-FC02-04ER54698.
Analysis Tools for Next-Generation Hadron Spectroscopy Experiments
NASA Astrophysics Data System (ADS)
Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.
The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.
Analysis Tools for Next-Generation Hadron Spectroscopy Experiments
Battaglieri, Marco; Briscoe, William; Celentano, Andrea; ...
2015-01-01
The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less
Visualization of International Solar-Terrestrial Physics Program (ISTP) data
NASA Technical Reports Server (NTRS)
Kessel, Ramona L.; Candey, Robert M.; Hsieh, Syau-Yun W.; Kayser, Susan
1995-01-01
The International Solar-Terrestrial Physics Program (ISTP) is a multispacecraft, multinational program whose objective is to promote further understanding of the Earth's complex plasma environment. Extensive data sharing and data analysis will be needed to ensure the success of the overall ISTP program. For this reason, there has been a special emphasis on data standards throughout ISTP. One of the key tools will be the common data format (CDF), developed, maintained, and evolved at the National Space Science Data Center (NSSDC), with the set of ISTP implementation guidelines specially designed for space physics data sets by the Space Physics Data Facility (associated with the NSSDC). The ISTP guidelines were developed to facilitate searching, plotting, merging, and subsetting of data sets. We focus here on the plotting application. A prototype software package was developed to plot key parameter (KP) data from the ISTP program at the Science Planning and Operations Facility (SPOF). The ISTP Key Parameter Visualization Tool is based on the Interactive Data Language (IDL) and is keyed to the ISTP guidelines, reading data stored in CDF. With the combination of CDF, the ISTP guidelines, and the visualization software, we can look forward to easier and more effective data sharing and use among ISTP scientists.
U-Access: a web-based system for routing pedestrians of differing abilities
NASA Astrophysics Data System (ADS)
Sobek, Adam D.; Miller, Harvey J.
2006-09-01
For most people, traveling through urban and built environments is straightforward. However, for people with physical disabilities, even a short trip can be difficult and perhaps impossible. This paper provides the design and implementation of a web-based system for the routing and prescriptive analysis of pedestrians with different physical abilities within built environments. U-Access, as a routing tool, provides pedestrians with the shortest feasible route with respect to one of three differing ability levels, namely, peripatetic (unaided mobility), aided mobility (mobility with the help of a cane, walker or crutches) and wheelchair users. U-Access is also an analytical tool that can help identify obstacles in built environments that create routing discrepancies among pedestrians with different physical abilities. This paper discusses the system design, including database, algorithm and interface specifications, and technologies for efficiently delivering results through the World Wide Web (WWW). This paper also provides an illustrative example of a routing problem and an analytical evaluation of the existing infrastructure which identifies the obstacles that pose the greatest discrepancies between physical ability levels. U-Access was evaluated by wheelchair users and route experts from the Center for Disability Services at The University of Utah, USA.
The change in critical technologies for computational physics
NASA Technical Reports Server (NTRS)
Watson, Val
1990-01-01
It is noted that the types of technology required for computational physics are changing as the field matures. Emphasis has shifted from computer technology to algorithm technology and, finally, to visual analysis technology as areas of critical research for this field. High-performance graphical workstations tied to a supercommunicator with high-speed communications along with the development of especially tailored visualization software has enabled analysis of highly complex fluid-dynamics simulations. Particular reference is made here to the development of visual analysis tools at NASA's Numerical Aerodynamics Simulation Facility. The next technology which this field requires is one that would eliminate visual clutter by extracting key features of simulations of physics and technology in order to create displays that clearly portray these key features. Research in the tuning of visual displays to human cognitive abilities is proposed. The immediate transfer of technology to all levels of computers, specifically the inclusion of visualization primitives in basic software developments for all work stations and PCs, is recommended.
Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki
2018-04-01
Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.
Combustion and Magnetohydrodynamic Processes in Advanced Pulse Detonation Rocket Engines
2012-10-01
use of high-order numerical methods can also be a powerful tool in the analysis of such complex flows, but we need to understand the interaction of...computational physics, 43(2):357372, 1981. [47] B. Einfeldt. On godunov-type methods for gas dynamics . SIAM Journal on Numerical Analysis , pages 294...dimensional effects with complex reaction kinetics, the simple one-dimensional detonation structure provides a rich spectrum of dynamical features which are
RooStatsCms: A tool for analysis modelling, combination and statistical studies
NASA Astrophysics Data System (ADS)
Piparo, D.; Schott, G.; Quast, G.
2010-04-01
RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Ronald W.; Collins, Benjamin S.; Godfrey, Andrew T.
2016-12-09
In order to support engineering analysis of Virtual Environment for Reactor Analysis (VERA) model results, the Consortium for Advanced Simulation of Light Water Reactors (CASL) needs a tool that provides visualizations of HDF5 files that adhere to the VERAOUT specification. VERAView provides an interactive graphical interface for the visualization and engineering analyses of output data from VERA. The Python-based software provides instantaneous 2D and 3D images, 1D plots, and alphanumeric data from VERA multi-physics simulations.
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea
2000-01-01
The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.
Integral Full Core Multi-Physics PWR Benchmark with Measured Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forget, Benoit; Smith, Kord; Kumar, Shikhar
In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less
Development of an Integrated Human Factors Toolkit
NASA Technical Reports Server (NTRS)
Resnick, Marc L.
2003-01-01
An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.
Tagliente, Irene; Solvoll, Terje; Trieste, Leopoldo; De Cecco, Carlo N; Murgia, Fabrizio; Bella, Sergio
2016-09-14
Obesity is one of the biggest drivers of preventable chronic diseases and healthcare costs in Worldwide. Different prevention activities are suggested. By monitoring daily energy expenditure (EE) could be possible make personalized diets and programming physical activity. In this, physical inactivity is one of the most important public health problems. Some studies refer the effort of the international community in promoting physical activities. Physical activity can be promoted only by increasing citizens' empowerment on taking care of their health, and it passes from the improving of individual information. Technology can offer solutions and metrics for monitoring and measuring daily activity by interacting with individuals, sharing information and feedbacks. In this study we review indicators of total energy expenditure and weaknesses of available devices in assessing these parameters. Literature review and technology testing EuNetHta core model. For the clinical aspects, it is fundamental to take into account all the factor that can influence the personal energy expenditure as: heart rate, blood pressure and thermoregulation (influenced by the body temperature). In this study we focused the attention on the importance of tools to encourage the physical activity. We made an analysis of the factor that can influence the right analysis of energy expenditure and at the same time the energy regime. A punctual monitoring of the exercise regime could be helpful in Telemedicine application as Telemonitorig. More study are needed to value the impact of physical activity tracker in Telemonitorig protocols. On the assessment of the energy expenditure, critical issues are related to the physiological data acquisition. Sensors connected with mobile devices could be important tools for disease prevention and interventions affecting health behaviors. New devices applications are potential useful for telemedicine assistance, but security of data and the related communication protocol limits should be taking into account.
The AAPT/ComPADRE Digital Library: Supporting Physics Education at All Levels
NASA Astrophysics Data System (ADS)
Mason, Bruce
For more than a decade, the AAPT/ComPADRE Digital Library has been providing online resources, tools, and services that support broad communities of physics faculty and physics education researchers. This online library provides vetted resources for teachers and students, an environment for authors and developers to share their work, and the collaboration tools for a diverse set of users. This talk will focus on the recent collaborations and developments being hosted on or developed with ComPADRE. Examples include PhysPort, making the tools and resources developed by physics education researchers more accessible, the Open Source Physics project, expanding the use of numerical modeling at all levels of physics education, and PICUP, a community for those promoting computation in the physics curriculum. NSF-0435336, 0532798, 0840768, 0937836.
MSL: Facilitating automatic and physical analysis of published scientific literature in PDF format
Ahmed, Zeeshan; Dandekar, Thomas
2018-01-01
Published scientific literature contains millions of figures, including information about the results obtained from different scientific experiments e.g. PCR-ELISA data, microarray analysis, gel electrophoresis, mass spectrometry data, DNA/RNA sequencing, diagnostic imaging (CT/MRI and ultrasound scans), and medicinal imaging like electroencephalography (EEG), magnetoencephalography (MEG), echocardiography (ECG), positron-emission tomography (PET) images. The importance of biomedical figures has been widely recognized in scientific and medicine communities, as they play a vital role in providing major original data, experimental and computational results in concise form. One major challenge for implementing a system for scientific literature analysis is extracting and analyzing text and figures from published PDF files by physical and logical document analysis. Here we present a product line architecture based bioinformatics tool ‘Mining Scientific Literature (MSL)’, which supports the extraction of text and images by interpreting all kinds of published PDF files using advanced data mining and image processing techniques. It provides modules for the marginalization of extracted text based on different coordinates and keywords, visualization of extracted figures and extraction of embedded text from all kinds of biological and biomedical figures using applied Optimal Character Recognition (OCR). Moreover, for further analysis and usage, it generates the system’s output in different formats including text, PDF, XML and images files. Hence, MSL is an easy to install and use analysis tool to interpret published scientific literature in PDF format. PMID:29721305
CFD Analysis of Turbo Expander for Cryogenic Refrigeration and Liquefaction Cycles
NASA Astrophysics Data System (ADS)
Verma, Rahul; Sam, Ashish Alex; Ghosh, Parthasarathi
Computational Fluid Dynamics analysis has emerged as a necessary tool for designing of turbomachinery. It helps to understand the various sources of inefficiency through investigation of flow physics of the turbine. In this paper, 3D turbulent flow analysis of a cryogenic turboexpander for small scale air separation was performed using Ansys CFX®. The turboexpander has been designed following assumptions based on meanlineblade generation procedure provided in open literature and good engineering judgement. Through analysis of flow field, modifications and further analysis required to evolve a more robust design procedure, have been suggested.
Johansson, Maria; Brunt, David
2012-04-01
The primary aim of the present study was to investigate if methods derived from environmental psychology can be used to study the qualities of the physical environment of supported housing facilities for persons with psychiatric disabilities. Three units of analysis were selected: the private area, the common indoor area, and the outdoor area. Expert assessments of 110 features of the physical environment in these units and semantic environmental description of the visual experience of them consistently showed that purpose-built supported housing facilities had more physical features important for high quality residential environments than the non-purpose-built supported housing facilities. The employed methods were thus seen to be able to describe and discriminate between qualities in the physical environment of supported housing facilities. Suggestions for the development of tools for the assessment of the physical environment in supported housing are made.
Arvidson, Elin; Börjesson, Mats; Ahlborg, Gunnar; Lindegård, Agneta; Jonsdottir, Ingibjörg H
2013-09-17
With increasing age, physical capacity decreases, while the need and time for recovery increases. At the same time, the demands of work usually do not change with age. In the near future, an aging and physically changing workforce risks reduced work ability. Therefore, the impact of different factors, such as physical activity, on work ability is of interest. Thus, the aim of this study was to evaluate the association between physical activity and work ability using both cross sectional and prospective analyses. This study was based on an extensive questionnaire survey. The number of participants included in the analysis at baseline in 2004 was 2.783, of whom 2.597 were also included in the follow-up in 2006. The primary outcome measure was the Work Ability Index (WAI), and the level of physical activity was measured using a single-item question. In the cross-sectional analysis we calculated the level of physical activity and the prevalence of poor or moderate work ability as reported by the participants. In the prospective analysis we calculated different levels of physical activity and the prevalence of positive changes in WAI-category from baseline to follow-up. In both the cross sectional and the prospective analyses the prevalence ratio was calculated using Generalized Linear Models. The cross-sectional analysis showed that with an increased level of physical activity, the reporting of poor or moderate work ability decreased. In the prospective analysis, participants reporting a higher level of physical activity were more likely to have made an improvement in WAI from 2004 to 2006. The level of physical activity seems to be related to work ability. Assessment of physical activity may also be useful as a predictive tool, potentially making it possible to prevent poor work ability and improve future work ability. For employers, the main implications of this study are the importance of promoting and facilitating the employees' engagement in physical activity, and the importance of the employees' maintaining a physically active lifestyle.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
Silverwood, Richard J.; Nitsch, Dorothea; Pierce, Mary; Kuh, Diana; Mishra, Gita D.
2011-01-01
The authors aimed to describe how longitudinal patterns of physical activity during mid-adulthood (ages 31–53 years) can be characterized using latent class analysis in a population-based birth cohort study, the Medical Research Council’s 1946 National Survey of Health and Development. Three different types of physical activity—walking, cycling, and leisure-time physical activity—were analyzed separately using self-reported data collected from questionnaires between 1977 and 1999; 3,847 study members were included in the analysis for one or more types of activity. Patterns of activity differed by sex, so stratified analyses were conducted. Two walking latent classes were identified representing low (52.8% of males in the cohort, 33.5% of females) and high (47.2%, 66.5%) levels of activity. Similar low (91.4%, 82.1%) and high (8.6%, 17.9%) classes were found for cycling, while 3 classes were identified for leisure-time physical activity: “low activity” (46.2%, 48.2%), “sports and leisure activity” (31.0%, 35.3%), and “gardening and do-it-yourself activities” (22.8%, 16.5%). The classes were reasonably or very well separated, with the exception of walking in females. Latent class analysis was found to be a useful tool for characterizing longitudinal patterns of physical activity, even when the measurement instrument differs slightly across ages, which added value in comparison with observed activity at a single age. PMID:22074812
Objective determination of image end-members in spectral mixture analysis of AVIRIS data
NASA Technical Reports Server (NTRS)
Tompkins, Stefanie; Mustard, John F.; Pieters, Carle M.; Forsyth, Donald W.
1993-01-01
Spectral mixture analysis has been shown to be a powerful, multifaceted tool for analysis of multi- and hyper-spectral data. Applications of AVIRIS data have ranged from mapping soils and bedrock to ecosystem studies. During the first phase of the approach, a set of end-members are selected from an image cube (image end-members) that best account for its spectral variance within a constrained, linear least squares mixing model. These image end-members are usually selected using a priori knowledge and successive trial and error solutions to refine the total number and physical location of the end-members. However, in many situations a more objective method of determining these essential components is desired. We approach the problem of image end-member determination objectively by using the inherent variance of the data. Unlike purely statistical methods such as factor analysis, this approach derives solutions that conform to a physically realistic model.
Inquiry Science for Liberal Arts Students: A Topical Course on Sound
NASA Astrophysics Data System (ADS)
Pine, Jerry; Hinckley, Joy; Mims, Sandra; Smith, Joel
1997-04-01
We have developed a topical general studies physics course for liberal arts students, and particularly for preservice elementary teachers. The course is taught entirely in a lab, and is based on a mix of student inquiries and ''sense-making'' in discussion. There are no lectures. A physics professor and a master elementary teacher co-lead. The students begin by conceptualizing the nature of sound by examining everyday phenomena, and then progress through a study of topics such as waves, interference, sysnthesis of complex sounds from pure tones, analysis of complex sounds into spectra, and independent projects. They use the computer program Soundedit Pro and the Macintosh interface as a powerful tool for analysis and synthesis. The student response has been extremely enthusiastic, though most have come to the course with very strong physics anxiety. The course has so far been trial-taught at five California campuses, and incorporatio into some of hte regular curricula seems promising.
Meader, Nicholas; Mitchell, Alex J; Chew-Graham, Carolyn; Goldberg, David; Rizzo, Maria; Bird, Victoria; Kessler, David; Packham, Jon; Haddad, Mark; Pilling, Stephen
2011-01-01
Background Depression is more likely in patients with chronic physical illness, and is associated with increased rates of disability and mortality. Effective treatment of depression may reduce morbidity and mortality. The use of two stem questions for case finding in diabetes and coronary heart disease is advocated in the Quality and Outcomes Framework, and has become normalised into primary care. Aim To define the most effective tool for use in consultations to detect depression in people with chronic physical illness. Design Meta-analysis. Method The following data sources were searched: CENTRAL, CINAHL, Embase, HMIC, MEDLINE, PsycINFO, Web of Knowledge, from inception to July 2009. Three authors selected studies that examined identification tools and used an interview-based ICD (International Classification of Diseases) or DSM (Diagnostic and statistical Manual of Mental Disorders) diagnosis of depression as reference standard. At least two authors independently extracted study characteristics and outcome data and assessed methodological quality. Results A total of 113 studies met the eligibility criteria, providing data on 20 826 participants. It was found that two stem questions, PHQ-9 (Patient Health Questionnaire), the Zung, and GHQ-28 (General Health Questionnaire) were the optimal measures for case identification, but no method was sufficiently accurate to recommend as a definitive case-finding tool. Limitations were the moderate-to-high heterogeneity for most scales and the facts that few studies used ICD diagnoses as the reference standard, and that a variety of methods were used to determine DSM diagnoses. Conclusion Assessing both validity and ease of use, the two stem questions are the preferred method. However, clinicians should not rely on the two-questions approach alone, but should be confident to engage in a more detailed clinical assessment of patients who score positively. PMID:22137418
A Neural-Network-Based Semi-Automated Geospatial Classification Tool
NASA Astrophysics Data System (ADS)
Hale, R. G.; Herzfeld, U. C.
2014-12-01
North America's largest glacier system, the Bering Bagley Glacier System (BBGS) in Alaska, surged in 2011-2013, as shown by rapid mass transfer, elevation change, and heavy crevassing. Little is known about the physics controlling surge glaciers' semi-cyclic patterns; therefore, it is crucial to collect and analyze as much data as possible so that predictive models can be made. In addition, physical signs frozen in ice in the form of crevasses may help serve as a warning for future surges. The BBGS surge provided an opportunity to develop an automated classification tool for crevasse classification based on imagery collected from small aircraft. The classification allows one to link image classification to geophysical processes associated with ice deformation. The tool uses an approach that employs geostatistical functions and a feed-forward perceptron with error back-propagation. The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network (NN) can recognize. In an application to preform analysis on airborne video graphic data from the surge of the BBGS, an NN was able to distinguish 18 different crevasse classes with 95 percent or higher accuracy, for over 3,000 images. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we designed the tool's semi-automated pre-training algorithm to be adaptable. The tool can be optimized to specific settings and variables of image analysis: (airborne and satellite imagery, different camera types, observation altitude, number and types of classes, and resolution). The generalization of the classification tool brings three important advantages: (1) multiple types of problems in geophysics can be studied, (2) the training process is sufficiently formalized to allow non-experts in neural nets to perform the training process, and (3) the time required to manually pre-sort imagery into classes is greatly reduced.
Thermal protection system (TPS) monitoring using acoustic emission
NASA Astrophysics Data System (ADS)
Hurley, D. A.; Huston, D. R.; Fletcher, D. G.; Owens, W. P.
2011-04-01
This project investigates acoustic emission (AE) as a tool for monitoring the degradation of thermal protection systems (TPS). The AE sensors are part of an array of instrumentation on an inductively coupled plasma (ICP) torch designed for testing advanced thermal protection aerospace materials used for hypervelocity vehicles. AE are generated by stresses within the material, propagate as elastic stress waves, and can be detected with sensitive instrumentation. Graphite (POCO DFP-2) is used to study gas-surface interaction during degradation of thermal protection materials. The plasma is produced by a RF magnetic field driven by a 30kW power supply at 3.5 MHz, which creates a noisy environment with large spikes when powered on or off. AE are waveguided from source to sensor by a liquid-cooled copper probe used to position the graphite sample in the plasma stream. Preliminary testing was used to set filters and thresholds on the AE detection system (Physical Acoustics PCI-2) to minimize the impact of considerable operating noise. Testing results show good correlation between AE data and testing environment, which dictates the physics and chemistry of the thermal breakdown of the sample. Current efforts for the project are expanding the dataset and developing statistical analysis tools. This study shows the potential of AE as a powerful tool for analysis of thermal protection material thermal degradations with the unique capability of real-time, in-situ monitoring.
Impact of design features upon perceived tool usability and safety
NASA Astrophysics Data System (ADS)
Wiker, Steven F.; Seol, Mun-Su
2005-11-01
While injuries from powered hand tools are caused by a number of factors, this study looks specifically at the impact of the tools design features on perceived tool usability and safety. The tools used in this study are circular saws, power drills and power nailers. Sixty-nine males and thirty-two females completed an anonymous web-based questionnaire that provided orthogonal view photographs of the various tools. Subjects or raters provided: 1) description of the respondents or raters, 2) description of the responses from the raters, and 3) analysis of the interrelationships among respondent ratings of tool safety and usability, physical metrics of the tool, and rater demographic information. The results of the study found that safety and usability were dependent materially upon rater history of use and experience, but not upon training in safety and usability, or quality of design features of the tools (e.g., grip diameters, trigger design, guards, etc.). Thus, positive and negative transfer of prior experience with use of powered hand tools is far more important than any expectancy that may be driven by prior safety and usability training, or from the visual cues that are provided by the engineering design of the tool.
Systematic review of fall risk screening tools for older patients in acute hospitals.
Matarese, Maria; Ivziku, Dhurata; Bartolozzi, Francesco; Piredda, Michela; De Marinis, Maria Grazia
2015-06-01
To determine the most accurate fall risk screening tools for predicting falls among patients aged 65 years or older admitted to acute care hospitals. Falls represent a serious problem in older inpatients due to the potential physical, social, psychological and economic consequences. Older inpatients present with risk factors associated with age-related physiological and psychological changes as well as multiple morbidities. Thus, fall risk screening tools for older adults should include these specific risk factors. There are no published recommendations addressing what tools are appropriate for older hospitalized adults. Systematic review. MEDLINE, CINAHL and Cochrane electronic databases were searched between January 1981-April 2013. Only prospective validation studies reporting sensitivity and specificity values were included. Recommendations of the Cochrane Handbook of Diagnostic Test Accuracy Reviews have been followed. Three fall risk assessment tools were evaluated in seven articles. Due to the limited number of studies, meta-analysis was carried out only for the STRATIFY and Hendrich Fall Risk Model II. In the combined analysis, the Hendrich Fall Risk Model II demonstrated higher sensitivity than STRATIFY, while the STRATIFY showed higher specificity. In both tools, the Youden index showed low prognostic accuracy. The identified tools do not demonstrate predictive values as high as needed for identifying older inpatients at risk for falls. For this reason, no tool can be recommended for fall detection. More research is needed to evaluate fall risk screening tools for older inpatients. © 2014 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
2017-12-01
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
Developing a uniformed assessment tool to evaluate care service needs for disabled persons in Japan.
Takei, Teiji; Takahashi, Hiroshi; Nakatani, Hiroki
2008-05-01
Until recently, the care services for disabled persons have been under rigid control by public sectors in terms of provision and funding in Japan. A reform was introduced in 2003 that brought a rapid increase of utilization of services and serious shortage of financial resources. Under these circumstances, the "Services and Supports for Persons with Disabilities Act" was enacted in 2005, requiring that the care service provision process should be transparent, fair and standardized. The purpose of this study is to develop an objective tool for assessing the need for disability care. In the present study we evaluate 1423 cases of patients receiving care services in 60 municipalities, including all three categories of disabilities (physical, intellectual and mental). Using the data of the total 106 items, we conducted factor analysis and regression analysis to develop an assessment tool for people with disabilities. The data revealed that instrumental activities of daily living (IADL) played an essential role in assessing disability levels. We have developed the uniformed assessment tool that has been utilized to guide the types and quantity of care services throughout Japan.
NASA Astrophysics Data System (ADS)
Wang, Jianxiong
2014-06-01
This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF
Developing the Stroke Exercise Preference Inventory (SEPI)
Bonner, Nicholas S.; O’Halloran, Paul D.; Bernhardt, Julie; Cumming, Toby B.
2016-01-01
Background Physical inactivity is highly prevalent after stroke, increasing the risk of poor health outcomes including recurrent stroke. Tailoring of exercise programs to individual preferences can improve adherence, but no tools exist for this purpose in stroke. Methods We identified potential questionnaire items for establishing exercise preferences via: (i) our preliminary Exercise Preference Questionnaire in stroke, (ii) similar tools used in other conditions, and (iii) expert panel consultations. The resulting 35-item questionnaire (SEPI-35) was administered to stroke survivors, along with measures of disability, depression, anxiety, fatigue and self-reported physical activity. Exploratory factor analysis was used to identify a factor structure in exercise preferences, providing a framework for item reduction. Associations between exercise preferences and personal characteristics were analysed using multivariable regression. Results A group of 134 community-dwelling stroke survivors (mean age 64.0, SD 13.3) participated. Analysis of the SEPI-35 identified 7 exercise preference factors (Supervision-support, Confidence-challenge, Health-wellbeing, Exercise context, Home-alone, Similar others, Music-TV). Item reduction processes yielded a 13-item version (SEPI-13); in analysis of this version, the original factor structure was maintained. Lower scores on Confidence-challenge were significantly associated with disability (p = 0.002), depression (p = 0.001) and fatigue (p = 0.001). Self-reported barriers to exercise were particularly prevalent in those experiencing fatigue and anxiety. Conclusions The SEPI-13 is a brief instrument that allows assessment of exercise preferences and barriers in the stroke population. This new tool can be employed by health professionals to inform the development of individually tailored exercise interventions. PMID:27711242
A5: Automated Analysis of Adversarial Android Applications
2014-06-03
algorithm is fairly intuitive. First, A5 invokes the DED [11] decompiler to create Java classes from the Android application code. Next, A5 uses Soot [30...implemented such as Bluetooth, Wi-Fi, sensors , etc. These hardware features are very common in physical devices and are simply not present in the...such as Androguard [1] and Soot [30]. Deficiencies in these tools may also manifest in A5. The bytecode static analysis is limited to finding only
Research in High Energy Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Robert John; Toki, Walter; Harton, John
This report summarizes research performed within the Department of Energy Office of Science's Intensity Frontier and Cosmic Frontier High Energy Physics research subprograms during the period 2014-17. The major research thrusts in the Intensity Frontier involved two currently active neutrino experiments T2K and NOvA; participation in development for the new Short-Baseline Neutrino program at Fermilab (SBN), which will begin full operation within the next one to two years; and physics tools, analysis and detector prototyping for the future Deep Underground Neutrino Experiment (DUNE). The major research thrusts in the Cosmic Frontier involved the Pierre Auger Observatory and the Directional Recoilmore » Identification From Tracks (DRIFT) dark matter search experiment.« less
Papaleo, Elena
2015-01-01
In the last years, we have been observing remarkable improvements in the field of protein dynamics. Indeed, we can now study protein dynamics in atomistic details over several timescales with a rich portfolio of experimental and computational techniques. On one side, this provides us with the possibility to validate simulation methods and physical models against a broad range of experimental observables. On the other side, it also allows a complementary and comprehensive view on protein structure and dynamics. What is needed now is a better understanding of the link between the dynamic properties that we observe and the functional properties of these important cellular machines. To make progresses in this direction, we need to improve the physical models used to describe proteins and solvent in molecular dynamics, as well as to strengthen the integration of experiments and simulations to overcome their own limitations. Moreover, now that we have the means to study protein dynamics in great details, we need new tools to understand the information embedded in the protein ensembles and in their dynamic signature. With this aim in mind, we should enrich the current tools for analysis of biomolecular simulations with attention to the effects that can be propagated over long distances and are often associated to important biological functions. In this context, approaches inspired by network analysis can make an important contribution to the analysis of molecular dynamics simulations.
Lan, Shao-Huan; Lu, Li-Chin; Lan, Shou-Jen; Chen, Jong-Chen; Wu, Wen-Jun; Chang, Shen-Peng; Lin, Long-Yau
2017-08-01
"Physical restraint" formerly used as a measure of protection for psychiatric patients is now widely used. However, existing studies showed that physical restraint not only has inadequate effect of protection but also has negative effects on residents. To analyzes the impact of educational program on the physical restraint use in long-term care facilities. A systematic review with meta-analysis and meta-regression. Eight databases, including Cochrane Library, ProQuest, PubMed, EMBASE, EBSCO, Web of Science, Ovid Medline and Physiotherapy Evidence Database (PEDro), were searched up to January 2017. Eligible studies were classified by intervention and accessed for quality using the Quality Assessment Tool for quantitative studies. Sixteen research articles were eligible in the final review; 10 randomize control trail studies were included in the analysis. The meta-analysis revealed that the use of physical restraint was significantly less often in the experimental (education) group (OR = 0.55, 95% CI: 0.39 to 0.78, p < 0.001) compared to the control group. Meta-regression revealed the period of post education would have decreased the effect of the restraint educational program (β: 0.08, p = 0.002); instead, the longer education period and more times of education would have a stronger effect of reducing the use of physical restraint (β: -0.07, p < 0.001; β: -0.04, p = 0.056). The educational program had an effect on the reduced use of physical restraint. The results of meta-regression suggest that long-term care facilities should provide a continuous education program of physical restraint for caregivers. Copyright © 2017. Published by Elsevier Taiwan.
Critical voids in exposure data and models lead risk assessors to rely on conservative assumptions. Risk assessors and managers need improved tools beyond the screening level analysis to address aggregate exposures to pesticides as required by the Food Quality Protection Act o...
ERIC Educational Resources Information Center
Ding, Lin
2014-01-01
Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses.…
Goal Programming: A New Tool for the Christmas Tree Industry
Bruce G. Hansen
1977-01-01
Goal programing (GP) can be useful for decision making in the natural Christmas tree industry. Its usefulness is demonstrated through an analysis of a hypothetical problem in which two potential growers decide how to use 10 acres in growing Christmas trees. Though the physical settings are identical, distinct differences between their goals significantly influence the...
ERIC Educational Resources Information Center
Famularo, Nicole; Kholod, Yana; Kosenkov, Dmytro
2016-01-01
This project is designed to improve physical chemistry and instrumental analysis laboratory courses for undergraduate students by employing as teaching tools novel technologies in electronics and data integration using the industrial Internet. The project carried out by upper-division undergraduates is described. Students are exposed to a complete…
USDA-ARS?s Scientific Manuscript database
The Consumo Alimentar e Atividade Fisica de Escolares (CAAFE) questionnaire is an online research tool that has been developed to enable the self-report of physical activity and diet by Brazilian schoolchildren aged 7–10 years. Formative research was conducted with nutritionists during the developme...
Videos Determine the Moon's "g"
ERIC Educational Resources Information Center
Persson, J. R.; Hagen, J. E.
2011-01-01
Determining the acceleration of a free-falling object due to gravity is a standard experiment in physics. Different methods to do this have been developed over the years. This article discusses the use of video-analysis tools as another method. If there is a video available and a known scale it is possible to analyse the motion. The use of video…
Toward better physics labs for future biologists
NASA Astrophysics Data System (ADS)
Moore, K.; Giannini, J.; Losert, W.
2014-05-01
We have developed a set of laboratories and hands on activities to accompany a new two-semester interdisciplinary physics course that has been developed and tested in two small test classes at the University of Maryland, College Park (UMD) in 2012-2013. We have designed the laboratories to be taken accompanying a reformed course in the student's second year, with calculus, biology, and chemistry as prerequisites. These prerequisites permit the laboratories to include significant content on physics relevant to cellular scales, from chemical interactions to random motion and charge screening in fluids. We also introduce students to research-grade equipment and modern physics analysis tools in contexts relevant to biology while maintaining the pedagogically valuable open-ended laboratory structure of reformed laboratories. Preliminary student response results from these two classes are discussed.
Rallis, Austin; Fercho, Kelene A; Bosch, Taylor J; Baugh, Lee A
2018-01-31
Tool use is associated with three visual streams-dorso-dorsal, ventro-dorsal, and ventral visual streams. These streams are involved in processing online motor planning, action semantics, and tool semantics features, respectively. Little is known about the way in which the brain represents virtual tools. To directly assess this question, a virtual tool paradigm was created that provided the ability to manipulate tool components in isolation of one another. During functional magnetic resonance imaging (fMRI), adult participants performed a series of virtual tool manipulation tasks in which vision and movement kinematics of the tool were manipulated. Reaction time and hand movement direction were monitored while the tasks were performed. Functional imaging revealed that activity within all three visual streams was present, in a similar pattern to what would be expected with physical tool use. However, a previously unreported network of right-hemisphere activity was found including right inferior parietal lobule, middle and superior temporal gyri and supramarginal gyrus - regions well known to be associated with tool processing within the left hemisphere. These results provide evidence that both virtual and physical tools are processed within the same brain regions, though virtual tools recruit bilateral tool processing regions to a greater extent than physical tools. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Weber, Gunther H.
2014-03-31
Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.
Physics Mining of Multi-Source Data Sets
NASA Technical Reports Server (NTRS)
Helly, John; Karimabadi, Homa; Sipes, Tamara
2012-01-01
Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek
Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less
Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...
2017-04-24
Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less
Pattern detection in stream networks: Quantifying spatialvariability in fish distribution
Torgersen, Christian E.; Gresswell, Robert E.; Bateman, Douglas S.
2004-01-01
Biological and physical properties of rivers and streams are inherently difficult to sample and visualize at the resolution and extent necessary to detect fine-scale distributional patterns over large areas. Satellite imagery and broad-scale fish survey methods are effective for quantifying spatial variability in biological and physical variables over a range of scales in marine environments but are often too coarse in resolution to address conservation needs in inland fisheries management. We present methods for sampling and analyzing multiscale, spatially continuous patterns of stream fishes and physical habitat in small- to medium-size watersheds (500–1000 hectares). Geospatial tools, including geographic information system (GIS) software such as ArcInfo dynamic segmentation and ArcScene 3D analyst modules, were used to display complex biological and physical datasets. These tools also provided spatial referencing information (e.g. Cartesian and route-measure coordinates) necessary for conducting geostatistical analyses of spatial patterns (empirical semivariograms and wavelet analysis) in linear stream networks. Graphical depiction of fish distribution along a one-dimensional longitudinal profile and throughout the stream network (superimposed on a 10-metre digital elevation model) provided the spatial context necessary for describing and interpreting the relationship between landscape pattern and the distribution of coastal cutthroat trout (Oncorhynchus clarki clarki) in western Oregon, U.S.A. The distribution of coastal cutthroat trout was highly autocorrelated and exhibited a spherical semivariogram with a defined nugget, sill, and range. Wavelet analysis of the main-stem longitudinal profile revealed periodicity in trout distribution at three nested spatial scales corresponding ostensibly to landscape disturbances and the spacing of tributary junctions.
Lee, Miyoung; Zhu, Weimo; Ackley-Holbrook, Elizabeth; Brower, Diana G; McMurray, Bryan
2014-07-01
It is critical to employ accurate measures when assessing physical activity (PA) barriers in any subpopulation, yet existing measures are not appropriate for persons with blindness or visual impairment (PBVI) due to a lack of validity or reliability evidence. To develop and calibrate a PA barrier scale for PBVI. An expert panel (n = 3) and 18 PBVI were recruited to establish content validity for a PA barriers subscale; 160 PBVI (96 females) completed the scale along with the Physical Activity Scale for Individuals with Physical Disabilities for calibration. To establish construct-related validity evidence, Confirmative factor analysis (CFA) and Rasch analysis were applied. To investigate internal consistency and reliability, Cronbach's alpha and the reliability coefficient (R) were employed, respectively. Following CFA and Rasch analyses, five items were eliminated due to misfits; reliability coefficients were unchanged upon deletion of these items. The barriers perceived by PBVI to have the most negative impact on PA included "lack of self-discipline" (logit = 1.40) and "lack of motivation" (logit = 1.27). "Too many stairs in the exercise facility" (logit = -1.49) was perceived to have the least impact. The newly-developed scale was found to be a valid and reliable tool for evaluating PA barriers in PBVI. To enhance promotion of health-producing levels of PA in PBVI, practitioners should consider applying this new tool as a precursor to programs aimed at improving PA participation in this group. Copyright © 2014 Elsevier Inc. All rights reserved.
Advanced Computing Tools and Models for Accelerator Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryne, Robert; Ryne, Robert D.
2008-06-11
This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.
Cossio-Bolaños, Marco; Vasquez, Pablo; Luarte-Rocha, Cristian; Sulla-Torres, José; Gómez Campos, Rossana
2016-08-01
Physical fitness may be assessed among children and adolescents in a quantitative and qualitative manner. At present, in Chile, there are no tools available to assess self-perception of physical fitness. Therefore, the purpose of this study was to develop a valid and reliable instrument that would allow to assess selfperception of physical fitness among adolescents and propose standards for age and sex. A survey was administered among adolescent students from six public schools in the Maule Region, Chile, selected in a probabilistic (stratified) fashion. To measure self-perception of physical fitness, a qualitative instrument was developed: the Self-Perception of Physical Fitness Scale (EAPAF, escala de autopercepcion de la aptitud fisica), which is made up of four dimensions and 18 questions. The LMS method (L: Box-Cox coefficient, M: median curve, and S: variation coefficient) was used to establish percentiles and propose references by dimension, age and sex. A total of 3060 adolescents (1702 boys and 1358 girls) aged 11.0 to 18.9 years old were included. The factor analysis evidenced four factors. Saturation values were above 0.40. The percentage of instrument explanation reached 54.24%. In terms of reliability, the 18 questions reflected that Cronbach's alpha was between 0.82 and 0.85. Percentiles (p15, p50 and p85) were developed to classify self-perception of physical fitness by dimension, age and sex. Boys showed higher scores in the self-perception of physical fitness scale when compared to girls (p 〈 0.001). The instrument developed in this study was valid and reliable. In addition, the standards proposed may become a useful tool to classify adolescents in relation to their selfperception of physical fitness. Sociedad Argentina de Pediatría.
FAST Modularization Framework for Wind Turbine Simulation: Full-System Linearization
Jonkman, Jason M.; Jonkman, Bonnie J.
2016-10-03
The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well-established methods and tools for analyzing linear systems. Here, this paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.
FAST modularization framework for wind turbine simulation: full-system linearization
NASA Astrophysics Data System (ADS)
Jonkman, J. M.; Jonkman, B. J.
2016-09-01
The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well- established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.
NASA Technical Reports Server (NTRS)
Kolecki, Joseph C.
2005-01-01
Tensor analysis is one of the more abstruse, even if one of the more useful, higher math subjects enjoined by students of physics and engineering. It is abstruse because of the intellectual gap that exists between where most physics and engineering mathematics leave off and where tensor analysis traditionally begins. It is useful because of its great generality, computational power, and compact, easy to use, notation. This paper bridges the intellectual gap. It is divided into three parts: algebra, calculus, and relativity. Algebra: In tensor analysis, coordinate independent quantities are sought for applications in physics and engineering. Coordinate independence means that the quantities have such coordinate transformations as to leave them invariant relative to a particular observer s coordinate system. Calculus: Non-zero base vector derivatives contribute terms to dynamical equations that correspond to pseudoaccelerations in accelerated coordinate systems and to curvature or gravity in relativity. These derivatives have a specific general form in tensor analysis. Relativity: Spacetime has an intrinsic geometry. Light is the tool for investigating that geometry. Since the observed geometry of spacetime cannot be made to match the classical geometry of Euclid, Einstein applied another more general geometry differential geometry. The merger of differential geometry and cosmology was accomplished in the theory of relativity. In relativity, gravity is equivalent to curvature.
Physical activity level and fall risk among community-dwelling older adults.
Low, Sok Teng; Balaraman, Thirumalaya
2017-07-01
[Purpose] To find the physical activity level and fall risk among the community-dwelling Malaysian older adults and determine the correlation between them. [Subjects and Methods] A cross-sectional study was conducted in which, the physical activity level was evaluated using the Rapid Assessment of Physical Activity questionnaire and fall risk with Fall Risk Assessment Tool. Subjects recruited were 132 community-dwelling Malaysian older adults using the convenience sampling method. [Results] The majority of the participants were under the category of under-active regular light-activities and most of them reported low fall risk. The statistical analysis using Fisher's exact test did not show a significant correlation between physical activity level and fall risk. [Conclusion] The majority of community-dwelling Malaysian older adults are performing some form of physical activity and in low fall risk category. But this study did not find any significant correlation between physical activity level and fall risk among community-dwelling older adults in Malaysia.
NASA Astrophysics Data System (ADS)
Bompard, E.; Ma, Y. C.; Ragazzi, E.
2006-03-01
Competition has been introduced in the electricity markets with the goal of reducing prices and improving efficiency. The basic idea which stays behind this choice is that, in competitive markets, a greater quantity of the good is exchanged at a lower price, leading to higher market efficiency. Electricity markets are pretty different from other commodities mainly due to the physical constraints related to the network structure that may impact the market performance. The network structure of the system on which the economic transactions need to be undertaken poses strict physical and operational constraints. Strategic interactions among producers that game the market with the objective of maximizing their producer surplus must be taken into account when modeling competitive electricity markets. The physical constraints, specific of the electricity markets, provide additional opportunity of gaming to the market players. Game theory provides a tool to model such a context. This paper discussed the application of game theory to physical constrained electricity markets with the goal of providing tools for assessing the market performance and pinpointing the critical network constraints that may impact the market efficiency. The basic models of game theory specifically designed to represent the electricity markets will be presented. IEEE30 bus test system of the constrained electricity market will be discussed to show the network impacts on the market performances in presence of strategic bidding behavior of the producers.
Lange, Toni; Matthijs, Omer; Jain, Nitin B; Schmitt, Jochen; Lützner, Jörg; Kopkow, Christian
2017-03-01
Shoulder pain in the general population is common and to identify the aetiology of shoulder pain, history, motion and muscle testing, and physical examination tests are usually performed. The aim of this systematic review was to summarise and evaluate intrarater and inter-rater reliability of physical examination tests in the diagnosis of shoulder pathologies. A comprehensive systematic literature search was conducted using MEDLINE, EMBASE, Allied and Complementary Medicine Database (AMED) and Physiotherapy Evidence Database (PEDro) through 20 March 2015. Methodological quality was assessed using the Quality Appraisal of Reliability Studies (QAREL) tool by 2 independent reviewers. The search strategy revealed 3259 articles, of which 18 finally met the inclusion criteria. These studies evaluated the reliability of 62 test and test variations used for the specific physical examination tests for the diagnosis of shoulder pathologies. Methodological quality ranged from 2 to 7 positive criteria of the 11 items of the QAREL tool. This review identified a lack of high-quality studies evaluating inter-rater as well as intrarater reliability of specific physical examination tests for the diagnosis of shoulder pathologies. In addition, reliability measures differed between included studies hindering proper cross-study comparisons. PROSPERO CRD42014009018. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
A Python-based interface to examine motions in time series of solar images
NASA Astrophysics Data System (ADS)
Campos-Rozo, J. I.; Vargas Domínguez, S.
2017-10-01
Python is considered to be a mature programming language, besides of being widely accepted as an engaging option for scientific analysis in multiple areas, as will be presented in this work for the particular case of solar physics research. SunPy is an open-source library based on Python that has been recently developed to furnish software tools to solar data analysis and visualization. In this work we present a graphical user interface (GUI) based on Python and Qt to effectively compute proper motions for the analysis of time series of solar data. This user-friendly computing interface, that is intended to be incorporated to the Sunpy library, uses a local correlation tracking technique and some extra tools that allows the selection of different parameters to calculate, vizualize and analyze vector velocity fields of solar data, i.e. time series of solar filtergrams and magnetograms.
#LancerHealth: Using Twitter and Instagram as a tool in a campus wide health promotion initiative.
Santarossa, Sara; Woodruff, Sarah J
2018-02-05
The present study aimed to explore using popular technology that people already have/use as a health promotion tool, in a campus wide social media health promotion initiative, entitled #LancerHealth . During a two-week period the university community was asked to share photos on Twitter and Instagram of What does being healthy on campus look like to you ?, while tagging the image with #LancerHealth . All publically tagged media was collected using the Netlytic software and analysed. Text analysis (N=234 records, Twitter; N=141 records, Instagram) revealed that the majority of the conversation was positive and focused on health and the university. Social network analysis, based on five network properties, showed a small network with little interaction. Lastly, photo coding analysis (N=71 unique image) indicated that the majority of the shared images were of physical activity (52%) and on campus (80%). Further research into this area is warranted.
XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework
NASA Astrophysics Data System (ADS)
Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò
2017-08-01
We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.
Preliminary Dynamic Feasibility and Analysis of a Spherical, Wind-Driven (Tumbleweed), Martian Rover
NASA Technical Reports Server (NTRS)
Flick, John J.; Toniolo, Matthew D.
2005-01-01
The process and findings are presented from a preliminary feasibility study examining the dynamics characteristics of a spherical wind-driven (or Tumbleweed) rover, which is intended for exploration of the Martian surface. The results of an initial feasibility study involving several worst-case mobility situations that a Tumbleweed rover might encounter on the surface of Mars are discussed. Additional topics include the evaluation of several commercially available analysis software packages that were examined as possible platforms for the development of a Monte Carlo Tumbleweed mission simulation tool. This evaluation lead to the development of the Mars Tumbleweed Monte Carlo Simulator (or Tumbleweed Simulator) using the Vortex physics software package from CM-Labs, Inc. Discussions regarding the development and evaluation of the Tumbleweed Simulator, as well as the results of a preliminary analysis using the tool are also presented. Finally, a brief conclusions section is presented.
Evaluating an holistic assessment tool for palliative care practice.
McIlfatrick, Sonja; Hasson, Felicity
2014-04-01
To evaluate a holistic assessment tool for palliative care practice. This included identifying patients' needs using the holistic tool and exploring the usability, applicability and barriers and facilitators towards implementation in practice. The delivery of effective holistic palliative care requires a careful assessment of the patients' needs and circumstances. Whilst holistic assessment of palliative care needs is advocated, questions exist around the appropriateness of tools to assist this process. Mixed-method research design. Data collection involved an analysis of piloted holistic assessments undertaken using the tool (n = 132) and two focus groups with healthcare professionals (n = 10). The tool enabled health professionals to identify and gain an understanding of the needs of the patients, specifically in relation to the physical healthcare needs. Differences, however, between the analysis of the tool documentation and focus group responses were identified in particular areas. For example, 59 (68·8%) respondents had discussed preferred priorities of care with the patient; however, focus group comments revealed participants had concerns around this. Similarly, whilst over half of responses (n = 50; 57·5%) had considered a prognostic clinical indicator for the patient as an action, focus group results indicated questions around healthcare professionals' knowledge and perceived usefulness of such indicators. Positive aspects of the tool were that it was easy to understand and captured the needs of individuals. Negative aspects of the tool were that it was repetitive and the experience of assessors required consideration. The tool evaluation identified questions regarding holistic assessment in palliative care practice and the importance of communication. A holistic assessment tool can support patient assessment and identification of patients' needs in the 'real world' of palliative care practice, but the 'tool' is merely an aid to assist professionals to discuss difficult and sensitive aspects of care. © 2013 John Wiley & Sons Ltd.
Simple Sensitivity Analysis for Orion GNC
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
The Interactions of Relationships, Interest, and Self-Efficacy in Undergraduate Physics
NASA Astrophysics Data System (ADS)
Dou, Remy
This collected papers dissertation explores students' academic interactions in an active learning, introductory physics settings as they relate to the development of physics self-efficacy and interest. The motivation for this work extends from the national call to increase participation of students in the pursuit of science, technology, engineering, and mathematics (STEM) careers. Self-efficacy and interest are factors that play prominent roles in popular, evidence-based, career theories, including the Social cognitive career theory (SCCT) and the identity framework. Understanding how these constructs develop in light of the most pervasive characteristic of the active learning introductory physics classroom (i.e., peer-to-peer interactions) has implications on how students learn in a variety of introductory STEM classrooms and settings structured after constructivist and sociocultural learning theories. I collected data related to students' in-class interactions using the tools of social network analysis (SNA). Social network analysis has recently been shown to be an effective and useful way to examine the structure of student relationships that develop in and out of STEM classrooms. This set of studies furthers the implementation of SNA as a tool to examine self-efficacy and interest formation in the active learning physics classroom. Here I represent a variety of statistical applications of SNA, including bootstrapped linear regression (Chapter 2), structural equation modeling (Chapter 3), and hierarchical linear modeling for longitudinal analyses (Chapter 4). Self-efficacy data were collected using the Sources of Self-Efficacy for Science Courses - Physics survey (SOSESC-P), and interest data were collected using the physics identity survey. Data for these studies came from the Modeling Instruction sections of Introductory Physics with Calculus offered at Florida International University in the fall of 2014 and 2015. Analyses support the idea that students' perceptions of one another impact the development of their social network centrality, which in turn affects their self-efficacy building experiences and their overall self-efficacy. It was shown that unlike career theories that emphasize causal relationships between the development of self-efficacy and the subsequent growth of student interest, in this context student interest takes precedence before the development of student self-efficacy. This outcome also has various implications for career theories.
Sonification Prototype for Space Physics
NASA Astrophysics Data System (ADS)
Candey, R. M.; Schertenleib, A. M.; Diaz Merced, W. L.
2005-12-01
As an alternative and adjunct to visual displays, auditory exploration of data via sonification (data controlled sound) and audification (audible playback of data samples) is promising for complex or rapidly/temporally changing visualizations, for data exploration of large datasets (particularly multi-dimensional datasets), and for exploring datasets in frequency rather than spatial dimensions (see also International Conferences on Auditory Display
Global Nanotribology Research Output (1996–2010): A Scientometric Analysis
Elango, Bakthavachalam; Rajendran, Periyaswamy; Bornmann, Lutz
2013-01-01
This study aims to assess the nanotribology research output at global level using scientometric tools. The SCOPUS database was used to retrieve records related to the nanotribology research for the period 1996–2010. Publications were counted on a fractional basis. The level of collaboration and its citation impact were examined. The performance of the most productive countries, institutes and most preferred journals is assessed. Various visualization tools such as the Sci2 tool and Ucinet were employed. The USA ranked top in terms of number of publications, citations per paper and h-index, while Switzerland published a higher percentage of international collaborative papers. The most productive institution was Tsinghua University followed by Ohio State University and Lanzhou Institute of Chemical Physics, CAS. The most preferred journals were Tribology Letters, Wear and Journal of Japanese Society of Tribologists. The result of author keywords analysis reveals that Molecular Dynamics, MEMS, Hard Disk and Diamond like Carbon are major research topics. PMID:24339900
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perl, J; Villagomez-Bernabe, B; Currell, F
2015-06-15
Purpose: The stochastic nature of the subatomic world presents a challenge for physics education. Even experienced physicists can be amazed at the varied behavior of electrons, x-rays, protons, neutrons, ions and the any short-lived particles that make up the overall behavior of our accelerators, brachytherapy sources and medical imaging systems. The all-particle Monte Carlo particle transport tool, TOPAS Tool for Particle Simulation, originally developed for proton therapy research, has been repurposed into a physics teaching tool, TOPAS-edu. Methods: TOPAS-edu students set up simulated particle sources, collimators, scatterers, imagers and scoring setups by writing simple ASCII files (in the TOPAS Parametermore » Control System format). Students visualize geometry setups and particle trajectories in a variety of modes from OpenGL graphics to VRML 3D viewers to gif and PostScript image files. Results written to simple comma separated values files are imported by the student into their preferred data analysis tool. Students can vary random seeds or adjust parameters of physics processes to better understand the stochastic nature of subatomic physics. Results: TOPAS-edu has been successfully deployed as the centerpiece of a physics course for master’s students at Queen’s University Belfast. Tutorials developed there takes students through a step by step course on the basics of particle transport and interaction, scattering, Bremsstrahlung, etc. At each step in the course, students build simulated experimental setups and then analyze the simulated results. Lessons build one upon another so that a student might end up with a full simulation of a medical accelerator, a water-phantom or an imager. Conclusion: TOPAS-edu was well received by students. A second application of TOPAS-edu is currently in development at Zurich University of Applied Sciences, Switzerland. It is our eventual goal to make TOPAS-edu available free of charge to any non-profit organization, along with associated tutorial materials developed by the TOPAS-edu community. Work supported in part by the U.S. Department of Energy under contract number DE-AC02-76SF00515. B. Villagomez-Bernabe is supported by CONACyT (Mexican Council for Science and Technology) project 231844.« less
Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit
NASA Astrophysics Data System (ADS)
Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.
2013-12-01
Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.
The LivePhoto Physics videos and video analysis site
NASA Astrophysics Data System (ADS)
Abbott, David
2009-09-01
The LivePhoto site is similar to an archive of short films for video analysis. Some videos have Flash tools for analyzing the video embedded in the movie. Most of the videos address mechanics topics with titles like Rolling Pencil (check this one out for pedagogy and content knowledge—nicely done!), Juggler, Yo-yo, Puck and Bar (this one is an inelastic collision with rotation), but there are a few titles in other areas (E&M, waves, thermo, etc.).
A Lunar Surface Operations Simulator
NASA Technical Reports Server (NTRS)
Nayar, H.; Balaram, J.; Cameron, J.; Jain, A.; Lim, C.; Mukherjee, R.; Peters, S.; Pomerantz, M.; Reder, L.; Shakkottai, P.;
2008-01-01
The Lunar Surface Operations Simulator (LSOS) is being developed to support planning and design of space missions to return astronauts to the moon. Vehicles, habitats, dynamic and physical processes and related environment systems are modeled and simulated in LSOS to assist in the visualization and design optimization of systems for lunar surface operations. A parametric analysis tool and a data browser were also implemented to provide an intuitive interface to run multiple simulations and review their results. The simulator and parametric analysis capability are described in this paper.
The Particle Physics Playground website: tutorials and activities using real experimental data
NASA Astrophysics Data System (ADS)
Bellis, Matthew; CMS Collaboration
2016-03-01
The CERN Open Data Portal provides access to data from the LHC experiments to anyone with the time and inclination to learn the analysis procedures. The CMS experiment has made a significant amount of data availible in basically the same format the collaboration itself uses, along with software tools and a virtual enviroment in which to run those tools. These same data have also been mined for educational exercises that range from very simple .csv files that can be analyzed in a spreadsheet to more sophisticated formats that use ROOT, a dominant software package in experimental particle physics but not used as much in the general computing community. This talk will present the Particle Physics Playground website (http://particle-physics-playground.github.io/), a project that uses data from the CMS experiment, as well as the older CLEO experiment, in tutorials and exercises aimed at high school and undergraduate students and other science enthusiasts. The data are stored as text files and the users are provided with starter Python/Jupyter notebook programs and accessor functions which can be modified to perform fairly high-level analyses. The status of the project, success stories, and future plans for the website will be presented. This work was supported in part by NSF Grant PHY-1307562.
Modelling human behaviour in a bumper car ride using molecular dynamics tools: a student project
NASA Astrophysics Data System (ADS)
Buendía, Jorge J.; Lopez, Hector; Sanchis, Guillem; Pardo, Luis Carlos
2017-05-01
Amusement parks are excellent laboratories of physics, not only to check physical laws, but also to investigate if those physical laws might also be applied to human behaviour. A group of Physics Engineering students from Universitat Politècnica de Catalunya has investigated if human behaviour, when driving bumper cars, can be modelled using tools borrowed from the analysis of molecular dynamics simulations, such as the radial and angular distribution functions. After acquiring several clips and obtaining the coordinates of the cars, those magnitudes are computed and analysed. Additionally, an analogous hard disks system is simulated to compare its distribution functions to those obtained from the cars’ coordinates. Despite the clear difference between bumper cars and a hard disk-like particle system, the obtained distribution functions are very similar. This suggests that there is no important effect of the individuals in the collective behaviour of the system in terms of structure. The research, performed by the students, has been undertaken in the frame of a motivational project designed to approach the scientific method for university students named FISIDABO. This project offers both the logistical and technical support to undertake the experiments designed by students at the amusement park of Barcelona TIBIDABO and accompanies them all along the scientific process.
Battista, Alexis
2017-01-01
The dominant frameworks for describing how simulations support learning emphasize increasing access to structured practice and the provision of feedback which are commonly associated with skills-based simulations. By contrast, studies examining student participants' experiences during scenario-based simulations suggest that learning may also occur through participation. However, studies directly examining student participation during scenario-based simulations are limited. This study examined the types of activities student participants engaged in during scenario-based simulations and then analyzed their patterns of activity to consider how participation may support learning. Drawing from Engeström's first-, second-, and third-generation activity systems analysis, an in-depth descriptive analysis was conducted. The study drew from multiple qualitative methods, namely narrative, video, and activity systems analysis, to examine student participants' activities and interaction patterns across four video-recorded simulations depicting common motivations for using scenario-based simulations (e.g., communication, critical patient management). The activity systems analysis revealed that student participants' activities encompassed three clinically relevant categories, including (a) use of physical clinical tools and artifacts, (b) social interactions, and (c) performance of structured interventions. Role assignment influenced participants' activities and the complexity of their engagement. Importantly, participants made sense of the clinical situation presented in the scenario by reflexively linking these three activities together. Specifically, student participants performed structured interventions, relying upon the use of physical tools, clinical artifacts, and social interactions together with interactions between students, standardized patients, and other simulated participants to achieve their goals. When multiple student participants were present, such as in a team-based scenario, they distributed the workload to achieve their goals. The findings suggest that student participants learned as they engaged in these scenario-based simulations when they worked to make sense of the patient's clinical presentation. The findings may provide insight into how student participants' meaning-making efforts are mediated by the cultural artifacts (e.g., physical clinical tools) they access, the social interactions they engage in, the structured interventions they perform, and the roles they are assigned. The findings also highlight the complex and emergent properties of scenario-based simulations as well as how activities are nested. Implications for learning, instructional design, and assessment are discussed.
Understanding Introductory Students' Application of Integrals in Physics from Multiple Perspectives
ERIC Educational Resources Information Center
Hu, Dehui
2013-01-01
Calculus is used across many physics topics from introductory to upper-division level college courses. The concepts of differentiation and integration are important tools for solving real world problems. Using calculus or any mathematical tool in physics is much more complex than the straightforward application of the equations and algorithms that…
GPCALMA: A Tool For Mammography With A GRID-Connected Distributed Database
NASA Astrophysics Data System (ADS)
Bottigli, U.; Cerello, P.; Cheran, S.; Delogu, P.; Fantacci, M. E.; Fauci, F.; Golosio, B.; Lauria, A.; Lopez Torres, E.; Magro, R.; Masala, G. L.; Oliva, P.; Palmiero, R.; Raso, G.; Retico, A.; Stumbo, S.; Tangaro, S.
2003-09-01
The GPCALMA (Grid Platform for Computer Assisted Library for MAmmography) collaboration involves several departments of physics, INFN (National Institute of Nuclear Physics) sections, and italian hospitals. The aim of this collaboration is developing a tool that can help radiologists in early detection of breast cancer. GPCALMA has built a large distributed database of digitised mammographic images (about 5500 images corresponding to 1650 patients) and developed a CAD (Computer Aided Detection) software which is integrated in a station that can also be used to acquire new images, as archive and to perform statistical analysis. The images (18×24 cm2, digitised by a CCD linear scanner with a 85 μm pitch and 4096 gray levels) are completely described: pathological ones have a consistent characterization with radiologist's diagnosis and histological data, non pathological ones correspond to patients with a follow up at least three years. The distributed database is realized throught the connection of all the hospitals and research centers in GRID tecnology. In each hospital local patients digital images are stored in the local database. Using GRID connection, GPCALMA will allow each node to work on distributed database data as well as local database data. Using its database the GPCALMA tools perform several analysis. A texture analysis, i.e. an automated classification on adipose, dense or glandular texture, can be provided by the system. GPCALMA software also allows classification of pathological features, in particular massive lesions (both opacities and spiculated lesions) analysis and microcalcification clusters analysis. The detection of pathological features is made using neural network software that provides a selection of areas showing a given "suspicion level" of lesion occurrence. The performance of the GPCALMA system will be presented in terms of the ROC (Receiver Operating Characteristic) curves. The results of GPCALMA system as "second reader" will also be presented.
Offroy, Marc; Duponchel, Ludovic
2016-03-03
An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.
Cook, Daniel L; Neal, Maxwell L; Bookstein, Fred L; Gennari, John H
2013-12-02
In prior work, we presented the Ontology of Physics for Biology (OPB) as a computational ontology for use in the annotation and representations of biophysical knowledge encoded in repositories of physics-based biosimulation models. We introduced OPB:Physical entity and OPB:Physical property classes that extend available spatiotemporal representations of physical entities and processes to explicitly represent the thermodynamics and dynamics of physiological processes. Our utilitarian, long-term aim is to develop computational tools for creating and querying formalized physiological knowledge for use by multiscale "physiome" projects such as the EU's Virtual Physiological Human (VPH) and NIH's Virtual Physiological Rat (VPR). Here we describe the OPB:Physical dependency taxonomy of classes that represent of the laws of classical physics that are the "rules" by which physical properties of physical entities change during occurrences of physical processes. For example, the fluid analog of Ohm's law (as for electric currents) is used to describe how a blood flow rate depends on a blood pressure gradient. Hooke's law (as in elastic deformations of springs) is used to describe how an increase in vascular volume increases blood pressure. We classify such dependencies according to the flow, transformation, and storage of thermodynamic energy that occurs during processes governed by the dependencies. We have developed the OPB and annotation methods to represent the meaning-the biophysical semantics-of the mathematical statements of physiological analysis and the biophysical content of models and datasets. Here we describe and discuss our approach to an ontological representation of physical laws (as dependencies) and properties as encoded for the mathematical analysis of biophysical processes.
Performance profiling for brachytherapy applications
NASA Astrophysics Data System (ADS)
Choi, Wonqook; Cho, Kihyeon; Yeo, Insung
2018-05-01
In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.
New tools for investigating student learning in upper-division electrostatics
NASA Astrophysics Data System (ADS)
Wilcox, Bethany R.
Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.
Cyber / Physical Security Vulnerability Assessment Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDonald, Douglas G.; Simpkins, Bret E.
Abstract Both physical protection and cyber security domains offer solutions for the discovery of vulnerabilities through the use of various assessment processes and software tools. Each vulnerability assessment (VA) methodology provides the ability to identify and categorize vulnerabilities, and quantifies the risks within their own areas of expertise. Neither approach fully represents the true potential security risk to a site and/or a facility, nor comprehensively assesses the overall security posture. The technical approach to solving this problem was to identify methodologies and processes that blend the physical and cyber security assessments, and develop tools to accurately quantify the unaccounted formore » risk. SMEs from both the physical and the cyber security domains developed the blending methodologies, and cross trained each other on the various aspects of the physical and cyber security assessment processes. A local critical infrastructure entity volunteered to host a proof of concept physical/cyber security assessment, and the lessons learned have been leveraged by this effort. The four potential modes of attack an adversary can use in approaching a target are; Physical Only Attack, Cyber Only Attack, Physical Enabled Cyber Attack, and the Cyber Enabled Physical Attack. The Physical Only and the Cyber Only pathway analysis are two of the most widely analyzed attack modes. The pathway from an off-site location to the desired target location is dissected to ensure adversarial activity can be detected and neutralized by the protection strategy, prior to completion of a predefined task. This methodology typically explores a one way attack from the public space (or common area) inward towards the target. The Physical Enabled Cyber Attack and the Cyber Enabled Physical Attack are much more intricate. Both scenarios involve beginning in one domain to affect change in the other, then backing outward to take advantage of the reduced system effectiveness, before penetrating further into the defenses. The proper identification and assessment of the overlapping areas (and interaction between these areas) in the VA process is necessary to accurately assess the true risk.« less
NASA Astrophysics Data System (ADS)
Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.
2014-12-01
X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.
NASA Astrophysics Data System (ADS)
Shprits, Y.; Zhelavskaya, I. S.; Kellerman, A. C.; Spasojevic, M.; Kondrashov, D. A.; Ghil, M.; Aseev, N.; Castillo Tibocha, A. M.; Cervantes Villa, J. S.; Kletzing, C.; Kurth, W. S.
2017-12-01
Increasing volume of satellite measurements requires deployment of new tools that can utilize such vast amount of data. Satellite measurements are usually limited to a single location in space, which complicates the data analysis geared towards reproducing the global state of the space environment. In this study we show how measurements can be combined by means of data assimilation and how machine learning can help analyze large amounts of data and can help develop global models that are trained on single point measurement. Data Assimilation: Manual analysis of the satellite measurements is a challenging task, while automated analysis is complicated by the fact that measurements are given at various locations in space, have different instrumental errors, and often vary by orders of magnitude. We show results of the long term reanalysis of radiation belt measurements along with fully operational real-time predictions using data assimilative VERB code. Machine Learning: We present application of the machine learning tools for the analysis of NASA Van Allen Probes upper-hybrid frequency measurements. Using the obtained data set we train a new global predictive neural network. The results for the Van Allen Probes based neural network are compared with historical IMAGE satellite observations. We also show examples of predictions of geomagnetic indices using neural networks. Combination of machine learning and data assimilation: We discuss how data assimilation tools and machine learning tools can be combine so that physics-based insight into the dynamics of the particular system can be combined with empirical knowledge of it's non-linear behavior.
MPI_XSTAR: MPI-based parallelization of XSTAR program
NASA Astrophysics Data System (ADS)
Danehkar, A.
2017-12-01
MPI_XSTAR parallelizes execution of multiple XSTAR runs using Message Passing Interface (MPI). XSTAR (ascl:9910.008), part of the HEASARC's HEAsoft (ascl:1408.004) package, calculates the physical conditions and emission spectra of ionized gases. MPI_XSTAR invokes XSTINITABLE from HEASoft to generate a job list of XSTAR commands for given physical parameters. The job list is used to make directories in ascending order, where each individual XSTAR is spawned on each processor and outputs are saved. HEASoft's XSTAR2TABLE program is invoked upon the contents of each directory in order to produce table model FITS files for spectroscopy analysis tools.
Teaching physics using Microsoft Excel
NASA Astrophysics Data System (ADS)
Uddin, Zaheer; Ahsanuddin, Muhammad; Khan, Danish Ahmed
2017-09-01
Excel is both ubiquitous and easily understandable. Most people from every walk of life know how to use MS office and Excel spreadsheets. Students are also familiar with spreadsheets. Most students know how to use spreadsheets for data analysis. Besides basic use of Excel, some important aspects of spreadsheets are highlighted in this article. MS Excel can be used to visualize effects of various parameters in a physical system. It can be used as a simulating tool; simulation of wind data has been done through spreadsheets in this study. Examples of Lissajous figures and a damped harmonic oscillator are presented in this article.
ERIC Educational Resources Information Center
Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael
2017-01-01
The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…
Projectiles, pendula, and special relativity
NASA Astrophysics Data System (ADS)
Price, Richard H.
2005-05-01
The kind of flat-earth gravity used in introductory physics appears in an accelerated reference system in special relativity. From this viewpoint, we work out the special relativistic description of a ballistic projectile and a simple pendulum, two examples of simple motion driven by earth-surface gravity. The analysis uses only the basic mathematical tools of special relativity typical of a first-year university course.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Ucilia
This report has the following articles: (1) Deconstructing Microbes--metagenomic research on bugs in termites relies on new data analysis tools; (2) Popular Science--a nanomaterial research paper in Nano Letters drew strong interest from the scientific community; (3) Direct Approach--researchers employ an algorithm to solve an energy-reduction issue essential in describing complex physical system; and (4) SciDAC Special--A science journal features research on petascale enabling technologies.
An Electromagnetic Tool for Damping and Fatigue Analysis
2004-03-01
Serway , Raymond A . Physics For Scientists & Engineers (3rd Edition). Philadelphia: Saunders College Publishing, 1990. 15. Kurtus, Ron...system was initially designed to reduce the time and manpower required to characterize damping treatments. It is based on a digitally controlled...the capability to study fatigue under a free boundary condition. The system consists of a test specimen suspended by a pendulum to closely
USDA-ARS?s Scientific Manuscript database
The Consumo Alimentar e Atividade Fisica de Escolares (CAAFE) questionnaire is an online research tool that has been developed to enable the self-report of physical activity and diet by Brazilian school children aged 7–10 years. Formative research was conducted with nutritionists during the developm...
G.H. Reeves; F.H. Everest; T.E. Nickelson
1989-01-01
Fishery managers are currently spending millions of dollars per year on habitat enhancement for anadromous salmonids but often do not have the tools needed to ensure success. An analysis of factors limiting production of salmonids in streams must be completed before any habitat-enhancement program is begun. This paper outlines the first formal procedure for identifying...
Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs
Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara
2017-01-01
Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...
Intentions and actions in molecular self-assembly: perspectives on students' language use
NASA Astrophysics Data System (ADS)
Höst, Gunnar E.; Anward, Jan
2017-04-01
Learning to talk science is an important aspect of learning to do science. Given that scientists' language frequently includes intentions and purposes in explanations of unobservable objects and events, teachers must interpret whether learners' use of such language reflects a scientific understanding or inaccurate anthropomorphism and teleology. In the present study, a framework consisting of three 'stances' (Dennett, 1987) - intentional, design and physical - is presented as a powerful tool for analysing students' language use. The aim was to investigate how the framework can be differentiated and used analytically for interpreting students' talk about a molecular process. Semi-structured group discussions and individual interviews about the molecular self-assembly process were conducted with engineering biology/chemistry (n = 15) and biology/chemistry teacher students (n = 6). Qualitative content analysis of transcripts showed that all three stances were employed by students. The analysis also identified subcategories for each stance, and revealed that intentional language with respect to molecular movement and assumptions about design requirements may be potentially problematic areas. Students' exclusion of physical stance explanations may indicate literal anthropomorphic interpretations. Implications for practice include providing teachers with a tool for scaffolding their use of metaphorical language and for supporting students' metacognitive development as scientific language users.
Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, R.; Neymark, J.; Polly, B.
2011-12-01
This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less
Design sensitivity analysis and optimization tool (DSO) for sizing design applications
NASA Technical Reports Server (NTRS)
Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa
1992-01-01
The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.
French, Anna; Bravery, Christopher; Smith, James; Chandra, Amit; Archibald, Peter; Gold, Joseph D; Artzi, Natalie; Kim, Hae-Won; Barker, Richard W; Meissner, Alexander; Wu, Joseph C; Knowles, Jonathan C; Williams, David; García-Cardeña, Guillermo; Sipp, Doug; Oh, Steve; Loring, Jeanne F; Rao, Mahendra S; Reeve, Brock; Wall, Ivan; Carr, Andrew J; Bure, Kim; Stacey, Glyn; Karp, Jeffrey M; Snyder, Evan Y; Brindley, David A
2015-03-01
There is a need for physical standards (reference materials) to ensure both reproducibility and consistency in the production of somatic cell types from human pluripotent stem cell (hPSC) sources. We have outlined the need for reference materials (RMs) in relation to the unique properties and concerns surrounding hPSC-derived products and suggest in-house approaches to RM generation relevant to basic research, drug screening, and therapeutic applications. hPSCs have an unparalleled potential as a source of somatic cells for drug screening, disease modeling, and therapeutic application. Undefined variation and product variability after differentiation to the lineage or cell type of interest impede efficient translation and can obscure the evaluation of clinical safety and efficacy. Moreover, in the absence of a consistent population, data generated from in vitro studies could be unreliable and irreproducible. Efforts to devise approaches and tools that facilitate improved consistency of hPSC-derived products, both as development tools and therapeutic products, will aid translation. Standards exist in both written and physical form; however, because many unknown factors persist in the field, premature written standards could inhibit rather than promote innovation and translation. We focused on the derivation of physical standard RMs. We outline the need for RMs and assess the approaches to in-house RM generation for hPSC-derived products, a critical tool for the analysis and control of product variation that can be applied by researchers and developers. We then explore potential routes for the generation of RMs, including both cellular and noncellular materials and novel methods that might provide valuable tools to measure and account for variation. Multiparametric techniques to identify "signatures" for therapeutically relevant cell types, such as neurons and cardiomyocytes that can be derived from hPSCs, would be of significant utility, although physical RMs will be required for clinical purposes. ©AlphaMed Press.
Bravery, Christopher; Smith, James; Chandra, Amit; Archibald, Peter; Gold, Joseph D.; Artzi, Natalie; Kim, Hae-Won; Barker, Richard W.; Meissner, Alexander; Wu, Joseph C.; Knowles, Jonathan C.; Williams, David; García-Cardeña, Guillermo; Sipp, Doug; Oh, Steve; Loring, Jeanne F.; Rao, Mahendra S.; Reeve, Brock; Wall, Ivan; Carr, Andrew J.; Bure, Kim; Stacey, Glyn; Karp, Jeffrey M.
2015-01-01
Summary There is a need for physical standards (reference materials) to ensure both reproducibility and consistency in the production of somatic cell types from human pluripotent stem cell (hPSC) sources. We have outlined the need for reference materials (RMs) in relation to the unique properties and concerns surrounding hPSC-derived products and suggest in-house approaches to RM generation relevant to basic research, drug screening, and therapeutic applications. hPSCs have an unparalleled potential as a source of somatic cells for drug screening, disease modeling, and therapeutic application. Undefined variation and product variability after differentiation to the lineage or cell type of interest impede efficient translation and can obscure the evaluation of clinical safety and efficacy. Moreover, in the absence of a consistent population, data generated from in vitro studies could be unreliable and irreproducible. Efforts to devise approaches and tools that facilitate improved consistency of hPSC-derived products, both as development tools and therapeutic products, will aid translation. Standards exist in both written and physical form; however, because many unknown factors persist in the field, premature written standards could inhibit rather than promote innovation and translation. We focused on the derivation of physical standard RMs. We outline the need for RMs and assess the approaches to in-house RM generation for hPSC-derived products, a critical tool for the analysis and control of product variation that can be applied by researchers and developers. We then explore potential routes for the generation of RMs, including both cellular and noncellular materials and novel methods that might provide valuable tools to measure and account for variation. Multiparametric techniques to identify “signatures” for therapeutically relevant cell types, such as neurons and cardiomyocytes that can be derived from hPSCs, would be of significant utility, although physical RMs will be required for clinical purposes. PMID:25650438
Chatterjee, Robin; Chapman, Tim; Brannan, Mike Gt; Varney, Justin
2017-10-01
Physical activity (PA) brief advice in health care is effective at getting individuals active. It has been suggested that one in four people would be more active if advised by a GP or nurse, but as many as 72% of GPs do not discuss the benefits of physical activity with patients. To assess the knowledge, use, and confidence in national PA and Chief Medical Officer (CMO) health guidelines and tools among GPs in England. Online questionnaire-based survey of self-selecting GPs in England that took place over a 10-day period in March 2016. The questionnaire consisted of six multiple-choice questions and was available on the Doctors.net.uk (DNUK) homepage. Quotas were used to ensure good regional representation. The final analysis included 1013 responses. Only 20% of responders were broadly or very familiar with the national PA guidelines. In all, 70% of GPs were aware of the General Practice Physical Activity Questionnaire (GPPAQ), but 26% were not familiar with any PA assessment tools, and 55% reported that they had not undertaken any training with respect to encouraging PA. The majority of GPs in England (80%) are unfamiliar with the national PA guidelines. Awareness of the recommended tool for assessment, GPPAQ, is higher than use by GPs. This may be because it is used by other clinical staff, for example, as part of the NHS Health Check programme. Although brief advice in isolation by GPs on PA will only be a part of the behaviour change journey, it is an important prompt, especially if repeated as part of routine practice. This study highlights the need for significant improvement in knowledge, skills, and confidence to maximise the potential for PA advice in GP consultations. © British Journal of General Practice 2017.
Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk
2018-04-01
The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.
Discover Space Weather and Sun's Superpowers: Using CCMC's innovative tools and applications
NASA Astrophysics Data System (ADS)
Mendoza, A. M. M.; Maddox, M. M.; Kuznetsova, M. M.; Chulaki, A.; Rastaetter, L.; Mullinix, R.; Weigand, C.; Boblitt, J.; Taktakishvili, A.; MacNeice, P. J.; Pulkkinen, A. A.; Pembroke, A. D.; Mays, M. L.; Zheng, Y.; Shim, J. S.
2015-12-01
Community Coordinated Modeling Center (CCMC) has developed a comprehensive set of tools and applications that are directly applicable to space weather and space science education. These tools, some of which were developed by our student interns, are capable of serving a wide range of student audiences, from middle school to postgraduate research. They include a web-based point of access to sophisticated space physics models and visualizations, and a powerful space weather information dissemination system, available on the web and as a mobile app. In this demonstration, we will use CCMC's innovative tools to engage the audience in real-time space weather analysis and forecasting and will share some of our interns' hands-on experiences while being trained as junior space weather forecasters. The main portals to CCMC's educational material are ccmc.gsfc.nasa.gov and iswa.gsfc.nasa.gov
Simultaneous fits in ISIS on the example of GRO J1008-57
NASA Astrophysics Data System (ADS)
Kühnel, Matthias; Müller, Sebastian; Kreykenbohm, Ingo; Schwarm, Fritz-Walter; Grossberger, Christoph; Dauser, Thomas; Pottschmidt, Katja; Ferrigno, Carlo; Rothschild, Richard E.; Klochkov, Dmitry; Staubert, Rüdiger; Wilms, Joern
2015-04-01
Parallel computing and steadily increasing computation speed have led to a new tool for analyzing multiple datasets and datatypes: fitting several datasets simultaneously. With this technique, physically connected parameters of individual data can be treated as a single parameter by implementing this connection into the fit directly. We discuss the terminology, implementation, and possible issues of simultaneous fits based on the X-ray data analysis tool Interactive Spectral Interpretation System (ISIS). While all data modeling tools in X-ray astronomy allow in principle fitting data from multiple data sets individually, the syntax used in these tools is not often well suited for this task. Applying simultaneous fits to the transient X-ray binary GRO J1008-57, we find that the spectral shape is only dependent on X-ray flux. We determine time independent parameters such as, e.g., the folding energy E_fold, with unprecedented precision.
Analysis of physical activity mass media campaign design.
Lankford, Tina; Wallace, Jana; Brown, David; Soares, Jesus; Epping, Jacqueline N; Fridinger, Fred
2014-08-01
Mass media campaigns are a necessary tool for public health practitioners to reach large populations and promote healthy behaviors. Most health scholars have concluded that mass media can significantly influence the health behaviors of populations; however the effects of such campaigns are typically modest and may require significant resources. A recent Community Preventive Services Task Force review on stand-alone mass media campaigns concluded there was insufficient evidence to determine their effectiveness in increasing physical activity, partly due to mixed methods and modest and inconsistent effects on levels of physical activity. A secondary analysis was performed on the campaigns evaluated in the Task Force review to determine use of campaign-building principles, channels, and levels of awareness and their impact on campaign outcomes. Each study was analyzed by 2 reviewers for inclusion of campaign building principles. Campaigns that included 5 or more campaign principles were more likely to be successful in achieving physical activity outcomes. Campaign success is more likely if the campaign building principles (formative research, audience segmentation, message design, channel placement, process evaluation, and theory-based) are used as part of campaign design and planning.
WE-G-BRC-02: Risk Assessment for HDR Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayadev, J.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
WE-G-BRC-01: Risk Assessment for Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, G.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
Stans, Steffy E A; Dalemans, Ruth J P; de Witte, Luc P; Smeets, Hester W H; Beurskens, Anna J
2017-12-01
The role of the physical environment in communication between health-care professionals and persons with communication problems is a neglected area. This study provides an overview of factors in the physical environment that play a role in communication during conversations between people who are communication vulnerable and health-care professionals. A scoping review was conducted using the methodological framework of Arksey and O'Malley. The PubMed, PsycINFO, CINAHL and Cochrane Library databases were screened, and a descriptive and thematic analysis was completed. Sixteen publications were included. Six factors in the physical environment play a role in conversations between people who are communication vulnerable and health-care professionals: (1) lighting, (2) acoustic environment, (3) humidity and temperature, (4) setting and furniture placement, (5) written information, and (6) availability of augmentative and alternative communication (AAC) tools. These factors indicated barriers and strategies related to the quality of these conversations. Relatively small and simple strategies to adjust the physical environment (such as adequate lighting, quiet environment, providing pen and paper) can support people who are communication vulnerable to be more involved in conversations. It is recommended that health-care professionals have an overall awareness of the potential influence of environmental elements on conversations. Implications for rehabilitation The physical environment is an important feature in the success or disturbance of communication. Small adjustments to the physical environment in rehabilitation can contribute to a communication-friendly environment for conversations with people who are communication vulnerable. Professionals should consider adjustments with regard to the following factors in the physical environment during conversations with people who are communication vulnerable: lighting, acoustic environment, humidity and temperature, setting and furniture placement, written information, and availability of AAC (augmentative and alternative communication tools).
Network analysis of physics discussion forums and links to course success
NASA Astrophysics Data System (ADS)
Traxler, Adrienne; Gavrin, Andrew; Lindell, Rebecca
2017-01-01
Large introductory science courses tend to isolate students, with negative consequences for long-term retention in college. Many active learning courses build collaboration and community among students as an explicit goal, and social network analysis has been used to track the development and beneficial effects of these collaborations. Here we supplement such work by conducting network analysis of online course discussion forums in two semesters of an introductory physics class. Online forums provide a tool for engaging students with each other outside of class, and offer new opportunities to commuter or non-traditional students with limited on-campus time. We look for correlations between position in the forum network (centrality) and final course grades. Preliminary investigation has shown weak correlations in the very dense full-semester network, so we will consider reduced ''backbone'' networks that highlight the most consistent links between students. Future work and implications for instruction will also be discussed.
Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method
NASA Astrophysics Data System (ADS)
Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín
2013-09-01
Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.
Revealing Fundamental Physics from the Daya Bay Neutrino Experiment Using Deep Neural Networks
Racah, Evan; Ko, Seyoon; Sadowski, Peter; ...
2017-02-02
Experiments in particle physics produce enormous quantities of data that must be analyzed and interpreted by teams of physicists. This analysis is often exploratory, where scientists are unable to enumerate the possible types of signal prior to performing the experiment. Thus, tools for summarizing, clustering, visualizing and classifying high-dimensional data are essential. Here in this work, we show that meaningful physical content can be revealed by transforming the raw data into a learned high-level representation using deep neural networks, with measurements taken at the Daya Bay Neutrino Experiment as a case study. We further show how convolutional deep neural networksmore » can provide an effective classification filter with greater than 97% accuracy across different classes of physics events, significantly better than other machine learning approaches.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Racah, Evan; Ko, Seyoon; Sadowski, Peter
Experiments in particle physics produce enormous quantities of data that must be analyzed and interpreted by teams of physicists. This analysis is often exploratory, where scientists are unable to enumerate the possible types of signal prior to performing the experiment. Thus, tools for summarizing, clustering, visualizing and classifying high-dimensional data are essential. Here in this work, we show that meaningful physical content can be revealed by transforming the raw data into a learned high-level representation using deep neural networks, with measurements taken at the Daya Bay Neutrino Experiment as a case study. We further show how convolutional deep neural networksmore » can provide an effective classification filter with greater than 97% accuracy across different classes of physics events, significantly better than other machine learning approaches.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Prescott, Steven; Coleman, Justin
This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communicationmore » and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.« less
Insightful problem solving and creative tool modification by captive nontool-using rooks.
Bird, Christopher D; Emery, Nathan J
2009-06-23
The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use.
A Bubble Chamber Simulator: A New Tool for the Physics Classroom
ERIC Educational Resources Information Center
Gagnon, Michel
2011-01-01
Mainly used in the 1960s, bubble chambers played a major role in particle physics. Now replaced with modern electronic detectors, we believe they remain an important didactic tool to introduce particle physics as they provide visual, appealing and insightful pictures. Sadly, this rare type of detector is mostly accessible through open-door events…
Locating the Center of Gravity: The Dance of Normal and Frictional Forces
ERIC Educational Resources Information Center
Balta, Nuri
2012-01-01
Teaching physics concepts with the basic materials that are around us is one of the beauties of physics. Without expensive lab materials and long experiments, many physics concepts can be taught to students using simple tools. Demonstrations with these tools can be presented as discrepant events that surprise, amaze, or puzzle students. Greenslade…
Using Plickers as an Assessment Tool in Health and Physical Education Settings
ERIC Educational Resources Information Center
Chng, Lena; Gurvitch, Rachel
2018-01-01
Written tests are one of the most common assessment tools classroom teachers use today. Despite its popularity, administering written tests or surveys, especially in health and physical education settings, is time consuming. In addition to the time taken to type and print out the tests or surveys, health and physical education teachers must grade…
ERIC Educational Resources Information Center
Metos, Julie; Gren, Lisa; Brusseau, Timothy; Moric, Endi; O'Toole, Karen; Mokhtari, Tahereh; Buys, Saundra; Frost, Caren
2018-01-01
Objective: The objective of this study was to understand adolescent girls' experiences using practical diet and physical activity measurement tools and to explore the food and physical activity settings that influence their lifestyle habits. Design: Mixed methods study using quantitative and qualitative methods. Setting: Large city in the western…
Pattison, Kira M.; Brooks, Dina; Cameron, Jill I.
2015-01-01
Background The use of standardized assessment tools is an element of evidence-informed rehabilitation, but physical therapists report administering these tools inconsistently poststroke. An in-depth understanding of physical therapists' approaches to walking assessment is needed to develop strategies to advance assessment practice. Objectives The objective of this study was to explore the methods physical therapists use to evaluate walking poststroke, reasons for selecting these methods, and the use of assessment results in clinical practice. Design A qualitative descriptive study involving semistructured telephone interviews was conducted. Methods Registered physical therapists assessing a minimum of 10 people with stroke per year in Ontario, Canada, were purposively recruited from acute care, rehabilitation, and outpatient settings. Interviews were audiotaped and transcribed verbatim. Transcripts were coded line by line by the interviewer. Credibility was optimized through triangulation of analysts, audit trail, and collection of field notes. Results Study participants worked in acute care (n=8), rehabilitation (n=11), or outpatient (n=9) settings and reported using movement observation and standardized assessment tools to evaluate walking. When selecting methods to evaluate walking, physical therapists described being influenced by a hierarchy of factors. Factors included characteristics of the assessment tool, the therapist, the workplace, and patients, as well as influential individuals or organizations. Familiarity exerted the primary influence on adoption of a tool into a therapist's assessment repertoire, whereas patient factors commonly determined daily use. Participants reported using the results from walking assessments to communicate progress to the patient and health care professionals. Conclusions Multilevel factors influence physical therapists' adoption and daily administration of standardized tools to assess walking. Findings will inform knowledge translation efforts aimed at increasing the standardized assessment of walking poststroke. PMID:25929532
Design and analysis of a magneto-rheological damper for an all terrain vehicle
NASA Astrophysics Data System (ADS)
Krishnan Unni, R.; Tamilarasan, N.
2018-02-01
A shock absorber design intended to replace the existing conventional shock absorber with a controllable system using a Magneto-rheological damper is introduced for an All Terrain Vehicle (ATV) that was designed for Baja SAE competitions. Suspensions are a vital part of an All Terrain Vehicles as it endures various surfaces and requires utmost attention while designing. COMSOL multi-physics software is used for applications that have coupled physics problems and is a unique tool that is used for the designing and analysis phase of the Magneto-rheological damper for the considered application and the model is optimized based on Taguchi using DOE software. The magneto-rheological damper is designed to maximize the damping force with the measured geometric constraints for the All Terrain Vehicle.
Review of computational fluid dynamics (CFD) researches on nano fluid flow through micro channel
NASA Astrophysics Data System (ADS)
Dewangan, Satish Kumar
2018-05-01
Nanofluid is becoming a promising heat transfer fluids due to its improved thermo-physical properties and heat transfer performance. Micro channel heat transfer has potential application in the cooling high power density microchips in CPU system, micro power systems and many such miniature thermal systems which need advanced cooling capacity. Use of nanofluids enhances the effectiveness of t=scu systems. Computational Fluid Dynamics (CFD) is a very powerful tool in computational analysis of the various physical processes. It application to the situations of flow and heat transfer analysis of the nano fluids is catching up very fast. Present research paper gives a brief account of the methodology of the CFD and also summarizes its application on nano fluid and heat transfer for microchannel cases.
Papathomas, Anthony; Williams, Toni L.; Smith, Brett
2015-01-01
The aim of this study was to identity the types of physical activity narratives drawn upon by active spinal injured people. More than 50 h of semi-structured life-story interview data, collected as part of larger interdisciplinary program of disability lifestyle research, was analysed for 30 physically active male and female spinal cord injury (SCI) participants. A structural narrative analysis of data identified three narrative types which people with SCI draw on: (1) exercise is restitution, (2) exercise is medicine, and (3) exercise is progressive redemption. These insights contribute new knowledge by adding a unique narrative perspective to existing cognitive understanding of physical activity behaviour in the spinal cord injured population. The implications of this narrative typology for developing effective positive behavioural change interventions are critically discussed. It is concluded that the identified narratives types may be constitutive, as well as reflective, of physical activity experiences and therefore may be a useful tool on which to base physical activity promotion initiatives. PMID:26282868
Capstone: A Geometry-Centric Platform to Enable Physics-Based Simulation and Design of Systems
2015-10-05
foundation for the air-vehicle early design tool DaVinci being developed by CREATETM-AV project to enable development of associative models of air...CREATETM-AV solvers Kestrel [11] and Helios [16,17]. Furthermore, it is the foundation for the CREATETM-AV’s DaVinci [9] tool that provides a... Tools and Environments (CREATETM) program [6] aimed at developing a suite of high- performance physics-based computational tools addressing the needs
RELIABILITY AND VALIDITY OF A BIOMECHANICALLY BASED ANALYSIS METHOD FOR THE TENNIS SERVE
Kibler, W. Ben; Lamborn, Leah; Smith, Belinda J.; English, Tony; Jacobs, Cale; Uhl, Tim L.
2017-01-01
Background An observational tennis serve analysis (OTSA) tool was developed using previously established body positions from three-dimensional kinematic motion analysis studies. These positions, defined as nodes, have been associated with efficient force production and minimal joint loading. However, the tool has yet to be examined scientifically. Purpose The primary purpose of this investigation was to determine the inter-observer reliability for each node between two health care professionals (HCPs) that developed the OTSA, and secondarily to investigate the validity of the OTSA. Methods Two separate studies were performed to meet these objectives. An inter-observer reliability study preceded the validity study by examining 28 videos of players serving. Two HCPs graded each video and scored the presence or absence of obtaining each node. Discriminant validity was determined in 33 tennis players using video taped records of three first serves. Serve mechanics were graded using the OSTA and categorized players into those with good ( ≥ 5) and poor ( ≤ 4) mechanics. Participants performed a series of field tests to evaluate trunk flexibility, lower extremity and trunk power, and dynamic balance. Results The group with good mechanics demonstrated greater backward trunk flexibility (p=0.02), greater rotational power (p=0.02), and higher single leg countermovement jump (p=0.05). Reliability of the OTSA ranged from K = 0.36-1.0, with the majority of all the nodes displaying substantial reliability (K>0.61). Conclusion This study provides HCPs with a valid and reliable field tool used to assess serve mechanics. Physical characteristics of trunk mobility and power appear to discriminate serve mechanics between players. Future intervention studies are needed to determine if improvement in physical function contribute to improved serve mechanics. Level of Evidence 3 PMID:28593098
Flores Mateo, Gemma; Granado-Font, Esther; Ferré-Grau, Carme; Montaña-Carreras, Xavier
2015-11-10
To our knowledge, no meta-analysis to date has assessed the efficacy of mobile phone apps to promote weight loss and increase physical activity. To perform a systematic review and meta-analysis of studies to compare the efficacy of mobile phone apps compared with other approaches to promote weight loss and increase physical activity. We conducted a systematic review and meta-analysis of relevant studies identified by a search of PubMed, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), and Scopus from their inception through to August 2015. Two members of the study team (EG-F, GF-M) independently screened studies for inclusion criteria and extracted data. We included all controlled studies that assessed a mobile phone app intervention with weight-related health measures (ie, body weight, body mass index, or waist circumference) or physical activity outcomes. Net change estimates comparing the intervention group with the control group were pooled across studies using random-effects models. We included 12 articles in this systematic review and meta-analysis. Compared with the control group, use of a mobile phone app was associated with significant changes in body weight (kg) and body mass index (kg/m(2)) of -1.04 kg (95% CI -1.75 to -0.34; I2 = 41%) and -0.43 kg/m(2) (95% CI -0.74 to -0.13; I2 = 50%), respectively. Moreover, a nonsignificant difference in physical activity was observed between the two groups (standardized mean difference 0.40, 95% CI -0.07 to 0.87; I2 = 93%). These findings were remarkably robust in the sensitivity analysis. No publication bias was shown. Evidence from this study shows that mobile phone app-based interventions may be useful tools for weight loss.
NASA Astrophysics Data System (ADS)
Lamy, L.; Henry, F.; Prangé, R.; Le Sidaner, P.
2015-10-01
The Auroral Planetary Imaging and Spectroscopy (APIS) service http://obspm.fr/apis/ provides an open and interactive access to processed auroral observations of the outer planets and their satellites. Such observations are of interest for a wide community at the interface between planetology, magnetospheric and heliospheric physics. APIS consists of (i) a high level database, built from planetary auroral observations acquired by the Hubble Space Telescope (HST) since 1997 with its mostly used Far-Ultraviolet spectro- imagers, (ii) a dedicated search interface aimed at browsing efficiently this database through relevant conditional search criteria (Figure 1) and (iii) the ability to interactively work with the data online through plotting tools developed by the Virtual Observatory (VO) community, such as Aladin and Specview. This service is VO compliant and can therefore also been queried by external search tools of the VO community. The diversity of available data and the capability to sort them out by relevant physical criteria shall in particular facilitate statistical studies, on long-term scales and/or multi-instrumental multispectral combined analysis [1,2]. We will present the updated capabilities of APIS with several examples. Several tutorials are available online.
Physics-based Entry, Descent and Landing Risk Model
NASA Technical Reports Server (NTRS)
Gee, Ken; Huynh, Loc C.; Manning, Ted
2014-01-01
A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.
Mapping the literature of physical therapy.
Wakiji, E M
1997-01-01
Physical therapy is a fast growing profession because of the aging population, medical advances, and the public's interest in health promotion. This study is part of the Medical Library Association (MLA) Nursing and Allied Health Resources Section's project to map the allied health literature. It identifies the core journals in physical therapy by analyzing the cited references of articles in two established physical therapy journals, Physical Therapy and Archives of Physical Medicine and Rehabilitation, during the period 1991 through 1993. This bibliometric analysis also determines the extent to which these journals are covered by the primary indexing sources, Allied and Alternative Medicine (AMED), the Cumulative Index to Nursing and Allied Health Literature, EMBASE, and MEDLINE. In this study, fourteen journals were found to supply one-third of all references studied. Ninety-five journals provided an additional third of the references. MEDLINE rated the highest as the indexing tool of choice for these 109 journals. The study results can assist in collection development decisions, advise physical therapists as to the best access to their core literature, and influence database producers to increase their coverage of the literature important to physical therapy. PMID:9285129
Physical punishment, culture, and rights: current issues for professionals.
Durrant, Joan E
2008-02-01
Once considered a legitimate parenting tool, physical punishment is increasingly being redefined as a developmental risk factor by health professionals. Three forces that have contributed to this significant social change are the evolution of pediatric psychology, increasing understanding of the dynamics of parental violence, and growing recognition of children as rights bearers. However, despite the consistency of research findings demonstrating the risks of physical punishment, some practitioners still struggle with the question of whether physical punishment is an appropriate practice among some cultural or ethnic groups. This issue is explored through an analysis of studies examining cultural differences and similarities in physical punishment's effects, as well as legal decisions made throughout the world. Despite practitioners' awareness of the prevalence and impact of parental violence, some still struggle with deciding where to "draw the line" in advising parents about spanking. This issue is addressed through an examination of the role that physical punishment plays in child maltreatment. Finally, the human rights perspective on physical punishment is offered as a new lens through which practitioners may view physical punishment to clarify the fuzzy issues of cultural relativity and the punishment-abuse dichotomy.
NASA Astrophysics Data System (ADS)
O'Neill, B. C.; Kauffman, B.; Lawrence, P.
2016-12-01
Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.
White, David B.
1991-01-01
An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.
Applications of Nuclear and Particle Physics Technology: Particles & Detection — A Brief Overview
NASA Astrophysics Data System (ADS)
Weisenberger, Andrew G.
A brief overview of the technology applications with significant societal benefit that have their origins in nuclear and particle physics research is presented. It is shown through representative examples that applications of nuclear physics can be classified into two basic areas: 1) applying the results of experimental nuclear physics and 2) applying the tools of experimental nuclear physics. Examples of the application of the tools of experimental nuclear and particle physics research are provided in the fields of accelerator and detector based technologies namely synchrotron light sources, nuclear medicine, ion implantation and radiation therapy.
NASA Astrophysics Data System (ADS)
DeVore, Seth; Marshman, Emily; Singh, Chandralekha
2017-06-01
As research-based, self-paced electronic learning tools become increasingly available, a critical issue educators encounter is implementing strategies to ensure that all students engage with them as intended. Here, we first discuss the effectiveness of electronic learning tutorials as self-paced learning tools in large enrollment brick and mortar introductory physics courses and then propose a framework for helping students engage effectively with the learning tools. The tutorials were developed via research in physics education and were found to be effective for a diverse group of introductory physics students in one-on-one implementation. Instructors encouraged the use of these tools in a self-paced learning environment by telling students that they would be helpful for solving the assigned homework problems and that the underlying physics principles in the tutorial problems would be similar to those in the in-class quizzes (which we call paired problems). We find that many students in the courses in which these interactive electronic learning tutorials were assigned as a self-study tool performed poorly on the paired problems. In contrast, a majority of student volunteers in one-on-one implementation greatly benefited from the tutorials and performed well on the paired problems. The significantly lower overall performance on paired problems administered as an in-class quiz compared to the performance of student volunteers who used the research-based tutorials in one-on-one implementation suggests that many students enrolled in introductory physics courses did not effectively engage with the tutorials outside of class and may have only used them superficially. The findings suggest that many students in need of out-of-class remediation via self-paced learning tools may have difficulty motivating themselves and may lack the self-regulation and time-management skills to engage effectively with tools specially designed to help them learn at their own pace. We conclude by proposing a theoretical framework to help students with diverse prior preparations engage effectively with self-paced learning tools.
NASA Astrophysics Data System (ADS)
Farina, Simone; Thepsonti, Thanongsak; Ceretti, Elisabetta; Özel, Tugrul
2011-05-01
Titanium alloys offer superb properties in strength, corrosion resistance and biocompatibility and are commonly utilized in medical devices and implants. Micro-end milling process is a direct and rapid fabrication method for manufacturing medical devices and implants in titanium alloys. Process performance and quality depend upon an understanding of the relationship between cutting parameters and forces and resultant tool deflections to avoid tool breakage. For this purpose, FE simulations of chip formation during micro-end milling of Ti-6Al-4V alloy with an ultra-fine grain solid carbide two-flute micro-end mill are investigated using DEFORM software. At first, specific forces in tangential and radial directions of cutting during micro-end milling for varying feed advance and rotational speeds have been determined using designed FE simulations for chip formation process. Later, these forces are applied to the micro-end mill geometry along the axial depth of cut in 3D analysis of ABAQUS. Consequently, 3D distributions for tool deflections & von Misses stress are determined. These analyses will yield in establishing integrated multi-physics process models for high performance micro-end milling and a leap-forward to process improvements.
Report on Automated Semantic Analysis of Scientific and Engineering Codes
NASA Technical Reports Server (NTRS)
Stewart. Maark E. M.; Follen, Greg (Technical Monitor)
2001-01-01
The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.
van der Weegen, Sanne; Verwey, Renée; Spreeuwenberg, Marieke; Tange, Huibert; van der Weijden, Trudy; de Witte, Luc
2015-07-24
Physical inactivity is a major public health problem. The It's LiFe! monitoring and feedback tool embedded in the Self-Management Support Program (SSP) is an attempt to stimulate physical activity in people with chronic obstructive pulmonary disease or type 2 diabetes treated in primary care. Our aim was to evaluate whether the SSP combined with the use of the monitoring and feedback tool leads to more physical activity compared to usual care and to evaluate the additional effect of using this tool on top of the SSP. This was a three-armed cluster randomised controlled trial. Twenty four family practices were randomly assigned to one of three groups in which participants received the tool + SSP (group 1), the SSP (group 2), or care as usual (group 3). The primary outcome measure was minutes of physical activity per day. The secondary outcomes were general and exercise self-efficacy and quality of life. Outcomes were measured at baseline after the intervention (4-6 months), and 3 months thereafter. The group that received the entire intervention (tool + SSP) showed more physical activity directly after the intervention than Group 3 (mean difference 11.73, 95% CI 6.21-17.25; P<.001), and Group 2 (mean difference 7.86, 95% CI 2.18-13.54; P=.003). Three months after the intervention, this effect was still present and significant (compared to Group 3: mean difference 10.59, 95% CI 4.94-16.25; P<.001; compared to Group 2: mean difference 9.41, 95% CI 3.70-15.11; P<.001). There was no significant difference in effect between Groups 2 and 3 on both time points. There was no interaction effect for disease type. The combination of counseling with the tool proved an effective way to stimulate physical activity. Counseling without the tool was not effective. Future research about the cost-effectiveness and application under more tailored conditions and in other target groups is recommended. ClinicalTrials.gov: NCT01867970, https://clinicaltrials.gov/ct2/show/NCT01867970 (archived by WebCite at http://www.webcitation.org/6a2qR5BSr).
Comparison of physical and semi-empirical hydraulic models for flood inundation mapping
NASA Astrophysics Data System (ADS)
Tavakoly, A. A.; Afshari, S.; Omranian, E.; Feng, D.; Rajib, A.; Snow, A.; Cohen, S.; Merwade, V.; Fekete, B. M.; Sharif, H. O.; Beighley, E.
2016-12-01
Various hydraulic/GIS-based tools can be used for illustrating spatial extent of flooding for first-responders, policy makers and the general public. The objective of this study is to compare four flood inundation modeling tools: HEC-RAS-2D, Gridded Surface Subsurface Hydrologic Analysis (GSSHA), AutoRoute and Height Above the Nearest Drainage (HAND). There is a trade-off among accuracy, workability and computational demand in detailed, physics-based flood inundation models (e.g. HEC-RAS-2D and GSSHA) in contrast with semi-empirical, topography-based, computationally less expensive approaches (e.g. AutoRoute and HAND). The motivation for this study is to evaluate this trade-off and offer guidance to potential large-scale application in an operational prediction system. The models were assessed and contrasted via comparability analysis (e.g. overlapping statistics) by using three case studies in the states of Alabama, Texas, and West Virginia. The sensitivity and accuracy of physical and semi-eimpirical models in producing inundation extent were evaluated for the following attributes: geophysical characteristics (e.g. high topographic variability vs. flat natural terrain, urbanized vs. rural zones, effect of surface roughness paratermer value), influence of hydraulic structures such as dams and levees compared to unobstructed flow condition, accuracy in large vs. small study domain, effect of spatial resolution in topographic data (e.g. 10m National Elevation Dataset vs. 0.3m LiDAR). Preliminary results suggest that semi-empericial models tend to underestimate in a flat, urbanized area with controlled/managed river channel around 40% of the inundation extent compared to the physical models, regardless of topographic resolution. However, in places where there are topographic undulations, semi-empericial models attain relatively higher level of accuracy than they do in flat non-urbanized terrain.
Zhang, Tan; Chen, Ang
2017-01-01
Based on the job demands-resources model, the study developed and validated an instrument that measures physical education teachers' job demands-resources perception. Expert review established content validity with the average item rating of 3.6/5.0. Construct validity and reliability were determined with a teacher sample ( n = 397). Exploratory factor analysis established a five-dimension construct structure matching the theoretical construct deliberated in the literature. The composite reliability scores for the five dimensions range from .68 to .83. Validity coefficients (intraclass correlational coefficients) are .69 for job resources items and .82 for job demands items. Inter-scale correlational coefficients range from -.32 to .47. Confirmatory factor analysis confirmed the construct validity with high dimensional factor loadings (ranging from .47 to .84 for job resources scale and from .50 to .85 for job demands scale) and adequate model fit indexes (root mean square error of approximation = .06). The instrument provides a tool to measure physical education teachers' perception of their working environment.
Zhang, Tan; Chen, Ang
2017-01-01
Based on the job demands–resources model, the study developed and validated an instrument that measures physical education teachers’ job demands–resources perception. Expert review established content validity with the average item rating of 3.6/5.0. Construct validity and reliability were determined with a teacher sample (n = 397). Exploratory factor analysis established a five-dimension construct structure matching the theoretical construct deliberated in the literature. The composite reliability scores for the five dimensions range from .68 to .83. Validity coefficients (intraclass correlational coefficients) are .69 for job resources items and .82 for job demands items. Inter-scale correlational coefficients range from −.32 to .47. Confirmatory factor analysis confirmed the construct validity with high dimensional factor loadings (ranging from .47 to .84 for job resources scale and from .50 to .85 for job demands scale) and adequate model fit indexes (root mean square error of approximation = .06). The instrument provides a tool to measure physical education teachers’ perception of their working environment. PMID:29200808
Vaiman, Daniel; Miralles, Francisco
2016-01-01
Preeclampsia (PE) is a pregnancy disorder defined by hypertension and proteinuria. This disease remains a major cause of maternal and fetal morbidity and mortality. Defective placentation is generally described as being at the root of the disease. The characterization of the transcriptome signature of the preeclamptic placenta has allowed to identify differentially expressed genes (DEGs). However, we still lack a detailed knowledge on how these DEGs impact the function of the placenta. The tools of network biology offer a methodology to explore complex diseases at a systems level. In this study we performed a cross-platform meta-analysis of seven publically available gene expression datasets comparing non-pathological and preeclamptic placentas. Using the rank product algorithm we identified a total of 369 DEGs consistently modified in PE. The DEGs were used as seeds to build both an extended physical protein-protein interactions network and a transcription factors regulatory network. Topological and clustering analysis was conducted to analyze the connectivity properties of the networks. Finally both networks were merged into a composite network which presents an integrated view of the regulatory pathways involved in preeclampsia and the crosstalk between them. This network is a useful tool to explore the relationship between the DEGs and enable hypothesis generation for functional experimentation. PMID:27802351
Artificial neural networks in biology and chemistry: the evolution of a new analytical tool.
Cartwright, Hugh M
2008-01-01
Once regarded as an eccentric and unpromising algorithm for the analysis of scientific data, the neural network has been developed in the last decade into a powerful computational tool. Its use now spans all areas of science, from the physical sciences and engineering to the life sciences and allied subjects. Applications range from the assessment of epidemiological data or the deconvolution of spectra to highly practical applications, such as the electronic nose. This introductory chapter considers briefly the growth in the use of neural networks and provides some general background in preparation for the more detailed chapters that follow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
2016-02-17
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike
2017-07-07
Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
Semi-supervised Machine Learning for Analysis of Hydrogeochemical Data and Models
NASA Astrophysics Data System (ADS)
Vesselinov, Velimir; O'Malley, Daniel; Alexandrov, Boian; Moore, Bryan
2017-04-01
Data- and model-based analyses such as uncertainty quantification, sensitivity analysis, and decision support using complex physics models with numerous model parameters and typically require a huge number of model evaluations (on order of 10^6). Furthermore, model simulations of complex physics may require substantial computational time. For example, accounting for simultaneously occurring physical processes such as fluid flow and biogeochemical reactions in heterogeneous porous medium may require several hours of wall-clock computational time. To address these issues, we have developed a novel methodology for semi-supervised machine learning based on Non-negative Matrix Factorization (NMF) coupled with customized k-means clustering. The algorithm allows for automated, robust Blind Source Separation (BSS) of groundwater types (contamination sources) based on model-free analyses of observed hydrogeochemical data. We have also developed reduced order modeling tools, which coupling support vector regression (SVR), genetic algorithms (GA) and artificial and convolutional neural network (ANN/CNN). SVR is applied to predict the model behavior within prior uncertainty ranges associated with the model parameters. ANN and CNN procedures are applied to upscale heterogeneity of the porous medium. In the upscaling process, fine-scale high-resolution models of heterogeneity are applied to inform coarse-resolution models which have improved computational efficiency while capturing the impact of fine-scale effects at the course scale of interest. These techniques are tested independently on a series of synthetic problems. We also present a decision analysis related to contaminant remediation where the developed reduced order models are applied to reproduce groundwater flow and contaminant transport in a synthetic heterogeneous aquifer. The tools are coded in Julia and are a part of the MADS high-performance computational framework (https://github.com/madsjulia/Mads.jl).
Two-Dimensional Neutronic and Fuel Cycle Analysis of the Transatomic Power Molten Salt Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betzler, Benjamin R.; Powers, Jeffrey J.; Worrall, Andrew
2017-01-15
This status report presents the results from the first phase of the collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear, Nuclear Energy Voucher program. The TAP design is a molten salt reactor using movable moderator rods to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches andmore » time-dependent parameters necessary to simulate the continuously changing physics in this complex system. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this design. Additional analyses of time step sizes, mass feed rates and enrichments, and isotopic removals provide additional information to make informed design decisions. This work further demonstrates capabilities of ORNL modeling and simulation tools for analysis of molten salt reactor designs and strongly positions this effort for the upcoming three-dimensional core analysis.« less
Initial development of a computer-aided diagnosis tool for solitary pulmonary nodules
NASA Astrophysics Data System (ADS)
Catarious, David M., Jr.; Baydush, Alan H.; Floyd, Carey E., Jr.
2001-07-01
This paper describes the development of a computer-aided diagnosis (CAD) tool for solitary pulmonary nodules. This CAD tool is built upon physically meaningful features that were selected because of their relevance to shape and texture. These features included a modified version of the Hotelling statistic (HS), a channelized HS, three measures of fractal properties, two measures of spicularity, and three manually measured shape features. These features were measured from a difficult database consisting of 237 regions of interest (ROIs) extracted from digitized chest radiographs. The center of each 256x256 pixel ROI contained a suspicious lesion which was sent to follow-up by a radiologist and whose nature was later clinically determined. Linear discriminant analysis (LDA) was used to search the feature space via sequential forward search using percentage correct as the performance metric. An optimized feature subset, selected for the highest accuracy, was then fed into a three layer artificial neural network (ANN). The ANN's performance was assessed by receiver operating characteristic (ROC) analysis. A leave-one-out testing/training methodology was employed for the ROC analysis. The performance of this system is competitive with that of three radiologists on the same database.
Nathan, Nicole; Wolfenden, Luke; Morgan, Philip J; Bell, Andrew C; Barker, Daniel; Wiggers, John
2013-06-13
Valid tools measuring characteristics of the school environment associated with the physical activity and dietary behaviours of children are needed to accurately evaluate the impact of initiatives to improve school environments. The aim of this study was to assess the validity of Principal self-report of primary school healthy eating and physical activity environments. Primary school Principals (n = 42) in New South Wales, Australia were invited to complete a telephone survey of the school environment; the School Environment Assessment Tool - SEAT. Equivalent observational data were collected by pre-service teachers located within the school. The SEAT, involved 65 items that assessed food availability via canteens, vending machines and fundraisers and the presence of physical activity facilities, equipment and organised physical activities. Kappa statistics were used to assess agreement between the two measures. Almost 70% of the survey demonstrated moderate to almost perfect agreement. Substantial agreement was found for 10 of 13 items assessing foods sold for fundraising, 3 of 6 items assessing physical activity facilities of the school, and both items assessing organised physical activities that occurred at recess and lunch and school sport. Limited agreement was found for items assessing foods sold through canteens and access to small screen recreation. The SEAT provides researchers and policy makers with a valid tool for assessing aspects of the school food and physical activity environment.
HEPS Tool for Schools: A Guide for School Policy Development on Healthy Eating and Physical Activity
ERIC Educational Resources Information Center
Simovska, Venka; Dadaczynski, Kevin; Viig, Nina Grieg; Bowker, Sue; Woynarowska, Barbara; de Ruiter, Silvia; Buijs, Goof
2010-01-01
The HEPS Tool for Schools provides ideas, guidelines and suggested techniques to help schools in their development of school policy on healthy eating and physical activity. There is growing evidence that a comprehensive whole school policy on healthy eating and physical activity can lead to better academic outcomes of pupils as well as promoting…
Meta II: Multi-Model Language Suite for Cyber Physical Systems
2013-03-01
AVM META) projects have developed tools for designing cyber physical (or Mechatronic ) Systems . These systems are increasingly complex, take much...projects have developed tools for designing cyber physical (CPS) (or Mechatronic ) systems . Exemplified by modern amphibious and ground military...and parametric interface of Simulink models and defines associations with CyPhy components and component interfaces. 2. Embedded Systems Modeling
Solar Tutorial and Annotation Resource (STAR)
NASA Astrophysics Data System (ADS)
Showalter, C.; Rex, R.; Hurlburt, N. E.; Zita, E. J.
2009-12-01
We have written a software suite designed to facilitate solar data analysis by scientists, students, and the public, anticipating enormous datasets from future instruments. Our “STAR" suite includes an interactive learning section explaining 15 classes of solar events. Users learn software tools that exploit humans’ superior ability (over computers) to identify many events. Annotation tools include time slice generation to quantify loop oscillations, the interpolation of event shapes using natural cubic splines (for loops, sigmoids, and filaments) and closed cubic splines (for coronal holes). Learning these tools in an environment where examples are provided prepares new users to comfortably utilize annotation software with new data. Upon completion of our tutorial, users are presented with media of various solar events and asked to identify and annotate the images, to test their mastery of the system. Goals of the project include public input into the data analysis of very large datasets from future solar satellites, and increased public interest and knowledge about the Sun. In 2010, the Solar Dynamics Observatory (SDO) will be launched into orbit. SDO’s advancements in solar telescope technology will generate a terabyte per day of high-quality data, requiring innovation in data management. While major projects develop automated feature recognition software, so that computers can complete much of the initial event tagging and analysis, still, that software cannot annotate features such as sigmoids, coronal magnetic loops, coronal dimming, etc., due to large amounts of data concentrated in relatively small areas. Previously, solar physicists manually annotated these features, but with the imminent influx of data it is unrealistic to expect specialized researchers to examine every image that computers cannot fully process. A new approach is needed to efficiently process these data. Providing analysis tools and data access to students and the public have proven efficient in similar astrophysical projects (e.g. the “Galaxy Zoo.”) For “crowdsourcing” to be effective for solar research, the public needs knowledge and skills to recognize and annotate key events on the Sun. Our tutorial can provide this training, with over 200 images and 18 movies showing examples of active regions, coronal dimmings, coronal holes, coronal jets, coronal waves, emerging flux, sigmoids, coronal magnetic loops, filaments, filament eruption, flares, loop oscillation, plage, surges, and sunspots. Annotation tools are provided for many of these events. Many features of the tutorial, such as mouse-over definitions and interactive annotation examples, are designed to assist people without previous experience in solar physics. After completing the tutorial, the user is presented with an interactive quiz: a series of movies and images to identify and annotate. The tutorial teaches the user, with feedback on correct and incorrect answers, until the user develops appropriate confidence and skill. This prepares users to annotate new data, based on their experience with event recognition and annotation tools. Trained users can contribute significantly to our data analysis tasks, even as our training tool contributes to public science literacy and interest in solar physics.
National policy on physical activity: the development of a policy audit tool.
Bull, Fiona C; Milton, Karen; Kahlmeier, Sonja
2014-02-01
Physical inactivity is a leading risk factor for noncommunicable disease worldwide. Increasing physical activity requires large scale actions and relevant, supportive national policy across multiple sectors. The policy audit tool (PAT) was developed to provide a standardized instrument to assess national policy approaches to physical activity. A draft tool, based on earlier work, was developed and pilot-tested in 7 countries. After several rounds of revisions, the final PAT comprises 27 items and collects information on 1) government structure, 2) development and content of identified key policies across multiple sectors, 3) the experience of policy implementation at both the national and local level, and 4) a summary of the PAT completion process. PAT provides a standardized instrument for assessing progress of national policy on physical activity. Engaging a diverse international group of countries in the development helped ensure PAT has applicability across a wide range of countries and contexts. Experiences from the development of the PAT suggests that undertaking an audit of health enhancing physical activity (HEPA) policy can stimulate greater awareness of current policy opportunities and gaps, promote critical debate across sectors, and provide a catalyst for collaboration on policy level actions. The final tool is available online.
Insightful problem solving and creative tool modification by captive nontool-using rooks
Bird, Christopher D.; Emery, Nathan J.
2009-01-01
The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use. PMID:19478068
NASA Technical Reports Server (NTRS)
Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.
1993-01-01
The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).
X-ray astronomical spectroscopy
NASA Technical Reports Server (NTRS)
Holt, S. S.
1980-01-01
The current status of the X-ray spectroscopy of celestial X-ray sources, ranging from nearby stars to distant quasars, is reviewed. Particular emphasis is placed on the role of such spectroscopy as a useful and unique tool in the elucidation of the physical parameters of the sources. The spectroscopic analysis of degenerate and nondegenerate stellar systems, galactic clusters and active galactic nuclei, and supernova remnants is discussed.
Remote Sensing for Inland Water Quality Monitoring: A U.S. Army Corps of Engineers Perspective
2011-10-01
outlined in Water Quality Management Plans , including traditional field sampling (water, sediment, and biological) and measure- ment of physical...at one time, a more comprehen- sive historical record or trend analysis, a planning tool for prioritizing field surveying and sampling, and accurate...estimations of optically active constituents used to characterize water quality. Furthermore, when utilized in water quality management planning
NASA Technical Reports Server (NTRS)
Gibson, Jim; Jordan, Joe; Grant, Terry
1990-01-01
Local Area Network Extensible Simulator (LANES) computer program provides method for simulating performance of high-speed local-area-network (LAN) technology. Developed as design and analysis software tool for networking computers on board proposed Space Station. Load, network, link, and physical layers of layered network architecture all modeled. Mathematically models according to different lower-layer protocols: Fiber Distributed Data Interface (FDDI) and Star*Bus. Written in FORTRAN 77.
Techniques for the Detection of Faulty Packet Header Modifications
2014-03-12
layer approaches to check if packets are being altered by middleboxes and were primarily developed as network neutrality analysis tools. Switzerland works...local and metropolitan area networks –specific requirements part 11: Wireless LAN medium access control (MAC) and physical layer (PHY) specifications...policy or position of the Department of Defense or the U.S. Government. Understanding, measuring, and debugging IP networks , particularly across
Power and promise of narrative for advancing physical therapist education and practice.
Greenfield, Bruce H; Jensen, Gail M; Delany, Clare M; Mostrom, Elizabeth; Knab, Mary; Jampel, Ann
2015-06-01
This perspective article provides a justification for and an overview of the use of narrative as a pedagogical tool for educators to help physical therapist students, residents, and clinicians develop skills of reflection and reflexivity in clinical practice. The use of narratives is a pedagogical approach that provides a reflective and interpretive framework for analyzing and making sense of texts, stories, and other experiences within learning environments. This article describes reflection as a well-established method to support critical analysis of clinical experiences; to assist in uncovering different perspectives of patients, families, and health care professionals involved in patient care; and to broaden the epistemological basis (ie, sources of knowledge) for clinical practice. The article begins by examining how phronetic (ie, practical and contextual) knowledge and ethical knowledge are used in physical therapy to contribute to evidence-based practice. Narrative is explored as a source of phronetic and ethical knowledge that is complementary but irreducible to traditional objective and empirical knowledge-the type of clinical knowledge that forms the basis of scientific training. The central premise is that writing narratives is a cognitive skill that should be learned and practiced to develop critical reflection for expert practice. The article weaves theory with practical application and strategies to foster narrative in education and practice. The final section of the article describes the authors' experiences with examples of integrating the tools of narrative into an educational program, into physical therapist residency programs, and into a clinical practice. © 2015 American Physical Therapy Association.
Penelo, Eva; Estévez-Guerra, Gabriel J; Fariña-López, Emilio
2018-03-01
To study the internal structure and measurement invariance of the Physical Restraint Use Questionnaire and to compare perceptions, experience and training, regarding use of physical restraint on the older people between nursing staff working in hospitals and nursing homes. Physical restraint of patients is still common in many countries, and thus, it is important to study the attitudes of nursing staff. One of the most common tools used to assess perceptions regarding its use is the Physical Restraint Use Questionnaire. However, gaps exist in its internal structure and measurement invariance across different groups of respondents. Cross-sectional multicentre survey. Data were collected from nurses working in eight Spanish hospitals and 19 nursing homes. All registered nurses and nurse assistants (N = 3,838) were contacted, of whom 1,635 agreed to participate. Confirmatory factor analysis was performed to determine internal structure and measurement invariance of Physical Restraint Use Questionnaire, after which scale scores and other measures of experience and training were compared between hospital-based (n = 855) and nursing homes-based (n = 780) nurses. The Physical Restraint Use Questionnaire showed three invariant factors across type of facility, and also professional category and sex. Nursing staff working in both types of facility scored similarly; prevention of therapy disruption and prevention of falls were rated more important. Nurses working in nursing homes reported using restraint "many times" more frequently (52.9% vs. 38.6%), less severe lack of training (18.2% vs. 58.7%) being perceived as more adequate (33.4% vs. 17.7%), than hospital-based nurses. These findings support Physical Restraint Use Questionnaire as a valid and reliable tool for assessing the importance given to the use of physical restraint in the older people by nursing professionals, regardless of the setting being studied. The information would help design more specifically the physical restraint training of nursing staff and to plan institutional interventions aimed at reducing its use. © 2018 John Wiley & Sons Ltd.
Emotion recognition in fathers and mothers at high-risk for child physical abuse.
Asla, Nagore; de Paúl, Joaquín; Pérez-Albéniz, Alicia
2011-09-01
The present study was designed to determine whether parents at high risk for physical child abuse, in comparison with parents at low risk, show deficits in emotion recognition, as well as to examine the moderator effect of gender and stress on the relationship between risk for physical child abuse and emotion recognition. Based on their scores on the Abuse Scale of the CAP Inventory (Milner, 1986), 64 parents at high risk (24 fathers and 40 mothers) and 80 parents at low risk (40 fathers and 40 mothers) for physical child abuse were selected. The Subtle Expression Training Tool/Micro Expression Training Tool (Ekman, 2004a, 2004b) and the Diagnostic Analysis of Nonverbal Accuracy II (Nowicki & Carton, 1993) were used to assess emotion recognition. As expected, parents at high risk, in contrast to parents at low risk, showed deficits in emotion recognition. However, differences between high- and low-risk participants were observed only for fathers, but not for mothers. Whereas fathers at high risk for physical child abuse made more errors than mothers at high risk, no differences between mothers at low risk and fathers at low risk were found. No interaction between stress, gender, and risk status was observed for errors in emotion recognition. The present findings, if confirmed with physical abusers, could be helpful to further our understanding of deficits in processing information of physically abusive parents and to develop treatment strategies specifically focused on emotion recognition. Moreover, if gender differences can be confirmed, the findings could be helpful to develop specific treatment programs for abusive fathers. Copyright © 2011 Elsevier Ltd. All rights reserved.
2013-01-01
Background In prior work, we presented the Ontology of Physics for Biology (OPB) as a computational ontology for use in the annotation and representations of biophysical knowledge encoded in repositories of physics-based biosimulation models. We introduced OPB:Physical entity and OPB:Physical property classes that extend available spatiotemporal representations of physical entities and processes to explicitly represent the thermodynamics and dynamics of physiological processes. Our utilitarian, long-term aim is to develop computational tools for creating and querying formalized physiological knowledge for use by multiscale “physiome” projects such as the EU’s Virtual Physiological Human (VPH) and NIH’s Virtual Physiological Rat (VPR). Results Here we describe the OPB:Physical dependency taxonomy of classes that represent of the laws of classical physics that are the “rules” by which physical properties of physical entities change during occurrences of physical processes. For example, the fluid analog of Ohm’s law (as for electric currents) is used to describe how a blood flow rate depends on a blood pressure gradient. Hooke’s law (as in elastic deformations of springs) is used to describe how an increase in vascular volume increases blood pressure. We classify such dependencies according to the flow, transformation, and storage of thermodynamic energy that occurs during processes governed by the dependencies. Conclusions We have developed the OPB and annotation methods to represent the meaning—the biophysical semantics—of the mathematical statements of physiological analysis and the biophysical content of models and datasets. Here we describe and discuss our approach to an ontological representation of physical laws (as dependencies) and properties as encoded for the mathematical analysis of biophysical processes. PMID:24295137
O'Neill, B; McDonough, S M; Wilson, J J; Bradbury, I; Hayes, K; Kirk, A; Kent, L; Cosgrove, D; Bradley, J M; Tully, M A
2017-01-14
There are challenges for researchers and clinicians to select the most appropriate physical activity tool, and a balance between precision and feasibility is needed. Currently it is unclear which physical activity tool should be used to assess physical activity in Bronchiectasis. The aim of this research is to compare assessment methods (pedometer and IPAQ) to our criterion method (ActiGraph) for the measurement of physical activity dimensions in Bronchiectasis (BE), and to assess their feasibility and acceptability. Patients in this analysis were enrolled in a cross-sectional study. The ActiGraph and pedometer were worn for seven consecutive days and the IPAQ was completed for the same period. Statistical analyses were performed using SPSS 20 (IBM). Descriptive statistics were used; the percentage agreement between ActiGraph and the other measures were calculated using limits of agreement. Feedback about the feasibility of the activity monitors and the IPAQ was obtained. There were 55 (22 male) data sets available. For step count there was no significant difference between the ActiGraph and Pedometer, however, total physical activity time (mins) as recorded by the ActiGraph was significantly higher than the pedometer (mean ± SD, 232 (75) vs. 63 (32)). Levels of agreement between the two devices was very good for step count (97% agreement); and variation in the levels of agreement were within accepted limits of ±2 standard deviations from the mean value. IPAQ reported more bouted- moderate - vigorous physical activity (MVPA) [mean, SD; 167(170) vs 6(9) mins/day], and significantly less sedentary time than ActiGraph [mean, SD; 362(115) vs 634(76) vmins/day]. There were low levels of agreement between the two tools (57% sedentary behaviour; 0% MVPA 10+ ), with IPAQ under-reporting sedentary behaviour and over-reporting MVPA 10+ compared to ActiGraph. The monitors were found to be feasible and acceptable by participants and researchers; while the IPAQ was accepta ble to use, most patients required assistance to complete it. Accurate measurement of physical activity is feasible in BE and will be valuable for future trials of therapeutic interventions. ActiGraph or pedometer could be used to measure simple daily step counts, but ActiGraph was superior as it measured intensity of physical activity and was a more precise measure of time spent walking. The IPAQ does not appear to represent an accurate measure of physical activity in this population. Clinical Trials Registration Number NCT01569009 : Physical Activity in Bronchiectasis.
Tools don't-and won't-make the man: A cognitive look at the future.
Osiurak, François; Navarro, Jordan; Reynaud, Emanuelle; Thomas, Gauthier
2018-05-01
The question of whether tools erase cognitive and physical interindividual differences has been surprisingly overlooked in the literature. Yet if technology is profusely available in a near or far future, will we be equal in our capacity to use it? We sought to address this unexplored, fundamental issue, asking 200 participants to perform 3 physical (e.g., fine manipulation) and 3 cognitive tasks (e.g., calculation) in both non-tool-use and tool-use conditions. Here we show that tools do not erase but rather extend our intrinsic physical and cognitive skills. Moreover, this phenomenon of extension is task specific because we found no evidence for superusers, benefitting from the use of a tool irrespective of the task concerned. These results challenge the possibility that technical solutions could always be found to make people equal. Rather, technical innovation might be systematically limited by the user's initial degree of knowledge or skills for a given task. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Physical intelligence does matter to cumulative technological culture.
Osiurak, François; De Oliveira, Emmanuel; Navarro, Jordan; Lesourd, Mathieu; Claidière, Nicolas; Reynaud, Emanuelle
2016-08-01
Tool-based culture is not unique to humans, but cumulative technological culture is. The social intelligence hypothesis suggests that this phenomenon is fundamentally based on uniquely human sociocognitive skills (e.g., shared intentionality). An alternative hypothesis is that cumulative technological culture also crucially depends on physical intelligence, which may reflect fluid and crystallized aspects of intelligence and enables people to understand and improve the tools made by predecessors. By using a tool-making-based microsociety paradigm, we demonstrate that physical intelligence is a stronger predictor of cumulative technological performance than social intelligence. Moreover, learners' physical intelligence is critical not only in observational learning but also when learners interact verbally with teachers. Finally, we show that cumulative performance is only slightly influenced by teachers' physical and social intelligence. In sum, human technological culture needs "great engineers" to evolve regardless of the proportion of "great pedagogues." Social intelligence might play a more limited role than commonly assumed, perhaps in tool-use/making situations in which teachers and learners have to share symbolic representations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Perspective: Reaches of chemical physics in biology.
Gruebele, Martin; Thirumalai, D
2013-09-28
Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry.
Perspective: Reaches of chemical physics in biology
Gruebele, Martin; Thirumalai, D.
2013-01-01
Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry. PMID:24089712
Simple Sensitivity Analysis for Orion Guidance Navigation and Control
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
The Auroral Planetary Imaging and Spectroscopy (APIS) service
NASA Astrophysics Data System (ADS)
Lamy, L.; Prangé, R.; Henry, F.; Le Sidaner, P.
2015-06-01
The Auroral Planetary Imaging and Spectroscopy (APIS) service, accessible online, provides an open and interactive access to processed auroral observations of the outer planets and their satellites. Such observations are of interest for a wide community at the interface between planetology, magnetospheric and heliospheric physics. APIS consists of (i) a high level database, built from planetary auroral observations acquired by the Hubble Space Telescope (HST) since 1997 with its mostly used Far-Ultraviolet spectro-imagers, (ii) a dedicated search interface aimed at browsing efficiently this database through relevant conditional search criteria and (iii) the ability to interactively work with the data online through plotting tools developed by the Virtual Observatory (VO) community, such as Aladin and Specview. This service is VO compliant and can therefore also been queried by external search tools of the VO community. The diversity of available data and the capability to sort them out by relevant physical criteria shall in particular facilitate statistical studies, on long-term scales and/or multi-instrumental multi-spectral combined analysis.
NASA Astrophysics Data System (ADS)
Chilcote, S.; Maumenee, N.; Lucotch, J.; Whited, D.; Bansack, T.; Kimball, J. S.; Stanford, J.
2009-12-01
The Salmonid Rivers Observatory Network (SaRON) is an intensive field research project which aims to describe the relation between salmon productivion and diversity in relation to environmental drivers and physical complexity of riverine shifting habitat mosaics. The Riverscape Analysis Project (RAP) is a spatially explicit remote sensing database which quantifies and ranks different combinations of physical landscape metrics around the Pacific Rim, displaying results through a publically accessible web based decision support framework designed to empower regional management and conservation efforts for wild salmon. The objective of our research is to explicitly describe and relate different habitat types and their potential fish production at a variety of scales and throughout the range of Pacific salmon, leveraging our field research through available satellite remote sensing and geospatial analysis. We find that rivers exhibit a range of physical, chemical, and biotic conditions consistent with the shifting habitat mosaic (SHM) concept. Landscape physical variables derived from global Landsat imagery and SRTM-DEM information explain 93.2% of observed variability in over 1500 watersheds across the Pacific Rim. We expect that it is these coarse scale differences in river typologies which are responsible for the fine scale differences in habitat conditions and juvenile salmon production. Therefore, we ranked rivers using landscape scale physical variables to prioritize them for management actions based on potential productivity. For example, the Kvichak River of Bristol Bay is highly ranked, 8th, based on its physical landscape structure as well as current human impacts. Currently, the Bristol Bay fishery is extremely productive. Habitat structure can be used not only to define reference conditions and management targets for how many fish we would expect a river to produce based on its potential habitat capacity, but it also provides new analytical tools to quantitatively evaluate potential ecosystem impacts from proposed development activities. We found that proposed water extraction of 29 cubic feet per second (cfs) in a tributary of the Kvichak could potentially reduce off-channel habitat capacity by over 512 juvenile fish per hectare of habitat. In this article, we provide examples of how managers can integrate these novel data and tools into their evaluation frameworks in order to make informed, ecologically based decisions about current ecosystem conditions, desired ecological states, and potential tradeoffs in meeting salmon management goals in relation to human impacts.
Chircop, Andrea; Edgecombe, Nancy; Hayward, Kathryn; Ducey-Gilbert, Cherie; Sheppard-Lemoine, Debbie
2013-04-01
Currently used audiovisual (AV) teaching tools to teach health and physical assessment reflect a Eurocentric bias using the biomedical model. The purpose of our study was to (a) identify commonly used AV teaching tools of Canadian schools of nursing and (b) evaluate the identified tools. A two-part descriptive quantitative method design was used. First, we surveyed schools of nursing across Canada. Second, the identified AV teaching tools were evaluated for content and modeling of cultural competence. The majority of the schools (67%) used publisher-produced videos associated with a physical assessment textbook. Major findings included minimal demonstration of negotiation with a client around cultural aspects of the interview including the need for an interpreter, modesty, and inclusion of support persons. Identification of culturally specific examples given during the videos was superficial and did not provide students with a comprehensive understanding of necessary culturally competent skills.
PREFACE: Anti-counterfeit Image Analysis Methods (A Special Session of ICSXII)
NASA Astrophysics Data System (ADS)
Javidi, B.; Fournel, T.
2007-06-01
The International Congress for Stereology is dedicated to theoretical and applied aspects of stochastic tools, image analysis and mathematical morphology. A special emphasis on `anti-counterfeit image analysis methods' has been given this year for the XIIth edition (ICSXII). Facing the economic and social threat of counterfeiting, this devoted session presents recent advances and original solutions in the field. A first group of methods are related to marks located either on the product (physical marks) or on the data (hidden information) to be protected. These methods concern laser fs 3D encoding and source separation for machine-readable identification, moiré and `guilloche' engraving for visual verification and watermarking. Machine-readable travel documents are well-suited examples introducing the second group of methods which are related to cryptography. Used in passports for data authentication and identification (of people), cryptography provides some powerful tools. Opto-digital processing allows some efficient implementations described in the papers and promising applications. We would like to thank the reviewers who have contributed to a session of high quality, and the authors for their fine and hard work. We would like to address some special thanks to the invited lecturers, namely Professor Roger Hersch and Dr Isaac Amidror for their survey of moiré methods, Prof. Serge Vaudenay for his survey of existing protocols concerning machine-readable travel documents, and Dr Elisabet Pérez-Cabré for her presentation on optical encryption for multifactor authentication. We also thank Professor Dominique Jeulin, President of the International Society for Stereology, Professor Michel Jourlin, President of the organizing committee of ICSXII, for their help and advice, and Mr Graham Douglas, the Publisher of Journal of Physics: Conference Series at IOP Publishing, for his efficiency. We hope that this collection of papers will be useful as a tool to further develop a very important field. Bahram Javidi University of Connecticut (USA) Thierry Fournel University of Saint-Etienne (France) Chairs of the special session on `Anti-counterfeit image analysis methods', July 2007
Physical and Biological Carbon Isotope Fractionation in Methane During Gas-Push-Pull-Tests
NASA Astrophysics Data System (ADS)
Gonzalez-Gil, G.; Schroth, M. H.; Gomez, K.; Zeyer, J.
2005-12-01
Stable isotope analyses have become a common tool to assess microbially-mediated processes in subsurface environments. We investigated if stable carbon isotope analysis can be used as a tool to complement gas push-pull tests (GPPTs), a novel technique that was recently developed and tested for the in-situ quantification of CH4 oxidation in soils. During a GPPT a gas mixture containing CH4, O2 and nonreactive tracer gases is injected into the soil, where CH4 is oxidized by indigenous microorganisms. Thereafter, a blend of injected gas mixture and soil air is extracted from the same location, and CH4 oxidation is quantified from an analysis of extracted CH4 and tracer gases. To assess the magnitude of physical isotope fractionation due to molecular diffusion during GPPTs, we conducted laboratory experiments in the absence of microbial activity in a 1m-high, 1m-diameter tank filled with dry sand. During the GPPTs' extraction phase, the isotopic composition of methane was analyzed. Results indicated strong carbon isotope fractionation (>20 per mil) during GPPTs. To assess the combined effect of physical and biological isotope fractionation, numerical simulations of GPPTs were conducted in which microbial CH4 isotope fractionation was simulated using first-order rate constants and microbial kinetic isotope fractionation factors previously reported for methane oxidation in landfill environments. Results of these simulations indicated that for small CH4 oxidation rates, overall isotope fractionation in CH4 is dominated by physical fractionation. Conversely, for high CH4 oxidation rates, overall fractionation is dominated by biological fractionation. Thus, CH4 isotope fractionation data alone from a single GPPT cannot be used to assess microbial CH4 oxidation. However, biological fractionation may be quantified if physical fractionation due to diffusion is known. This can be achieved by conducting two sequential GPPTs, with microbial activity being inhibited in the second test.
NASA Astrophysics Data System (ADS)
Offermans, A. G. E.; Haasnoot, M.
2009-04-01
Development of sustainable water management strategies involves analysing current and future vulnerability, identification of adaptation possibilities, effect analysis and evaluation of the strategies under different possible futures. Recent studies on water management often followed the pressure-effect chain and compared the state of social, economic and ecological functions of the water systems in one or two future situations with the current situation. The future is, however, more complex and dynamic. Water management faces major challenges to cope with future uncertainties in both the water system as well as the social system. Uncertainties in our water system relate to (changes in) drivers and pressures and their effects on the state, like the effects of climate change on discharges. Uncertainties in the social world relate to changing of perceptions, objectives and demands concerning water (management), which are often related with the aforementioned changes in the physical environment. The methodology presented here comprises the 'Perspectives method', derived from the Cultural Theory, a method on analyzing and classifying social response to social and natural states and pressures. The method will be used for scenario analysis and to identify social responses including changes in perspectives and management strategies. The scenarios and responses will be integrated within a rapid assessment tool. The purpose of the tool is to provide users with insight about the interaction of the social and physical system and to identify robust water management strategies by analysing the effectiveness under different possible futures on the physical, social and socio-economic system. This method allows for a mutual interaction between the physical and social system. We will present the theoretical background of the perspectives method as well as a historical overview of perspective changes in the Dutch Meuse area to show how social and physical systems interrelate. We will also show how the integration of both can contribute to the identification of robust water management strategies.
Toward Better Physics Labs for Future Biologists
NASA Astrophysics Data System (ADS)
Giannini, John; Moore, Kim; Losert, Wolfgang
2014-03-01
We have developed a set of laboratories and hands on activities to accompany a new two-semester interdisciplinary physics course that has been successfully developed and tested in two small test classes of students at the University of Maryland, College Park (UMD) in 2012-2013, and is currently being used on a wider scale. We have designed the laboratories to be taken accompanying a reformed course in the student's second year, with calculus, biology, and chemistry as prerequisites. This permits the laboratories to include significant content on physics relevant to cellular scales, from chemical interactions to random motion and charge screening in fluids. One major focus of the laboratories is to introduce the students to research-grade equipment and modern physics analysis tools in contexts relevant to biology, while maintaining the pedagogically valuable open-ended laboratory structure of reformed laboratories. Lab development procedures along with some preliminary student results from these two small test classes are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
Garcia-Pinillos, Felipe; Cozar-Barba, Manuela; Munoz-Jimenez, Marcos; Soto-Hermoso, Victor; Latorre-Roman, Pedro
2016-05-01
With ageing, physical and cognitive functions become impaired. Analyzing and determining the association between both functions can facilitate the prevention and diagnosis of associated problems. Some previous works have proposed batteries of physical performance tests to determine both physical and cognitive functions. However, only a few studies have used the gait speed (GS) test as a tool to evaluate parameters representative of health in the elderly such as functionality, mobility, independence, autonomy, and comorbidity. Therefore, the aim of this study was to determine the association between physical and cognitive functions in older people (over 65 years old) and to detect the most appropriate physical test to assess cognitive impairment, functional independence, comorbidity, and perceived health in this population. One hundred six older adults (38 men, 68 women) participated voluntarily in this cross-sectional study. To assess the physical function handgrip strength, GS, 30-s chair stand tests, and body composition analysis were performed. To evaluate cognitive function, the Mini-Mental State Examination, Barthel index, and Charlson index were employed. No significant differences (P ≥ 0.05) between sexes were found. Multiple regression analysis of the Mini-Mental State Examination and physical fitness variables, adjusted for age and sex, indicates that GS is a predictor of Mini-Mental State Examination score (R(2) = 0.138). The results showed that GS is an important predictor of functional capacity (physical and cognitive function) in adults over 65 years old. © 2015 The Authors. Psychogeriatrics © 2015 Japanese Psychogeriatric Society.
SpacePy - a Python-based library of tools for the space sciences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morley, Steven K; Welling, Daniel T; Koller, Josef
Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks tomore » promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the analysis methods employed in scientific studies and will give access to advanced tools to all space scientists regardless of affiliation or circumstance.« less
Faye, Alexandrine; Jacquin-Courtois, Sophie; Osiurak, François
2018-03-01
The purpose of this study was to deepen our understanding of the cognitive bases of human tool use based on the technical reasoning hypothesis (i.e., the reasoning-based approach). This approach assumes that tool use is supported by the ability to reason about an object's physical properties (e.g., length, weight, strength, etc.) to perform mechanical actions (e.g., lever). In this framework, an important issue is to understand whether left-brain-damaged (LBD) individuals with tool-use deficits are still able to estimate the physical object's properties necessary to use the tool. Eleven LBD patients and 12 control participants performed 3 original experimental tasks: Use-Length (visual evaluation of the length of a stick to bring down a target), Visual-Length (to visually compare objects of different lengths) and Addition-Length (to visually compare added lengths). Participants were also tested on conventional tasks: Familiar Tool Use and Mechanical Problem-Solving (novel tools). LBD patients had more difficulties than controls on both conventional tasks. No significant differences were observed for the 3 experimental tasks. These results extend the reasoning-based approach, stressing that it might not be the representation of length that is impaired in LBD patients, but rather the ability to generate mechanical actions based on physical object properties. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
ERIC Educational Resources Information Center
Courtright, Joachim
2017-01-01
INTRODUCTION. The learning style of a student is an important factor in their ability to gain knowledge. This is especially important in challenging curriculums such as the Doctor of Physical Therapy (DPT) program. A common tool to assess one's learning style is The Kolb Learning Styles Inventory (LSI). A common tool used to measure the…
Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria
2016-01-01
Aim To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. Methods The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. Results A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90–0.97) in Brazil and 0.81 (95% confidence interval 0.66–0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. Conclusions These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective perfor mance measures. PMID:24666718
Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria
2014-10-01
To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90-0.97) in Brazil and 0.81 (95% confidence interval 0.66-0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective performance measures. © 2013 Japan Geriatrics Society.
Estimating ankle rotational constraints from anatomic structure
NASA Astrophysics Data System (ADS)
Baker, H. H.; Bruckner, Janice S.; Langdon, John H.
1992-09-01
Three-dimensional biomedical data obtained through tomography provide exceptional views of biological anatomy. While visualization is one of the primary purposes for obtaining these data, other more quantitative and analytic uses are possible. These include modeling of tissue properties and interrelationships, simulation of physical processes, interactive surgical investigation, and analysis of kinematics and dynamics. As an application of our research in modeling tissue structure and function, we have been working to develop interactive and automated tools for studying joint geometry and kinematics. We focus here on discrimination of morphological variations in the foot and determining the implications of these on both hominid bipedal evolution and physical therapy treatment for foot disorders.
Suited crewmember productivity.
Barer, A S; Filipenkov, S N
1994-01-01
Analysis of the extravehicular activity (EVA) sortie experience gained in the former Soviet Union and physiologic hygienic aspect of space suit design and development shows that crewmember productivity is related to the following main factors: -space suit microclimate (gas composition, pressure and temperature); -limitation of motion activity and perception, imposed by the space suit; -good crewmember training in the ground training program; -level of crewmember general physical performance capabilities in connection with mission duration and intervals between sorties; -individual EVA experience (with accumulation) at which workmanship improves, while metabolism, physical and emotional stress decreases; -concrete EVA duration and work rate; -EVA bioengineering, including selection of tools, work station, EVA technology and mechanization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aly, A.; Avramova, Maria; Ivanov, Kostadin
To correctly describe and predict this hydrogen distribution there is a need for multi-physics coupling to provide accurate three-dimensional azimuthal, radial, and axial temperature distributions in the cladding. Coupled high-fidelity reactor-physics codes with a sub-channel code as well as with a computational fluid dynamics (CFD) tool have been used to calculate detailed temperature distributions. These high-fidelity coupled neutronics/thermal-hydraulics code systems are coupled further with the fuel-performance BISON code with a kernel (module) for hydrogen. Both hydrogen migration and precipitation/dissolution are included in the model. Results from this multi-physics analysis is validated utilizing calculations of hydrogen distribution using models informed bymore » data from hydrogen experiments and PIE data.« less
Physics of Shock Compression and Release: NEMD Simulations of Tantalum and Silicon
NASA Astrophysics Data System (ADS)
Hahn, Eric; Meyers, Marc; Zhao, Shiteng; Remington, Bruce; Bringa, Eduardo; Germann, Tim; Ravelo, Ramon; Hammerberg, James
2015-06-01
Shock compression and release allow us to evaluate physical deformation and damage mechanisms occurring in extreme environments. SPaSM and LAMMPS molecular dynamics codes were employed to simulate single and polycrystalline tantalum and silicon at strain rates above 108 s-1. Visualization and analysis was accomplished using OVITO, Crystal Analysis Tool, and a redesigned orientation imaging function implemented into SPaSM. A comparison between interatomic potentials for both Si and Ta (as pertaining to shock conditions) is conducted and the influence on phase transformation and plastic relaxation is discussed. Partial dislocations, shear induced disordering, and metastable phase changes are observed in compressed silicon. For tantalum, the role of grain boundary and twin intersections are evaluated for their role in ductile spallation. Finally, the temperature dependent response of both Ta and Si is investigated.
NASA Astrophysics Data System (ADS)
Kavcar, Nevzat; Özen, Ali Ihsan
2017-02-01
Purpose of this work is to determine the physics teacher candidates' views on Physics 11 textbook' content and general properties suitable to the 2013 Secondary School Physics Curriculum. 24 teacher candidates at 2015-2016 school year constituted the sampling of the study in which scanning model based on qualitative research technique was used by performing document analysis. Data collection tool of the research was the files prepared with 51 and 28 open ended questions including the subject content and general properties of the textbook. It was concluded that the textbook was sufficient in terms of discussion, investigation, daily life context, visual elements, permanent learning traces; but was insufficient for design elements and being only one project in Electricity and Magnetism unit. Affective area activities may be involved in the textbook, there may be teacher guide book and book' teaching packet, and underline issues and qualification of the textbook may be improved.
Development of a quantitative tool to assess the content of physical therapy for infants.
Blauw-Hospers, Cornill H; Dirks, Tineke; Hulshof, Lily J; Hadders-Algra, Mijna
2010-01-01
The study aim was to describe and quantify physical therapy interventions for infants at high risk for developmental disorders. An observation protocol was developed based on knowledge about infant physical therapy and analysis of directly observable physiotherapeutic (PT) actions. The protocol's psychometric quality was assessed. Videos of 42 infant physical therapy sessions at 4 or 6 months of corrected age were analyzed. The observation protocol classified PT actions into 8 mutually exclusive categories. Virtually all PT actions during treatment could be classified. Inter- and intrarater agreements were satisfactory (intraclass correlations, 0.68-1.00). Approximately 40% of treatment time was spent challenging the infant to produce motor behavior by themselves, whereas approximately 30% of time facilitation techniques were applied. Tradition-based sessions could be differentiated from function-oriented ones. It is possible to document PT actions during physical therapy treatment of infants at high risk for cerebral palsy in a systematic, standardized, and reliable way.
Staples, William H; Killian, Clyde B
2012-08-01
This study investigated the factor structure of an instrument to measure attitudes and beliefs of how physical therapist (PT) practitioners perceive working with people with a dementia disorder. A survey was mailed to every skilled nursing facility in Indiana (n = 495) for completion by a PT or physical therapist assistant. The survey was developed and included whether the severity of Alzheimer's disease (AD) impacts the attitudes of physical therapy practitioners. Of the 12 attitudinal questions, 11 were significant (P < .001) concerning how the severity of a diagnosis of AD (early, middle, and late) impacts attitudes of people in physical therapy practice. Principal component analysis identified 3 factors with Eigen values of 3.3 or higher accounting for 43% of the cumulative variance. These factors include professional competence, resources, and conscientiousness. This brief instrument could serve as an assessment tool to determine whether PT practitioners exhibit therapeutic nihilism when working with people with a dementia disorder.
Information-theoretic metric as a tool to investigate nonclassical correlations
NASA Astrophysics Data System (ADS)
Rudolph, Alexander L.; Lamine, Brahim; Joyce, Michael; Vignolles, Hélène; Consiglio, David
2014-06-01
We report on a project to introduce interactive learning strategies (ILS) to physics classes at the Université Pierre et Marie Curie, one of the leading science universities in France. In Spring 2012, instructors in two large introductory classes, first-year, second-semester mechanics, and second-year introductory electricity and magnetism, enrolling approximately 500 and 250 students, respectively, introduced ILS into some, but not all, of the sections of each class. The specific ILS utilized were think-pair-share questions and Peer Instruction in the main lecture classrooms, and University of Washington Tutorials for Introductory Physics in recitation sections. Pre- and postinstruction assessments [Force Concept Inventory (FCI) and Conceptual Survey of Electricity and Magnetism (CSEM), respectively] were given, along with a series of demographic questions. Since not all lecture or recitation sections in these classes used ILS, we were able to compare the results of the FCI and CSEM between interactive and noninteractive classes taught simultaneously with the same curriculum. We also analyzed final exam results, as well as the results of student and instructor attitude surveys between classes. In our analysis, we argue that multiple linear regression modeling is superior to other common analysis tools, including normalized gain. Our results show that ILS are effective at improving student learning by all measures used: research-validated concept inventories and final exam scores, on both conceptual and traditional problem-solving questions. Multiple linear regression analysis reveals that interactivity in the classroom is a significant predictor of student learning, showing a similar or stronger relationship with student learning than such ascribed characteristics as parents’ education, and achieved characteristics such as grade point average and hours studied per week. Analysis of student and instructor attitudes shows that both groups believe that ILS improve student learning in the physics classroom and increase student engagement and motivation. All of the instructors who used ILS in this study plan to continue their use.
Novel Tool for Simultaneous Carbon and Nitrogen Stable Isotope Analyses in Aqueous Samples
NASA Astrophysics Data System (ADS)
Federherr, E.; Schmidt, T. C.; Cerli, C.; Kalbitz, K.; Kupka, H. J.; Lange, L.; Dunsbach, R.; Panetta, R. J.; Kasson, A.
2014-12-01
Investigation of transformation and transport processes of carbon and nitrogen in ecosystems plays an important role to understand and predict their dynamics and role in biogeochemistry. Consequently, suitable and accurate methods for concentration as well as stable isotopic composition analysis of carbon and nitrogen in waters and aqueous solutions play a significant role. Traditionally dissolved carbon and nitrogen stable isotope analysis (SIA) is performed using either offline sample preparation followed by elemental analysis isotope ratio mass spectrometry (EA/IRMS) or modified wet chemical oxidation based device coupled to IRMS. Recently we presented a high temperature combustion system (HTC), which significantly improves upon these methods for dissolved organic carbon (DOC) SIA. The analysis of δ15N of dissolved nitrogen still has large limitations. Its low concentration makes EA/IRMS laborious, time and sample consuming. Systems based on wet chemical oxidation-IRMS bare the risk of sensitivity loss as well as of fractionation due to incomplete mineralization. In addition, the high solubility of molecular nitrogen in water remains a technical challenge, as it requires additional separation steps to distinguish between physically dissolved nitrogen and bound nitrogen. Further development of our HTC system lead to the implementation of the δ15N determination which now coupled, into a novel total organic carbon (TOC) analyzing system, especially designed for SIA of both, carbon and nitrogen. Integrated, innovative purge and trap technique (peak focusing) for nitrogen with aluminosilicate adsorber and peltier element based cooling system, in combination with high injection volume (up to 3 mL) as well as favorable carrier gas flow significantly improves sensitivity. Down to 1ppm and less total nitrogen can be measured with precision of ≤ 0.5‰. To lower the background caused by physically dissolved nitrogen new, membrane-vacuum based, degasser was designed for online separation of physically dissolved nitrogen. This novel HTC system, "iso TOC cube", provides an innovative tool with large potential in investigation of biogeochemical carbon and nitrogen cycles.
NASA Technical Reports Server (NTRS)
Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris
2011-01-01
A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.
NASA Technical Reports Server (NTRS)
Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg
2011-01-01
The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.
Verwey, R; van der Weegen, S; Spreeuwenberg, M; Tange, H; van der Weijden, T; de Witte, L
2016-01-01
A monitoring-and-feedback tool was developed to stimulate physical activity by giving feedback on physical activity performance to patients and practice nurses. The tool consists of an activity monitor (accelerometer), wirelessly connected to a Smartphone and a web application. Use of this tool is combined with a behaviour change counselling protocol (the Self-management Support Programme) based on the Five A's model (Assess-Advise-Agree-Assist-Arrange). To examine the reach, implementation and satisfaction with the counselling protocol and the tool. A process evaluation was conducted in two intervention groups of a three-armed cluster randomised controlled trial, in which the counselling protocol was evaluated with (group 1, n=65) and without (group 2, n=66) the use of the tool using a mixed methods design. Sixteen family practices in the South of the Netherlands. Practice nurses (n=20) and their associated physically inactive patients (n=131), diagnosed with Chronic Obstructive Pulmonary Disease or Type 2 Diabetes, aged between 40 and 70 years old, and having access to a computer with an Internet connection. Semi structured interviews about the receipt of the intervention were conducted with the nurses and log files were kept regarding the consultations. After the intervention, questionnaires were presented to patients and nurses regarding compliance to and satisfaction with the interventions. Functioning and use of the tool were also evaluated by system and helpdesk logging. Eighty-six percent of patients (group 1: n=57 and group 2: n=56) and 90% of nurses (group 1: n=10 and group 2: n=9) responded to the questionnaires. The execution of the Self-management Support Programme was adequate; in 83% (group 1: n=52, group 2: n=57) of the patients, the number and planning of the consultations were carried out as intended. Eighty-eight percent (n=50) of the patients in group 1 used the tool until the end of the intervention period. Technical problems occurred in 58% (n=33). Participants from group 1 were significantly more positive: patients: χ(2)(2, N=113)=11.17, p=0.004, and nurses: χ(2)(2, N=19)=6.37, p=0.040. Use of the tool led to greater awareness of the importance of physical activity, more discipline in carrying it out and more enjoyment. The interventions were adequately executed and received as planned. Patients from both groups appreciated the focus on physical activity and personal attention given by the nurse. The most appreciated aspect of the combined intervention was the tool, although technical problems frequently occurred. Patients with the tool estimated more improvement of physical activity than patients without the tool. Copyright © 2015 Elsevier Ltd. All rights reserved.
Rasch model based analysis of the Force Concept Inventory
NASA Astrophysics Data System (ADS)
Planinic, Maja; Ivanjek, Lana; Susac, Ana
2010-06-01
The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct). The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17-18 years). The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4)% , indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian), an additional predominantly Newtonian sample ( N=141 , average FCI score of 64.5%) of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further examination. The analysis revealed some problems with item distribution in the FCI and suggested that the FCI may function differently in non-Newtonian and predominantly Newtonian population. Some possible improvements of the test are suggested.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
Biewick, Laura
2006-01-01
A geographic information system (GIS) focusing on the Upper Cretaceous Navarro and Taylor Groups in the Gulf Coast region was developed as a visual-analysis tool for the U.S. Geological Survey's 2003 assessment of undiscovered, technically recoverable oil and natural gas resources in the Western Gulf Province. The Central Energy Resources Team of the U.S. Geological Survey has also developed an Internet Map Service to deliver the GIS data to the general public. This mapping tool utilizes information from a database about the oil and natural gas endowment of the United States - including physical locations of geologic and geographic data - and converts the data into visual layers. Portrayal and analysis of geologic features on an interactive map provide an excellent tool for understanding domestic oil and gas resources for strategic planning, formulating economic and energy policies, evaluating lands under the purview of the Federal Government, and developing sound environmental policies. Assessment results can be viewed and analyzed or downloaded from the internet web site.
Marcatto, Francesco; D'Errico, Giuseppe; Di Blas, Lisa; Ferrante, Donatella
2011-01-01
The aim of this paper is to present a preliminary validation of an Italian adaptation of the HSE Management Standards Work-Related Stress Indicator Tool (IT), an instrument for assessing work-related stress at the organizational level, originally developed in Britain by the Health and Safety Executive. A scale that assesses the physical work environment has been added to the original version of the IT. 190 employees of the University of Trieste have been enrolled in the study. A confirmatory analysis showed a satisfactory fit of the eight-factors structure of the instrument. Further psychometric analysis showed adequate internal consistency of the IT scales and good criterion validity, as evidenced by the correlations with self-perception of stress, work satisfaction and motivation. In conclusion, the Indicator Tool proved to be a valid and reliable instrument for the assessment of work-related stress at the organizational level, and it is also compatible with the instructions provided by the Ministry of Labour and Social Policy (Circular letter 18/11/2010).
Tool use as distributed cognition: how tools help, hinder and define manual skill.
Baber, Chris; Parekh, Manish; Cengiz, Tulin G
2014-01-01
Our thesis in this paper is that, in order to appreciate the interplay between cognitive (goal-directed) and physical performance in tool use, it is necessary to determine the role that representations play in the use of tools. We argue that rather being solely a matter of internal (mental) representation, tool use makes use of the external representations that define the human-environment-tool-object system. This requires the notion of Distributed Cognition to encompass not simply the manner in which artifacts represent concepts but also how they represent praxis. Our argument is that this can be extended to include how artifacts-in-context afford use and how this response to affordances constitutes a particular form of skilled performance. By artifacts-in-context, we do not mean solely the affordances offered by the physical dimensions of a tool but also the interaction between the tool and the object that it is being used on. From this, "affordance" does not simply relate to the physical appearance of the tool but anticipates subsequent actions by the user directed towards the goal of changing the state of the object and this is best understood in terms of the "complimentarity" in the system. This assertion raises two challenges which are explored in this paper. The first is to distinguish "affordance" from the adaptation that one might expect to see in descriptions of motor control; when we speak of "affordance" as a form of anticipation, don't we just mean the ability to adjust movements in response to physical demands? The second is to distinguish "affordance" from a schema of the tool; when we talk about anticipation, don't we just mean the ability to call on a schema representing a "recipe" for using that tool for that task? This question of representation, specifically what knowledge needs to be represented in tool use, is central to this paper.
Jadczak, A D; Mahajan, N; Visvanathan, R
2017-01-01
Geriatric assessment tools are applicable to the general geriatric population; however, their feasibility in frail older adults is yet to be determined. The study aimed to determine the feasibility of standardised geriatric assessment tools and physical exercises in hospitalised frail older adults. Various assessment tools including the FRAIL Screen, the Charlson Comorbidity Index, the SF-36, the Trail Making Test (TMT), the Rapid Cognitive Screen, the Self Mini Nutritional Assessment (MNA-SF) and the Lawton iADL as well as standard physical exercises were assessed using observational protocols. The FRAIL Screen, MNA-SF, Rapid Cognitive Screen, Lawton iADL and the physical exercises were deemed to be feasible with only minor comprehension, execution and safety issues. The TMT was not considered to be feasible and the SF-36 should be replaced by its shorter form, the SF-12. In order to ensure the validity of these findings a study with a larger sample size should be undertaken.
Identifying factors of comfort in using hand tools.
Kuijt-Evers, L F M; Groenesteijn, L; de Looze, M P; Vink, P
2004-09-01
To design comfortable hand tools, knowledge about comfort/discomfort in using hand tools is required. We investigated which factors determine comfort/discomfort in using hand tools according to users. Therefore, descriptors of comfort/discomfort in using hand tools were collected from literature and interviews. After that, the relatedness of a selection of the descriptors to comfort in using hand tools was investigated. Six comfort factors could be distinguished (functionality, posture and muscles, irritation and pain of hand and fingers, irritation of hand surface, handle characteristics, aesthetics). These six factors can be classified into three meaningful groups: functionality, physical interaction and appearance. The main conclusions were that (1) the same descriptors were related to comfort and discomfort in using hand tools, (2) descriptors of functionality are most related to comfort in using hand tools followed by descriptors of physical interaction and (3) descriptors of appearance become secondary in comfort in using hand tools.
Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.
2006-01-01
This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.
Multi-physics CFD simulations in engineering
NASA Astrophysics Data System (ADS)
Yamamoto, Makoto
2013-08-01
Nowadays Computational Fluid Dynamics (CFD) software is adopted as a design and analysis tool in a great number of engineering fields. We can say that single-physics CFD has been sufficiently matured in the practical point of view. The main target of existing CFD software is single-phase flows such as water and air. However, many multi-physics problems exist in engineering. Most of them consist of flow and other physics, and the interactions between different physics are very important. Obviously, multi-physics phenomena are critical in developing machines and processes. A multi-physics phenomenon seems to be very complex, and it is so difficult to be predicted by adding other physics to flow phenomenon. Therefore, multi-physics CFD techniques are still under research and development. This would be caused from the facts that processing speed of current computers is not fast enough for conducting a multi-physics simulation, and furthermore physical models except for flow physics have not been suitably established. Therefore, in near future, we have to develop various physical models and efficient CFD techniques, in order to success multi-physics simulations in engineering. In the present paper, I will describe the present states of multi-physics CFD simulations, and then show some numerical results such as ice accretion and electro-chemical machining process of a three-dimensional compressor blade which were obtained in my laboratory. Multi-physics CFD simulations would be a key technology in near future.
ERIC Educational Resources Information Center
Kuhn, Jochen; Vogt, Patrik
2013-01-01
New media technology becomes more and more important for our daily life as well as for teaching physics. Within the scope of our N.E.T. research project we develop experiments using New Media Experimental Tools (N.E.T.) in physics education and study their influence on students learning abilities. We want to present the possibilities e.g. of…
ERIC Educational Resources Information Center
de Pereira, Alexsandro Pereira; Lima Junior, Paulo; Rodrigues, Renato Felix
2016-01-01
Explaining is one of the most important everyday practices in science education. In this article, we examine how scientific explanations could serve as cultural tools for members of a group of pre-service physics teachers. Specifically, we aim at their use of explanations about forces of inertia in non-inertial frames of reference. A basic…
1998-06-01
4] By 2010, we should be able to change how we conduct the most intense joint operations. Instead of relying on massed forces and sequential ...not independent, sequential steps. Data probes to support the analysis phase were required to complete the logical models. This generated a need...Networks) Identify Granularity (System Level) - Establish Physical Bounds or Limits to Systems • Determine System Test Configuration and Lineup
Acevedo-Nuevo, M; González-Gil, M T; Solís-Muñoz, M; Láiz-Díez, N; Toraño-Olivera, M J; Carrasco-Rodríguez-Rey, L F; García-González, S; Velasco-Sanz, T R; Martínez-Álvarez, A; Martin-Rivera, B E
2016-01-01
To identify nursing experience on physical restraint management in Critical Care Units. To analyse similarities and differences in nursing experience on physical restraint management according to the clinical context that they are involved in. A multicentre phenomenological study was carried out including 14 Critical Care Units in Madrid, classified according to physical restraint use: Common/systematic use, lacking/personalised use, and mixed use. Five focus groups (23 participants were selected following purposeful sampling) were convened, concluding in data saturation. Data analysis was focused on thematic content analysis following Colaizzi's method. Six main themes: Physical restraint meaning in Critical Care Units, safety (self-retreat vital devices), contribution factors, feelings, alternatives, and pending issues. Although some themes are common to the 3 Critical Care Unit types, discourse differences are found as regards to indication, feelings, systematic use of pain and sedation measurement tools. In order to achieve real physical restraint reduction in Critical Care Units, it is necessary to have a deep understanding of restraints use in the specific clinical context. As self-retreat vital devices emerge as central concept, some interventions proposed in other settings could not be effective, requiring alternatives for critical care patients. Discourse variations laid out in the different Critical Care Unit types could highlight key items that determine the use and different attitudes towards physical restraint. Copyright © 2015 Elsevier España, S.L.U. y SEEIUC. All rights reserved.
Use of Ultrasound in Male Infertility: Appropriate Selection of Men for Scrotal Ultrasound.
Armstrong, Joseph M; Keihani, Sorena; Hotaling, James M
2018-05-28
Male factor infertility is a complex and multifaceted problem facing the modern urologist and is identified in 30-40% of infertile couples. This review focuses on the use of ultrasound, as an adjunct screening tool, in the initial evaluation of male infertility. Access to male reproductive urologist for assessment of male infertility is limited and about a quarter of infertile couples do not complete the male component in their infertility assessment. Ultrasound evaluation of the infertile male is low-cost and non-invasive and helps uncover underlying pathologies that may be missed during the initial assessment. The addition of ultrasound allows the physician to accurately assess testicular anatomy and dimensions, as well as vascular environments, which may help guide treatment decisions. Scrotal ultrasound evaluation, in conjunction with a semen analysis and as an adjunct to physical exam, can be offered in the initial assessment of men who present for infertility consultation given its low cost, non-invasive nature, and ability to detect and discriminate between various etiologies of male infertility. Further, when directed by physical exam and semen analysis findings, it provides a valuable tool to select men for referral to a reproductive urologist, especially for infertile couples who are only screened by reproductive endocrinologists and female infertility specialists.
Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.
Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang
2015-10-14
Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy
2014-02-01
This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.
Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research
Alaidi, Osama; Rames, Matthew J.
2016-01-01
Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941
Rocket Engine Oscillation Diagnostics
NASA Technical Reports Server (NTRS)
Nesman, Tom; Turner, James E. (Technical Monitor)
2002-01-01
Rocket engine oscillating data can reveal many physical phenomena ranging from unsteady flow and acoustics to rotordynamics and structural dynamics. Because of this, engine diagnostics based on oscillation data should employ both signal analysis and physical modeling. This paper describes an approach to rocket engine oscillation diagnostics, types of problems encountered, and example problems solved. Determination of design guidelines and environments (or loads) from oscillating phenomena is required during initial stages of rocket engine design, while the additional tasks of health monitoring, incipient failure detection, and anomaly diagnostics occur during engine development and operation. Oscillations in rocket engines are typically related to flow driven acoustics, flow excited structures, or rotational forces. Additional sources of oscillatory energy are combustion and cavitation. Included in the example problems is a sampling of signal analysis tools employed in diagnostics. The rocket engine hardware includes combustion devices, valves, turbopumps, and ducts. Simple models of an oscillating fluid system or structure can be constructed to estimate pertinent dynamic parameters governing the unsteady behavior of engine systems or components. In the example problems it is shown that simple physical modeling when combined with signal analysis can be successfully employed to diagnose complex rocket engine oscillatory phenomena.
Lai, Chih-Chin; Tu, Yu-Kang; Wang, Tyng-Guey; Huang, Yi-Ting; Chien, Kuo-Liong
2018-05-01
A variety of different types of exercise are promoted to improve muscle strength and physical performance in older people. We aimed to determine the relative effects of resistance training, endurance training and whole-body vibration on lean body mass, muscle strength and physical performance in older people. A systematic review and network meta-analysis. Adults aged 60 and over. Evidence from randomised controlled trials of resistance training, endurance training and whole-body vibration were combined. The effects of exercise interventions on lean body mass, muscle strength and physical performance were evaluated by conducting a network meta-analysis to compare multiple interventions and usual care. Risk of bias of included studies was assessed using the Cochrane Collaboration's tool. A meta-regression was performed to assess potential effect modifiers. Data were obtained from 30 trials involving 1,405 participants (age range: 60-92 years). No significant differences were found between the effects of exercise or usual care on lean body mass. Resistance training (minimum 6 weeks duration) achieved greater muscle strength improvement than did usual care (12.8 kg; 95% confidence interval [CI]: 8.5-17.0 kg). Resistance training and whole-body vibration were associated with greater physical performance improvement compared with usual care (2.6 times greater [95% CI: 1.3-3.9] and 2.1 times greater [95% CI: 0.5-3.7], respectively). Resistance training is the most effect intervention to improve muscle strength and physical performance in older people. Our findings also suggest that whole-body vibration is beneficial for physical performance. However, none of the three exercise interventions examined had a significant effect on lean body mass.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mar, M.H.
1995-07-01
Based on the vulnerability Lethality (V/L) taxonomy developed by the Ballistic Vulnerability Lethality Division (BVLD) of the Survivability Lethality Analysis Directorate (SLAD), a nuclear electromagnetic pulse (EMP) coupling V/L analysis taxonomy has been developed. A nuclear EMP threat to a military system can be divided into two levels: (1) coupling to a system level through a cable, antenna, or aperture; and (2) the component level. This report will focus on the initial condition, which includes threat definition and target description, as well as the mapping process from the initial condition to damaged components state. EMP coupling analysis at a systemmore » level is used to accomplish this. This report introduces the nature of EMP threat, interaction between the threat and target, and how the output of EMP coupling analysis at a system level becomes the input to the component level analysis. Many different tools (EMP coupling codes) will be discussed for the mapping process, which correponds to the physics of phenomenology. This EMP coupling V/L taxonomy and the models identified in this report will provide the tools necessary to conduct basic V/L analysis of EMP coupling.« less
Marriages of mathematics and physics: A challenge for biology.
Islami, Arezoo; Longo, Giuseppe
2017-12-01
The human attempts to access, measure and organize physical phenomena have led to a manifold construction of mathematical and physical spaces. We will survey the evolution of geometries from Euclid to the Algebraic Geometry of the 20th century. The role of Persian/Arabic Algebra in this transition and its Western symbolic development is emphasized. In this relation, we will also discuss changes in the ontological attitudes toward mathematics and its applications. Historically, the encounter of geometric and algebraic perspectives enriched the mathematical practices and their foundations. Yet, the collapse of Euclidean certitudes, of over 2300 years, and the crisis in the mathematical analysis of the 19th century, led to the exclusion of "geometric judgments" from the foundations of Mathematics. After the success and the limits of the logico-formal analysis, it is necessary to broaden our foundational tools and re-examine the interactions with natural sciences. In particular, the way the geometric and algebraic approaches organize knowledge is analyzed as a cross-disciplinary and cross-cultural issue and will be examined in Mathematical Physics and Biology. We finally discuss how the current notions of mathematical (phase) "space" should be revisited for the purposes of life sciences. Copyright © 2017. Published by Elsevier Ltd.
Verification of the Icarus Material Response Tool
NASA Technical Reports Server (NTRS)
Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre
2017-01-01
Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.
Bermejo-Cantarero, Alberto; Álvarez-Bueno, Celia; Martinez-Vizcaino, Vicente; García-Hermoso, Antonio; Torres-Costoso, Ana Isabel; Sánchez-López, Mairena
2017-03-01
Health related quality of life (HRQoL) is a subjective, multidimensional and changing over time construct. When HRQoL is decreased, a child is less likely to be able to develop normally and mature into a healthy adult. Physical inactivity is a priority public health problem. Evidence suggests how even moderate levels of physical activity or high fitness levels are associated with benefits for the health in children and adolescents. The aims of this systematic review are to examine the evidence about the relationship between physical activity, sedentary behavior, and fitness with HRQoL, and estimate the effects of interventions that have tested the effectiveness of the increase of the physical activity, the improvement of the physical fitness or the avoidance of sedentary behaviors in HRQoL in healthy subjects aged under 18 years old. This systematic review and meta-analysis protocol was conducted following the preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) statement. To identify relevant studies, the following electronic databases will be searched: MEDLINE, EMBASE, Cochrane Database, Web of Science, and PEDro. Reference lists of relevant studies will be examined for links to potential related articles. The methodological quality of the observational included studies will be scored using a quality assessment checklist. For the intervention studies, the risk of bias will be estimated using The Cochrane Collaboration tool for assessing risk of bias. Reviewers will determine whether a meta-analysis is possible when data have been extracted. If it is, subgroup analyses will be carried out by age and socioeconomic status, and by the different dimensions of the HRQoL. If is not possible, a descriptive analysis will be conducted. To our knowledge, this systematic review and meta-analysis will be the first that synthesizes the existing results about the relationship between physical activity, sedentary behavior, physical fitness, and HRQoL, and the effect of physical activity interventions on HRQoL, in healthy subjects under 18 years old. This study will clarify this relationship and will provide evidence for decision-making. Limitations may include the quality of the selected studies and their characteristics. Only studies published in English and Spanish will be included. PROSPERO CRD42015025823.
Rashotte, Judy; Varpio, Lara; Day, Kathy; Kuziemsky, Craig; Parush, Avi; Elliott-Miller, Pat; King, James W; Roffey, Tyson
2016-09-01
Members of the healthcare team must access and share patient information to coordinate interprofessional collaborative practice (ICP). Although some evidence suggests that electronic health records (EHRs) contribute to in-team communication breakdowns, EHRs are still widely hailed as tools that support ICP. If EHRs are expected to promote ICP, researchers must be able to longitudinally study the impact of EHRs on ICP across communication types, users, and physical locations. This paper presents a data collection and analysis tool, named the Map of the Clinical Interprofessional Communication Spaces (MCICS), which supports examining how EHRs impact ICP over time, and across communication types, users, and physical locations. The tool's development evolved during a large prospective longitudinal study conducted at a Canadian pediatric academic tertiary-care hospital. This two-phased study [i.e., pre-implementation (phase 1) and post implementation (phase 2)] of an EHR employed a constructivist grounded theory approach and triangulated data collection strategies (i.e., non-participant observations, interviews, think-alouds, and document analysis). The MCICS was created through a five-step process: (i) preliminary structural development based on the use of the paper-based chart (phase 1); (ii) confirmatory review and modification process (phase 1); (iii) ongoing data collection and analysis facilitated by the map (phase 1); (iv) data collection and modification of map based on impact of EHR (phase 2); and (v) confirmatory review and modification process (phase 2). Creating and using the MCICS enabled our research team to locate, observe, and analyze the impact of the EHR on ICP, (a) across oral, electronic, and paper communications, (b) through a patient's passage across different units in the hospital, (c) across the duration of the patient's stay in hospital, and (d) across multiple healthcare providers. By using the MCICS, we captured a comprehensive, detailed picture of the clinical milieu in which the EHR was implemented, and of the intended and unintended consequences of the EHR's deployment. The map supported our observations and analysis of ICP communication spaces, and of the role of the patient chart in these spaces. If EHRs are expected to help resolve ICP challenges, it is important that researchers be able to longitudinally assess the impact of EHRs on ICP across multiple modes of communication, users, and physical locations. Mapping the clinical communication spaces can help EHR designers, clinicians, educators and researchers understand these spaces, appreciate their complexity, and navigate their way towards effective use of EHRs as means for supporting ICP. We propose that the MCICS can be used "as is" in other academic tertiary-care pediatric hospitals, and can be tailored for use in other healthcare institutions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Job title of recent bachelor's degree recipients
NASA Astrophysics Data System (ADS)
White, Susan C.
2015-05-01
Physics bachelor's degree recipients work in all kinds of professions—science writing, medicine, law, history of science, acting, music, healthcare and more. Since very few of these employees have the word "physics" in their job titles, it can be hard for new graduates to know where to look for jobs and how to find other recent physics graduates in the workforce. The American Institute of Physics and the Society of Physics Students joined forces on an NSF-funded grant to create career tools for undergraduate physics students.1 One of the tools available to students in the Careers Toolbox is a listing of common job titles of physics bachelors degree recipients working in various fields; some of the job titles are listed below.
Students’ epistemic understanding of mathematical derivations in physics
NASA Astrophysics Data System (ADS)
Sirnoorkar, Amogh; Mazumdar, Anwesh; Kumar, Arvind
2017-01-01
We propose an epistemic measure of physics in terms of the ability to discriminate between the purely mathematical, physical (i.e. dependent on empirical inputs) and nominal (i.e. empty of mathematical or physical content) propositions appearing in a typical derivation in physics. The measure can be relevant in understanding the maths-physics link hurdles among college students. To illustrate the idea, we construct a tool for a familiar derivation (involving specific heats of an ideal gas), and use it for a sample of students from three different institutes. The reliability of the tool is examined. The results indicate, as intuitively expected, that epistemic clarity correlates with content clarity. Data yield several significant trends on the extent and kinds of epistemic pitfalls prevalent among physics undergraduates.
Spectral Analysis of B Stars: An Application of Bayesian Statistics
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2012-12-01
To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
Lagrangian ocean analysis: Fundamentals and practices
van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan; ...
2017-11-24
Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. A variety of tools and methods for this purpose have emerged, over several decades. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolvedmore » physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. Our overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.« less
Lagrangian ocean analysis: Fundamentals and practices
NASA Astrophysics Data System (ADS)
van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan; Adams, Thomas P.; Berloff, Pavel; Biastoch, Arne; Blanke, Bruno; Chassignet, Eric P.; Cheng, Yu; Cotter, Colin J.; Deleersnijder, Eric; Döös, Kristofer; Drake, Henri F.; Drijfhout, Sybren; Gary, Stefan F.; Heemink, Arnold W.; Kjellsson, Joakim; Koszalka, Inga Monika; Lange, Michael; Lique, Camille; MacGilchrist, Graeme A.; Marsh, Robert; Mayorga Adame, C. Gabriela; McAdam, Ronan; Nencioli, Francesco; Paris, Claire B.; Piggott, Matthew D.; Polton, Jeff A.; Rühs, Siren; Shah, Syed H. A. M.; Thomas, Matthew D.; Wang, Jinbo; Wolfram, Phillip J.; Zanna, Laure; Zika, Jan D.
2018-01-01
Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. Over several decades, a variety of tools and methods for this purpose have emerged. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolved physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. The overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.
Lagrangian ocean analysis: Fundamentals and practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan
Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. A variety of tools and methods for this purpose have emerged, over several decades. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolvedmore » physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. Our overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.« less
The Macro Dynamics of Weapon System Acquisition: Shaping Early Decisions to Get Better Outcomes
2012-05-17
defects and rework •Design tools and processes •Lack of feedback to key design and SE processes •Lack of quantified risk and uncertainty at key... Tools for Rapid Exploration of the Physical Design Space Coupling Operability, Interoperability, and Physical Feasibility Analyses – a Game Changer...Interoperability •Training Quantified Margins and Uncertainties at Each Critical Decision Point M&S RDT&E A Continuum of Tools Underpinned with
Alternative model for administration and analysis of research-based assessments
NASA Astrophysics Data System (ADS)
Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.
2016-06-01
Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.
Zhang, Jun; Shoham, David A.; Tesdahl, Eric
2015-01-01
Objectives. We studied simulated interventions that leveraged social networks to increase physical activity in children. Methods. We studied a real-world social network of 81 children (average age = 7.96 years) who lived in low socioeconomic status neighborhoods, and attended public schools and 1 of 2 structured afterschool programs. The sample was ethnically diverse, and 44% were overweight or obese. We used social network analysis and agent-based modeling simulations to test whether implementing a network intervention would increase children’s physical activity. We tested 3 intervention strategies. Results. The intervention that targeted opinion leaders was effective in increasing the average level of physical activity across the entire network. However, the intervention that targeted the most sedentary children was the best at increasing their physical activity levels. Conclusions. Which network intervention to implement depends on whether the goal is to shift the entire distribution of physical activity or to influence those most adversely affected by low physical activity. Agent-based modeling could be an important complement to traditional project planning tools, analogous to sample size and power analyses, to help researchers design more effective interventions for increasing children’s physical activity. PMID:25689202
NASA Astrophysics Data System (ADS)
Hinko, Kathleen
2016-03-01
University educators (UEs) have a long history of teaching physics not only in formal classroom settings but also in informal outreach environments. The pedagogical practices of UEs in informal physics teaching have not been widely studied, and they may provide insight into formal practices and preparation. We investigate the interactions between UEs and children in an afterschool physics program facilitated by university physics students from the University of Colorado Boulder. In this program, physics undergraduates, graduate students and post-doctoral researchers work with K-8 children on hands-on physics activities on a weekly basis over the course of a semester. We use an Activity Theoretic framework as a tool to examine situational aspects of individuals' behavior in the complex structure of the afterschool program. Using this framework, we analyze video of UE-child interactions and identify three main pedagogical modalities that UEs display during activities: Instruction, Consultation and Participation modes. These modes are characterized by certain language, physical location, and objectives that establish differences in UE-child roles and division of labor. Based on this analysis, we discuss implications for promoting pedagogical strategies through purposeful curriculum development and university educator preparation.
CDPP Tools in the IMPEx infrastructure
NASA Astrophysics Data System (ADS)
Gangloff, Michel; Génot, Vincent; Bourrel, Nataliya; Hess, Sébastien; Khodachenko, Maxim; Modolo, Ronan; Kallio, Esa; Alexeev, Igor; Al-Ubaidi, Tarek; Cecconi, Baptiste; André, Nicolas; Budnik, Elena; Bouchemit, Myriam; Dufourg, Nicolas; Beigbeder, Laurent
2014-05-01
The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of plasma data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search, cataloguing, and 3DView (http://3dview.cdpp.eu/) which provides immersive visualisations in planetary environments and is further developed to include simulation and observational data. Both tools implement the IMPEx protocol (http://impexfp7.oeaw.ac.at/) to give access to outputs of simulation runs and models in planetary sciences from several providers like LATMOS, FMI , SINP; prototypes have also been built to access some UCLA and CCMC simulations. These tools and their interaction will be presented together with the IMPEx simulation data model (http://impex.latmos.ipsl.fr/tools/DataModel.htm) used for the interface to model databases.
IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.
This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less
Rigidity controllable polishing tool based on magnetorheological effect
NASA Astrophysics Data System (ADS)
Wang, Jia; Wan, Yongjian; Shi, Chunyan
2012-10-01
A stable and predictable material removal function (MRF) plays a crucial role in computer controlled optical surfacing (CCOS). For physical contact polishing case, the stability of MRF depends on intimate contact between polishing interface and workpiece. Rigid laps maintain this function in polishing spherical surfaces, whose curvature has no variation with the position on the surface. Such rigid laps provide smoothing effect for mid-spatial frequency errors, but can't be used in aspherical surfaces for they will destroy the surface figure. Flexible tools such as magnetorheological fluid or air bonnet conform to the surface [1]. They lack rigidity and provide little natural smoothing effect. We present a rigidity controllable polishing tool that uses a kind of magnetorheological elastomers (MRE) medium [2]. It provides the ability of both conforming to the aspheric surface and maintaining natural smoothing effect. What's more, its rigidity can be controlled by the magnetic field. This paper will present the design, analysis, and stiffness variation mechanism model of such polishing tool [3].
Standardized observation of neighbourhood disorder: does it work in Canada?
2010-01-01
Background There is a growing body of evidence that where you live is important to your health. Despite numerous previous studies investigating the relationship between neighbourhood deprivation (and structure) and residents' health, the precise nature of this relationship remains unclear. Relatively few investigations have relied on direct observation of neighbourhoods, while those that have were developed primarily in US settings. Evaluation of the transferability of such tools to other contexts is an important first step before applying such instruments to the investigation of health and well-being. This study evaluated the performance of a systematic social observational (SSO) tool (adapted from previous studies of American and British neighbourhoods) in a Canadian urban context. Methods This was a mixed-methods study. Quantitative SSO ratings and qualitative descriptions of 176 block faces were obtained in six Toronto neighbourhoods (4 low-income, and 2 middle/high-income) by trained raters. Exploratory factor analysis was conducted with the quantitative SSO ratings. Content analysis consisted of independent coding of qualitative data by three members of the research team to yield common themes and categories. Results Factor analysis identified three factors (physical decay/disorder, social accessibility, recreational opportunities), but only 'physical decay/disorder' reflected previous findings in the literature. Qualitative results (based on raters' fieldwork experiences) revealed the tool's shortcomings in capturing important features of the neighbourhoods under study, and informed interpretation of the quantitative findings. Conclusions This study tested the performance of an SSO tool in a Canadian context, which is an important initial step before applying it to the study of health and disease. The tool demonstrated important shortcomings when applied to six diverse Toronto neighbourhoods. The study's analyses challenge previously held assumptions (e.g. social 'disorder') regarding neighbourhood social and built environments. For example, neighbourhood 'order' has traditionally been assumed to be synonymous with a certain degree of homogeneity, however the neighbourhoods under study were characterized by high degrees of heterogeneity and low levels of disorder. Heterogeneity was seen as an appealing feature of a block face. Employing qualitative techniques with SSO represents a unique contribution, enhancing both our understanding of the quantitative ratings obtained and of neighbourhood characteristics that are not currently captured by such instruments. PMID:20146821
Conceptual Tools for Understanding Nature - Proceedings of the 3rd International Symposium
NASA Astrophysics Data System (ADS)
Costa, G.; Calucci, M.
1997-04-01
The Table of Contents for the full book PDF is as follows: * Foreword * Some Limits of Science and Scientists * Three Limits of Scientific Knowledge * On Features and Meaning of Scientific Knowledge * How Science Approaches the World: Risky Truths versus Misleading Certitudes * On Discovery and Justification * Thought Experiments: A Philosophical Analysis * Causality: Epistemological Questions and Cognitive Answers * Scientific Inquiry via Rational Hypothesis Revision * Probabilistic Epistemology * The Transferable Belief Model for Uncertainty Representation * Chemistry and Complexity * The Difficult Epistemology of Medicine * Epidemiology, Causality and Medical Anthropology * Conceptual Tools for Transdisciplinary Unified Theory * Evolution and Learning in Economic Organizations * The Possible Role of Symmetry in Physics and Cosmology * Observational Cosmology and/or other Imaginable Models of the Universe
The QuakeSim Project: Numerical Simulations for Active Tectonic Processes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry
2004-01-01
In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.
Transition Matrices: A Tool to Assess Student Learning and Improve Instruction
NASA Astrophysics Data System (ADS)
Morris, Gary A.; Walter, Paul; Skees, Spencer; Schwartz, Samantha
2017-03-01
This paper introduces a new spreadsheet tool for adoption by high school or college-level physics teachers who use common assessments in a pre-instruction/post-instruction mode to diagnose student learning and teaching effectiveness. The spreadsheet creates a simple matrix that identifies the percentage of students who select each possible pre-/post-test answer combination on each question of the diagnostic exam. Leveraging analysis of the quality of the incorrect answer choices, one can order the answer choices from worst to best (i.e., correct), resulting in "transition matrices" that can provide deeper insight into student learning and the success or failure of the pedagogical approach than traditional analyses that employ dichotomous scoring.
NASA Astrophysics Data System (ADS)
Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.
2014-12-01
The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and Automated Rotational Center Hurricane Eye Retrieval (ARCHER) tools. In this presentation, we will compare the enabling technologies we tested and discuss which ones we selected for integration into the TCIS' data analysis tool architecture. We will also show how these techniques have been automated to provide access to NRT data through our analysis tools.
Lindemann, Ulrich; Zijlstra, Wiebren; Aminian, Kamiar; Chastin, Sebastien F M; de Bruin, Eling D; Helbostad, Jorunn L; Bussmann, Johannes B J
2014-01-10
Physical activity is an important determinant of health and well-being in older persons and contributes to their social participation and quality of life. Hence, assessment tools are needed to study this physical activity in free-living conditions. Wearable motion sensing technology is used to assess physical activity. However, there is a lack of harmonisation of validation protocols and applied statistics, which make it hard to compare available and future studies. Therefore, the aim of this paper is to formulate recommendations for assessing the validity of sensor-based activity monitoring in older persons with focus on the measurement of body postures and movements. Validation studies of body-worn devices providing parameters on body postures and movements were identified and summarized and an extensive inter-active process between authors resulted in recommendations about: information on the assessed persons, the technical system, and the analysis of relevant parameters of physical activity, based on a standardized and semi-structured protocol. The recommended protocols can be regarded as a first attempt to standardize validity studies in the area of monitoring physical activity.
Fastame, Maria Chiara; Hitchcott, Paul Kenneth; Penna, Maria Pietronilla
2017-04-01
This study was mainly aimed at exploring the relationship between psychological well-being and lifestyle, religion, perceived physical health and social desirability of Italian elders. Four hundred and six cognitively healthy 65-99 years old participants were recruited from the Italian isle of Sardinia, where a high prevalence of centenarians is registered. Participants were presented with several tools assessing psychological well-being, lifestyle, social desirability, religiosity and subjective physical health. A hierarchical regression analysis revealed that the social desirability measure is the best predictor of general subjective well-being, whereas further predictors are age, perceived physical health and gardening. A significant but moderate relationship was also found between psychological well-being, subjective physical health and religiosity, while controlling for social desirability. Social desirability seems to contaminate the self-rating of psychological well-being in late adulthood. Moreover, from a developmental perspective, age-related factors, life style and perceived physical health are strictly related to and therefore influence the perception of life quality in the third and fourth age.
Distributed data analysis in ATLAS
NASA Astrophysics Data System (ADS)
Nilsson, Paul; Atlas Collaboration
2012-12-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.
NASA Astrophysics Data System (ADS)
Christian, Paul M.
2002-07-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Interactive Visual Analysis within Dynamic Ocean Models
NASA Astrophysics Data System (ADS)
Butkiewicz, T.
2012-12-01
The many observation and simulation based ocean models available today can provide crucial insights for all fields of marine research and can serve as valuable references when planning data collection missions. However, the increasing size and complexity of these models makes leveraging their contents difficult for end users. Through a combination of data visualization techniques, interactive analysis tools, and new hardware technologies, the data within these models can be made more accessible to domain scientists. We present an interactive system that supports exploratory visual analysis within large-scale ocean flow models. The currents and eddies within the models are illustrated using effective, particle-based flow visualization techniques. Stereoscopic displays and rendering methods are employed to ensure that the user can correctly perceive the complex 3D structures of depth-dependent flow patterns. Interactive analysis tools are provided which allow the user to experiment through the introduction of their customizable virtual dye particles into the models to explore regions of interest. A multi-touch interface provides natural, efficient interaction, with custom multi-touch gestures simplifying the otherwise challenging tasks of navigating and positioning tools within a 3D environment. We demonstrate the potential applications of our visual analysis environment with two examples of real-world significance: Firstly, an example of using customized particles with physics-based behaviors to simulate pollutant release scenarios, including predicting the oil plume path for the 2010 Deepwater Horizon oil spill disaster. Secondly, an interactive tool for plotting and revising proposed autonomous underwater vehicle mission pathlines with respect to the surrounding flow patterns predicted by the model; as these survey vessels have extremely limited energy budgets, designing more efficient paths allows for greater survey areas.
NASA Technical Reports Server (NTRS)
Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.
2000-01-01
The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.
NASA Technical Reports Server (NTRS)
Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.
2001-01-01
The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.
NASA Technical Reports Server (NTRS)
Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.
1999-01-01
The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.
NASA Technical Reports Server (NTRS)
Ocuna, M. H.; Ogilvie, K. W.; Baker, D. N.; Curtis, S. A.; Fairfield, D. H.; Mish, W. H.
2000-01-01
The Global Geospace Science Program (GGS) is designed to improve greatly the understanding of the flow of energy, mass and momentum in the solar-terrestrial environment with particular emphasis on "Geospace". The Global Geospace Science Program is the US contribution to the International Solar-Terrestrial Physics (ISTP) Science Initiative. This CD-ROM issue describes the WIND and POLAR spacecraft, the scientific experiments carried onboard, the Theoretical and Ground Based investigations which constitute the US Global Geospace Science Program and the ISTP Data Systems which support the data acquisition and analysis effort. The International Solar-Terrestrial Physics Program (ISTP) Key Parameter Visualization Tool (KPVT), provided on the CD-ROM, was developed at the ISTP Science Planning and Operations Facility (SPOF). The KPVT is a generic software package for visualizing the key parameter data produced from all ISTP missions, interactively and simultaneously. The tool is designed to facilitate correlative displays of ISTP data from multiple spacecraft and instruments, and thus the selection of candidate events and data quality control. The software, written in IDL, includes a graphical/widget user interface, and runs on many platforms, including various UNIX workstations, Alpha/Open VMS, Macintosh (680x0 and PowerPC), and PC/Windows NT, Windows 3.1, and Windows 95.
Tool use as distributed cognition: how tools help, hinder and define manual skill
Baber, Chris; Parekh, Manish; Cengiz, Tulin G.
2014-01-01
Our thesis in this paper is that, in order to appreciate the interplay between cognitive (goal-directed) and physical performance in tool use, it is necessary to determine the role that representations play in the use of tools. We argue that rather being solely a matter of internal (mental) representation, tool use makes use of the external representations that define the human–environment–tool–object system. This requires the notion of Distributed Cognition to encompass not simply the manner in which artifacts represent concepts but also how they represent praxis. Our argument is that this can be extended to include how artifacts-in-context afford use and how this response to affordances constitutes a particular form of skilled performance. By artifacts-in-context, we do not mean solely the affordances offered by the physical dimensions of a tool but also the interaction between the tool and the object that it is being used on. From this, “affordance” does not simply relate to the physical appearance of the tool but anticipates subsequent actions by the user directed towards the goal of changing the state of the object and this is best understood in terms of the “complimentarity” in the system. This assertion raises two challenges which are explored in this paper. The first is to distinguish “affordance” from the adaptation that one might expect to see in descriptions of motor control; when we speak of “affordance” as a form of anticipation, don’t we just mean the ability to adjust movements in response to physical demands? The second is to distinguish “affordance” from a schema of the tool; when we talk about anticipation, don’t we just mean the ability to call on a schema representing a “recipe” for using that tool for that task? This question of representation, specifically what knowledge needs to be represented in tool use, is central to this paper. PMID:24605103
High-throughput determination of structural phase diagram and constituent phases using GRENDEL
NASA Astrophysics Data System (ADS)
Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.
2015-11-01
Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.
Synergism of Nanomaterials with Physical Stimuli for Biology and Medicine.
Shin, Tae-Hyun; Cheon, Jinwoo
2017-03-21
Developing innovative tools that facilitate the understanding of sophisticated biological systems has been one of the Holy Grails in the physical and biological sciences. In this Commentary, we discuss recent advances, opportunities, and challenges in the use of nanomaterials as a precision tool for biology and medicine.
Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will
2016-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.
Visual readability analysis: how to make your writings easier to read.
Oelke, Daniela; Spretke, David; Stoffel, Andreas; Keim, Daniel A
2012-05-01
We present a tool that is specifically designed to support a writer in revising a draft version of a document. In addition to showing which paragraphs and sentences are difficult to read and understand, we assist the reader in understanding why this is the case. This requires features that are expressive predictors of readability, and are also semantically understandable. In the first part of the paper, we, therefore, discuss a semiautomatic feature selection approach that is used to choose appropriate measures from a collection of 141 candidate readability features. In the second part, we present the visual analysis tool VisRA, which allows the user to analyze the feature values across the text and within single sentences. Users can choose between different visual representations accounting for differences in the size of the documents and the availability of information about the physical and logical layout of the documents. We put special emphasis on providing as much transparency as possible to ensure that the user can purposefully improve the readability of a sentence. Several case studies are presented that show the wide range of applicability of our tool. Furthermore, an in-depth evaluation assesses the quality of the measure and investigates how well users do in revising a text with the help of the tool.
ThinkerTools. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2012
2012-01-01
"ThinkerTools" is a computer-based program that aims to develop students' understanding of physics and scientific modeling. The program is composed of two curricula for middle school students, "ThinkerTools Inquiry" and "Model-Enhanced ThinkerTools". "ThinkerTools Inquiry" allows students to explore the…
Analysis of cancer-related fatigue based on smart bracelet devices.
Shen, Hong; Hou, Honglun; Tian, Wei; Wu, MingHui; Chen, Tianzhou; Zhong, Xian
2016-01-01
Fatigue is the most common symptom associated with cancer and its treatment, and profoundly affects all aspects of quality of life for cancer patients. It is very important to measure and manage cancer-related fatigue. Usually, the cancer-related fatigue scores, which estimate the degree of fatigue, are self-reported by cancer patients using standardized assessment tools. But most of the classical methods used for measurement of fatigue are subjective and inconvenient. In this study, we try to establish a new method to assess cancer-related fatigue objectively and accurately by using smart bracelet. All patients with metastatic pancreatic cancer wore smart bracelet for recording the physical activity including step count and sleep time before and after chemotherapy. Meantime, their psychological state was assessed by completing questionnaire tables as cancer-related fatigue scores. Step count record by smart bracelet reflecting the physical performance dramatically decreased in the initial days of chemotherapy and recovered in the next few days. Statistical analysis showed a strong and significant correlation between self-reported cancer-related fatigue and physical performance (P= 0.000, r=-0.929). Sleep time was also significantly correlated with fatigue (P= 0.000, r= 0.723). Multiple regression analysis showed that physical performance and sleep time are significant predictors of fatigue. Measuring activity using smart bracelets may be an appropriate method for quantitative and objective measurement of cancer-related fatigue by using smart bracelet devices.
Localized Overheating Phenomena and Optimization of Spark-Plasma Sintering Tooling Design
Giuntini, Diletta; Olevsky, Eugene A.; Garcia-Cardona, Cristina; Maximenko, Andrey L.; Yurlova, Maria S.; Haines, Christopher D.; Martin, Darold G.; Kapoor, Deepak
2013-01-01
The present paper shows the application of a three-dimensional coupled electrical, thermal, mechanical finite element macro-scale modeling framework of Spark Plasma Sintering (SPS) to an actual problem of SPS tooling overheating, encountered during SPS experimentation. The overheating phenomenon is analyzed by varying the geometry of the tooling that exhibits the problem, namely by modeling various tooling configurations involving sequences of disk-shape spacers with step-wise increasing radii. The analysis is conducted by means of finite element simulations, intended to obtain temperature spatial distributions in the graphite press-forms, including punches, dies, and spacers; to identify the temperature peaks and their respective timing, and to propose a more suitable SPS tooling configuration with the avoidance of the overheating as a final aim. Electric currents-based Joule heating, heat transfer, mechanical conditions, and densification are imbedded in the model, utilizing the finite-element software COMSOL™, which possesses a distinguishing ability of coupling multiple physics. Thereby the implementation of a finite element method applicable to a broad range of SPS procedures is carried out, together with the more specific optimization of the SPS tooling design when dealing with excessive heating phenomena. PMID:28811398
Rodrigues, Susana; Silva, Joana; Severo, Milton; Inácio, Cátia; Padrão, Patrícia; Lopes, Carla; Carvalho, Joana; do Carmo, Isabel; Moreira, Pedro
2015-01-01
Dehydration is common among elderly people. The aim of this study was to perform validation analysis of a geriatric dehydration-screening tool (DST) in the assessment of hydration status in elderly people. This tool was based on the DST proposed by Vivanti et al., which is composed by 11 items (four physical signs of dehydration and seven questions about thirst sensation, pain and mobility), with four questions extra about drinking habits. The resulting questionnaire was evaluated in a convenience sample comprising institutionalized (n = 29) and community-dwelling (n = 74) elderly people. Urinary parameters were assessed (24-h urine osmolality and volume) and free water reserve (FWR) was calculated. Exploratory factor analysis was used to evaluate the scale’s dimensionality and Cronbach’s alpha was used to measure the reliability of each subscale. Construct’s validity was tested using linear regression to estimate the association between scores in each dimension and urinary parameters. Two factors emerged from factor analysis, which were named “Hydration Score” and “Pain Score”, and both subscales showed acceptable reliabilities. The “Hydration Score” was negatively associated with 24-h urine osmolality in community-dwelling; and the “Pain Score” was negatively associated with 24-h urine osmolality, and positively associated with 24-h urine volume and FWR in institutionalized elderly people. PMID:25739005
Learning motion concepts using real-time microcomputer-based laboratory tools
NASA Astrophysics Data System (ADS)
Thornton, Ronald K.; Sokoloff, David R.
1990-09-01
Microcomputer-based laboratory (MBL) tools have been developed which interface to Apple II and Macintosh computers. Students use these tools to collect physical data that are graphed in real time and then can be manipulated and analyzed. The MBL tools have made possible discovery-based laboratory curricula that embody results from educational research. These curricula allow students to take an active role in their learning and encourage them to construct physical knowledge from observation of the physical world. The curricula encourage collaborative learning by taking advantage of the fact that MBL tools present data in an immediately understandable graphical form. This article describes one of the tools—the motion detector (hardware and software)—and the kinematics curriculum. The effectiveness of this curriculum compared to traditional college and university methods for helping students learn basic kinematics concepts has been evaluated by pre- and post-testing and by observation. There is strong evidence for significantly improved learning and retention by students who used the MBL materials, compared to those taught in lecture.
A better way of fitting clips? A comparative study with respect to physical workload.
Gaudez, Clarisse; Wild, Pascal; Aublet-Cuvelier, Agnès
2015-11-01
The clip fitting task is a frequently encountered assembly operation in the car industry. It can cause upper limb pain. During task laboratory simulations, upper limb muscular activity and external force were compared for 4 clip fitting methods: with the bare hand, with an unpowered tool commonly used at a company and with unpowered and powered prototype tools. None of the 4 fitting methods studied induced a lower overall workload than the other three. Muscle activity was lower at the dominant limb when using the unpowered tools and at the non-dominant limb with the bare hand or with the powered tool. Fitting clips with the bare hand required a higher external force than fitting with the three tools. Evaluation of physical workload was different depending on whether external force or muscle activity results were considered. Measuring external force only, as recommended in several standards, is insufficient for evaluating physical workload. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.
2013-12-01
The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).
Merrill, Katherine G; Knight, Louise; Glynn, Judith R; Allen, Elizabeth; Naker, Dipak; Devries, Karen M
2017-01-01
Objective To conduct a multilevel analysis of risk factors for physical violence perpetration by school staff against Ugandan students. Design Multilevel logistic regression analysis of cross-sectional survey data from 499 staff and 828 caregivers of students at 38 primary schools, collected in 2012 and 2014 during the Good Schools Study. Setting Luwero District, Uganda. Main outcome measure Past-week use of physical violence by school staff against students was measured using the International Society for the Prevention of Child Abuse and Neglect ‘Child Abuse Screening Tool- Child International’ and the WHO Multi-Country Study on Women’s Health and Domestic Violence against Women. Results Of 499 staff, 215 (43%) reported perpetration of physical violence against students in the past week. Individual risk factors associated with physical violence perpetration included being a teacher versus another type of staff member (p<0.001), approving of physical discipline practices (p<0.001), having children (p<0.01), being age 30–39 years (p<0.05), using physical violence against non-students (p<0.05) and being a victim of intimate partner violence (IPV) (p<0.05). We observed weak evidence (p=0.06) that male staff members who had been a victim of IPV showed higher odds of violence perpetration compared with male staff who had not been a victim of IPV. No evidence was observed for school- or community-level risk factors. Conclusions Physical violence perpetration from school staff is widespread, and interventions are needed to address this issue. Staff who have been victims of violence and who use violence against people other than students may benefit from additional interventions. Researchers should further investigate how school and community contexts influence staff’s physical violence usage, given a lack of associations observed in this study. PMID:28821514
Merrill, Katherine G; Knight, Louise; Glynn, Judith R; Allen, Elizabeth; Naker, Dipak; Devries, Karen M
2017-08-18
To conduct a multilevel analysis of risk factors for physical violence perpetration by school staff against Ugandan students. Multilevel logistic regression analysis of cross-sectional survey data from 499 staff and 828 caregivers of students at 38 primary schools, collected in 2012 and 2014 during the Good Schools Study. Luwero District, Uganda. Past-week use of physical violence by school staff against students was measured using the International Society for the Prevention of Child Abuse and Neglect 'Child Abuse Screening Tool- Child International' and the WHO Multi-Country Study on Women's Health and Domestic Violence against Women. Of 499 staff, 215 (43%) reported perpetration of physical violence against students in the past week. Individual risk factors associated with physical violence perpetration included being a teacher versus another type of staff member (p<0.001), approving of physical discipline practices (p<0.001), having children (p<0.01), being age 30-39 years (p<0.05), using physical violence against non-students (p<0.05) and being a victim of intimate partner violence (IPV) (p<0.05). We observed weak evidence (p=0.06) that male staff members who had been a victim of IPV showed higher odds of violence perpetration compared with male staff who had not been a victim of IPV. No evidence was observed for school- or community-level risk factors. Physical violence perpetration from school staff is widespread, and interventions are needed to address this issue. Staff who have been victims of violence and who use violence against people other than students may benefit from additional interventions. Researchers should further investigate how school and community contexts influence staff's physical violence usage, given a lack of associations observed in this study. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Baig, Hasan; Madsen, Jan
2017-01-15
Simulation and behavioral analysis of genetic circuits is a standard approach of functional verification prior to their physical implementation. Many software tools have been developed to perform in silico analysis for this purpose, but none of them allow users to interact with the model during runtime. The runtime interaction gives the user a feeling of being in the lab performing a real world experiment. In this work, we present a user-friendly software tool named D-VASim (Dynamic Virtual Analyzer and Simulator), which provides a virtual laboratory environment to simulate and analyze the behavior of genetic logic circuit models represented in an SBML (Systems Biology Markup Language). Hence, SBML models developed in other software environments can be analyzed and simulated in D-VASim. D-VASim offers deterministic as well as stochastic simulation; and differs from other software tools by being able to extract and validate the Boolean logic from the SBML model. D-VASim is also capable of analyzing the threshold value and propagation delay of a genetic circuit model. D-VASim is available for Windows and Mac OS and can be downloaded from bda.compute.dtu.dk/downloads/. haba@dtu.dk, jama@dtu.dk. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
The PhytoClust tool for metabolic gene clusters discovery in plant genomes
Fuchs, Lisa-Maria
2017-01-01
Abstract The existence of Metabolic Gene Clusters (MGCs) in plant genomes has recently raised increased interest. Thus far, MGCs were commonly identified for pathways of specialized metabolism, mostly those associated with terpene type products. For efficient identification of novel MGCs, computational approaches are essential. Here, we present PhytoClust; a tool for the detection of candidate MGCs in plant genomes. The algorithm employs a collection of enzyme families related to plant specialized metabolism, translated into hidden Markov models, to mine given genome sequences for physically co-localized metabolic enzymes. Our tool accurately identifies previously characterized plant MGCs. An exhaustive search of 31 plant genomes detected 1232 and 5531 putative gene cluster types and candidates, respectively. Clustering analysis of putative MGCs types by species reflected plant taxonomy. Furthermore, enrichment analysis revealed taxa- and species-specific enrichment of certain enzyme families in MGCs. When operating through our web-interface, PhytoClust users can mine a genome either based on a list of known cluster types or by defining new cluster rules. Moreover, for selected plant species, the output can be complemented by co-expression analysis. Altogether, we envisage PhytoClust to enhance novel MGCs discovery which will in turn impact the exploration of plant metabolism. PMID:28486689
The PhytoClust tool for metabolic gene clusters discovery in plant genomes.
Töpfer, Nadine; Fuchs, Lisa-Maria; Aharoni, Asaph
2017-07-07
The existence of Metabolic Gene Clusters (MGCs) in plant genomes has recently raised increased interest. Thus far, MGCs were commonly identified for pathways of specialized metabolism, mostly those associated with terpene type products. For efficient identification of novel MGCs, computational approaches are essential. Here, we present PhytoClust; a tool for the detection of candidate MGCs in plant genomes. The algorithm employs a collection of enzyme families related to plant specialized metabolism, translated into hidden Markov models, to mine given genome sequences for physically co-localized metabolic enzymes. Our tool accurately identifies previously characterized plant MGCs. An exhaustive search of 31 plant genomes detected 1232 and 5531 putative gene cluster types and candidates, respectively. Clustering analysis of putative MGCs types by species reflected plant taxonomy. Furthermore, enrichment analysis revealed taxa- and species-specific enrichment of certain enzyme families in MGCs. When operating through our web-interface, PhytoClust users can mine a genome either based on a list of known cluster types or by defining new cluster rules. Moreover, for selected plant species, the output can be complemented by co-expression analysis. Altogether, we envisage PhytoClust to enhance novel MGCs discovery which will in turn impact the exploration of plant metabolism. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Kanjanapen, Manorth; Kunsombat, Cherdsak; Chiangga, Surasak
2017-09-01
The functional transformation method (FTM) is a powerful tool for detailed investigation of digital sound synthesis by the physical modeling method, the resulting sound or measured vibrational characteristics at discretized points on real instruments directly solves the underlying physical effect of partial differential equation (PDE). In this paper, we present the Higuchi’s method to examine the difference between the timbre of tone and estimate fractal dimension of musical signals which contains information about their geometrical structure that synthesizes by FTM. With the Higuchi’s method we obtain the whole process is not complicated, fast processing, with the ease of analysis without expertise in the physics or virtuoso musicians and the easiest way for the common people can judge that sounds similarly presented.
Real time polymer nanocomposites-based physical nanosensors: theory and modeling.
Bellucci, Stefano; Shunin, Yuri; Gopeyenko, Victor; Lobanova-Shunina, Tamara; Burlutskaya, Nataly; Zhukovskii, Yuri
2017-09-01
Functionalized carbon nanotubes and graphene nanoribbons nanostructures, serving as the basis for the creation of physical pressure and temperature nanosensors, are considered as tools for ecological monitoring and medical applications. Fragments of nanocarbon inclusions with different morphologies, presenting a disordered system, are regarded as models for nanocomposite materials based on carbon nanoсluster suspension in dielectric polymer environments (e.g., epoxy resins). We have formulated the approach of conductivity calculations for carbon-based polymer nanocomposites using the effective media cluster approach, disordered systems theory and conductivity mechanisms analysis, and obtained the calibration dependences. Providing a proper description of electric responses in nanosensoring systems, we demonstrate the implementation of advanced simulation models suitable for real time control nanosystems. We also consider the prospects and prototypes of the proposed physical nanosensor models providing the comparisons with experimental calibration dependences.
Real time polymer nanocomposites-based physical nanosensors: theory and modeling
NASA Astrophysics Data System (ADS)
Bellucci, Stefano; Shunin, Yuri; Gopeyenko, Victor; Lobanova-Shunina, Tamara; Burlutskaya, Nataly; Zhukovskii, Yuri
2017-09-01
Functionalized carbon nanotubes and graphene nanoribbons nanostructures, serving as the basis for the creation of physical pressure and temperature nanosensors, are considered as tools for ecological monitoring and medical applications. Fragments of nanocarbon inclusions with different morphologies, presenting a disordered system, are regarded as models for nanocomposite materials based on carbon nanoсluster suspension in dielectric polymer environments (e.g., epoxy resins). We have formulated the approach of conductivity calculations for carbon-based polymer nanocomposites using the effective media cluster approach, disordered systems theory and conductivity mechanisms analysis, and obtained the calibration dependences. Providing a proper description of electric responses in nanosensoring systems, we demonstrate the implementation of advanced simulation models suitable for real time control nanosystems. We also consider the prospects and prototypes of the proposed physical nanosensor models providing the comparisons with experimental calibration dependences.
Pilot Wave Model for Impulsive Thrust from RF Test Device Measured in Vacuum
NASA Technical Reports Server (NTRS)
White, Harold; Lawrence, James; Sylvester, Andre; Vera, Jerry; Chap, Andrew; George, Jeff
2017-01-01
A physics model is developed in detail and its place in the taxonomy of ideas about the nature of the quantum vacuum is discussed. The experimental results from the recently completed vacuum test campaign evaluating the impulsive thrust performance of a tapered RF test article excited in the TM212 mode at 1,937 megahertz (MHz) are summarized. The empirical data from this campaign is compared to the predictions from the physics model tools. A discussion is provided to further elaborate on the possible implications of the proposed model if it is physically valid. Based on the correlation of analysis prediction with experimental data collected, it is proposed that the observed anomalous thrust forces are real, not due to experimental error, and are due to a new type of interaction with quantum vacuum fluctuations.
Together and apart: a typology of re-partnering in old age.
Koren, Chaya
2014-08-01
The human need for love, friendship, and physical contact, and the fear of loneliness do not diminish with age. Widowhood and late-life divorce and increased life expectancy are likely to lead to alternative relationships, such as re-partnering. The purpose of this paper is to explore interplays between emotional and physical components of re-partnering in old age. Theoretical sampling of 20 couples included men who re-partnered at the age of 65+ years and women at the age of 60+ years, following termination of lifelong marriages due to death or divorce. Living arrangements included married or unmarried cohabitation under the same roof or in separate homes. Forty semi-structured interviews were tape-recorded and transcribed verbatim. The couple was the unit of analysis. Interplays between physical and emotional dimensions were examined using five abductive parameters derived from data analysis resulting in a fourfold typology of emotional and physical closeness/distance in re-partnering in old age: (1) living together (physically and emotionally); (2) living apart (physically) together (emotionally); (3) living together (physically) apart (emotionally); and (4) living apart (physically and emotionally). Findings revealed types of partner relationships that are different from lifelong marriages. The typology could help professionals working with older persons regarding what to expect in re-partnering in old age and be included in developmental theories as an option in old age. A quantitative tool for research and therapy purposes, entitled The Re-partnering in Old Age Typology Scale (RPOAT Scale), based on abductive parameters, could be established for measuring re-partnering relationship quality and classifying re-partnering couples.
NASA Astrophysics Data System (ADS)
Leka, K. D.; Barnes, Graham; Wagner, Eric
2018-04-01
A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.
New physics with the lepton flavor violating decay τ →3 μ
NASA Astrophysics Data System (ADS)
Calcuttawala, Zaineb; Kundu, Anirban; Nandi, Soumitra; Patra, Sunando Kumar
2018-05-01
Lepton flavor violating (LFV) processes are a smoking gun signal of new physics (NP). If the semileptonic B decay anomalies are indeed due to some NP, such operators can potentially lead to LFV decays involving the second and the third generation leptons, like τ →3 μ . In this paper, we explore how far the nature of NP can be unraveled at the next generation B -factories like Belle-II, provided the decay τ →3 μ has been observed. We use four observables with which the differentiation among NP operators may be achieved to a high confidence level. Possible presence of multiple NP operators are also analyzed with the optimal observable technique. While the analysis can be improved even further if the final state muon polarizations are measured, we present this work as a motivational tool for the experimentalists, as well as a template for the analysis of similar processes.
Research-based resources on PhysPort
NASA Astrophysics Data System (ADS)
Sayre, Eleanor
2017-01-01
PhysPort (http://physport.org) is a website that supports physics faculty in implementing research-based teaching practices in their classrooms. We provide expert recommendations and practical information about teaching methods and assessment. The PhysPort Data Explorer is an intuitive online tool for physics faculty to analyze their assessment data. Faculty upload their students' responses using our secure interface. The Data Explorer matches their pre/post data, scores it, compares it to national data, and graphs it in an interactive and intuitive manner. The Periscope collection on Physport brings together classroom video of students working groups with professional development materials for faculty, pre-service teachers, and learning assistants. To support PhysPort's development efforts, we conduct research on faculty needs around teaching and assessment, secondary analysis of published PER studies, and primary analysis of assessment data. In this talk, I'll introduce some of PhysPort's research-based resources and the research results which support them.