Sweeney, Jane K; Heriza, Carolyn B; Blanchard, Yvette
2009-01-01
To describe clinical training models, delineate clinical competencies, and outline a clinical decision-making algorithm for neonatal physical therapy. In these updated practice guidelines, advanced clinical training models, including precepted practicum and residency or fellowship training, are presented to guide practitioners in organizing mentored, competency-based preparation for neonatal care. Clinical competencies in neonatal physical therapy are outlined with advanced clinical proficiencies and knowledge areas specific to each role. An algorithm for decision making on examination, evaluation, intervention, and re-examination processes provides a framework for clinical reasoning. Because of advanced-level competency requirements and the continuous examination, evaluation, and modification of procedures during each patient contact, the intensive care unit is a restricted practice area for physical therapist assistants, physical therapist generalists, and physical therapy students. Accountable, ethical physical therapy for neonates requires advanced, competency-based training with a preceptor in the pediatric subspecialty of neonatology.
Advanced Computing Tools and Models for Accelerator Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryne, Robert; Ryne, Robert D.
2008-06-11
This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.
NASA Astrophysics Data System (ADS)
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R. N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.
2017-07-01
The diversity in hydrologic models has historically led to great controversy on the correct
approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
NASA Astrophysics Data System (ADS)
Clark, M. P.; Nijssen, B.; Wood, A.; Mizukami, N.; Newman, A. J.
2017-12-01
The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
High Fidelity Modeling of Field Reversed Configuration (FRC) Thrusters
2016-06-01
space propulsion . This effort consists of numerical model development, physical model development, and systematic studies of the non-linear plasma...studies of the physical characteristics of Field Reversed Configuration (FRC) plasma for advanced space propulsion . This effort consists of numerical...FRCs for propulsion application. Two of the most advanced designs are based on the theta-pinch formation and the RMF formation mechanism, which
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis
The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less
Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; ...
2017-07-11
The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less
Source Physics Experiments at the Nevada Test Site
2010-09-01
not display a currently valid OMB control number. 1. REPORT DATE SEP 2010 2. REPORT TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND...seismograms through three-dimensional models of the earth will move monitoring science into a physics- based era. This capability should enable...the advanced ability to model synthetic seismograms in three-dimensional earth models should also lead to advances in the ability to locate and
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
Physically based modeling in catchment hydrology at 50: Survey and outlook
NASA Astrophysics Data System (ADS)
Paniconi, Claudio; Putti, Mario
2015-09-01
Integrated, process-based numerical models in hydrology are rapidly evolving, spurred by novel theories in mathematical physics, advances in computational methods, insights from laboratory and field experiments, and the need to better understand and predict the potential impacts of population, land use, and climate change on our water resources. At the catchment scale, these simulation models are commonly based on conservation principles for surface and subsurface water flow and solute transport (e.g., the Richards, shallow water, and advection-dispersion equations), and they require robust numerical techniques for their resolution. Traditional (and still open) challenges in developing reliable and efficient models are associated with heterogeneity and variability in parameters and state variables; nonlinearities and scale effects in process dynamics; and complex or poorly known boundary conditions and initial system states. As catchment modeling enters a highly interdisciplinary era, new challenges arise from the need to maintain physical and numerical consistency in the description of multiple processes that interact over a range of scales and across different compartments of an overall system. This paper first gives an historical overview (past 50 years) of some of the key developments in physically based hydrological modeling, emphasizing how the interplay between theory, experiments, and modeling has contributed to advancing the state of the art. The second part of the paper examines some outstanding problems in integrated catchment modeling from the perspective of recent developments in mathematical and computational science.
Advanced instrumentation for aeronautical propulsion research
NASA Technical Reports Server (NTRS)
Hartmann, M. J.
1986-01-01
The development and use of advanced instrumentation and measurement systems are key to extending the understanding of the physical phenomena that limit the advancement of aeropropulsion systems. The data collected by using these systems are necessary to verify numerical models and to increase the technologists' intuition into the physical phenomena. The systems must be versatile enough to allow their use with older technology measurement systems, with computer-based data reduction systems, and with existing test facilities. Researchers in all aeropropulsion fields contribute to the development of these systems.
Mastery Learning in Physical Education.
ERIC Educational Resources Information Center
Annarino, Anthony
This paper discusses the design of a physical education curriculum to be used in advanced secondary physical education programs and in university basic instructional programs; the design is based on the premise of mastery learning and employs programed instructional techniques. The effective implementation of a mastery learning model necessitates…
NASA Astrophysics Data System (ADS)
Lei, Li
1999-07-01
In this study the researcher develops and presents a new model, founded on the laws of physics, for analyzing dance technique. Based on a pilot study of four advanced dance techniques, she creates a new model for diagnosing, analyzing and describing basic, intermediate and advanced dance techniques. The name for this model is ``PED,'' which stands for Physics of Expressive Dance. The research design consists of five phases: (1) Conduct a pilot study to analyze several advanced dance techniques chosen from Chinese dance, modem dance, and ballet; (2) Based on learning obtained from the pilot study, create the PED Model for analyzing dance technique; (3) Apply this model to eight categories of dance technique; (4) Select two advanced dance techniques from each category and analyze these sample techniques to demonstrate how the model works; (5) Develop an evaluation framework and use it to evaluate the effectiveness of the model, taking into account both scientific and artistic aspects of dance training. In this study the researcher presents new solutions to three problems highly relevant to dance education: (1) Dancers attempting to learn difficult movements often fail because they are unaware of physics laws; (2) Even those who do master difficult movements can suffer injury due to incorrect training methods; (3) Even the best dancers can waste time learning by trial and error, without scientific instruction. In addition, the researcher discusses how the application of the PED model can benefit dancers, allowing them to avoid inefficient and ineffective movements and freeing them to focus on the artistic expression of dance performance. This study is unique, presenting the first comprehensive system for analyzing dance techniques in terms of physics laws. The results of this study are useful, allowing a new level of awareness about dance techniques that dance professionals can utilize for more effective and efficient teaching and learning. The approach utilized in this study is universal, and can be applied to any dance movement and to any dance style.
Integrated Formulation of Beacon-Based Exception Analysis for Multimissions
NASA Technical Reports Server (NTRS)
Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail
2003-01-01
Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,
Physics through the 1990s: Gravitation, cosmology and cosmic-ray physics
NASA Technical Reports Server (NTRS)
1986-01-01
The volume contains recommendations for space-and ground-based programs in gravitational physics, cosmology, and cosmic-ray physics. The section on gravitation examines current and planned experimental tests of general relativity; the theory behind, and search for, gravitational waves, including sensitive laser-interferometric tests and other observations; and advances in gravitation theory (for example, incorporating quantum effects). The section on cosmology deals with the big-bang model, the standard model from elementary-particle theory, the inflationary model of the Universe. Computational needs are presented for both gravitation and cosmology. Finally, cosmic-ray physics theory (nucleosynthesis, acceleration models, high-energy physics) and experiment (ground and spaceborne detectors) are discussed.
Physics-based simulation models for EBSD: advances and challenges
NASA Astrophysics Data System (ADS)
Winkelmann, A.; Nolze, G.; Vos, M.; Salvat-Pujol, F.; Werner, W. S. M.
2016-02-01
EBSD has evolved into an effective tool for microstructure investigations in the scanning electron microscope. The purpose of this contribution is to give an overview of various simulation approaches for EBSD Kikuchi patterns and to discuss some of the underlying physical mechanisms.
Observation-Based Dissipation and Input Terms for Spectral Wave Models, with End-User Testing
2014-09-30
scale influence of the Great barrier reef matrix on wave attenuation, Coral Reefs [published, refereed] Ghantous, M., and A.V. Babanin, 2014: One...Observation-Based Dissipation and Input Terms for Spectral Wave Models...functions, based on advanced understanding of physics of air-sea interactions, wave breaking and swell attenuation, in wave - forecast models. OBJECTIVES The
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
NASA Astrophysics Data System (ADS)
Nardi, F.; Grimaldi, S.; Petroselli, A.
2012-12-01
Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.
NASA Astrophysics Data System (ADS)
Tallapragada, V.
2017-12-01
NOAA's Next Generation Global Prediction System (NGGPS) has provided the unique opportunity to develop and implement a non-hydrostatic global model based on Geophysical Fluid Dynamics Laboratory (GFDL) Finite Volume Cubed Sphere (FV3) Dynamic Core at National Centers for Environmental Prediction (NCEP), making a leap-step advancement in seamless prediction capabilities across all spatial and temporal scales. Model development efforts are centralized with unified model development in the NOAA Environmental Modeling System (NEMS) infrastructure based on Earth System Modeling Framework (ESMF). A more sophisticated coupling among various earth system components is being enabled within NEMS following National Unified Operational Prediction Capability (NUOPC) standards. The eventual goal of unifying global and regional models will enable operational global models operating at convective resolving scales. Apart from the advanced non-hydrostatic dynamic core and coupling to various earth system components, advanced physics and data assimilation techniques are essential for improved forecast skill. NGGPS is spearheading ambitious physics and data assimilation strategies, concentrating on creation of a Common Community Physics Package (CCPP) and Joint Effort for Data Assimilation Integration (JEDI). Both initiatives are expected to be community developed, with emphasis on research transitioning to operations (R2O). The unified modeling system is being built to support the needs of both operations and research. Different layers of community partners are also established with specific roles/responsibilities for researchers, core development partners, trusted super-users, and operations. Stakeholders are engaged at all stages to help drive the direction of development, resources allocations and prioritization. This talk presents the current and future plans of unified model development at NCEP for weather, sub-seasonal, and seasonal climate prediction applications with special emphasis on implementation of NCEP FV3 Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) into operations by 2019.
Radiative Transfer Modeling and Retrievals for Advanced Hyperspectral Sensors
NASA Technical Reports Server (NTRS)
Liu, Xu; Zhou, Daniel K.; Larar, Allen M.; Smith, William L., Sr.; Mango, Stephen A.
2009-01-01
A novel radiative transfer model and a physical inversion algorithm based on principal component analysis will be presented. Instead of dealing with channel radiances, the new approach fits principal component scores of these quantities. Compared to channel-based radiative transfer models, the new approach compresses radiances into a much smaller dimension making both forward modeling and inversion algorithm more efficient.
State of the art of sonic boom modeling
NASA Astrophysics Data System (ADS)
Plotkin, Kenneth J.
2002-01-01
Based on fundamental theory developed through the 1950s and 1960s, sonic boom modeling has evolved into practical tools. Over the past decade, there have been requirements for design tools for an advanced supersonic transport, and for tools for environmental assessment of various military and aerospace activities. This has resulted in a number of advances in the understanding of the physics of sonic booms, including shock wave rise times, propagation through turbulence, and blending sonic boom theory with modern computational fluid dynamics (CFD) aerodynamic design methods. This article reviews the early fundamental theory, recent advances in theory, and the application of these advances to practical models.
State of the art of sonic boom modeling.
Plotkin, Kenneth J
2002-01-01
Based on fundamental theory developed through the 1950s and 1960s, sonic boom modeling has evolved into practical tools. Over the past decade, there have been requirements for design tools for an advanced supersonic transport, and for tools for environmental assessment of various military and aerospace activities. This has resulted in a number of advances in the understanding of the physics of sonic booms, including shock wave rise times, propagation through turbulence, and blending sonic boom theory with modern computational fluid dynamics (CFD) aerodynamic design methods. This article reviews the early fundamental theory, recent advances in theory, and the application of these advances to practical models.
Intuitive Physics: Current Research and Controversies.
Kubricht, James R; Holyoak, Keith J; Lu, Hongjing
2017-10-01
Early research in the field of intuitive physics provided extensive evidence that humans succumb to common misconceptions and biases when predicting, judging, and explaining activity in the physical world. Recent work has demonstrated that, across a diverse range of situations, some biases can be explained by the application of normative physical principles to noisy perceptual inputs. However, it remains unclear how knowledge of physical principles is learned, represented, and applied to novel situations. In this review we discuss theoretical advances from heuristic models to knowledge-based, probabilistic simulation models, as well as recent deep-learning models. We also consider how recent work may be reconciled with earlier findings that favored heuristic models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Advances in Cell and Gene-based Therapies for Cystic Fibrosis Lung Disease
Oakland, Mayumi; Sinn, Patrick L; McCray Jr, Paul B
2012-01-01
Cystic fibrosis (CF) is a disease characterized by airway infection, inflammation, remodeling, and obstruction that gradually destroy the lungs. Direct delivery of the cystic fibrosis transmembrane conductance regulator (CFTR) gene to airway epithelia may offer advantages, as the tissue is accessible for topical delivery of vectors. Yet, physical and host immune barriers in the lung present challenges for successful gene transfer to the respiratory tract. Advances in gene transfer approaches, tissue engineering, and novel animal models are generating excitement within the CF research field. This review discusses current challenges and advancements in viral and nonviral vectors, cell-based therapies, and CF animal models. PMID:22371844
NASA Astrophysics Data System (ADS)
Toepfer, F.; Cortinas, J. V., Jr.; Kuo, W.; Tallapragada, V.; Stajner, I.; Nance, L. B.; Kelleher, K. E.; Firl, G.; Bernardet, L.
2017-12-01
NOAA develops, operates, and maintains an operational global modeling capability for weather, sub seasonal and seasonal prediction for the protection of life and property and fostering the US economy. In order to substantially improve the overall performance and accelerate advancements of the operational modeling suite, NOAA is partnering with NCAR to design and build the Global Modeling Test Bed (GMTB). The GMTB has been established to provide a platform and a capability for researchers to contribute to the advancement primarily through the development of physical parameterizations needed to improve operational NWP. The strategy to achieve this goal relies on effectively leveraging global expertise through a modern collaborative software development framework. This framework consists of a repository of vetted and supported physical parameterizations known as the Common Community Physics Package (CCPP), a common well-documented interface known as the Interoperable Physics Driver (IPD) for combining schemes into suites and for their configuration and connection to dynamic cores, and an open evidence-based governance process for managing the development and evolution of CCPP. In addition, a physics test harness designed to work within this framework has been established in order to facilitate easier like-to-like comparison of physics advancements. This paper will present an overview of the design of the CCPP and test platform. Additionally, an overview of potential new opportunities of how physics developers can engage in the process, from implementing code for CCPP/IPD compliance to testing their development within an operational-like software environment, will be presented. In addition, insight will be given as to how development gets elevated to CPPP-supported status, the pre-cursor to broad availability and use within operational NWP. An overview of how the GMTB can be expanded to support other global or regional modeling capabilities will also be presented.
A Diagnostic Model for Impending Death in Cancer Patients: Preliminary Report
Hui, David; Hess, Kenneth; dos Santos, Renata; Chisholm, Gary; Bruera, Eduardo
2015-01-01
Background We recently identified several highly specific bedside physical signs associated with impending death within 3 days among patients with advanced cancer. In this study, we developed and assessed a diagnostic model for impending death based on these physical signs. Methods We systematically documented 62 physical signs every 12 hours from admission to death or discharge in 357 patients with advanced cancer admitted to acute palliative care units (APCUs) at two tertiary care cancer centers. We used recursive partitioning analysis (RPA) to develop a prediction model for impending death in 3 days using admission data. We validated the model with 5 iterations of 10-fold cross-validation, and also applied the model to APCU days 2/3/4/5/6. Results Among 322/357 (90%) patients with complete data for all signs, the 3-day mortality was 24% on admission. The final model was based on 2 variables (palliative performance scale [PPS] and drooping of nasolabial fold) and had 4 terminal leaves: PPS≤20% and drooping of nasolabial fold present, PPS≤20% and drooping of nasolabial fold absent, PPS 30–60% and PPS ≥ 70%, with 3-day mortality of 94%, 42%, 16% and 3%, respectively. The diagnostic accuracy was 81% for the original tree, 80% for cross-validation, and 79%–84% for subsequent APCU days. Conclusion(s) We developed a diagnostic model for impending death within 3 days based on 2 objective bedside physical signs. This model was applicable to both APCU admission and subsequent days. Upon further external validation, this model may help clinicians to formulate the diagnosis of impending death. PMID:26218612
Real time polymer nanocomposites-based physical nanosensors: theory and modeling.
Bellucci, Stefano; Shunin, Yuri; Gopeyenko, Victor; Lobanova-Shunina, Tamara; Burlutskaya, Nataly; Zhukovskii, Yuri
2017-09-01
Functionalized carbon nanotubes and graphene nanoribbons nanostructures, serving as the basis for the creation of physical pressure and temperature nanosensors, are considered as tools for ecological monitoring and medical applications. Fragments of nanocarbon inclusions with different morphologies, presenting a disordered system, are regarded as models for nanocomposite materials based on carbon nanoсluster suspension in dielectric polymer environments (e.g., epoxy resins). We have formulated the approach of conductivity calculations for carbon-based polymer nanocomposites using the effective media cluster approach, disordered systems theory and conductivity mechanisms analysis, and obtained the calibration dependences. Providing a proper description of electric responses in nanosensoring systems, we demonstrate the implementation of advanced simulation models suitable for real time control nanosystems. We also consider the prospects and prototypes of the proposed physical nanosensor models providing the comparisons with experimental calibration dependences.
Real time polymer nanocomposites-based physical nanosensors: theory and modeling
NASA Astrophysics Data System (ADS)
Bellucci, Stefano; Shunin, Yuri; Gopeyenko, Victor; Lobanova-Shunina, Tamara; Burlutskaya, Nataly; Zhukovskii, Yuri
2017-09-01
Functionalized carbon nanotubes and graphene nanoribbons nanostructures, serving as the basis for the creation of physical pressure and temperature nanosensors, are considered as tools for ecological monitoring and medical applications. Fragments of nanocarbon inclusions with different morphologies, presenting a disordered system, are regarded as models for nanocomposite materials based on carbon nanoсluster suspension in dielectric polymer environments (e.g., epoxy resins). We have formulated the approach of conductivity calculations for carbon-based polymer nanocomposites using the effective media cluster approach, disordered systems theory and conductivity mechanisms analysis, and obtained the calibration dependences. Providing a proper description of electric responses in nanosensoring systems, we demonstrate the implementation of advanced simulation models suitable for real time control nanosystems. We also consider the prospects and prototypes of the proposed physical nanosensor models providing the comparisons with experimental calibration dependences.
Mace Firebaugh, Casey; Moyes, Simon; Jatrana, Santosh; Rolleston, Anna; Kerse, Ngaire
2018-01-18
The relationship between physical activity, function, and mortality is not established in advanced age. Physical activity, function, and mortality were followed in a cohort of Māori and non-Māori adults living in advanced age for a period of six years. Generalised Linear regression models were used to analyse the association between physical activity and NEADL while Kaplan-Meier survival analysis, and Cox-proportional hazard models were used to assess the association between the physical activity and mortality. The Hazard Ratio for mortality for those in the least active physical activity quartile was 4.1 for Māori and 1.8 for non- Māori compared to the most active physical activity quartile. There was an inverse relationship between physical activity and mortality, with lower hazard ratios for mortality at all levels of physical activity. Higher levels of physical activity were associated with lower mortality and higher functional status in advanced aged adults.
An Anticipatory Model of Cavitation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgood, G.O.; Dress, W.B., Jr.; Hylton, J.O.
1999-04-05
The Anticipatory System (AS) formalism developed by Robert Rosen provides some insight into the problem of embedding intelligent behavior in machines. AS emulates the anticipatory behavior of biological systems. AS bases its behavior on its expectations about the near future and those expectations are modified as the system gains experience. The expectation is based on an internal model that is drawn from an appeal to physical reality. To be adaptive, the model must be able to update itself. To be practical, the model must run faster than real-time. The need for a physical model and the requirement that the modelmore » execute at extreme speeds, has held back the application of AS to practical problems. Two recent advances make it possible to consider the use of AS for practical intelligent sensors. First, advances in transducer technology make it possible to obtain previously unavailable data from which a model can be derived. For example, acoustic emissions (AE) can be fed into a Bayesian system identifier that enables the separation of a weak characterizing signal, such as the signature of pump cavitation precursors, from a strong masking signal, such as a pump vibration feature. The second advance is the development of extremely fast, but inexpensive, digital signal processing hardware on which it is possible to run an adaptive Bayesian-derived model faster than real-time. This paper reports the investigation of an AS using a model of cavitation based on hydrodynamic principles and Bayesian analysis of data from high-performance AE sensors.« less
A Collaborative Model for Community-Based Health Care Screening of Homeless Adolescents.
ERIC Educational Resources Information Center
Busen, Nancy H.; Beech, Bettina
1997-01-01
A multidisciplinary team from community organizations serving the homeless and from universities collaborated in screening 150 homeless adolescents for psychosocial and physical risks. The population had a history of physical, sexual, and substance abuse as well as high rates of HIV and hepatitis B. Case management by advanced practice nurses was…
Predicting remaining life by fusing the physics of failure modeling with diagnostics
NASA Astrophysics Data System (ADS)
Kacprzynski, G. J.; Sarlashkar, A.; Roemer, M. J.; Hess, A.; Hardman, B.
2004-03-01
Technology that enables failure prediction of critical machine components (prognostics) has the potential to significantly reduce maintenance costs and increase availability and safety. This article summarizes a research effort funded through the U.S. Defense Advanced Research Projects Agency and Naval Air System Command aimed at enhancing prognostic accuracy through more advanced physics-of-failure modeling and intelligent utilization of relevant diagnostic information. H-60 helicopter gear is used as a case study to introduce both stochastic sub-zone crack initiation and three-dimensional fracture mechanics lifing models along with adaptive model updating techniques for tuning key failure mode variables at a local material/damage site based on fused vibration features. The overall prognostic scheme is aimed at minimizing inherent modeling and operational uncertainties via sensed system measurements that evolve as damage progresses.
A diagnostic model for impending death in cancer patients: Preliminary report.
Hui, David; Hess, Kenneth; dos Santos, Renata; Chisholm, Gary; Bruera, Eduardo
2015-11-01
Several highly specific bedside physical signs associated with impending death within 3 days for patients with advanced cancer were recently identified. A diagnostic model for impending death based on these physical signs was developed and assessed. Sixty-two physical signs were systematically documented every 12 hours from admission to death or discharge for 357 patients with advanced cancer who were admitted to acute palliative care units (APCUs) at 2 tertiary care cancer centers. Recursive partitioning analysis was used to develop a prediction model for impending death within 3 days with admission data. The model was validated with 5 iterations of 10-fold cross-validation, and the model was also applied to APCU days 2 to 6. For the 322 of 357 patients (90%) with complete data for all signs, the 3-day mortality rate was 24% on admission. The final model was based on 2 variables (Palliative Performance Scale [PPS] and drooping of nasolabial folds) and had 4 terminal leaves: PPS score ≤ 20% and drooping of nasolabial folds present, PPS score ≤ 20% and drooping of nasolabial folds absent, PPS score of 30% to 60%, and PPS score ≥ 70%. The 3-day mortality rates were 94%, 42%, 16%, and 3%, respectively. The diagnostic accuracy was 81% for the original tree, 80% for cross-validation, and 79% to 84% for subsequent APCU days. Based on 2 objective bedside physical signs, a diagnostic model was developed for impending death within 3 days. This model was applicable to both APCU admission and subsequent days. Upon further external validation, this model may help clinicians to formulate the diagnosis of impending death. © 2015 American Cancer Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This factsheet describes a project that developed and demonstrated a new manufacturing-informed design framework that utilizes advanced multi-scale, physics-based process modeling to dramatically improve manufacturing productivity and quality in machining operations while reducing the cost of machined components.
NSF's Perspective on Space Weather Research for Building Forecasting Capabilities
NASA Astrophysics Data System (ADS)
Bisi, M. M.; Pulkkinen, A. A.; Bisi, M. M.; Pulkkinen, A. A.; Webb, D. F.; Oughton, E. J.; Azeem, S. I.
2017-12-01
Space weather research at the National Science Foundation (NSF) is focused on scientific discovery and on deepening knowledge of the Sun-Geospace system. The process of maturation of knowledge base is a requirement for the development of improved space weather forecast models and for the accurate assessment of potential mitigation strategies. Progress in space weather forecasting requires advancing in-depth understanding of the underlying physical processes, developing better instrumentation and measurement techniques, and capturing the advancements in understanding in large-scale physics based models that span the entire chain of events from the Sun to the Earth. This presentation will provide an overview of current and planned programs pertaining to space weather research at NSF and discuss the recommendations of the Geospace Section portfolio review panel within the context of space weather forecasting capabilities.
Advancing reservoir operation description in physically based hydrological models
NASA Astrophysics Data System (ADS)
Anghileri, Daniela; Giudici, Federico; Castelletti, Andrea; Burlando, Paolo
2016-04-01
Last decades have seen significant advances in our capacity of characterizing and reproducing hydrological processes within physically based models. Yet, when the human component is considered (e.g. reservoirs, water distribution systems), the associated decisions are generally modeled with very simplistic rules, which might underperform in reproducing the actual operators' behaviour on a daily or sub-daily basis. For example, reservoir operations are usually described by a target-level rule curve, which represents the level that the reservoir should track during normal operating conditions. The associated release decision is determined by the current state of the reservoir relative to the rule curve. This modeling approach can reasonably reproduce the seasonal water volume shift due to reservoir operation. Still, it cannot capture more complex decision making processes in response, e.g., to the fluctuations of energy prices and demands, the temporal unavailability of power plants or varying amount of snow accumulated in the basin. In this work, we link a physically explicit hydrological model with detailed hydropower behavioural models describing the decision making process by the dam operator. In particular, we consider two categories of behavioural models: explicit or rule-based behavioural models, where reservoir operating rules are empirically inferred from observational data, and implicit or optimization based behavioural models, where, following a normative economic approach, the decision maker is represented as a rational agent maximising a utility function. We compare these two alternate modelling approaches on the real-world water system of Lake Como catchment in the Italian Alps. The water system is characterized by the presence of 18 artificial hydropower reservoirs generating almost 13% of the Italian hydropower production. Results show to which extent the hydrological regime in the catchment is affected by different behavioural models and reservoir operating strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anh Bui; Nam Dinh; Brian Williams
In addition to validation data plan, development of advanced techniques for calibration and validation of complex multiscale, multiphysics nuclear reactor simulation codes are a main objective of the CASL VUQ plan. Advanced modeling of LWR systems normally involves a range of physico-chemical models describing multiple interacting phenomena, such as thermal hydraulics, reactor physics, coolant chemistry, etc., which occur over a wide range of spatial and temporal scales. To a large extent, the accuracy of (and uncertainty in) overall model predictions is determined by the correctness of various sub-models, which are not conservation-laws based, but empirically derived from measurement data. Suchmore » sub-models normally require extensive calibration before the models can be applied to analysis of real reactor problems. This work demonstrates a case study of calibration of a common model of subcooled flow boiling, which is an important multiscale, multiphysics phenomenon in LWR thermal hydraulics. The calibration process is based on a new strategy of model-data integration, in which, all sub-models are simultaneously analyzed and calibrated using multiple sets of data of different types. Specifically, both data on large-scale distributions of void fraction and fluid temperature and data on small-scale physics of wall evaporation were simultaneously used in this work’s calibration. In a departure from traditional (or common-sense) practice of tuning/calibrating complex models, a modern calibration technique based on statistical modeling and Bayesian inference was employed, which allowed simultaneous calibration of multiple sub-models (and related parameters) using different datasets. Quality of data (relevancy, scalability, and uncertainty) could be taken into consideration in the calibration process. This work presents a step forward in the development and realization of the “CIPS Validation Data Plan” at the Consortium for Advanced Simulation of LWRs to enable quantitative assessment of the CASL modeling of Crud-Induced Power Shift (CIPS) phenomenon, in particular, and the CASL advanced predictive capabilities, in general. This report is prepared for the Department of Energy’s Consortium for Advanced Simulation of LWRs program’s VUQ Focus Area.« less
Physics-Based Computational Algorithm for the Multi-Fluid Plasma Model
2014-06-30
applying it to study laser - 20 Physics-Based Multi-Fluid Plasma Algorithm Shumlak Figure 6: Blended finite element method applied to the species...separation problem in capsule implosions. Number densities and electric field are shown after the laser drive has compressed the multi-fluid plasma and...6 after the laser drive has started the compression. A separation clearly develops. The solution is found using an explicit advance (CFL=1) for the
Barton, Justin E.; Boyer, Mark D.; Shi, Wenyu; ...
2015-07-30
DIII-D experimental results are reported to demonstrate the potential of physics-model-based safety factor profile control for robust and reproducible sustainment of advanced scenarios. In the absence of feedback control, variability in wall conditions and plasma impurities, as well as drifts due to external disturbances, can limit the reproducibility of discharges with simple pre-programmed scenario trajectories. The control architecture utilized is a feedforward + feedback scheme where the feedforward commands are computed off-line and the feedback commands are computed on-line. In this work, firstly a first-principles-driven (FPD), physics-based model of the q profile and normalized beta (β N) dynamics is embeddedmore » into a numerical optimization algorithm to design feedforward actuator trajectories that sheer the plasma through the tokamak operating space to reach a desired stationary target state that is characterized by the achieved q profile and β N. Good agreement between experimental results and simulations demonstrates the accuracy of the models employed for physics-model-based control design. Secondly, a feedback algorithm for q profile control is designed following a FPD approach, and the ability of the controller to achieve and maintain a target q profile evolution is tested in DIII-D high confinement (H-mode) experiments. The controller is shown to be able to effectively control the q profile when β N is relatively close to the target, indicating the need for integrated q profile and β N control to further enhance the ability to achieve robust scenario execution. Furthermore, the ability of an integrated q profile + β N feedback controller to track a desired target is demonstrated through simulation.« less
Benchmarking and Modeling of a Conventional Mid-Size Car Using ALPHA (SAE Paper 2015-01-1140)
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...
Intelligent fault management for the Space Station active thermal control system
NASA Technical Reports Server (NTRS)
Hill, Tim; Faltisco, Robert M.
1992-01-01
The Thermal Advanced Automation Project (TAAP) approach and architecture is described for automating the Space Station Freedom (SSF) Active Thermal Control System (ATCS). The baseline functionally and advanced automation techniques for Fault Detection, Isolation, and Recovery (FDIR) will be compared and contrasted. Advanced automation techniques such as rule-based systems and model-based reasoning should be utilized to efficiently control, monitor, and diagnose this extremely complex physical system. TAAP is developing advanced FDIR software for use on the SSF thermal control system. The goal of TAAP is to join Knowledge-Based System (KBS) technology, using a combination of rules and model-based reasoning, with conventional monitoring and control software in order to maximize autonomy of the ATCS. TAAP's predecessor was NASA's Thermal Expert System (TEXSYS) project which was the first large real-time expert system to use both extensive rules and model-based reasoning to control and perform FDIR on a large, complex physical system. TEXSYS showed that a method is needed for safely and inexpensively testing all possible faults of the ATCS, particularly those potentially damaging to the hardware, in order to develop a fully capable FDIR system. TAAP therefore includes the development of a high-fidelity simulation of the thermal control system. The simulation provides realistic, dynamic ATCS behavior and fault insertion capability for software testing without hardware related risks or expense. In addition, thermal engineers will gain greater confidence in the KBS FDIR software than was possible prior to this kind of simulation testing. The TAAP KBS will initially be a ground-based extension of the baseline ATCS monitoring and control software and could be migrated on-board as additional computation resources are made available.
Statistical analysis of target acquisition sensor modeling experiments
NASA Astrophysics Data System (ADS)
Deaver, Dawne M.; Moyer, Steve
2015-05-01
The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.
Model-Based Control using Model and Mechanization Fusion Techniques for Image-Aided Navigation
2009-03-01
Magnet Motors . Magna Physics Publishing, Hillsboro, OH, 1994. 7. Houwu Bai, Xubo Song, Eric Wan and Andriy Myronenko. “Vision-only Navi- gation and...filter”. Proceedings of the Recent Advances in Space Technologies (RAST). Nov 2003. 6. Hendershot, J.R. and Tje Miller. Design of Brushless Permanent
[Advance in researches on the effect of forest on hydrological process].
Zhang, Zhiqiang; Yu, Xinxiao; Zhao, Yutao; Qin, Yongsheng
2003-01-01
According to the effects of forest on hydrological process, forest hydrology can be divided into three related aspects: experimental research on the effects of forest changing on hydrological process quantity and water quality; mechanism study on the effects of forest changing on hydrological cycle, and establishing and exploitating physical-based distributed forest hydrological model for resource management and engineering construction. Orientation experiment research can not only support the first-hand data for forest hydrological model, but also make clear the precipitation-runoff mechanisms. Research on runoff mechanisms can be valuable for the exploitation and improvement of physical based hydrological models. Moreover, the model can also improve the experimental and runoff mechanism researches. A review of above three aspects are summarized in this paper.
Switching moving boundary models for two-phase flow evaporators and condensers
NASA Astrophysics Data System (ADS)
Bonilla, Javier; Dormido, Sebastián; Cellier, François E.
2015-03-01
The moving boundary method is an appealing approach for the design, testing and validation of advanced control schemes for evaporators and condensers. When it comes to advanced control strategies, not only accurate but fast dynamic models are required. Moving boundary models are fast low-order dynamic models, and they can describe the dynamic behavior with high accuracy. This paper presents a mathematical formulation based on physical principles for two-phase flow moving boundary evaporator and condenser models which support dynamic switching between all possible flow configurations. The models were implemented in a library using the equation-based object-oriented Modelica language. Several integrity tests in steady-state and transient predictions together with stability tests verified the models. Experimental data from a direct steam generation parabolic-trough solar thermal power plant is used to validate and compare the developed moving boundary models against finite volume models.
Reliability Quantification of Advanced Stirling Convertor (ASC) Components
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward
2010-01-01
The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.
Ocean Modeling in an Eddying Regime
NASA Astrophysics Data System (ADS)
Hecht, Matthew W.; Hasumi, Hiroyasu
This monograph is the first to survey progress in realistic simulation in a strongly eddying regime made possible by recent increases in computational capability. Its contributors comprise the leading researchers in this important and constantly evolving field. Divided into three parts, • Oceanographic Processes and Regimes: Fundamental Questions • Ocean Dynamics and State: From Regional to Global Scale, and • Modeling at the Mesoscale: State of the Art and Future Directions the volume details important advances in physical oceanography based on eddy resolving ocean modeling. It captures the state of the art and discusses issues that ocean modelers must consider in order to effectively contribute to advancing current knowledge, from subtleties of the underlying fluid dynamical equations to meaningful comparison with oceanographic observations and leading-edge model development. It summarizes many of the important results which have emerged from ocean modeling in an eddying regime, for those interested broadly in the physical science. More technical topics are intended to address the concerns of those actively working in the field.
NASA Astrophysics Data System (ADS)
Bonne, François; Alamir, Mazen; Bonnay, Patrick
2014-01-01
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection, to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonne, François; Bonnay, Patrick; Alamir, Mazen
2014-01-29
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection,more » to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.« less
Bryce, Richard A
2011-04-01
The ability to accurately predict the interaction of a ligand with its receptor is a key limitation in computer-aided drug design approaches such as virtual screening and de novo design. In this article, we examine current strategies for a physics-based approach to scoring of protein-ligand affinity, as well as outlining recent developments in force fields and quantum chemical techniques. We also consider advances in the development and application of simulation-based free energy methods to study protein-ligand interactions. Fuelled by recent advances in computational algorithms and hardware, there is the opportunity for increased integration of physics-based scoring approaches at earlier stages in computationally guided drug discovery. Specifically, we envisage increased use of implicit solvent models and simulation-based scoring methods as tools for computing the affinities of large virtual ligand libraries. Approaches based on end point simulations and reference potentials allow the application of more advanced potential energy functions to prediction of protein-ligand binding affinities. Comprehensive evaluation of polarizable force fields and quantum mechanical (QM)/molecular mechanical and QM methods in scoring of protein-ligand interactions is required, particularly in their ability to address challenging targets such as metalloproteins and other proteins that make highly polar interactions. Finally, we anticipate increasingly quantitative free energy perturbation and thermodynamic integration methods that are practical for optimization of hits obtained from screened ligand libraries.
Noise in state of the art clocks and their impact for fundamental physics
NASA Technical Reports Server (NTRS)
Maleki, L.
2001-01-01
In this paper a review of the use of advanced atomic clocks in testing the fundamental physical laws will be presented. Noise sources of clocks will be discussed, together with an outline their characterization based on current models. The paper will conclude with a discussion of recent attempts to reduce the fundamental, as well as technical noise in atomic clocks.
A Physics-Based Engineering Approach to Predict the Cross Section for Advanced SRAMs
NASA Astrophysics Data System (ADS)
Li, Lei; Zhou, Wanting; Liu, Huihua
2012-12-01
This paper presents a physics-based engineering approach to estimate the heavy ion induced upset cross section for 6T SRAM cells from layout and technology parameters. The new approach calculates the effects of radiation with junction photocurrent, which is derived based on device physics. The new and simple approach handles the problem by using simple SPICE simulations. At first, the approach uses a standard SPICE program on a typical PC to predict the SPICE-simulated curve of the collected charge vs. its affected distance from the drain-body junction with the derived junction photocurrent. And then, the SPICE-simulated curve is used to calculate the heavy ion induced upset cross section with a simple model, which considers that the SEU cross section of a SRAM cell is more related to a “radius of influence” around a heavy ion strike than to the physical size of a diffusion node in the layout for advanced SRAMs in nano-scale process technologies. The calculated upset cross section based on this method is in good agreement with the test results for 6T SRAM cells processed using 90 nm process technology.
Physics-based distributed snow models in the operational arena: Current and future challenges
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Jonas, T.; Schirmer, M.; Helbig, N.
2017-12-01
The demand for modeling tools robust to climate change and weather extremes along with coincident increases in computational capabilities have led to an increase in the use of physics-based snow models in operational applications. Current operational applications include the WSL-SLF's across Switzerland, ASO's in California, and USDA-ARS's in Idaho. While the physics-based approaches offer many advantages there remain limitations and modeling challenges. The most evident limitation remains computation times that often limit forecasters to a single, deterministic model run. Other limitations however remain less conspicuous amidst the assumptions that these models require little to no calibration based on their foundation on physical principles. Yet all energy balance snow models seemingly contain parameterizations or simplifications of processes where validation data are scarce or present understanding is limited. At the research-basin scale where many of these models were developed these modeling elements may prove adequate. However when applied over large areas, spatially invariable parameterizations of snow albedo, roughness lengths and atmospheric exchange coefficients - all vital to determining the snowcover energy balance - become problematic. Moreover as we apply models over larger grid cells, the representation of sub-grid variability such as the snow-covered fraction adds to the challenges. Here, we will demonstrate some of the major sensitivities of distributed energy balance snow models to particular model constructs, the need for advanced and spatially flexible methods and parameterizations, and prompt the community for open dialogue and future collaborations to further modeling capabilities.
Advanced Ground Systems Maintenance Physics Models For Diagnostics Project
NASA Technical Reports Server (NTRS)
Perotti, Jose M.
2015-01-01
The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations. This project will develop and implement high-fidelity physics-based modeling techniques tosimulate the real-time operation of cryogenics and other fluids systems and, when compared to thereal-time operation of the actual systems, provide assessment of their state. Physics-modelcalculated measurements (called “pseudo-sensors”) will be compared to the system real-timedata. Comparison results will be utilized to provide systems operators with enhanced monitoring ofsystems' health and status, identify off-nominal trends and diagnose system/component failures.This capability can also be used to conduct planning and analysis of cryogenics and other fluidsystems designs. This capability will be interfaced with the ground operations command andcontrol system as a part of the Advanced Ground Systems Maintenance (AGSM) project to helpassure system availability and mission success. The initial capability will be developed for theLiquid Oxygen (LO2) ground loading systems.
Physics through the 1990s: Nuclear physics
NASA Technical Reports Server (NTRS)
1986-01-01
The volume begins with a non-mathematical introduction to nuclear physics. A description of the major advances in the field follows, with chapters on nuclear structure and dynamics, fundamental forces in the nucleus, and nuclei under extreme conditions of temperature, density, and spin. Impacts of nuclear physics on astrophysics and the scientific and societal benefits of nuclear physics are then discussed. Another section deals with scientific frontiers, describing research into the realm of the quark-gluon plasma; the changing description of nuclear matter, specifically the use of the quark model; and the implications of the standard model and grand unified theories of elementary-particle physics; and finishes with recommendations and priorities for nuclear physics research facilities, instrumentation, accelerators, theory, education, and data bases. Appended are a list of national accelerator facilities, a list of reviewers, a bibliography, and a glossary.
Model-Based Diagnostics for Propellant Loading Systems
NASA Technical Reports Server (NTRS)
Daigle, Matthew John; Foygel, Michael; Smelyanskiy, Vadim N.
2011-01-01
The loading of spacecraft propellants is a complex, risky operation. Therefore, diagnostic solutions are necessary to quickly identify when a fault occurs, so that recovery actions can be taken or an abort procedure can be initiated. Model-based diagnosis solutions, established using an in-depth analysis and understanding of the underlying physical processes, offer the advanced capability to quickly detect and isolate faults, identify their severity, and predict their effects on system performance. We develop a physics-based model of a cryogenic propellant loading system, which describes the complex dynamics of liquid hydrogen filling from a storage tank to an external vehicle tank, as well as the influence of different faults on this process. The model takes into account the main physical processes such as highly nonequilibrium condensation and evaporation of the hydrogen vapor, pressurization, and also the dynamics of liquid hydrogen and vapor flows inside the system in the presence of helium gas. Since the model incorporates multiple faults in the system, it provides a suitable framework for model-based diagnostics and prognostics algorithms. Using this model, we analyze the effects of faults on the system, derive symbolic fault signatures for the purposes of fault isolation, and perform fault identification using a particle filter approach. We demonstrate the detection, isolation, and identification of a number of faults using simulation-based experiments.
NASA Astrophysics Data System (ADS)
Akasofu, S.-I.; Kamide, Y.
1998-07-01
A new approach is needed to advance magnetospheric physics in the future to achieve a much closer integration than in the past among satellite-based researchers, ground-based researchers, and theorists/modelers. Specifically, we must find efficient ways to combine two-dimensional ground-based data and single points satellite-based data to infer three-dimensional aspects of magnetospheric disturbances. For this particular integration purpose, we propose a new project. It is designed to determine the currents on the magnetospheric equatorial plane from the ionospheric current distribution which has become available by inverting ground-based magnetic data from an extensive, systematic network of observations, combined with ground-based radar measurements of ionospheric parameters, and satellite observations of auroras, electric fields, and currents. The inversion method is based on the KRM/AMIE algorithms. In the first part of the paper, we extensively review the reliability and accuracy of the KRM and AMIE algorithms and conclude that the ionospheric quantities thus obtained are accurate enough for the next step. In the second part, the ionospheric current distribution thus obtained is projected onto the equatorial plane. This process requires a close cooperation with modelers in determining an accurate configuration of the magnetospheric field lines. If we succeed in this projection, we should be able to study the changing distribution of the currents in a vast region of the magnetospheric equatorial plane for extended periods with a time resolution of about 5 min. This process requires a model of the magnetosphere for the different phases of the magnetospheric substorm. Satellite-based observations are needed to calibrate the projection results. Agreements and disagreements thus obtained will be crucial for theoretical studies of magnetospheric plasma convection and dynamics, particularly in studying substorms. Nothing is easy in these procedures. However, unless we can overcome the associated difficulties, we may not be able to make distinct progresses. We believe that the proposed project is one way to draw the three groups closer together in advancing magnetospheric physics in the future. It is important to note that the proposed project has become possible because ground-based space physics has made a major advance during the last decade.
NASA Astrophysics Data System (ADS)
Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming
2018-04-01
Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tome, Carlos N; Caro, J A; Lebensohn, R A
2010-01-01
Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less
NASA Astrophysics Data System (ADS)
Pulkkinen, A.
2012-12-01
Empirical modeling has been the workhorse of the past decades in predicting the state of the geospace. For example, numerous empirical studies have shown that global geoeffectiveness indices such as Kp and Dst are generally well predictable from the solar wind input. These successes have been facilitated partly by the strongly externally driven nature of the system. Although characterizing the general state of the system is valuable and empirical modeling will continue playing an important role, refined physics-based quantification of the state of the system has been the obvious next step in moving toward more mature science. Importantly, more refined and localized products are needed also for space weather purposes. Predictions of local physical quantities are necessary to make physics-based links to the impacts on specific systems. As we have introduced more localized predictions of the geospace state one central question is how predictable these local quantities are? This complex question can be addressed by rigorously measuring the model performance against the observed data. Space sciences community has made great advanced on this topic over the past few years and there are ongoing efforts in SHINE, CEDAR and GEM to carry out community-wide evaluations of the state-of-the-art solar and heliospheric, ionosphere-thermosphere and geospace models, respectively. These efforts will help establish benchmarks and thus provide means to measure the progress in the field analogous to monitoring of the improvement in lower atmospheric weather predictions carried out rigorously since 1980s. In this paper we will discuss some of the latest advancements in predicting the local geospace parameters and give an overview of some of the community efforts to rigorously measure the model performances. We will also briefly discuss some of the future opportunities for advancing the geospace modeling capability. These will include further development in data assimilation and ensemble modeling (e.g. taking into account uncertainty in the inflow boundary conditions).
Tokunaga, Jin; Takamura, Norito; Ogata, Kenji; Setoguchi, Nao; Sato, Keizo
2013-01-01
Bedside training for fourth-year students, as well as seminars in hospital pharmacy (vital sign seminars) for fifth-year students at the Department of Pharmacy of Kyushu University of Health and Welfare have been implemented using patient training models and various patient simulators. The introduction of simulation-based pharmaceutical education, where no patients are present, promotes visually, aurally, and tactilely simulated learning regarding the evaluation of vital signs and implementation of physical assessment when disease symptoms are present or adverse effects occur. A patient simulator also promotes the creation of training programs for emergency and critical care, with which basic as well as advanced life support can be practiced. In addition, an advanced objective structured clinical examination (OSCE) trial has been implemented to evaluate skills regarding vital signs and physical assessments. Pharmacists are required to examine vital signs and conduct physical assessment from a pharmaceutical point of view. The introduction of these pharmacy clinical skills will improve the efficacy of drugs, work for the prevention or early detection of adverse effects, and promote the appropriate use of drugs. It is considered that simulation-based pharmaceutical education is essential to understand physical assessment, and such education will ideally be applied and developed according to on-site practices.
Overview of NASA Heliophysics and the Science of Space Weather
NASA Astrophysics Data System (ADS)
Talaat, E. R.
2017-12-01
In this paper, an overview is presented on the various activities within NASA that address space weather-related observations, model development, and research to operations. Specific to space weather, NASA formulates and implements, through the Heliophysics division, a national research program for understanding the Sun and its interactions with the Earth and the Solar System and how these phenomena impact life and society. NASA researches and prototypes new mission and instrument capabilities in this area, providing new physics-based algorithms to advance the state of solar, space physics, and space weather modeling.
Shock-loading response of advanced materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, G.T. III
1993-08-01
Advanced materials, such as composites (metal, ceramic, or polymer-matrix), intermetallics, foams (metallic or polymeric-based), laminated materials, and nanostructured materials are receiving increasing attention because their properties can be custom tailored specific applications. The high-rate/impact response of advanced materials is relevant to a broad range of service environments such as the crashworthiness of civilian/military vehicles, foreign-object-damage in aerospace, and light-weight armor. Increased utilization of these material classes under dynamic loading conditions requires an understanding of the relationship between high-rate/shock-wave response as a function of microstructure if we are to develop models to predict material behavior. In this paper the issues relevantmore » to defect generation, storage, and the underlying physical basis needed in predictive models for several advanced materials will be reviewed.« less
NASA Astrophysics Data System (ADS)
Ulmer, S.; Mooser, A.; Nagahama, H.; Sellner, S.; Smorra, C.
2018-03-01
The BASE collaboration investigates the fundamental properties of protons and antiprotons, such as charge-to-mass ratios and magnetic moments, using advanced cryogenic Penning trap systems. In recent years, we performed the most precise measurement of the magnetic moments of both the proton and the antiproton, and conducted the most precise comparison of the proton-to-antiproton charge-to-mass ratio. In addition, we have set the most stringent constraint on directly measured antiproton lifetime, based on a unique reservoir trap technique. Our matter/antimatter comparison experiments provide stringent tests of the fundamental charge-parity-time invariance, which is one of the fundamental symmetries of the standard model of particle physics. This article reviews the recent achievements of BASE and gives an outlook to our physics programme in the ELENA era. This article is part of the Theo Murphy meeting issue `Antiproton physics in the ELENA era'.
Ulmer, S; Mooser, A; Nagahama, H; Sellner, S; Smorra, C
2018-03-28
The BASE collaboration investigates the fundamental properties of protons and antiprotons, such as charge-to-mass ratios and magnetic moments, using advanced cryogenic Penning trap systems. In recent years, we performed the most precise measurement of the magnetic moments of both the proton and the antiproton, and conducted the most precise comparison of the proton-to-antiproton charge-to-mass ratio. In addition, we have set the most stringent constraint on directly measured antiproton lifetime, based on a unique reservoir trap technique. Our matter/antimatter comparison experiments provide stringent tests of the fundamental charge-parity-time invariance, which is one of the fundamental symmetries of the standard model of particle physics. This article reviews the recent achievements of BASE and gives an outlook to our physics programme in the ELENA era.This article is part of the Theo Murphy meeting issue 'Antiproton physics in the ELENA era'. © 2018 The Authors.
Mooser, A.; Nagahama, H.; Sellner, S.; Smorra, C.
2018-01-01
The BASE collaboration investigates the fundamental properties of protons and antiprotons, such as charge-to-mass ratios and magnetic moments, using advanced cryogenic Penning trap systems. In recent years, we performed the most precise measurement of the magnetic moments of both the proton and the antiproton, and conducted the most precise comparison of the proton-to-antiproton charge-to-mass ratio. In addition, we have set the most stringent constraint on directly measured antiproton lifetime, based on a unique reservoir trap technique. Our matter/antimatter comparison experiments provide stringent tests of the fundamental charge–parity–time invariance, which is one of the fundamental symmetries of the standard model of particle physics. This article reviews the recent achievements of BASE and gives an outlook to our physics programme in the ELENA era. This article is part of the Theo Murphy meeting issue ‘Antiproton physics in the ELENA era’. PMID:29459414
ERIC Educational Resources Information Center
Annesi, James J.; Faigenbaum, Avery D.; Westcott, Wayne L.
2010-01-01
The transtheoretical model (TTM; Prochaska, DiClemente, & Norcross, 1992) suggests that, at any point, an individual is in one of five stages-of-change related to adopting a behavior. People sequentially advance in stage but may also maintain or even regress, based on personal and environmental factors (Nigg, 2005). A classic study published in…
Closing the loop: integrating human impacts on water resources to advanced land surface models
NASA Astrophysics Data System (ADS)
Zaitchik, B. F.; Nie, W.; Rodell, M.; Kumar, S.; Li, B.
2016-12-01
Advanced Land Surface Models (LSMs), including those used in the North American Land Data Assimilation System (NLDAS), offer a physically consistent and spatially and temporally complete analysis of the distributed water balance. These models are constrained both by physically-based process representation and by observations ingested as meteorological forcing or as data assimilation updates. As such, they have become important tools for hydrological monitoring and long-term climate analysis. The representation of water management, however, is extremely limited in these models. Recent advances have brought prognostic irrigation routines into models used in NLDAS, while assimilation of Gravity Recovery and Climate Experiment (GRACE) derived estimates of terrestrial water storage anomaly has made it possible to nudge models towards observed states in water storage below the root zone. But with few exceptions these LSMs do not account for the source of irrigation water, leading to a disconnect between the simulated water balance and the observed human impact on water resources. This inconsistency is unacceptable for long-term studies of climate change and human impact on water resources in North America. Here we define the modeling challenge, review instances of models that have begun to account for water withdrawals (e.g., CLM), and present ongoing efforts to improve representation of human impacts on water storage across models through integration of irrigation routines, water withdrawal information, and GRACE Data Assimilation in NLDAS LSMs.
2012-01-01
discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI models such as, the...detection and discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI...Shubitidze of Sky Research and Dartmouth College, conceived, implemented , and tested most of the approaches presented in this report. He developed
Pointer, William David; Baglietto, Emilio
2016-05-01
Here, in the effort to reinvigorate innovation in the way we design, build, and operate the nuclear power generating stations of today and tomorrow, nothing can be taken for granted. Not even the seemingly familiar physics of boiling water. The Consortium for the Advanced Simulation of Light Water Reactors, or CASL, is focused on the deployment of advanced modeling and simulation capabilities to enable the nuclear industry to reduce uncertainties in the prediction of multi-physics phenomena and continue to improve the performance of today’s Light Water Reactors and their fuel. An important part of the CASL mission is the developmentmore » of a next generation thermal hydraulics simulation capability, integrating the history of engineering models based on experimental experience with the computing technology of the future.« less
Modeling and Controls Development of 48V Mild Hybrid Electric Vehicles
The Advanced Light-Duty Powertrain and Hybrid Analysis tool (ALPHA) was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...
Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model
NASA Astrophysics Data System (ADS)
Anderson, K. R.
2016-12-01
Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics-based, mixed deterministic-probabilistic eruption forecasting approach in reducing and quantifying these uncertainties.
NASA Astrophysics Data System (ADS)
Kariniotakis, G.; Anemos Team
2003-04-01
Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.
Self-Brown, Shannon; Lai, Betty; Patterson, Alexandria; Glasheen, Theresa
2017-08-01
This paper reviews youth outcomes following exposure to natural disaster, with a focus on three relatively understudied outcomes: externalizing behavior problems, physical health, and posttraumatic growth. Recent, high-impact studies focusing on each outcome are summarized. Studies highlighted in this review utilize innovative and comprehensive approaches to improve our current understanding of youth broad-based physical and mental health outcomes beyond PTSD. The review concludes with recommendations to advance the field of youth disaster research by exploring how disasters may impact children across multiple domains, as well as using cutting edge ecobiological approaches and advanced modeling strategies to better understand how youth adjust and thrive following natural disaster.
Space-weather assets developed by the French space-physics community
NASA Astrophysics Data System (ADS)
Rouillard, A. P.; Pinto, R. F.; Brun, A. S.; Briand, C.; Bourdarie, S.; Dudok De Wit, T.; Amari, T.; Blelly, P.-L.; Buchlin, E.; Chambodut, A.; Claret, A.; Corbard, T.; Génot, V.; Guennou, C.; Klein, K. L.; Koechlin, L.; Lavarra, M.; Lavraud, B.; Leblanc, F.; Lemorton, J.; Lilensten, J.; Lopez-Ariste, A.; Marchaudon, A.; Masson, S.; Pariat, E.; Reville, V.; Turc, L.; Vilmer, N.; Zucarello, F. P.
2016-12-01
We present a short review of space-weather tools and services developed and maintained by the French space-physics community. They include unique data from ground-based observatories, advanced numerical models, automated identification and tracking tools, a range of space instrumentation and interconnected virtual observatories. The aim of the article is to highlight some advances achieved in this field of research at the national level over the last decade and how certain assets could be combined to produce better space-weather tools exploitable by space-weather centres and customers worldwide. This review illustrates the wide range of expertise developed nationally but is not a systematic review of all assets developed in France.
Modeling and Validation of Lithium-ion Automotive Battery Packs (SAE 2013-01-1539)
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...
USDA-ARS?s Scientific Manuscript database
Significant advancements in photogrammetric Structure-from-Motion (SfM) software, coupled with improvements in the quality and resolution of smartphone cameras, has made it possible to create ultra-fine resolution three-dimensional models of physical objects using an ordinary smartphone. Here we pre...
Gain selection method and model for coupled propulsion and airframe systems
NASA Technical Reports Server (NTRS)
Murphy, P. C.
1982-01-01
A longitudinal model is formulated for an advanced fighter from three subsystem models: the inlet, the engine, and the airframe. Notable interaction is found in the coupled system. A procedure, based on eigenvalue sensitivities, is presented which indicates the importance of the feedback gains to the optimal solution. This allows ineffectual gains to be eliminated; thus, hardware and expense may be saved in the realization of the physical controller.
Understanding Cooperative Chirality at the Nanoscale
NASA Astrophysics Data System (ADS)
Yu, Shangjie; Wang, Pengpeng; Govorov, Alexander; Ouyang, Min
Controlling chirality of organic and inorganic structures plays a key role in many physical, chemical and biochemical processes, and may offer new opportunity to create technology applications based on chiroptical effect. In this talk, we will present a theoretical model and simulation to demonstrate how to engineer nanoscale chirality in inorganic nanostructures via synergistic control of electromagnetic response of both lattice and geometry, leading to rich tunability of chirality at the nanoscale. Our model has also been applied to understand recent materials advancement of related control with excellent agreement, and can elucidate physical origins of circular dichroism features in the experiment.
NASA Astrophysics Data System (ADS)
Huzil, J. Torin; Sivaloganathan, Siv; Kohandel, Mohammad; Foldvari, Marianna
2011-11-01
The advancement of dermal and transdermal drug delivery requires the development of delivery systems that are suitable for large protein and nucleic acid-based therapeutic agents. However, a complete mechanistic understanding of the physical barrier properties associated with the epidermis, specifically the membrane structures within the stratum corneum, has yet to be developed. Here, we describe the assembly and computational modeling of stratum corneum lipid bilayers constructed from varying ratios of their constituent lipids (ceramide, free fatty acids and cholesterol) to determine if there is a difference in the physical properties of stratum corneum compositions.
Physical Constraints on Seismic Waves from Chemical and Nuclear Explosions
1992-04-22
AIR FORCE SYSTEMS COMMAND HANSCOM AIR FORCE BASE , MASSACHUSETTS 01731-5000 92-23124 9 2 8 1 9 5 9 IIII!I!I l1!j lIII ii SPONSORED BY Defense Advanced...in good agreement with seismic yield esti- improve the detection capabilities of new systems. Given mates [Sykes and Ekstrom, 1989]. (1990) reports...nuclear ,eismology. physical model for spall; (4) Determination of energy balance in Many questions still remain, particularly those associated with the
Future Directions in Medical Physics: Models, Technology, and Translation to Medicine
NASA Astrophysics Data System (ADS)
Siewerdsen, Jeffrey
The application of physics in medicine has been integral to major advances in diagnostic and therapeutic medicine. Two primary areas represent the mainstay of medical physics research in the last century: in radiation therapy, physicists have propelled advances in conformal radiation treatment and high-precision image guidance; and in diagnostic imaging, physicists have advanced an arsenal of multi-modality imaging that includes CT, MRI, ultrasound, and PET as indispensible tools for noninvasive screening, diagnosis, and assessment of treatment response. In addition to their role in building such technologically rich fields of medicine, physicists have also become integral to daily clinical practice in these areas. The future suggests new opportunities for multi-disciplinary research bridging physics, biology, engineering, and computer science, and collaboration in medical physics carries a strong capacity for identification of significant clinical needs, access to clinical data, and translation of technologies to clinical studies. In radiation therapy, for example, the extraction of knowledge from large datasets on treatment delivery, image-based phenotypes, genomic profile, and treatment outcome will require innovation in computational modeling and connection with medical physics for the curation of large datasets. Similarly in imaging physics, the demand for new imaging technology capable of measuring physical and biological processes over orders of magnitude in scale (from molecules to whole organ systems) and exploiting new contrast mechanisms for greater sensitivity to molecular agents and subtle functional / morphological change will benefit from multi-disciplinary collaboration in physics, biology, and engineering. Also in surgery and interventional radiology, where needs for increased precision and patient safety meet constraints in cost and workflow, development of new technologies for imaging, image registration, and robotic assistance can leverage collaboration in physics, biomedical engineering, and computer science. In each area, there is major opportunity for multi-disciplinary collaboration with medical physics to accelerate the translation of such technologies to clinical use. Research supported by the National Institutes of Health, Siemens Healthcare, and Carestream Health.
Recent Advances on INSAR Temporal Decorrelation: Theory and Observations Using UAVSAR
NASA Technical Reports Server (NTRS)
Lavalle, M.; Hensley, S.; Simard, M.
2011-01-01
We review our recent advances in understanding the role of temporal decorrelation in SAR interferometry and polarimetric SAR interferometry. We developed a physical model of temporal decorrelation based on Gaussian-statistic motion that varies along the vertical direction in forest canopies. Temporal decorrelation depends on structural parameters such as forest height, is sensitive to polarization and affects coherence amplitude and phase. A model of temporal-volume decorrelation valid for arbitrary spatial baseline is discussed. We tested the inversion of this model to estimate forest height from model simulations supported by JPL/UAVSAR data and lidar LVIS data. We found a general good agreement between forest height estimated from radar data and forest height estimated from lidar data.
Model-Based Battery Management Systems: From Theory to Practice
NASA Astrophysics Data System (ADS)
Pathak, Manan
Lithium-ion batteries are now extensively being used as the primary storage source. Capacity and power fade, and slow recharging times are key issues that restrict its use in many applications. Battery management systems are critical to address these issues, along with ensuring its safety. This dissertation focuses on exploring various control strategies using detailed physics-based electrochemical models developed previously for lithium-ion batteries, which could be used in advanced battery management systems. Optimal charging profiles for minimizing capacity fade based on SEI-layer formation are derived and the benefits of using such control strategies are shown by experimentally testing them on a 16 Ah NMC-based pouch cell. This dissertation also explores different time-discretization strategies for non-linear models, which gives an improved order of convergence for optimal control problems. Lastly, this dissertation also explores a physics-based model for predicting the linear impedance of a battery, and develops a freeware that is extremely robust and computationally fast. Such a code could be used for estimating transport, kinetic and material properties of the battery based on the linear impedance spectra.
Sebire, Simon J; Jago, Russell; Fox, Kenneth R; Edwards, Mark J; Thompson, Janice L
2013-09-26
Understanding children's physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children's physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children's minutes in moderate-to-vigorous physical activity. The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children's motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children's physical activity.
NASA Astrophysics Data System (ADS)
Kirchner, James W.
2006-03-01
The science of hydrology is on the threshold of major advances, driven by new hydrologic measurements, new methods for analyzing hydrologic data, and new approaches to modeling hydrologic systems. Here I suggest several promising directions forward, including (1) designing new data networks, field observations, and field experiments, with explicit recognition of the spatial and temporal heterogeneity of hydrologic processes, (2) replacing linear, additive "black box" models with "gray box" approaches that better capture the nonlinear and non-additive character of hydrologic systems, (3) developing physically based governing equations for hydrologic behavior at the catchment or hillslope scale, recognizing that they may look different from the equations that describe the small-scale physics, (4) developing models that are minimally parameterized and therefore stand some chance of failing the tests that they are subjected to, and (5) developing ways to test models more comprehensively and incisively. I argue that scientific progress will mostly be achieved through the collision of theory and data, rather than through increasingly elaborate and parameter-rich models that may succeed as mathematical marionettes, dancing to match the calibration data even if their underlying premises are unrealistic. Thus advancing the science of hydrology will require not only developing theories that get the right answers but also testing whether they get the right answers for the right reasons.
Collignon, Bertrand; Séguret, Axel; Halloy, José
2016-01-01
Collective motion is one of the most ubiquitous behaviours displayed by social organisms and has led to the development of numerous models. Recent advances in the understanding of sensory system and information processing by animals impels one to revise classical assumptions made in decisional algorithms. In this context, we present a model describing the three-dimensional visual sensory system of fish that adjust their trajectory according to their perception field. Furthermore, we introduce a stochastic process based on a probability distribution function to move in targeted directions rather than on a summation of influential vectors as is classically assumed by most models. In parallel, we present experimental results of zebrafish (alone or in group of 10) swimming in both homogeneous and heterogeneous environments. We use these experimental data to set the parameter values of our model and show that this perception-based approach can simulate the collective motion of species showing cohesive behaviour in heterogeneous environments. Finally, we discuss the advances of this multilayer model and its possible outcomes in biological, physical and robotic sciences. PMID:26909173
Potential Collaborative Research topics with Korea’s Agency for Defense Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R.; Todd, Michael D.
2012-08-23
This presentation provides a high level summary of current research activities at the Los Alamos National Laboratory (LANL)-University of California Jacobs School of Engineering (UCSD) Engineering Institute that will be presented at Korea's Agency for Defense Development (ADD). These research activities are at the basic engineering science level with different level of maturity ranging from initial concepts to field proof-of-concept demonstrations. We believe that all of these activities are appropriate for collaborative research activities with ADD subject to approval by each institution. All the activities summarized herein have the common theme that they are multi-disciplinary in nature and typically involvedmore » the integration of high-fidelity predictive modeling, advanced sensing technologies and new development in information technology. These activities include: Wireless Sensor Systems, Swarming Robot sensor systems, Advanced signal processing (compressed sensing) and pattern recognition, Model Verification and Validation, Optimal/robust sensor system design, Haptic systems for large-scale data processing, Cyber-physical security for robots, Multi-source energy harvesting, Reliability-based approaches to damage prognosis, SHMTools software development, and Cyber-physical systems advanced study institute.« less
NASA Astrophysics Data System (ADS)
Sadi, Toufik; Mehonic, Adnan; Montesi, Luca; Buckwell, Mark; Kenyon, Anthony; Asenov, Asen
2018-02-01
We employ an advanced three-dimensional (3D) electro-thermal simulator to explore the physics and potential of oxide-based resistive random-access memory (RRAM) cells. The physical simulation model has been developed recently, and couples a kinetic Monte Carlo study of electron and ionic transport to the self-heating phenomenon while accounting carefully for the physics of vacancy generation and recombination, and trapping mechanisms. The simulation framework successfully captures resistance switching, including the electroforming, set and reset processes, by modeling the dynamics of conductive filaments in the 3D space. This work focuses on the promising yet less studied RRAM structures based on silicon-rich silica (SiO x ) RRAMs. We explain the intrinsic nature of resistance switching of the SiO x layer, analyze the effect of self-heating on device performance, highlight the role of the initial vacancy distributions acting as precursors for switching, and also stress the importance of using 3D physics-based models to capture accurately the switching processes. The simulation work is backed by experimental studies. The simulator is useful for improving our understanding of the little-known physics of SiO x resistive memory devices, as well as other oxide-based RRAM systems (e.g. transition metal oxide RRAMs), offering design and optimization capabilities with regard to the reliability and variability of memory cells.
High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics
Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis
2014-07-28
The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less
NASA Astrophysics Data System (ADS)
Jin, Yongmei
In recent years, theoretical modeling and computational simulation of microstructure evolution and materials property has been attracting much attention. While significant advances have been made, two major challenges remain. One is the integration of multiple physical phenomena for simulation of complex materials behavior, the other is the bridging over multiple length and time scales in materials modeling and simulation. The research presented in this Thesis is focused mainly on tackling the first major challenge. In this Thesis, a unified Phase Field Microelasticity (PFM) approach is developed. This approach is an advanced version of the phase field method that takes into account the exact elasticity of arbitrarily anisotropic, elastically and structurally inhomogeneous systems. The proposed theory and models are applicable to infinite solids, elastic half-space, and finite bodies with arbitrary-shaped free surfaces, which may undergo various concomitant physical processes. The Phase Field Microelasticity approach is employed to formulate the theories and models of martensitic transformation, dislocation dynamics, and crack evolution in single crystal and polycrystalline solids. It is also used to study strain relaxation in heteroepitaxial thin films through misfit dislocation and surface roughening. Magnetic domain evolution in nanocrystalline thin films is also investigated. Numerous simulation studies are performed. Comparison with analytical predictions and experimental observations are presented. Agreement verities the theory and models as realistic simulation tools for computational materials science and engineering. The same Phase Field Microelasticity formalism of individual models of different physical phenomena makes it easy to integrate multiple physical processes into one unified simulation model, where multiple phenomena are treated as various relaxation modes that together act as one common cooperative phenomenon. The model does not impose a priori constraints on possible microstructure evolution paths. This gives the model predicting power, where material system itself "chooses" the optimal path for multiple processes. The advances made in this Thesis present a significant step forward to overcome the first challenge, mesoscale multi-physics modeling and simulation of materials. At the end of this Thesis, the way to tackle the second challenge, bridging over multiple length and time scales in materials modeling and simulation, is discussed based on connection between the mesoscale Phase Field Microelasticity modeling and microscopic atomistic calculation as well as macroscopic continuum theory.
Modeling and Validation of Power-split and P2 Parallel Hybrid Electric Vehicles SAE 2013-01-1470)
The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types combined ...
Center for Advanced Power and Energy Research (CAPEC)
2015-01-01
discharge (DCD). A glow discharge at a low ambient density becomes Corona discharge at the elevated ambient pressure condition. The thermal plasma actuator...Elisson and Kogelschlatz [9] has identified that the discharge consists of two distinct positive Corona streamers and diffusion modes. Enloe et al...4 2.3 Physics-Base Discharge Modeling
atlant: Advanced Three Level Approximation for Numerical Treatment of Cosmological Recombination
NASA Astrophysics Data System (ADS)
Kholupenko, E. E.; Ivanchik, A. V.; Balashev, S. A.; Varshalovich, D. A.
2011-10-01
atlant is a public numerical code for fast calculations of cosmological recombination of primordial hydrogen-helium plasma is presented. This code is based on the three-level approximation (TLA) model of recombination and allows us to take into account some "fine" physical effects of cosmological recombination simultaneously with using fudge factors.
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov
The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samei, E; Nelson, J; Hangiandreou, N
Medical Physics 2.0 is a bold vision for an existential transition of clinical imaging physics in face of the new realities of value-based and evidencebased medicine, comparative effectiveness, and meaningful use. It speaks to how clinical imaging physics can expand beyond traditional insular models of inspection and acceptance testing, oriented toward compliance, towards team-based models of operational engagement, prospective definition and assurance of effective use, and retrospective evaluation of clinical performance. Organized into four sessions of the AAPM, this particular session focuses on three specific modalities as outlined below. CT 2.0: CT has been undergoing a dramatic transition in themore » last few decades. While the changes in the technology merits discussions of their own, an important question is how clinical medical physicists are expected to effectively engage with the new realities of CT technology and practice. Consistent with the upcoming paradigm of Medical Physics 2.0, this CT presentation aims to provide definitions and demonstration of the components of the new clinical medical physics practice pertaining CT. The topics covered include physics metrics and analytics that aim to provide higher order clinicallyrelevant quantification of system performance as pertains to new (and not so new) technologies. That will include the new radiation and dose metrics (SSDE, organ dose, risk indices), image quality metrology (MTF/NPS/d’), task-based phantoms, and the effect of patient size. That will follow with a discussion of the testing implication of new CT hardware (detectors, tubes), acquisition methods (innovative helical geometries, AEC, wide beam CT, dual energy, inverse geometry, application specialties), and image processing and analysis (iterative reconstructions, quantitative CT, advanced renditions). The presentation will conclude with a discussion of clinical and operational aspects of Medical Physics 2.0 including training and communication, use optimization (dose and technique factors), automated analysis and data management (automated QC methods, protocol tracking, dose monitoring, issue tracking), and meaningful QC considerations. US 2.0: Ultrasound imaging is evolving at a rapid pace, adding new imaging functions and modes that continue to enhance its clinical utility and benefits to patients. The ultrasound talk will look ahead 10–15 years and consider how medical physicists can bring maximal value to the clinical ultrasound practices of the future. The roles of physics in accreditation and regulatory compliance, image quality and exam optimization, clinical innovation, and education of staff and trainees will all be considered. A detailed examination of expected technology evolution and impact on image quality metrics will be presented. Clinical implementation of comprehensive physics services will also be discussed. Nuclear Medicine 2.0: Although the basic science of nuclear imaging has remained relatively unchanged since its inception, advances in instrumentation continue to advance the field into new territories. With a great number of these advances occurring over the past decade, the role and testing strategies of clinical nuclear medicine physicists must evolve in parallel. The Nuclear Medicine 2.0 presentation is designed to highlight some of the recent advances from a clinical medical physicist perspective and provide ideas and motivation for designing better evaluation strategies. Topics include improvement of traditional physics metrics and analytics, testing implications of hybrid imaging and advanced detector technologies, and strategies for effective implementation into the clinic. Learning Objectives: Become familiar with new physics metrics and analytics in nuclear medicine, CT, and ultrasound. To become familiar with the major new developments of clinical physics support. To understand the physics testing implications of new technologies, hardware, software, and applications. Identify approaches for implementing comprehensive medical physics services in future imaging practices.« less
A Bayesian network approach for modeling local failure in lung cancer
NASA Astrophysics Data System (ADS)
Oh, Jung Hun; Craft, Jeffrey; Lozi, Rawan Al; Vaidya, Manushka; Meng, Yifan; Deasy, Joseph O.; Bradley, Jeffrey D.; El Naqa, Issam
2011-03-01
Locally advanced non-small cell lung cancer (NSCLC) patients suffer from a high local failure rate following radiotherapy. Despite many efforts to develop new dose-volume models for early detection of tumor local failure, there was no reported significant improvement in their application prospectively. Based on recent studies of biomarker proteins' role in hypoxia and inflammation in predicting tumor response to radiotherapy, we hypothesize that combining physical and biological factors with a suitable framework could improve the overall prediction. To test this hypothesis, we propose a graphical Bayesian network framework for predicting local failure in lung cancer. The proposed approach was tested using two different datasets of locally advanced NSCLC patients treated with radiotherapy. The first dataset was collected retrospectively, which comprises clinical and dosimetric variables only. The second dataset was collected prospectively in which in addition to clinical and dosimetric information, blood was drawn from the patients at various time points to extract candidate biomarkers as well. Our preliminary results show that the proposed method can be used as an efficient method to develop predictive models of local failure in these patients and to interpret relationships among the different variables in the models. We also demonstrate the potential use of heterogeneous physical and biological variables to improve the model prediction. With the first dataset, we achieved better performance compared with competing Bayesian-based classifiers. With the second dataset, the combined model had a slightly higher performance compared to individual physical and biological models, with the biological variables making the largest contribution. Our preliminary results highlight the potential of the proposed integrated approach for predicting post-radiotherapy local failure in NSCLC patients.
Integrating 3D geological information with a national physically-based hydrological modelling system
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Parkin, Geoff; Kessler, Holger; Whiteman, Mark
2016-04-01
Robust numerical models are an essential tool for informing flood and water management and policy around the world. Physically-based hydrological models have traditionally not been used for such applications due to prohibitively large data, time and computational resource requirements. Given recent advances in computing power and data availability, a robust, physically-based hydrological modelling system for Great Britain using the SHETRAN model and national datasets has been created. Such a model has several advantages over less complex systems. Firstly, compared with conceptual models, a national physically-based model is more readily applicable to ungauged catchments, in which hydrological predictions are also required. Secondly, the results of a physically-based system may be more robust under changing conditions such as climate and land cover, as physical processes and relationships are explicitly accounted for. Finally, a fully integrated surface and subsurface model such as SHETRAN offers a wider range of applications compared with simpler schemes, such as assessments of groundwater resources, sediment and nutrient transport and flooding from multiple sources. As such, SHETRAN provides a robust means of simulating numerous terrestrial system processes which will add physical realism when coupled to the JULES land surface model. 306 catchments spanning Great Britain have been modelled using this system. The standard configuration of this system performs satisfactorily (NSE > 0.5) for 72% of catchments and well (NSE > 0.7) for 48%. Many of the remaining 28% of catchments that performed relatively poorly (NSE < 0.5) are located in the chalk in the south east of England. As such, the British Geological Survey 3D geology model for Great Britain (GB3D) has been incorporated, for the first time in any hydrological model, to pave the way for improvements to be made to simulations of catchments with important groundwater regimes. This coupling has involved development of software to allow for easy incorporation of geological information into SHETRAN for any model setup. The addition of more realistic subsurface representation following this approach is shown to greatly improve model performance in areas dominated by groundwater processes. The resulting modelling system has great potential to be used as a resource at national, regional and local scales in an array of different applications, including climate change impact assessments, land cover change studies and integrated assessments of groundwater and surface water resources.
NASA Technical Reports Server (NTRS)
Gastellu-Etchegorry, Jean-Phil; Lauret, Nicolas; Yin, Tiangang; Landier, Lucas; Kallel, Abdelaziz; Malenovsky, Zbynek; Bitar, Ahmad Al; Aval, Josselin; Benhmida, Sahar; Qi, Jianbo;
2017-01-01
To better understand the life-essential cycles and processes of our planet and to further develop remote sensing (RS) technology, there is an increasing need for models that simulate the radiative budget (RB) and RS acquisitions of urban and natural landscapes using physical approaches and considering the three-dimensional (3-D) architecture of Earth surfaces. Discrete anisotropic radiative transfer (DART) is one of the most comprehensive physically based 3-D models of Earth-atmosphere radiative transfer, covering the spectral domain from ultraviolet to thermal infrared wavelengths. It simulates the optical 3-DRB and optical signals of proximal, aerial, and satellite imaging spectrometers and laser scanners, for any urban and/or natural landscapes and for any experimental and instrumental configurations. It is freely available for research and teaching activities. In this paper, we briefly introduce DART theory and present recent advances in simulated sensors (LiDAR and cameras with finite field of view) and modeling mechanisms (atmosphere, specular reflectance with polarization and chlorophyll fluorescence). A case study demonstrating a novel application of DART to investigate urban landscapes is also presented.
Analytical approximation of the InGaZnO thin-film transistors surface potential
NASA Astrophysics Data System (ADS)
Colalongo, Luigi
2016-10-01
Surface-potential-based mathematical models are among the most accurate and physically based compact models of thin-film transistors, and in turn of indium gallium zinc oxide TFTs, available today. However, the need of iterative computations of the surface potential limits their computational efficiency and diffusion in CAD applications. The existing closed-form approximations of the surface potential are based on regional approximations and empirical smoothing functions that could result not accurate enough in particular to model transconductances and transcapacitances. In this work we present an extremely accurate (in the range of nV) and computationally efficient non-iterative approximation of the surface potential that can serve as a basis for advanced surface-potential-based indium gallium zinc oxide TFTs models.
Blanton, Brian; Dresback, Kendra; Colle, Brian; Kolar, Randy; Vergara, Humberto; Hong, Yang; Leonardo, Nicholas; Davidson, Rachel; Nozick, Linda; Wachtendorf, Tricia
2018-04-25
Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing "best-case" and "worst-case" scenarios for the subsequent risk-based evacuation model. © 2018 Society for Risk Analysis.
Simulation modelling for new gas turbine fuel controller creation.
NASA Astrophysics Data System (ADS)
Vendland, L. E.; Pribylov, V. G.; Borisov, Yu A.; Arzamastsev, M. A.; Kosoy, A. A.
2017-11-01
State of the art gas turbine fuel flow control systems are based on throttle principle. Major disadvantage of such systems is that they require high pressure fuel intake. Different approach to fuel flow control is to use regulating compressor. And for this approach because of controller and gas turbine interaction a specific regulating compressor is required. Difficulties emerge as early as the requirement definition stage. To define requirements for new object, his properties must be known. Simulation modelling helps to overcome these difficulties. At the requirement definition stage the most simplified mathematical model is used. Mathematical models will get more complex and detailed as we advance in planned work. If future adjusting of regulating compressor physical model to work with virtual gas turbine and physical control system is planned.
Solar physics in the space age
NASA Technical Reports Server (NTRS)
1989-01-01
A concise and brief review is given of the solar physics' domain, and how its study has been affected by NASA Space programs which have enabled space based observations. The observations have greatly increased the knowledge of solar physics by proving some theories and challenging others. Many questions remain unanswered. To exploit coming opportunities like the Space Station, solar physics must continue its advances in instrument development, observational techniques, and basic theory. Even with the Advance Solar Observatory, other space based observation will still be required for the sure to be ensuing questions.
Li, Jian-Yang; Helfenstein, Paul; Buratti, Bonnie J.; Takir, Driss; Beth Ellen Clark,; Michel, Patrick; DeMeo, Francesca E.; Bottke, William F.
2015-01-01
Asteroid photometry has three major applications: providing clues about asteroid surface physical properties and compositions, facilitating photometric corrections, and helping design and plan ground-based and spacecraft observations. The most significant advances in asteroid photometry in the past decade were driven by spacecraft observations that collected spatially resolved imaging and spectroscopy data. In the mean time, laboratory measurements and theoretical developments are revealing controversies regarding the physical interpretations of models and model parameter values. We will review the new developments in asteroid photometry that have occurred over the past decade in the three complementary areas of observations, laboratory work, and theory. Finally we will summarize and discuss the implications of recent findings.
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...
2017-01-01
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
NASA Astrophysics Data System (ADS)
Danner, Travis W.
Developing technology systems requires all manner of investment---engineering talent, prototypes, test facilities, and more. Even for simple design problems the investment can be substantial; for complex technology systems, the development costs can be staggering. The profitability of a corporation in a technology-driven industry is crucially dependent on maximizing the effectiveness of research and development investment. Decision-makers charged with allocation of this investment are forced to choose between the further evolution of existing technologies and the pursuit of revolutionary technologies. At risk on the one hand is excessive investment in an evolutionary technology which has only limited availability for further improvement. On the other hand, the pursuit of a revolutionary technology may mean abandoning momentum and the potential for substantial evolutionary improvement resulting from the years of accumulated knowledge. The informed answer to this question, evolutionary or revolutionary, requires knowledge of the expected rate of improvement and the potential a technology offers for further improvement. This research is dedicated to formulating the assessment and forecasting tools necessary to acquire this knowledge. The same physical laws and principles that enable the development and improvement of specific technologies also limit the ultimate capability of those technologies. Researchers have long used this concept as the foundation for modeling technological advancement through extrapolation by analogy to biological growth models. These models are employed to depict technology development as it asymptotically approaches limits established by the fundamental principles on which the technological approach is based. This has proven an effective and accurate approach to modeling and forecasting simple single-attribute technologies. With increased system complexity and the introduction of multiple system objectives, however, the usefulness of this modeling technique begins to diminish. With the introduction of multiple objectives, researchers often abandon technology growth models for scoring models and technology frontiers. While both approaches possess advantages over current growth models for the assessment of multi-objective technologies, each lacks a necessary dimension for comprehensive technology assessment. By collapsing multiple system metrics into a single, non-intuitive technology measure, scoring models provide a succinct framework for multi-objective technology assessment and forecasting. Yet, with no consideration of physical limits, scoring models provide no insight as to the feasibility of a particular combination of system capabilities. They only indicate that a given combination of system capabilities yields a particular score. Conversely, technology frontiers are constructed with the distinct objective of providing insight into the feasibility of system capability combinations. Yet again, upper limits to overall system performance are ignored. Furthermore, the data required to forecast subsequent technology frontiers is often inhibitive. In an attempt to reincorporate the fundamental nature of technology advancement as bound by physical principles, researchers have sought to normalize multi-objective systems whereby the variability of a single system objective is eliminated as a result of changes in the remaining objectives. This drastically limits the applicability of the resulting technology model because it is only applicable for a single setting of all other system attributes. Attempts to maintain the interaction between the growth curves of each technical objective of a complex system have thus far been limited to qualitative and subjective consideration. This research proposes the formulation of multidimensional growth models as an approach to simulating the advancement of multi-objective technologies towards their upper limits. Multidimensional growth models were formulated by noticing and exploiting the correlation between technology growth models and technology frontiers. Both are frontiers in actuality. The technology growth curve is a frontier between capability levels of a single attribute and time, while a technology frontier is a frontier between the capability levels of two or more attributes. Multidimensional growth models are formulated by exploiting the mathematical significance of this correlation. The result is a model that can capture both the interaction between multiple system attributes and their expected rates of improvement over time. The fundamental nature of technology development is maintained, and interdependent growth curves are generated for each system metric with minimal data requirements. Being founded on the basic nature of technology advancement, relative to physical limits, the availability for further improvement can be determined for a single metric relative to other system measures of merit. A by-product of this modeling approach is a single n-dimensional technology frontier linking all n system attributes with time. This provides an environment capable of forecasting future system capability in the form of advancing technology frontiers. The ability of a multidimensional growth model to capture the expected improvement of a specific technological approach is dependent on accurately identifying the physical limitations to each pertinent attribute. This research investigates two potential approaches to identifying those physical limits, a physics-based approach and a regression-based approach. The regression-based approach has found limited acceptance among forecasters, although it does show potential for estimating upper limits with a specified degree of uncertainty. Forecasters have long favored physics-based approaches for establishing the upper limit to unidimensional growth models. The task of accurately identifying upper limits has become increasingly difficult with the extension of growth models into multiple dimensions. A lone researcher may be able to identify the physical limitation to a single attribute of a simple system; however, as system complexity and the number of attributes increases, the attention of researchers from multiple fields of study is required. Thus, limit identification is itself an area of research and development requiring some level of investment. Whether estimated by physics or regression-based approaches, predicted limits will always have some degree of uncertainty. This research takes the approach of quantifying the impact of that uncertainty on model forecasts rather than heavily endorsing a single technique to limit identification. In addition to formulating the multidimensional growth model, this research provides a systematic procedure for applying that model to specific technology architectures. Researchers and decision-makers are able to investigate the potential for additional improvement within that technology architecture and to estimate the expected cost of each incremental improvement relative to the cost of past improvements. In this manner, multidimensional growth models provide the necessary information to set reasonable program goals for the further evolution of a particular technological approach or to establish the need for revolutionary approaches in light of the constraining limits of conventional approaches.
Advanced solar irradiances applied to satellite and ionospheric operational systems
NASA Astrophysics Data System (ADS)
Tobiska, W. Kent; Schunk, Robert; Eccles, Vince; Bouwer, Dave
Satellite and ionospheric operational systems require solar irradiances in a variety of time scales and spectral formats. We describe the development of a system using operational grade solar irradiances that are applied to empirical thermospheric density models and physics-based ionospheric models used by operational systems that require a space weather characterization. The SOLAR2000 (S2K) and SOLARFLARE (SFLR) models developed by Space Environment Technologies (SET) provide solar irradiances from the soft X-rays (XUV) through the Far Ultraviolet (FUV) spectrum. The irradiances are provided as integrated indices for the JB2006 empirical atmosphere density models and as line/band spectral irradiances for the physics-based Ionosphere Forecast Model (IFM) developed by the Space Environment Corporation (SEC). We describe the integration of these irradiances in historical, current epoch, and forecast modes through the Communication Alert and Prediction System (CAPS). CAPS provides real-time and forecast HF radio availability for global and regional users and global total electron content (TEC) conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syphers, M. J.; Chattopadhyay, S.
An overview is provided of the currently envisaged landscape of charged particle accelerators at the energy and intensity frontiers to explore particle physics beyond the standard model via 1-100 TeV-scale lepton and hadron colliders and multi-Megawatt proton accelerators for short- and long- baseline neutrino experiments. The particle beam physics, associated technological challenges and progress to date for these accelerator facilities (LHC, HL-LHC, future 100 TeV p-p colliders, Tev-scale linear and circular electron-positron colliders, high intensity proton accelerator complex PIP-II for DUNE and future upgrade to PIP-III) are outlined. Potential and prospects for advanced “nonlinear dynamic techniques” at the multi-MW levelmore » intensity frontier and advanced “plasma- wakefield-based techniques” at the TeV-scale energy frontier and are also described.« less
Advancement of CMOS Doping Technology in an External Development Framework
NASA Astrophysics Data System (ADS)
Jain, Amitabh; Chambers, James J.; Shaw, Judy B.
2011-01-01
The consumer appetite for a rich multimedia experience drives technology development for mobile hand-held devices and the infrastructure to support them. Enhancements in functionality, speed, and user experience are derived from advancements in CMOS technology. The technical challenges in developing each successive CMOS technology node to support these enhancements have become increasingly difficult. These trends have motivated the CMOS business towards a collaborative approach based on strategic partnerships. This paper describes our model and experience of CMOS development, based on multi-dimensional industrial and academic partnerships. We provide to our process equipment, materials, and simulation partners, as well as to our silicon foundry partners, the detailed requirements for future integrated circuit products. This is done very early in the development cycle to ensure that these requirements can be met. In order to determine these fundamental requirements, we rely on a strategy that requires strong interaction between process and device simulation, physical and chemical analytical methods, and research at academic institutions. This learning is shared with each project partner to address integration and manufacturing issues encountered during CMOS technology development from its inception through product ramp. We utilize TI's core strengths in physical analysis, unit processes and integration, yield ramp, reliability, and product engineering to support this technological development. Finally, this paper presents examples of the advancement of CMOS doping technology for the 28 nm node and beyond through this development model.
Advanced Propulsion Physics Lab: Eagleworks Investigations
NASA Technical Reports Server (NTRS)
Scogin, Tyler
2014-01-01
Eagleworks Laboratory is an advanced propulsions physics laboratory with two primary investigations currently underway. The first is a Quantum Vacuum Plasma Thruster (QVPT or Q-thrusters), an advanced electric propulsion technology in the development and demonstration phase. The second investigation is in Warp Field Interferometry (WFI). This is an investigation of Dr. Harold "Sonny" White's theoretical physics models for warp field equations using optical experiments in the Electro Optical laboratory (EOL) at Johnson Space Center. These investigations are pursuing technology necessary to enable human exploration of the solar system and beyond.
Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources
NASA Astrophysics Data System (ADS)
Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.
2017-09-01
We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction. We have developed a methodology for synthesizing physics-based broadband ground motion that incorporates the effects of realistic earthquake rupture along specific faults and the actual geology between the source and site.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philip, Bobby
2012-06-01
The Advanced Multi-Physics (AMP) code, in its present form, will allow a user to build a multi-physics application code for existing mechanics and diffusion operators and extend them with user-defined material models and new physics operators. There are examples that demonstrate mechanics, thermo-mechanics, coupled diffusion, and mechanical contact. The AMP code is designed to leverage a variety of mathematical solvers (PETSc, Trilinos, SUNDIALS, and AMP solvers) and mesh databases (LibMesh and AMP) in a consistent interchangeable approach.
ZERODUR thermo-mechanical modelling and advanced dilatometry for the ELT generation
NASA Astrophysics Data System (ADS)
Jedamzik, Ralf; Kunisch, Clemens; Westerhoff, Thomas
2016-07-01
Large amounts of low thermal expansion material are required for the upcoming ELT projects. The main mirror is designed using several hundreds of hexagonal 1.4 m sized mirror blanks. The M2 and M3 are monolithic 4 m class mirror blanks. The mirror blank material needs to fulfill tight requirements regarding CTE specification and homogeneity. Additionally the mirror blanks need to be dimensionally stable for more than 30 years. In particular, stress effects due to the changes in the environment shall not entail shape variation of more than 0.5 μm PV within 30 years. In 2010 SCHOTT developed a physically based model to describe the thermal and mechanical long time behavior of ZERODUR. The model enables simulation of the long time behavior of ZERODUR mirror blanks under realistic mechanical and thermal constraints. This presentation shows FEM simulation results on the long time behavior of the ELT M1, M2 and M3 mirror blanks under different loading conditions. Additionally the model results will be compared to an already 15 years lasting long time measurement of a ZERODUR sample at the German federal physical standardization institute (PTB). In recent years SCHOTT pushed the push rod dilatometer measurement technology to its limit. With the new Advanced Dilatometer CTE measurement accuracies of +- 3 ppb/K and reproducibilities of better 1 ppb/K have been achieved. The new Advanced Dilatometer exhibits excellent long time stability.
Predicting Production Costs for Advanced Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Bao, Han P.; Samareh, J. A.; Weston, R. P.
2002-01-01
For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This paper outlines the development of a process-based cost model in which the physical elements of the vehicle are soared according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this paper is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool.
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-05-01
Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-01-01
Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.
NASA Astrophysics Data System (ADS)
Olson, John R.
This is a quasi-experimental study of 261 first year high school students that analyzes gains made through the use of calculator based rangers attached to calculators. The study has qualitative components but is based on quantitative tests. Biechner's TUG-K test was used for the pretest, posttest, and post-posttest. The population was divided into one group that predicted the results before using the CBRs and another that did not predict first but completed the same activities. The data for the groups was further disaggregated into learning style groups (based on Kolb's Learning Styles Inventory), type of class (advanced vs. general physics), and gender. Four instructors used the labs developed by the author for this study and created significant differences between the groups by instructor based on interviews, participant observation and one way ANOVA. No significant differences were found between learning styles based on MANOVA. No significant differences were found between predict and nonpredict groups for the one way ANOVAs or MANOVA, however, some differences do exist as measured by a survey and participant observation. Significant differences do exist between gender and type of class (advanced/general) based on one way ANOVA and MANOVA. The males outscored the females on all tests and the advanced physics scored higher than the general physics on all tests. The advanced physics scoring higher was expected but the difference between genders was not.
Quantifying Astronaut Tasks: Robotic Technology and Future Space Suit Design
NASA Technical Reports Server (NTRS)
Newman, Dava
2003-01-01
The primary aim of this research effort was to advance the current understanding of astronauts' capabilities and limitations in space-suited EVA by developing models of the constitutive and compatibility relations of a space suit, based on experimental data gained from human test subjects as well as a 12 degree-of-freedom human-sized robot, and utilizing these fundamental relations to estimate a human factors performance metric for space suited EVA work. The three specific objectives are to: 1) Compile a detailed database of torques required to bend the joints of a space suit, using realistic, multi- joint human motions. 2) Develop a mathematical model of the constitutive relations between space suit joint torques and joint angular positions, based on experimental data and compare other investigators' physics-based models to experimental data. 3) Estimate the work envelope of a space suited astronaut, using the constitutive and compatibility relations of the space suit. The body of work that makes up this report includes experimentation, empirical and physics-based modeling, and model applications. A detailed space suit joint torque-angle database was compiled with a novel experimental approach that used space-suited human test subjects to generate realistic, multi-joint motions and an instrumented robot to measure the torques required to accomplish these motions in a space suit. Based on the experimental data, a mathematical model is developed to predict joint torque from the joint angle history. Two physics-based models of pressurized fabric cylinder bending are compared to experimental data, yielding design insights. The mathematical model is applied to EVA operations in an inverse kinematic analysis coupled to the space suit model to calculate the volume in which space-suited astronauts can work with their hands, demonstrating that operational human factors metrics can be predicted from fundamental space suit information.
(U) Ristra Next Generation Code Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hungerford, Aimee L.; Daniel, David John
LANL’s Weapons Physics management (ADX) and ASC program office have defined a strategy for exascale-class application codes that follows two supportive, and mutually risk-mitigating paths: evolution for established codes (with a strong pedigree within the user community) based upon existing programming paradigms (MPI+X); and Ristra (formerly known as NGC), a high-risk/high-reward push for a next-generation multi-physics, multi-scale simulation toolkit based on emerging advanced programming systems (with an initial focus on data-flow task-based models exemplified by Legion [5]). Development along these paths is supported by the ATDM, IC, and CSSE elements of the ASC program, with the resulting codes forming amore » common ecosystem, and with algorithm and code exchange between them anticipated. Furthermore, solution of some of the more challenging problems of the future will require a federation of codes working together, using established-pedigree codes in partnership with new capabilities as they come on line. The role of Ristra as the high-risk/high-reward path for LANL’s codes is fully consistent with its role in the Advanced Technology Development and Mitigation (ATDM) sub-program of ASC (see Appendix C), in particular its emphasis on evolving ASC capabilities through novel programming models and data management technologies.« less
2013-01-01
Background Understanding children’s physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children’s physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Methods Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. Results The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children’s minutes in moderate-to-vigorous physical activity. Conclusions The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children’s motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children’s physical activity. PMID:24067078
The Ensemble Space Weather Modeling System (eSWMS): Status, Capabilities and Challenges
NASA Astrophysics Data System (ADS)
Fry, C. D.; Eccles, J. V.; Reich, J. P.
2010-12-01
Marking a milestone in space weather forecasting, the Space Weather Modeling System (SWMS) successfully completed validation testing in advance of operational testing at Air Force Weather Agency’s primary space weather production center. This is the first coupling of stand-alone, physics-based space weather models that are currently in operations at AFWA supporting the warfighter. Significant development effort went into ensuring the component models were portable and scalable while maintaining consistent results across diverse high performance computing platforms. Coupling was accomplished under the Earth System Modeling Framework (ESMF). The coupled space weather models are the Hakamada-Akasofu-Fry version 2 (HAFv2) solar wind model and GAIM1, the ionospheric forecast component of the Global Assimilation of Ionospheric Measurements (GAIM) model. The SWMS was developed by team members from AFWA, Explorations Physics International, Inc. (EXPI) and Space Environment Corporation (SEC). The successful development of the SWMS provides new capabilities beyond enabling extended lead-time, data-driven ionospheric forecasts. These include ingesting diverse data sets at higher resolution, incorporating denser computational grids at finer time steps, and performing probability-based ensemble forecasts. Work of the SWMS development team now focuses on implementing the ensemble-based probability forecast capability by feeding multiple scenarios of 5 days of solar wind forecasts to the GAIM1 model based on the variation of the input fields to the HAFv2 model. The ensemble SWMS (eSWMS) will provide the most-likely space weather scenario with uncertainty estimates for important forecast fields. The eSWMS will allow DoD mission planners to consider the effects of space weather on their systems with more advance warning than is currently possible. The payoff is enhanced, tailored support to the warfighter with improved capabilities, such as point-to-point HF propagation forecasts, single-frequency GPS error corrections, and high cadence, high-resolution Space Situational Awareness (SSA) products. We present the current status of eSWMS, its capabilities, limitations and path of transition to operational use.
The Iterative Research Cycle: Process-Based Model Evaluation
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2014-12-01
The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex physics based models that simulate a myriad of processes at different spatial and temporal scales. Reconciling these high-order system models with perpetually larger volumes of field data is becoming more and more difficult, particularly because classical likelihood-based fitting methods lack the power to detect and pinpoint deficiencies in the model structure. In this talk I will give an overview of our latest research on process-based model calibration and evaluation. This approach, rooted in Bayesian theory, uses summary metrics of the calibration data rather than the data itself to help detect which component(s) of the model is (are) malfunctioning and in need of improvement. A few case studies involving hydrologic and geophysical models will be used to demonstrate the proposed methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loveday, D.L.; Craggs, C.
Box-Jenkins-based multivariate stochastic modeling is carried out using data recorded from a domestic heating system. The system comprises an air-source heat pump sited in the roof space of a house, solar assistance being provided by the conventional tile roof acting as a radiation absorber. Multivariate models are presented which illustrate the time-dependent relationships between three air temperatures - at external ambient, at entry to, and at exit from, the heat pump evaporator. Using a deterministic modeling approach, physical interpretations are placed on the results of the multivariate technique. It is concluded that the multivariate Box-Jenkins approach is a suitable techniquemore » for building thermal analysis. Application to multivariate Box-Jenkins approach is a suitable technique for building thermal analysis. Application to multivariate model-based control is discussed, with particular reference to building energy management systems. It is further concluded that stochastic modeling of data drawn from a short monitoring period offers a means of retrofitting an advanced model-based control system in existing buildings, which could be used to optimize energy savings. An approach to system simulation is suggested.« less
Peng, Jiegang
2015-11-04
Weakly electric fish sense their surroundings in complete darkness by their active electrolocation system. For biologists, the active electrolocation system has been investigated for near 60 years. And for engineers, bio-inspired active electrolocation sensor has been investigated for about 20 years. But how the amplitude information response will be affected by frequencies of detecting electric fields in the active electrolocation system was rarely investigated. In this paper, an electrolocation experiment system has been built. The amplitude information-frequency characteristics (AIFC) of the electrolocation system for sinusoidal electric fields of varying frequencies have been investigated. We find that AIFC of the electrolocation system have relevance to the material properties and geometric features of the probed object and conductivity of surrounding water. Detect frequency dead zone (DFDZ) and frequency inflection point (FIP) of AIFC for the electrolocation system were found. The analysis model of the electrolocation system has been investigated for many years, but DFDZ and FIP of AIFC can be difficult to explain by those models. In order to explain those AIFC phenomena for the electrolocation system, a simple relaxation model based on Cole-Cole model which is not only a mathematical explanation but it is a physical one for the electrolocation system was advanced. We also advance a hypothesis for physical mechanism of weakly electrical fish electrolocation system. It may have reference value for physical mechanism of weakly electrical fish active electrolocation system.
DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1989-01-01
This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.
Materials used to simulate physical properties of human skin.
Dąbrowska, A K; Rotaru, G-M; Derler, S; Spano, F; Camenzind, M; Annaheim, S; Stämpfli, R; Schmid, M; Rossi, R M
2016-02-01
For many applications in research, material development and testing, physical skin models are preferable to the use of human skin, because more reliable and reproducible results can be obtained. This article gives an overview of materials applied to model physical properties of human skin to encourage multidisciplinary approaches for more realistic testing and improved understanding of skin-material interactions. The literature databases Web of Science, PubMed and Google Scholar were searched using the terms 'skin model', 'skin phantom', 'skin equivalent', 'synthetic skin', 'skin substitute', 'artificial skin', 'skin replica', and 'skin model substrate.' Articles addressing material developments or measurements that include the replication of skin properties or behaviour were analysed. It was found that the most common materials used to simulate skin are liquid suspensions, gelatinous substances, elastomers, epoxy resins, metals and textiles. Nano- and micro-fillers can be incorporated in the skin models to tune their physical properties. While numerous physical skin models have been reported, most developments are research field-specific and based on trial-and-error methods. As the complexity of advanced measurement techniques increases, new interdisciplinary approaches are needed in future to achieve refined models which realistically simulate multiple properties of human skin. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Schilling, Oliver S.; Gerber, Christoph; Partington, Daniel J.; Purtschert, Roland; Brennwald, Matthias S.; Kipfer, Rolf; Hunkeler, Daniel; Brunner, Philip
2017-12-01
To provide a sound understanding of the sources, pathways, and residence times of groundwater water in alluvial river-aquifer systems, a combined multitracer and modeling experiment was carried out in an important alluvial drinking water wellfield in Switzerland. 222Rn, 3H/3He, atmospheric noble gases, and the novel 37Ar-method were used to quantify residence times and mixing ratios of water from different sources. With a half-life of 35.1 days, 37Ar allowed to successfully close a critical observational time gap between 222Rn and 3H/3He for residence times of weeks to months. Covering the entire range of residence times of groundwater in alluvial systems revealed that, to quantify the fractions of water from different sources in such systems, atmospheric noble gases and helium isotopes are tracers suited for end-member mixing analysis. A comparison between the tracer-based mixing ratios and mixing ratios simulated with a fully-integrated, physically-based flow model showed that models, which are only calibrated against hydraulic heads, cannot reliably reproduce mixing ratios or residence times of alluvial river-aquifer systems. However, the tracer-based mixing ratios allowed the identification of an appropriate flow model parametrization. Consequently, for alluvial systems, we recommend the combination of multitracer studies that cover all relevant residence times with fully-coupled, physically-based flow modeling to better characterize the complex interactions of river-aquifer systems.
Magnitude, moment, and measurement: The seismic mechanism controversy and its resolution.
Miyake, Teru
This paper examines the history of two related problems concerning earthquakes, and the way in which a theoretical advance was involved in their resolution. The first problem is the development of a physical, as opposed to empirical, scale for measuring the size of earthquakes. The second problem is that of understanding what happens at the source of an earthquake. There was a controversy about what the proper model for the seismic source mechanism is, which was finally resolved through advances in the theory of elastic dislocations. These two problems are linked, because the development of a physically-based magnitude scale requires an understanding of what goes on at the seismic source. I will show how the theoretical advances allowed seismologists to re-frame the questions they were trying to answer, so that the data they gathered could be brought to bear on the problem of seismic sources in new ways. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nonlinear ultrasonics for material state awareness
NASA Astrophysics Data System (ADS)
Jacobs, L. J.
2014-02-01
Predictive health monitoring of structural components will require the development of advanced sensing techniques capable of providing quantitative information on the damage state of structural materials. By focusing on nonlinear acoustic techniques, it is possible to measure absolute, strength based material parameters that can then be coupled with uncertainty models to enable accurate and quantitative life prediction. Starting at the material level, this review will present current research that involves a combination of sensing techniques and physics-based models to characterize damage in metallic materials. In metals, these nonlinear ultrasonic measurements can sense material state, before the formation of micro- and macro-cracks. Typically, cracks of a measurable size appear quite late in a component's total life, while the material's integrity in terms of toughness and strength gradually decreases due to the microplasticity (dislocations) and associated change in the material's microstructure. This review focuses on second harmonic generation techniques. Since these nonlinear acoustic techniques are acoustic wave based, component interrogation can be performed with bulk, surface and guided waves using the same underlying material physics; these nonlinear ultrasonic techniques provide results which are independent of the wave type used. Recent physics-based models consider the evolution of damage due to dislocations, slip bands, interstitials, and precipitates in the lattice structure, which can lead to localized damage.
Probability for Weather and Climate
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.
Bridging the divide: a model-data approach to Polar and Alpine microbiology.
Bradley, James A; Anesio, Alexandre M; Arndt, Sandra
2016-03-01
Advances in microbial ecology in the cryosphere continue to be driven by empirical approaches including field sampling and laboratory-based analyses. Although mathematical models are commonly used to investigate the physical dynamics of Polar and Alpine regions, they are rarely applied in microbial studies. Yet integrating modelling approaches with ongoing observational and laboratory-based work is ideally suited to Polar and Alpine microbial ecosystems given their harsh environmental and biogeochemical characteristics, simple trophic structures, distinct seasonality, often difficult accessibility, geographical expansiveness and susceptibility to accelerated climate changes. In this opinion paper, we explain how mathematical modelling ideally complements field and laboratory-based analyses. We thus argue that mathematical modelling is a powerful tool for the investigation of these extreme environments and that fully integrated, interdisciplinary model-data approaches could help the Polar and Alpine microbiology community address some of the great research challenges of the 21st century (e.g. assessing global significance and response to climate change). However, a better integration of field and laboratory work with model design and calibration/validation, as well as a stronger focus on quantitative information is required to advance models that can be used to make predictions and upscale processes and fluxes beyond what can be captured by observations alone. © FEMS 2016.
Bridging the divide: a model-data approach to Polar and Alpine microbiology
Bradley, James A.; Anesio, Alexandre M.; Arndt, Sandra
2016-01-01
Advances in microbial ecology in the cryosphere continue to be driven by empirical approaches including field sampling and laboratory-based analyses. Although mathematical models are commonly used to investigate the physical dynamics of Polar and Alpine regions, they are rarely applied in microbial studies. Yet integrating modelling approaches with ongoing observational and laboratory-based work is ideally suited to Polar and Alpine microbial ecosystems given their harsh environmental and biogeochemical characteristics, simple trophic structures, distinct seasonality, often difficult accessibility, geographical expansiveness and susceptibility to accelerated climate changes. In this opinion paper, we explain how mathematical modelling ideally complements field and laboratory-based analyses. We thus argue that mathematical modelling is a powerful tool for the investigation of these extreme environments and that fully integrated, interdisciplinary model-data approaches could help the Polar and Alpine microbiology community address some of the great research challenges of the 21st century (e.g. assessing global significance and response to climate change). However, a better integration of field and laboratory work with model design and calibration/validation, as well as a stronger focus on quantitative information is required to advance models that can be used to make predictions and upscale processes and fluxes beyond what can be captured by observations alone. PMID:26832206
A Trial of Physics Education for Liberal Arts Students Using the Advancing Physics
NASA Astrophysics Data System (ADS)
Ochi, Nobuaki
A new approach to physics education for liberal arts students was performed in a Japanese university. The Advancing Physics, a modern textbook developed by the Institute of Physics, was employed as the base of this approach. The textbook includes a variety of modern topics about science and technology with beautiful pictures, while the use of math is kept to a minimum. From results of the questionnaire after one-semester lectures, it turned out that students' interest in science and technology rose substantially. On the other hand, there were some difficulties in lecturing, mathematical techniques in particular, which should be modified by the next trial. This result is an indication of a potential of the Advancing Physics for liberal arts education.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messner, M. C.; Truster, T. J.; Cochran, K. B.
Advanced reactors designed to operate at higher temperatures than current light water reactors require structural materials with high creep strength and creep-fatigue resistance to achieve long design lives. Grade 91 is a ferritic/martensitic steel designed for long creep life at elevated temperatures. It has been selected as a candidate material for sodium fast reactor intermediate heat exchangers and other advanced reactor structural components. This report focuses on the creep deformation and rupture life of Grade 91 steel. The time required to complete an experiment limits the availability of long-life creep data for Grade 91 and other structural materials. Design methodsmore » often extrapolate the available shorter-term experimental data to longer design lives. However, extrapolation methods tacitly assume the underlying material mechanisms causing creep for long-life/low-stress conditions are the same as the mechanisms controlling creep in the short-life/high-stress experiments. A change in mechanism for long-term creep could cause design methods based on extrapolation to be non-conservative. The goal for physically-based microstructural models is to accurately predict material response in experimentally-inaccessible regions of design space. An accurate physically-based model for creep represents all the material mechanisms that contribute to creep deformation and damage and predicts the relative influence of each mechanism, which changes with loading conditions. Ideally, the individual mechanism models adhere to the material physics and not an empirical calibration to experimental data and so the model remains predictive for a wider range of loading conditions. This report describes such a physically-based microstructural model for Grade 91 at 600° C. The model explicitly represents competing dislocation and diffusional mechanisms in both the grain bulk and grain boundaries. The model accurately recovers the available experimental creep curves at higher stresses and the limited experimental data at lower stresses, predominately primary creep rates. The current model considers only one temperature. However, because the model parameters are, for the most part, directly related to the physics of fundamental material processes, the temperature dependence of the properties are known. Therefore, temperature dependence can be included in the model with limited additional effort. The model predicts a mechanism shift for 600° C at approximately 100 MPa from a dislocation- dominated regime at higher stress to a diffusion-dominated regime at lower stress. This mechanism shift impacts the creep life, notch-sensitivity, and, likely, creep ductility of Grade 91. In particular, the model predicts existing extrapolation methods for creep life may be non-conservative when attempting to extrapolate data for higher stress creep tests to low stress, long-life conditions. Furthermore, the model predicts a transition from notchstrengthening behavior at high stress to notch-weakening behavior at lower stresses. Both behaviors may affect the conservatism of existing design methods.« less
Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merzari, E.; Shemon, E. R.; Yu, Y. Q.
This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models ofmore » a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.« less
Status Report on NEAMS System Analysis Module Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, R.; Fanning, T. H.; Sumner, T.
2015-12-01
Under the Reactor Product Line (RPL) of DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, an advanced SFR System Analysis Module (SAM) is being developed at Argonne National Laboratory. The goal of the SAM development is to provide fast-running, improved-fidelity, whole-plant transient analyses capabilities. SAM utilizes an object-oriented application framework MOOSE), and its underlying meshing and finite-element library libMesh, as well as linear and non-linear solvers PETSc, to leverage modern advanced software environments and numerical methods. It also incorporates advances in physical and empirical models and seeks closure models based on information from high-fidelity simulations and experiments. This reportmore » provides an update on the SAM development, and summarizes the activities performed in FY15 and the first quarter of FY16. The tasks include: (1) implement the support of 2nd-order finite elements in SAM components for improved accuracy and computational efficiency; (2) improve the conjugate heat transfer modeling and develop pseudo 3-D full-core reactor heat transfer capabilities; (3) perform verification and validation tests as well as demonstration simulations; (4) develop the coupling requirements for SAS4A/SASSYS-1 and SAM integration.« less
van der Merwe, Rudolph; Leen, Todd K; Lu, Zhengdong; Frolov, Sergey; Baptista, Antonio M
2007-05-01
We present neural network surrogates that provide extremely fast and accurate emulation of a large-scale circulation model for the coupled Columbia River, its estuary and near ocean regions. The circulation model has O(10(7)) degrees of freedom, is highly nonlinear and is driven by ocean, atmospheric and river influences at its boundaries. The surrogates provide accurate emulation of the full circulation code and run over 1000 times faster. Such fast dynamic surrogates will enable significant advances in ensemble forecasts in oceanography and weather.
Frank, Lawrence D; Fox, Eric H; Ulmer, Jared M; Chapman, James E; Kershaw, Suzanne E; Sallis, James F; Conway, Terry L; Cerin, Ester; Cain, Kelli L; Adams, Marc A; Smith, Graham R; Hinckson, Erica; Mavoa, Suzanne; Christiansen, Lars B; Hino, Adriano Akira F; Lopes, Adalberto A S; Schipperijn, Jasper
2017-01-23
Advancements in geographic information systems over the past two decades have increased the specificity by which an individual's neighborhood environment may be spatially defined for physical activity and health research. This study investigated how different types of street network buffering methods compared in measuring a set of commonly used built environment measures (BEMs) and tested their performance on associations with physical activity outcomes. An internationally-developed set of objective BEMs using three different spatial buffering techniques were used to evaluate the relative differences in resulting explanatory power on self-reported physical activity outcomes. BEMs were developed in five countries using 'sausage,' 'detailed-trimmed,' and 'detailed,' network buffers at a distance of 1 km around participant household addresses (n = 5883). BEM values were significantly different (p < 0.05) for 96% of sausage versus detailed-trimmed buffer comparisons and 89% of sausage versus detailed network buffer comparisons. Results showed that BEM coefficients in physical activity models did not differ significantly across buffering methods, and in most cases BEM associations with physical activity outcomes had the same level of statistical significance across buffer types. However, BEM coefficients differed in significance for 9% of the sausage versus detailed models, which may warrant further investigation. Results of this study inform the selection of spatial buffering methods to estimate physical activity outcomes using an internationally consistent set of BEMs. Using three different network-based buffering methods, the findings indicate significant variation among BEM values, however associations with physical activity outcomes were similar across each buffering technique. The study advances knowledge by presenting consistently assessed relationships between three different network buffer types and utilitarian travel, sedentary behavior, and leisure-oriented physical activity outcomes.
Integrative sensing and prediction of urban water for sustainable cities (iSPUW)
NASA Astrophysics Data System (ADS)
Seo, D. J.; Fang, N. Z.; Yu, X.; Zink, M.; Gao, J.; Kerkez, B.
2014-12-01
We describe a newly launched project in the Dallas-Fort Worth Metroplex (DFW) area to develop a cyber-physical prototype system that integrates advanced sensing, modeling and prediction of urban water, to support its early adoption by a spectrum of users and stakeholders, and to educate a new generation of future sustainability scientists and engineers. The project utilizes the very high-resolution precipitation and other sensing capabilities uniquely available in DFW as well as crowdsourcing and cloud computing to advance understanding of the urban water cycle and to improve urban sustainability from transient shocks of heavy-to-extreme precipitation under climate change and urbanization. All available water information from observations and models will be fused objectively via advanced data assimilation to produce the best estimate of the state of the uncertain system. Modeling, prediction and decision support tools will be developed in the ensemble framework to increase the information content of the analysis and prediction and to support risk-based decision making.
The Effects of Ambient Conditions on Helicopter Rotor Source Noise Modeling
NASA Technical Reports Server (NTRS)
Schmitz, Frederic H.; Greenwood, Eric
2011-01-01
A new physics-based method called Fundamental Rotorcraft Acoustic Modeling from Experiments (FRAME) is used to demonstrate the change in rotor harmonic noise of a helicopter operating at different ambient conditions. FRAME is based upon a non-dimensional representation of the governing acoustic and performance equations of a single rotor helicopter. Measured external noise is used together with parameter identification techniques to develop a model of helicopter external noise that is a hybrid between theory and experiment. The FRAME method is used to evaluate the main rotor harmonic noise of a Bell 206B3 helicopter operating at different altitudes. The variation with altitude of Blade-Vortex Interaction (BVI) noise, known to be a strong function of the helicopter s advance ratio, is dependent upon which definition of airspeed is flown by the pilot. If normal flight procedures are followed and indicated airspeed (IAS) is held constant, the true airspeed (TAS) of the helicopter increases with altitude. This causes an increase in advance ratio and a decrease in the speed of sound which results in large changes to BVI noise levels. Results also show that thickness noise on this helicopter becomes more intense at high altitudes where advancing tip Mach number increases because the speed of sound is decreasing and advance ratio increasing for the same indicated airspeed. These results suggest that existing measurement-based empirically derived helicopter rotor noise source models may give incorrect noise estimates when they are used at conditions where data were not measured and may need to be corrected for mission land-use planning purposes.
NASA Technical Reports Server (NTRS)
Veazie, David R.
1998-01-01
Advanced polymer matrix composites (PMC's) are desirable for structural materials in diverse applications such as aircraft, civil infrastructure and biomedical implants because of their improved strength-to-weight and stiffness-to-weight ratios. For example, the next generation military and commercial aircraft requires applications for high strength, low weight structural components subjected to elevated temperatures. A possible disadvantage of polymer-based composites is that the physical and mechanical properties of the matrix often change significantly over time due to the exposure of elevated temperatures and environmental factors. For design, long term exposure (i.e. aging) of PMC's must be accounted for through constitutive models in order to accurately assess the effects of aging on performance, crack initiation and remaining life. One particular aspect of this aging process, physical aging, is considered in this research.
Performance Assessment of New Land-Surface and Planetary Boundary Layer Physics in the WRF-ARW
The Pleim-Xiu land surface model, Pleim surface layer scheme, and Asymmetric Convective Model (version 2) are now options in version 3.0 of the Weather Research and Forecasting model (WRF) Advanced Research WRF (ARW) core. These physics parameterizations were developed for the f...
Yoga as palliation in women with advanced cancer: a pilot study.
Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Duggleby, Wendy; Thomas, Roanne; Holtslander, Lorraine
2016-03-01
The purpose of this pilot study was to investigate the palliative potential of home-based yoga sessions provided to women with advanced cancer. Personalised 45-minute yoga sessions were offered to three women with advanced cancer by an experienced yoga teacher. Each woman took part in a one-to-one interview after the completion of the yoga programme and was asked to describe her experiences of the programme's impact. The personalised nature of the yoga sessions resulted in similar positive physical and psychosocial effects comparable to those demonstrated in other studies with cancer patients. Participants described physical, mental, and emotional benefits as well as the alleviation of illness impacts. The enhancement of mind-body and body-spirit connections were also noted. Personalised home-based yoga programmes for people with advanced cancer may produce similar benefits, including palliation, as those institutionally-based programmes for people with non-advanced cancer.
Physics Literacy for All Students
NASA Astrophysics Data System (ADS)
Hobson, Art
2010-03-01
Physics teachers must broaden their focus from physics for scientists to physics for all. The reason, as the American Association for the Advancement of Science puts it, is: ``Without a scientifically literate population, the outlook for a better world is not promising.'' Physics for all (including the first course for scientists) should be conceptual, not technical. It should describe the universe as we understand it today, including special and general relativity, quantum physics, modern cosmology, the standard model, and quantum fields. Many science writers have shown this is possible. It should include physics-related social topics such as global warming and nuclear weapons, because citizens need to vote on these issues. Above all, it should emphasize the scientific process and the difference between science and nonsense. Science is based not on beliefs but rather on evidence and reason. We should constantly ask ``How do we know?'' and ``What is the evidence?''
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
NASA Astrophysics Data System (ADS)
Hong, Y.; Kirschbaum, D. B.; Fukuoka, H.
2011-12-01
The key to advancing the predictability of rainfall-triggered landslides is to use physically based slope-stability models that simulate the dynamical response of the subsurface moisture to spatiotemporal variability of rainfall in complex terrains. An early warning system applying such physical models has been developed to predict rainfall-induced shallow landslides over Java Island in Indonesia and Honduras. The prototyped early warning system integrates three major components: (1) a susceptibility mapping or hotspot identification component based on a land surface geospatial database (topographical information, maps of soil properties, and local landslide inventory etc.); (2) a satellite-based precipitation monitoring system (http://trmm.gsfc.nasa.gov) and a precipitation forecasting model (i.e. Weather Research Forecast); and (3) a physically-based, rainfall-induced landslide prediction model SLIDE (SLope-Infiltration-Distributed Equilibrium). The system utilizes the modified physical model to calculate a Factor of Safety (FS) that accounts for the contribution of rainfall infiltration and partial saturation to the shear strength of the soil in topographically complex terrains. The system's prediction performance has been evaluated using a local landslide inventory. In Java Island, Indonesia, evaluation of SLIDE modeling results by local news reports shows that the system successfully predicted landslides in correspondence to the time of occurrence of the real landslide events. Further study of SLIDE is implemented in Honduras where Hurricane Mitch triggered widespread landslides in 1998. Results shows within the approximately 1,200 square kilometers study areas, the values of hit rates reached as high as 78% and 75%, while the error indices were 35% and 49%. Despite positive model performance, the SLIDE model is limited in the early warning system by several assumptions including, using general parameter calibration rather than in situ tests and neglecting geologic information. Advantages and limitations of this model will be discussed with respect to future applications of landslide assessment and prediction over large scales. In conclusion, integration of spatially distributed remote sensing precipitation products and in-situ datasets and physical models in this prototype system enable us to further develop a regional early warning tool in the future for forecasting storm-induced landslides.
Science with the Advanced Gamma Ray Imaging System (AGIS)
NASA Astrophysics Data System (ADS)
Coppi, Paolo
2009-05-01
We present the scientific drivers for the Advanced Gamma Ray Imaging System (AGIS), a concept for the next-generation ground- based gamma-ray experiment, comprised of an array of ˜100 imaging atmospheric Cherenkov telescopes. Design requirements for AGIS include achieving a sensitivity an order of magnitude better than the current generation of space or ground-based instruments in the energy range of 40 GeV to ˜100 TeV. We present here an overview of the scientific goals of AGIS, including the prospects for understanding VHE phenomena in the vicinity of accreting black holes, particle acceleration in a variety of astrophysical environments, indirect detection of dark matter, study of cosmological background radiation fields, and particle physics beyond the standard model.
Probing the scale of new physics by Advanced LIGO/VIRGO
NASA Astrophysics Data System (ADS)
Dev, P. S. Bhupal; Mazumdar, A.
2016-05-01
We show that if the new physics beyond the standard model is associated with a first-order phase transition around 107- 108 GeV , the energy density stored in the resulting stochastic gravitational waves and the corresponding peak frequency are within the projected final sensitivity of the advanced LIGO/VIRGO detectors. We discuss some possible new physics scenarios that could arise at such energies, and in particular, the consequences for Peccei-Quinn and supersymmetry breaking scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDeavitt, Sean; Shao, Lin; Tsvetkov, Pavel
2014-04-07
Advanced fast reactor systems being developed under the DOE's Advanced Fuel Cycle Initiative are designed to destroy TRU isotopes generated in existing and future nuclear energy systems. Over the past 40 years, multiple experiments and demonstrations have been completed using U-Zr, U-Pu-Zr, U-Mo and other metal alloys. As a result, multiple empirical and semi-empirical relationships have been established to develop empirical performance modeling codes. Many mechanistic questions about fission as mobility, bubble coalescience, and gas release have been answered through industrial experience, research, and empirical understanding. The advent of modern computational materials science, however, opens new doors of development suchmore » that physics-based multi-scale models may be developed to enable a new generation of predictive fuel performance codes that are not limited by empiricism.« less
Physical Analytics: An emerging field with real-world applications and impact
NASA Astrophysics Data System (ADS)
Hamann, Hendrik
2015-03-01
In the past most information on the internet has been originated by humans or computers. However with the emergence of cyber-physical systems, vast amount of data is now being created by sensors from devices, machines etc digitizing the physical world. While cyber-physical systems are subject to active research around the world, the vast amount of actual data generated from the physical world has attracted so far little attention from the engineering and physics community. In this presentation we use examples to highlight the opportunities in this new subject of ``Physical Analytics'' for highly inter-disciplinary research (including physics, engineering and computer science), which aims understanding real-world physical systems by leveraging cyber-physical technologies. More specifically, the convergence of the physical world with the digital domain allows applying physical principles to everyday problems in a much more effective and informed way than what was possible in the past. Very much like traditional applied physics and engineering has made enormous advances and changed our lives by making detailed measurements to understand the physics of an engineered device, we can now apply the same rigor and principles to understand large-scale physical systems. In the talk we first present a set of ``configurable'' enabling technologies for Physical Analytics including ultralow power sensing and communication technologies, physical big data management technologies, numerical modeling for physical systems, machine learning based physical model blending, and physical analytics based automation and control. Then we discuss in detail several concrete applications of Physical Analytics ranging from energy management in buildings and data centers, environmental sensing and controls, precision agriculture to renewable energy forecasting and management.
Physical-scale models of engineered log jams in rivers
USDA-ARS?s Scientific Manuscript database
Stream restoration and river engineering projects are employing engineered log jams increasingly for stabilization and in-stream improvements. To further advance the design of these structures and their morphodynamic effects on corridors, the basis for physical-scale models of rivers with engineere...
Leonard, Tammy; Shuval, Kerem; de Oliveira, Angela; Skinner, Celette Sugg; Eckel, Catherine; Murdoch, James C
2013-01-01
To examine the relationship between physical activity stages of change and preferences for financial risk and time. A cross-sectional, community-based study. A low-income, urban, African-American neighborhood. One hundred sixty-nine adults. Self-reported physical activity stages of change-precontemplation to maintenance, objectively measured body mass index and waist circumference, and economic preferences for time and risk measured via incentivized economic experiments. Multivariable ordered logistic regression models were used to examine the association between physical activity stages of change and economic preferences while controlling for demographic characteristics of the individuals. Individuals who are more tolerant of financial risks (odds ratio [OR] = 1.31, p < .05) and whose time preferences indicate more patience (OR = 1.68, p < .01) are more likely to be in a more advanced physical activity stage (e.g., from preparation to action). The likelihood of being in the maintenance stage increases by 5.6 and 10.9 percentage points for each one-unit increase in financial risk tolerance or one-unit increase in the time preference measure, respectively. Greater tolerance of financial risk and more patient time preferences among this low-income ethnic minority population are associated with a more advanced physical activity stage. Further exploration is clearly warranted in larger and more representative samples.
ERIC Educational Resources Information Center
Lodewyk, Ken R.; Mandigo, James L.
2017-01-01
Physical and Health Education Canada has developed and implemented a formative, criterion-referenced, and practitioner-based national (Canadian) online educational assessment and support resource called Passport for Life (PFL). It was developed to support the awareness and advancement of physical literacy among PE students and teachers. PFL…
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Engineering uses of physics-based ground motion simulations
Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.
2014-01-01
This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.
Magnetic geometry and physics of advanced divertors: The X-divertor and the snowflake
NASA Astrophysics Data System (ADS)
Kotschenreuther, Mike; Valanju, Prashant; Covele, Brent; Mahajan, Swadesh
2013-10-01
Advanced divertors are magnetic geometries where a second X-point is added in the divertor region to address the serious challenges of burning plasma power exhaust. Invoking physical arguments, numerical work, and detailed model magnetic field analysis, we investigate the magnetic field structure of advanced divertors in the physically relevant region for power exhaust—the scrape-off layer. A primary result of our analysis is the emergence of a physical "metric," the Divertor Index DI, which quantifies the flux expansion increase as one goes from the main X-point to the strike point. It clearly separates three geometries with distinct consequences for divertor physics—the Standard Divertor (DI = 1), and two advanced geometries—the X-Divertor (XD, DI > 1) and the Snowflake (DI < 1). The XD, therefore, cannot be classified as one variant of the Snowflake. By this measure, recent National Spherical Torus Experiment and DIIID experiments are X-Divertors, not Snowflakes.
Engineering and physical sciences in oncology: challenges and opportunities.
Mitchell, Michael J; Jain, Rakesh K; Langer, Robert
2017-11-01
The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas.
Large Eddy Simulation of High Reynolds Number Complex Flows
NASA Astrophysics Data System (ADS)
Verma, Aman
Marine configurations are subject to a variety of complex hydrodynamic phenomena affecting the overall performance of the vessel. The turbulent flow affects the hydrodynamic drag, propulsor performance and structural integrity, control-surface effectiveness, and acoustic signature of the marine vessel. Due to advances in massively parallel computers and numerical techniques, an unsteady numerical simulation methodology such as Large Eddy Simulation (LES) is well suited to study such complex turbulent flows whose Reynolds numbers (Re) are typically on the order of 10. 6. LES also promises increasedaccuracy over RANS based methods in predicting unsteady phenomena such as cavitation and noise production. This dissertation develops the capability to enable LES of high Re flows in complex geometries (e.g. a marine vessel) on unstructured grids and provide physical insight into the turbulent flow. LES is performed to investigate the geometry induced separated flow past a marine propeller attached to a hull, in an off-design condition called crashback. LES shows good quantitative agreement with experiments and provides a physical mechanism to explain the increase in side-force on the propeller blades below an advance ratio of J=-0.7. Fundamental developments in the dynamic subgrid-scale model for LES are pursued to improve the LES predictions, especially for complex flows on unstructured grids. A dynamic procedure is proposed to estimate a Lagrangian time scale based on a surrogate correlation without any adjustable parameter. The proposed model is applied to turbulent channel, cylinder and marine propeller flows and predicts improved results over other model variants due to a physically consistent Lagrangian time scale. A wall model is proposed for application to LES of high Reynolds number wall-bounded flows. The wall model is formulated as the minimization of a generalized constraint in the dynamic model for LES and applied to LES of turbulent channel flow at various Reynolds numbers up to Reτ=10000 and coarse grid resolutions to obtain significant improvement.
Temporal and Location Based RFID Event Data Management and Processing
NASA Astrophysics Data System (ADS)
Wang, Fusheng; Liu, Peiya
Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-10
... ice area are linked in the IPCC climate models to GHG emissions by the physics of radiation processes... scenario), a model that is known for incorporating advanced sea ice physics, and for which snow data were...
Physics-based analysis and control of human snoring
NASA Astrophysics Data System (ADS)
Sanchez, Yaselly; Wang, Junshi; Han, Pan; Xi, Jinxiang; Dong, Haibo
2017-11-01
In order to advance the understanding of biological fluid dynamics and its effects on the acoustics of human snoring, the study pursued a physics-based computational approach. From human magnetic resonance image (MRI) scans, the researchers were able to develop both anatomically and dynamically accurate airway-uvula models. With airways defined as rigid, and the uvula defined as flexible, computational models were created with various pharynx thickness and geometries. In order to determine vortex shedding with prescribed uvula movement, the uvula fluctuation was categorized by its specific parameters: magnitude, frequency, and phase lag. Uvula vibration modes were based on one oscillation, or one harmonic frequency, and pressure probes were located in seven different positions throughout the airway-uvula model. By taking fast Fourier transforms (FFT) from the pressure probe data, it was seen that four harmonics were created throughout the simulation within one oscillation of uvula movement. Of the four harmonics, there were two pressure probes which maintained high amplitudes and led the researcher to believe that different vortices formed with different snoring frequencies. This work is supported by the NSF Grant CBET-1605434.
Developing model-making and model-breaking skills using direct measurement video-based activities
NASA Astrophysics Data System (ADS)
Vonk, Matthew; Bohacek, Peter; Militello, Cheryl; Iverson, Ellen
2017-12-01
This study focuses on student development of two important laboratory skills in the context of introductory college-level physics. The first skill, which we call model making, is the ability to analyze a phenomenon in a way that produces a quantitative multimodal model. The second skill, which we call model breaking, is the ability to critically evaluate if the behavior of a system is consistent with a given model. This study involved 116 introductory physics students in four different sections, each taught by a different instructor. All of the students within a given class section participated in the same instruction (including labs) with the exception of five activities performed throughout the semester. For those five activities, each class section was split into two groups; one group was scaffolded to focus on model-making skills and the other was scaffolded to focus on model-breaking skills. Both conditions involved direct measurement videos. In some cases, students could vary important experimental parameters within the video like mass, frequency, and tension. Data collected at the end of the semester indicate that students in the model-making treatment group significantly outperformed the other group on the model-making skill despite the fact that both groups shared a common physical lab experience. Likewise, the model-breaking treatment group significantly outperformed the other group on the model-breaking skill. This is important because it shows that direct measurement video-based instruction can help students acquire science-process skills, which are critical for scientists, and which are a key part of current science education approaches such as the Next Generation Science Standards and the Advanced Placement Physics 1 course.
DRS: Derivational Reasoning System
NASA Technical Reports Server (NTRS)
Bose, Bhaskar
1995-01-01
The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.
Interdisciplinary cantilever physics: Elasticity of carrot, celery, and plasticware
NASA Astrophysics Data System (ADS)
Pestka, Kenneth A.
2014-05-01
This article presents several simple cantilever-based experiments using common household items (celery, carrot, and a plastic spoon) that are appropriate for introductory undergraduate laboratories or independent student projects. By applying Hooke's law and Euler beam theory, students are able to determine Young's modulus, fracture stress, yield stress, strain energy, and sound speed of these apparently disparate materials. In addition, a cellular foam elastic model is introduced—applicable to biologic materials as well as an essential component in the development of advanced engineering composites—that provides a mechanism to determine Young's modulus of the cell wall material found in celery and carrot. These experiments are designed to promote exploration of the similarities and differences between common inorganic and organic materials, fill a void in the typical undergraduate curriculum, and provide a foundation for more advanced material science pursuits within biology, botany, and food science as well as physics and engineering.
Advanced graphical user interface for multi-physics simulations using AMST
NASA Astrophysics Data System (ADS)
Hoffmann, Florian; Vogel, Frank
2017-07-01
Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.
NASA Astrophysics Data System (ADS)
Liao, Z.; Hong, Y.; Kirschbaum, D. B.; Fukuoka, H.; Sassa, K.; Karnawati, D.; Fathani, F.
2010-12-01
Recent advancements in the availability of remotely sensed datasets provide an opportunity to advance the predictability of rainfall-triggered landslides at larger spatial scales. An early-warning system based on a physical landslide model and remote sensing information is used to simulate the dynamical response of the soil water content to the spatiotemporal variability of rainfall in complex terrain. The system utilizes geomorphologic datasets including a 30-meter ASTER DEM, a 1-km downscaled FAO soil map, and satellite-based Tropical Rainfall Measuring Mission (TRMM) precipitation. The applied physical model SLIDE (SLope-Infiltration-Distributed Equilibrium) defines a direct relationship between a factor of safety and the rainfall depth on an infinite slope. This prototype model is applied to a case study in Honduras during Hurricane Mitch in 1998 and a secondary case of typhoon-induced shallow landslides over Java Island, Indonesia. In Honduras, two study areas were selected which cover approximately 1,200 square kilometers and where a high density of shallow landslides occurred. The results were quantitatively evaluated using landslide inventory data compiled by the United States Geological Survey (USGS) following Hurricane Mitch, and show a good agreement between the modeling results and observations. The success rate for accurately estimating slope failure locations reached as high as 78% and 75%, while the error indices were 35% and 49%, respectively for each of the two selected study areas. Advantages and limitations of this application are discussed with respect to future assessment and challenges of performing a slope-stability estimation using coarse data at 1200 square kilometers. In Indonesia, the system has been applied over the whole Java Island. The prototyped early-warning system has been enhanced by integration of a susceptibility mapping and a precipitation forecasting model (i.e. Weather Research Forecast). The performance has been evaluated using a local landslide inventory, and results show that the system successfully predicted landslides in correspondence to the time of occurrence of the real landslide events in this case.
Kris-Etherton, Penny M; Akabas, Sharon R; Bales, Connie W; Bistrian, Bruce; Braun, Lynne; Edwards, Marilyn S; Laur, Celia; Lenders, Carine M; Levy, Matthew D; Palmer, Carole A; Pratt, Charlotte A; Ray, Sumantra; Rock, Cheryl L; Saltzman, Edward; Seidner, Douglas L; Van Horn, Linda
2014-01-01
Nutrition is a recognized determinant in 3 (ie, diseases of the heart, malignant neoplasms, cerebrovascular diseases) of the top 4 leading causes of death in the United States. However, many health care providers are not adequately trained to address lifestyle recommendations that include nutrition and physical activity behaviors in a manner that could mitigate disease development or progression. This contributes to a compelling need to markedly improve nutrition education for health care professionals and to establish curricular standards and requisite nutrition and physical activity competencies in the education, training, and continuing education for health care professionals. This article reports the present status of nutrition and physical activity education for health care professionals, evaluates the current pedagogic models, and underscores the urgent need to realign and synergize these models to reflect evidence-based and outcomes-focused education. PMID:24717343
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mara, Nathan Allan; Bronkhorst, Curt Allan; Beyerlein, Irene Jane
2015-12-21
The intent of this research effort is to prove the hypothesis that: Through the employment of controlled processing parameters which are based upon integrated advanced material characterization and multi-physics material modeling, bulk nanolayered composites can be designed to contain high densities of preferred interfaces that can serve as supersinks for the defects responsible for premature damage and failure.
Optimizing zonal advection of the Advanced Research WRF (ARW) dynamics for Intel MIC
NASA Astrophysics Data System (ADS)
Mielikainen, Jarno; Huang, Bormin; Huang, Allen H.
2014-10-01
The Weather Research and Forecast (WRF) model is the most widely used community weather forecast and research model in the world. There are two distinct varieties of WRF. The Advanced Research WRF (ARW) is an experimental, advanced research version featuring very high resolution. The WRF Nonhydrostatic Mesoscale Model (WRF-NMM) has been designed for forecasting operations. WRF consists of dynamics code and several physics modules. The WRF-ARW core is based on an Eulerian solver for the fully compressible nonhydrostatic equations. In the paper, we will use Intel Intel Many Integrated Core (MIC) architecture to substantially increase the performance of a zonal advection subroutine for optimization. It is of the most time consuming routines in the ARW dynamics core. Advection advances the explicit perturbation horizontal momentum equations by adding in the large-timestep tendency along with the small timestep pressure gradient tendency. We will describe the challenges we met during the development of a high-speed dynamics code subroutine for MIC architecture. Furthermore, lessons learned from the code optimization process will be discussed. The results show that the optimizations improved performance of the original code on Xeon Phi 5110P by a factor of 2.4x.
NASA Astrophysics Data System (ADS)
Mielikainen, Jarno; Huang, Bormin; Huang, Allen H.-L.
2015-05-01
The most widely used community weather forecast and research model in the world is the Weather Research and Forecast (WRF) model. Two distinct varieties of WRF exist. The one we are interested is the Advanced Research WRF (ARW) is an experimental, advanced research version featuring very high resolution. The WRF Nonhydrostatic Mesoscale Model (WRF-NMM) has been designed for forecasting operations. WRF consists of dynamics code and several physics modules. The WRF-ARW core is based on an Eulerian solver for the fully compressible nonhydrostatic equations. In the paper, we optimize a meridional (north-south direction) advection subroutine for Intel Xeon Phi coprocessor. Advection is of the most time consuming routines in the ARW dynamics core. It advances the explicit perturbation horizontal momentum equations by adding in the large-timestep tendency along with the small timestep pressure gradient tendency. We will describe the challenges we met during the development of a high-speed dynamics code subroutine for MIC architecture. Furthermore, lessons learned from the code optimization process will be discussed. The results show that the optimizations improved performance of the original code on Xeon Phi 7120P by a factor of 1.2x.
Advancing our profession: are higher educational standards the answer?
Boyleston, Erin S; Collins, Marie A
2012-01-01
Educational models in health care professions have changed drastically since on-the-job training models. The purpose of this manuscript was to investigate how the professions of physical therapy, occupational therapy, physician assistant, nursing and respiratory therapy have advanced their educational models for entry into practice and to recommend how dental hygiene can integrate similar models to advance the profession. The recommendations are to create an accreditation council for dental hygiene education and to mandate articulation agreements for baccalaureate degree completion in developing and existing programs. Dental hygiene must continue on the path to advance our profession and glean lessons from other health professions.
Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction
NASA Astrophysics Data System (ADS)
Mach, J. C.; Budrow, C. J.; Pagan, D. C.; Ruff, J. P. C.; Park, J.-S.; Okasinski, J.; Beaudoin, A. J.; Miller, M. P.
2017-05-01
Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present work, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to develop significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. The experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.
Advanced Methodologies for NASA Science Missions
NASA Astrophysics Data System (ADS)
Hurlburt, N. E.; Feigelson, E.; Mentzel, C.
2017-12-01
Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.
Advanced analytical modeling of double-gate Tunnel-FETs - A performance evaluation
NASA Astrophysics Data System (ADS)
Graef, Michael; Hosenfeld, Fabian; Horst, Fabian; Farokhnejad, Atieh; Hain, Franziska; Iñíguez, Benjamín; Kloes, Alexander
2018-03-01
The Tunnel-FET is one of the most promising devices to be the successor of the standard MOSFET due to its alternative current transport mechanism, which allows a smaller subthreshold slope than the physically limited 60 mV/dec of the MOSFET. Recently fabricated devices show smaller slopes already but mostly not over multiple decades of the current transfer characteristics. In this paper the performance limiting effects, occurring during the fabrication process of the device, such as doping profiles and midgap traps are analyzed by physics-based analytical models and their performance limiting abilities are determined. Additionally, performance enhancing possibilities, such as hetero-structures and ambipolarity improvements are introduced and discussed. An extensive double-gate n-Tunnel-FET model is presented, which meets the versatile device requirements and shows a good fit with TCAD simulations and measurement data.
Advances in visual representation of molecular potentials.
Du, Qi-Shi; Huang, Ri-Bo; Chou, Kuo-Chen
2010-06-01
The recent advances in visual representations of molecular properties in 3D space are summarized, and their applications in molecular modeling study and rational drug design are introduced. The visual representation methods provide us with detailed insights into protein-ligand interactions, and hence can play a major role in elucidating the structure or reactivity of a biomolecular system. Three newly developed computation and visualization methods for studying the physical and chemical properties of molecules are introduced, including their electrostatic potential, lipophilicity potential and excess chemical potential. The newest application examples of visual representations in structure-based rational drug are presented. The 3D electrostatic potentials, calculated using the empirical method (EM-ESP), in which the classical Coulomb equation and traditional atomic partial changes are discarded, are highly consistent with the results by the higher level quantum chemical method. The 3D lipophilicity potentials, computed by the heuristic molecular lipophilicity potential method based on the principles of quantum mechanics and statistical mechanics, are more accurate and reliable than those by using the traditional empirical methods. The 3D excess chemical potentials, derived by the reference interaction site model-hypernetted chain theory, provide a new tool for computational chemistry and molecular modeling. For structure-based drug design, the visual representations of molecular properties will play a significant role in practical applications. It is anticipated that the new advances in computational chemistry will stimulate the development of molecular modeling methods, further enriching the visual representation techniques for rational drug design, as well as other relevant fields in life science.
Sleep Quality Prediction From Wearable Data Using Deep Learning.
Sathyanarayana, Aarti; Joty, Shafiq; Fernandez-Luque, Luis; Ofli, Ferda; Srivastava, Jaideep; Elmagarmid, Ahmed; Arora, Teresa; Taheri, Shahrad
2016-11-04
The importance of sleep is paramount to health. Insufficient sleep can reduce physical, emotional, and mental well-being and can lead to a multitude of health complications among people with chronic conditions. Physical activity and sleep are highly interrelated health behaviors. Our physical activity during the day (ie, awake time) influences our quality of sleep, and vice versa. The current popularity of wearables for tracking physical activity and sleep, including actigraphy devices, can foster the development of new advanced data analytics. This can help to develop new electronic health (eHealth) applications and provide more insights into sleep science. The objective of this study was to evaluate the feasibility of predicting sleep quality (ie, poor or adequate sleep efficiency) given the physical activity wearable data during awake time. In this study, we focused on predicting good or poor sleep efficiency as an indicator of sleep quality. Actigraphy sensors are wearable medical devices used to study sleep and physical activity patterns. The dataset used in our experiments contained the complete actigraphy data from a subset of 92 adolescents over 1 full week. Physical activity data during awake time was used to create predictive models for sleep quality, in particular, poor or good sleep efficiency. The physical activity data from sleep time was used for the evaluation. We compared the predictive performance of traditional logistic regression with more advanced deep learning methods: multilayer perceptron (MLP), convolutional neural network (CNN), simple Elman-type recurrent neural network (RNN), long short-term memory (LSTM-RNN), and a time-batched version of LSTM-RNN (TB-LSTM). Deep learning models were able to predict the quality of sleep (ie, poor or good sleep efficiency) based on wearable data from awake periods. More specifically, the deep learning methods performed better than traditional logistic regression. “CNN had the highest specificity and sensitivity, and an overall area under the receiver operating characteristic (ROC) curve (AUC) of 0.9449, which was 46% better as compared with traditional logistic regression (0.6463). Deep learning methods can predict the quality of sleep based on actigraphy data from awake periods. These predictive models can be an important tool for sleep research and to improve eHealth solutions for sleep. ©Aarti Sathyanarayana, Shafiq Joty, Luis Fernandez-Luque, Ferda Ofli, Jaideep Srivastava, Ahmed Elmagarmid, Teresa Arora, Shahrad Taheri. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 04.11.2016.
Sleep Quality Prediction From Wearable Data Using Deep Learning
Sathyanarayana, Aarti; Joty, Shafiq; Ofli, Ferda; Srivastava, Jaideep; Elmagarmid, Ahmed; Arora, Teresa; Taheri, Shahrad
2016-01-01
Background The importance of sleep is paramount to health. Insufficient sleep can reduce physical, emotional, and mental well-being and can lead to a multitude of health complications among people with chronic conditions. Physical activity and sleep are highly interrelated health behaviors. Our physical activity during the day (ie, awake time) influences our quality of sleep, and vice versa. The current popularity of wearables for tracking physical activity and sleep, including actigraphy devices, can foster the development of new advanced data analytics. This can help to develop new electronic health (eHealth) applications and provide more insights into sleep science. Objective The objective of this study was to evaluate the feasibility of predicting sleep quality (ie, poor or adequate sleep efficiency) given the physical activity wearable data during awake time. In this study, we focused on predicting good or poor sleep efficiency as an indicator of sleep quality. Methods Actigraphy sensors are wearable medical devices used to study sleep and physical activity patterns. The dataset used in our experiments contained the complete actigraphy data from a subset of 92 adolescents over 1 full week. Physical activity data during awake time was used to create predictive models for sleep quality, in particular, poor or good sleep efficiency. The physical activity data from sleep time was used for the evaluation. We compared the predictive performance of traditional logistic regression with more advanced deep learning methods: multilayer perceptron (MLP), convolutional neural network (CNN), simple Elman-type recurrent neural network (RNN), long short-term memory (LSTM-RNN), and a time-batched version of LSTM-RNN (TB-LSTM). Results Deep learning models were able to predict the quality of sleep (ie, poor or good sleep efficiency) based on wearable data from awake periods. More specifically, the deep learning methods performed better than traditional linear regression. CNN had the highest specificity and sensitivity, and an overall area under the receiver operating characteristic (ROC) curve (AUC) of 0.9449, which was 46% better as compared with traditional linear regression (0.6463). Conclusions Deep learning methods can predict the quality of sleep based on actigraphy data from awake periods. These predictive models can be an important tool for sleep research and to improve eHealth solutions for sleep. PMID:27815231
Advanced Wide-Field Interferometric Microscopy for Nanoparticle Sensing and Characterization
NASA Astrophysics Data System (ADS)
Avci, Oguzhan
Nanoparticles have a key role in today's biotechnological research owing to the rapid advancement of nanotechnology. While metallic, polymer, and semiconductor based artificial nanoparticles are widely used as labels or targeted drug delivery agents, labeled and label-free detection of natural nanoparticles promise new ways for viral diagnostics and therapeutic applications. The increasing impact of nanoparticles in bio- and nano-technology necessitates the development of advanced tools for their accurate detection and characterization. Optical microscopy techniques have been an essential part of research for visualizing micron-scale particles. However, when it comes to the visualization of individual nano-scale particles, they have shown inadequate success due to the resolution and visibility limitations. Interferometric microscopy techniques have gained significant attention for providing means to overcome the nanoparticle visibility issue that is often the limiting factor in the imaging techniques based solely on the scattered light. In this dissertation, we develop a rigorous physical model to simulate the single nanoparticle optical response in a common-path wide-field interferometric microscopy (WIM) system. While the fundamental elements of the model can be used to analyze nanoparticle response in any generic wide-field imaging systems, we focus on imaging with a layered substrate (common-path interferometer) where specular reflection of illumination provides the reference light for interferometry. A robust physical model is quintessential in realizing the full potential of an optical system, and throughout this dissertation, we make use of it to benchmark our experimental findings, investigate the utility of various optical configurations, reconstruct weakly scattering nanoparticle images, as well as to characterize and discriminate interferometric nanoparticle responses. This study investigates the integration of advanced optical schemes in WIM with two main goals in mind: (i) increasing the visibility of low-index nanoscale particles via pupil function engineering, pushing the limit of sensitivity; (ii) improving the resolution of sub-diffraction-limited, low-index particle images in WIM via reconstruction strategies for shape and orientation information. We successfully demonstrate an overall ten-fold improvement in the visibility of the low-index sub-wavelength nanoparticles as well as up to two-fold extended spatial resolution of the interference-enhanced nanoparticle images. We also systematically examine the key factors that determine the signal in WIM. These factors include the particle type, size, layered substrate design, defocus and nanoparticle polarizability. We use the physical model to demonstrate how these factors determine the signal levels, and demonstrate how the layered substrate can be designed to optimize the overall signal, while defocus scan can be used to maximize it, as well as its signature can be utilized for particle discrimination purposes for both dielectric particles and resonant metallic particles. We introduce a machine learning based particle characterization algorithm that relies on supervised learning from model. The particle characterization is limited to discrimination based on nanosphere size and type in the scope of this dissertation.
Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen
2018-01-01
This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.
Source characterization of underground explosions from hydrodynamic-to-elastic coupling simulations
NASA Astrophysics Data System (ADS)
Chiang, A.; Pitarka, A.; Ford, S. R.; Ezzedine, S. M.; Vorobiev, O.
2017-12-01
A major improvement in ground motion simulation capabilities for underground explosion monitoring during the first phase of the Source Physics Experiment (SPE) is the development of a wave propagation solver that can propagate explosion generated non-linear near field ground motions to the far-field. The calculation is done using a hybrid modeling approach with a one-way hydrodynamic-to-elastic coupling in three dimensions where near-field motions are computed using GEODYN-L, a Lagrangian hydrodynamics code, and then passed to WPP, an elastic finite-difference code for seismic waveform modeling. The advancement in ground motion simulation capabilities gives us the opportunity to assess moment tensor inversion of a realistic volumetric source with near-field effects in a controlled setting, where we can evaluate the recovered source properties as a function of modeling parameters (i.e. velocity model) and can provide insights into previous source studies on SPE Phase I chemical shots and other historical nuclear explosions. For example the moment tensor inversion of far-field SPE seismic data demonstrated while vertical motions are well-modeled using existing velocity models large misfits still persist in predicting tangential shear wave motions from explosions. One possible explanation we can explore is errors and uncertainties from the underlying Earth model. Here we investigate the recovered moment tensor solution, particularly on the non-volumetric component, by inverting far-field ground motions simulated from physics-based explosion source models in fractured material, where the physics-based source models are based on the modeling of SPE-4P, SPE-5 and SPE-6 near-field data. The hybrid modeling approach provides new prospects in modeling explosion source and understanding the uncertainties associated with it.
A holistic approach to movement education in sport and fitness: a systems based model.
Polsgrove, Myles Jay
2012-01-01
The typical model used by movement professionals to enhance performance relies on the notion that a linear increase in load results in steady and progressive gains, whereby, the greater the effort, the greater the gains in performance. Traditional approaches to movement progression typically rely on the proper sequencing of extrinsically based activities to facilitate the individual in reaching performance objectives. However, physical rehabilitation or physical performance rarely progresses in such a linear fashion; instead they tend to evolve non-linearly and rather unpredictably. A dynamic system can be described as an entity that self-organizes into increasingly complex forms. Applying this view to the human body, practitioners could facilitate non-linear performance gains through a systems based programming approach. Utilizing a dynamic systems view, the Holistic Approach to Movement Education (HADME) is a model designed to optimize performance by accounting for non-linear and self-organizing traits associated with human movement. In this model, gains in performance occur through advancing individual perspectives and through optimizing sub-system performance. This inward shift of the focus of performance creates a sharper self-awareness and may lead to more optimal movements. Copyright © 2011 Elsevier Ltd. All rights reserved.
The physical basis and future of radiation therapy.
Bortfeld, T; Jeraj, R
2011-06-01
The remarkable progress in radiation therapy over the last century has been largely due to our ability to more effectively focus and deliver radiation to the tumour target volume. Physics discoveries and technology inventions have been an important driving force behind this progress. However, there is still plenty of room left for future improvements through physics, for example image guidance and four-dimensional motion management and particle therapy, as well as increased efficiency of more compact and cheaper technologies. Bigger challenges lie ahead of physicists in radiation therapy beyond the dose localisation problem, for example in the areas of biological target definition, improved modelling for normal tissues and tumours, advanced multicriteria and robust optimisation, and continuous incorporation of advanced technologies such as molecular imaging. The success of physics in radiation therapy has been based on the continued "fuelling" of the field with new discoveries and inventions from physics research. A key to the success has been the application of the rigorous scientific method. In spite of the importance of physics research for radiation therapy, too few physicists are currently involved in cutting-edge research. The increased emphasis on more "professionalism" in medical physics will tip the situation even more off balance. To prevent this from happening, we argue that medical physics needs more research positions, and more and better academic programmes. Only with more emphasis on medical physics research will the future of radiation therapy and other physics-related medical specialties look as bright as the past, and medical physics will maintain a status as one of the most exciting fields of applied physics.
The physical basis and future of radiation therapy
Bortfeld, T; Jeraj, R
2011-01-01
The remarkable progress in radiation therapy over the last century has been largely due to our ability to more effectively focus and deliver radiation to the tumour target volume. Physics discoveries and technology inventions have been an important driving force behind this progress. However, there is still plenty of room left for future improvements through physics, for example image guidance and four-dimensional motion management and particle therapy, as well as increased efficiency of more compact and cheaper technologies. Bigger challenges lie ahead of physicists in radiation therapy beyond the dose localisation problem, for example in the areas of biological target definition, improved modelling for normal tissues and tumours, advanced multicriteria and robust optimisation, and continuous incorporation of advanced technologies such as molecular imaging. The success of physics in radiation therapy has been based on the continued “fuelling” of the field with new discoveries and inventions from physics research. A key to the success has been the application of the rigorous scientific method. In spite of the importance of physics research for radiation therapy, too few physicists are currently involved in cutting-edge research. The increased emphasis on more “professionalism” in medical physics will tip the situation even more off balance. To prevent this from happening, we argue that medical physics needs more research positions, and more and better academic programmes. Only with more emphasis on medical physics research will the future of radiation therapy and other physics-related medical specialties look as bright as the past, and medical physics will maintain a status as one of the most exciting fields of applied physics. PMID:21606068
NASA Astrophysics Data System (ADS)
Vihma, T.; Pirazzini, R.; Fer, I.; Renfrew, I. A.; Sedlar, J.; Tjernström, M.; Lüpkes, C.; Nygård, T.; Notz, D.; Weiss, J.; Marsan, D.; Cheng, B.; Birnbaum, G.; Gerland, S.; Chechin, D.; Gascard, J. C.
2014-09-01
The Arctic climate system includes numerous highly interactive small-scale physical processes in the atmosphere, sea ice, and ocean. During and since the International Polar Year 2007-2009, significant advances have been made in understanding these processes. Here, these recent advances are reviewed, synthesized, and discussed. In atmospheric physics, the primary advances have been in cloud physics, radiative transfer, mesoscale cyclones, coastal, and fjordic processes as well as in boundary layer processes and surface fluxes. In sea ice and its snow cover, advances have been made in understanding of the surface albedo and its relationships with snow properties, the internal structure of sea ice, the heat and salt transfer in ice, the formation of superimposed ice and snow ice, and the small-scale dynamics of sea ice. For the ocean, significant advances have been related to exchange processes at the ice-ocean interface, diapycnal mixing, double-diffusive convection, tidal currents and diurnal resonance. Despite this recent progress, some of these small-scale physical processes are still not sufficiently understood: these include wave-turbulence interactions in the atmosphere and ocean, the exchange of heat and salt at the ice-ocean interface, and the mechanical weakening of sea ice. Many other processes are reasonably well understood as stand-alone processes but the challenge is to understand their interactions with and impacts and feedbacks on other processes. Uncertainty in the parameterization of small-scale processes continues to be among the greatest challenges facing climate modelling, particularly in high latitudes. Further improvements in parameterization require new year-round field campaigns on the Arctic sea ice, closely combined with satellite remote sensing studies and numerical model experiments.
NASA Astrophysics Data System (ADS)
Vihma, T.; Pirazzini, R.; Renfrew, I. A.; Sedlar, J.; Tjernström, M.; Nygård, T.; Fer, I.; Lüpkes, C.; Notz, D.; Weiss, J.; Marsan, D.; Cheng, B.; Birnbaum, G.; Gerland, S.; Chechin, D.; Gascard, J. C.
2013-12-01
The Arctic climate system includes numerous highly interactive small-scale physical processes in the atmosphere, sea ice, and ocean. During and since the International Polar Year 2007-2008, significant advances have been made in understanding these processes. Here these advances are reviewed, synthesized and discussed. In atmospheric physics, the primary advances have been in cloud physics, radiative transfer, mesoscale cyclones, coastal and fjordic processes, as well as in boundary-layer processes and surface fluxes. In sea ice and its snow cover, advances have been made in understanding of the surface albedo and its relationships with snow properties, the internal structure of sea ice, the heat and salt transfer in ice, the formation of super-imposed ice and snow ice, and the small-scale dynamics of sea ice. In the ocean, significant advances have been related to exchange processes at the ice-ocean interface, diapycnal mixing, tidal currents and diurnal resonance. Despite this recent progress, some of these small-scale physical processes are still not sufficiently understood: these include wave-turbulence interactions in the atmosphere and ocean, the exchange of heat and salt at the ice-ocean interface, and the mechanical weakening of sea ice. Many other processes are reasonably well understood as stand-alone processes but challenge is to understand their interactions with, and impacts and feedbacks on, other processes. Uncertainty in the parameterization of small-scale processes continues to be among the largest challenges facing climate modeling, and nowhere is this more true than in the Arctic. Further improvements in parameterization require new year-round field campaigns on the Arctic sea ice, closely combined with satellite remote sensing studies and numerical model experiments.
Plasmonics of 2D Nanomaterials: Properties and Applications
Li, Yu; Li, Ziwei; Chi, Cheng; Shan, Hangyong; Zheng, Liheng
2017-01-01
Plasmonics has developed for decades in the field of condensed matter physics and optics. Based on the classical Maxwell theory, collective excitations exhibit profound light‐matter interaction properties beyond classical physics in lots of material systems. With the development of nanofabrication and characterization technology, ultra‐thin two‐dimensional (2D) nanomaterials attract tremendous interest and show exceptional plasmonic properties. Here, we elaborate the advanced optical properties of 2D materials especially graphene and monolayer molybdenum disulfide (MoS2), review the plasmonic properties of graphene, and discuss the coupling effect in hybrid 2D nanomaterials. Then, the plasmonic tuning methods of 2D nanomaterials are presented from theoretical models to experimental investigations. Furthermore, we reveal the potential applications in photocatalysis, photovoltaics and photodetections, based on the development of 2D nanomaterials, we make a prospect for the future theoretical physics and practical applications. PMID:28852608
Advanced diffusion MRI and biomarkers in the central nervous system: a new approach.
Martín Noguerol, T; Martínez Barbero, J P
The introduction of diffusion-weighted sequences has revolutionized the detection and characterization of central nervous system (CNS) disease. Nevertheless, the assessment of diffusion studies of the CNS is often limited to qualitative estimation. Moreover, the pathophysiological complexity of the different entities that affect the CNS cannot always be correctly explained through classical models. The development of new models for the analysis of diffusion sequences provides numerous parameters that enable a quantitative approach to both diagnosis and prognosis as well as to monitoring the response to treatment; these parameters can be considered potential biomarkers of health and disease. In this update, we review the physical bases underlying diffusion studies and diffusion tensor imaging, advanced models for their analysis (intravoxel coherent motion and kurtosis), and the biological significance of the parameters derived. Copyright © 2017 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Progress in the understanding of physical, chemical, and biological processes influencing water quality, coupled with advancements in the collection and analysis of hydrologic data, provide opportunities for significant innovations in the manner and level with which watershed-scale processes may be ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Chen; Gupta, Vipul; Huang, Shenyan
The goal of this project is to model long-term creep performance for nickel-base superalloy weldments in high temperature power generation systems. The project uses physics-based modeling methodologies and algorithms for predicting alloy properties in heterogeneous material structures. The modeling methodology will be demonstrated on a gas turbine combustor liner weldment of Haynes 282 precipitate-strengthened nickel-base superalloy. The major developments are: (1) microstructure-property relationships under creep conditions and microstructure characterization (2) modeling inhomogeneous microstructure in superalloy weld (3) modeling mesoscale plastic deformation in superalloy weld and (4) a constitutive creep model that accounts for weld and base metal microstructure and theirmore » long term evolution. The developed modeling technology is aimed to provide a more efficient and accurate assessment of a material’s long-term performance compared with current testing and extrapolation methods. This modeling technology will also accelerate development and qualification of new materials in advanced power generation systems. This document is a final technical report for the project, covering efforts conducted from October 2014 to December 2016.« less
Study on miss distance based on projectile shock wave sensor
NASA Astrophysics Data System (ADS)
Gu, Guohua; Cheng, Gang; Zhang, Chenjun; Zhou, Lei
2017-05-01
The paper establishes miss distance models based on physical characteristic of shock-wave. The aerodynamic theory shows that the shock-wave of flying super-sonic projectile is generated for the projectile compressing and expending its ambient atmosphere. It advances getting miss distance according to interval of the first sensors, which first catches shock-wave, to solve the problem such as noise filtering on severe background, and signals of amplifier vibration dynamic disposal and electromagnetism compatibility, in order to improves the precision and reliability of gathering wave N signals. For the first time, it can identify the kinds of pills and firing units automatically, measure miss distance and azimuth when pills are firing. Application shows that the tactics and technique index is advanced all of the world.
Coupling biology and oceanography in models.
Fennel, W; Neumann, T
2001-08-01
The dynamics of marine ecosystems, i.e. the changes of observable chemical-biological quantities in space and time, are driven by biological and physical processes. Predictions of future developments of marine systems need a theoretical framework, i.e. models, solidly based on research and understanding of the different processes involved. The natural way to describe marine systems theoretically seems to be the embedding of chemical-biological models into circulation models. However, while circulation models are relatively advanced the quantitative theoretical description of chemical-biological processes lags behind. This paper discusses some of the approaches and problems in the development of consistent theories and indicates the beneficial potential of the coupling of marine biology and oceanography in models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Prescott, Steven; Coleman, Justin
This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communicationmore » and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.« less
Advanced teaching labs in physics - celebrating progress; challenges ahead
NASA Astrophysics Data System (ADS)
Peterson, Richard
A few examples of optical physics experiments may help us first reflect on significant progress on how advanced lab initiatives may now be more effectively developed, discussed, and disseminated - as opposed to only 10 or 15 years back. Many cooperative developments of the last decade are having profound impacts on advanced lab workers and students. Central to these changes are the programs of the Advanced Laboratory Physics Association (ALPhA) (Immersions, BFY conferences), AAPT (advlab-l server, ComPADRE, apparatus competitions, summer workshops/sessions), APS (Reichert Award, FEd activities and sessions), and the Jonathan F. Reichert Foundation (ALPhA support and institution matched equipment grants for Immersion participants). Broad NSF support has helped undergird several of these initiatives. Two of the most significant challenges before this new advanced lab community are (a) to somehow enhance funding opportunities for teaching equipment and apparatus in an era of minimal NSF equipment support, and (b) to help develop a more complementary relationship between research-based advanced lab pedagogies and the development of fresh physics experiments that help enable the mentoring and experimental challenge of our students.
iPSC-based drug screening for Huntington's disease.
Zhang, Ningzhe; Bailus, Barbara J; Ring, Karen L; Ellerby, Lisa M
2016-05-01
Huntington's disease (HD) is an autosomal dominant neurodegenerative disorder, caused by an expansion of the CAG repeat in exon 1 of the huntingtin gene. The disease generally manifests in middle age with both physical and mental symptoms. There are no effective treatments or cures and death usually occurs 10-20 years after initial symptoms. Since the original identification of the Huntington disease associated gene, in 1993, a variety of models have been created and used to advance our understanding of HD. The most recent advances have utilized stem cell models derived from HD-patient induced pluripotent stem cells (iPSCs) offering a variety of screening and model options that were not previously available. The discovery and advancement of technology to make human iPSCs has allowed for a more thorough characterization of human HD on a cellular and developmental level. The interaction between the genome editing and the stem cell fields promises to further expand the variety of HD cellular models available for researchers. In this review, we will discuss the history of Huntington's disease models, common screening assays, currently available models and future directions for modeling HD using iPSCs-derived from HD patients. This article is part of a Special Issue entitled SI: PSC and the brain. Copyright © 2015 Elsevier B.V. All rights reserved.
Development of the US3D Code for Advanced Compressible and Reacting Flow Simulations
NASA Technical Reports Server (NTRS)
Candler, Graham V.; Johnson, Heath B.; Nompelis, Ioannis; Subbareddy, Pramod K.; Drayna, Travis W.; Gidzak, Vladimyr; Barnhardt, Michael D.
2015-01-01
Aerothermodynamics and hypersonic flows involve complex multi-disciplinary physics, including finite-rate gas-phase kinetics, finite-rate internal energy relaxation, gas-surface interactions with finite-rate oxidation and sublimation, transition to turbulence, large-scale unsteadiness, shock-boundary layer interactions, fluid-structure interactions, and thermal protection system ablation and thermal response. Many of the flows have a large range of length and time scales, requiring large computational grids, implicit time integration, and large solution run times. The University of Minnesota NASA US3D code was designed for the simulation of these complex, highly-coupled flows. It has many of the features of the well-established DPLR code, but uses unstructured grids and has many advanced numerical capabilities and physical models for multi-physics problems. The main capabilities of the code are described, the physical modeling approaches are discussed, the different types of numerical flux functions and time integration approaches are outlined, and the parallelization strategy is overviewed. Comparisons between US3D and the NASA DPLR code are presented, and several advanced simulations are presented to illustrate some of novel features of the code.
Group Theory with Applications in Chemical Physics
NASA Astrophysics Data System (ADS)
Jacobs, Patrick
2005-10-01
Group Theory is an indispensable mathematical tool in many branches of chemistry and physics. This book provides a self-contained and rigorous account on the fundamentals and applications of the subject to chemical physics, assuming no prior knowledge of group theory. The first half of the book focuses on elementary topics, such as molecular and crystal symmetry, whilst the latter half is more advanced in nature. Discussions on more complex material such as space groups, projective representations, magnetic crystals and spinor bases, often omitted from introductory texts, are expertly dealt with. With the inclusion of numerous exercises and worked examples, this book will appeal to advanced undergraduates and beginning graduate students studying physical sciences and is an ideal text for use on a two-semester course. An introductory and advanced text that comprehensively covers fundamentals and applications of group theory in detail Suitable for a two-semester course with numerous worked examples and problems Includes several topics often omitted from introductory texts, such as rotation group, space groups and spinor bases
Engineering and physical sciences in oncology: challenges and opportunities
Mitchell, Michael J.; Jain, Rakesh K.; Langer, Robert
2017-01-01
The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas. PMID:29026204
An Improved Mathematical Scheme for LTE-Advanced Coexistence with FM Broadcasting Service
Al-hetar, Abdulaziz M.
2016-01-01
Power spectral density (PSD) overlapping analysis is considered the surest approach to evaluate feasibility of compatibility between wireless communication systems. In this paper, a new closed-form for the Interference Signal Power Attenuation (ISPA) is mathematically derived to evaluate interference caused from Orthogonal Frequency Division Multiplexing (OFDM)-based Long Term Evolution (LTE)-Advanced into Frequency Modulation (FM) broadcasting service. In this scheme, ISPA loss due to PSD overlapping of both OFDM-based LTE-Advanced and FM broadcasting service is computed. The proposed model can estimate power attenuation loss more precisely than the Advanced Minimum Coupling Loss (A-MCL) and approximate-ISPA methods. Numerical results demonstrate that the interference power is less than that obtained using the A-MCL and approximate ISPA methods by 2.8 and 1.5 dB at the co-channel and by 5.2 and 2.2 dB at the adjacent channel with null guard band, respectively. The outperformance of this scheme over the other methods leads to more diminishing in the required physical distance between the two systems which ultimately supports efficient use of the radio frequency spectrum. PMID:27855216
An Improved Mathematical Scheme for LTE-Advanced Coexistence with FM Broadcasting Service.
Shamsan, Zaid Ahmed; Al-Hetar, Abdulaziz M
2016-01-01
Power spectral density (PSD) overlapping analysis is considered the surest approach to evaluate feasibility of compatibility between wireless communication systems. In this paper, a new closed-form for the Interference Signal Power Attenuation (ISPA) is mathematically derived to evaluate interference caused from Orthogonal Frequency Division Multiplexing (OFDM)-based Long Term Evolution (LTE)-Advanced into Frequency Modulation (FM) broadcasting service. In this scheme, ISPA loss due to PSD overlapping of both OFDM-based LTE-Advanced and FM broadcasting service is computed. The proposed model can estimate power attenuation loss more precisely than the Advanced Minimum Coupling Loss (A-MCL) and approximate-ISPA methods. Numerical results demonstrate that the interference power is less than that obtained using the A-MCL and approximate ISPA methods by 2.8 and 1.5 dB at the co-channel and by 5.2 and 2.2 dB at the adjacent channel with null guard band, respectively. The outperformance of this scheme over the other methods leads to more diminishing in the required physical distance between the two systems which ultimately supports efficient use of the radio frequency spectrum.
NASA Astrophysics Data System (ADS)
Peck, Myron A.; Arvanitidis, Christos; Butenschön, Momme; Canu, Donata Melaku; Chatzinikolaou, Eva; Cucco, Andrea; Domenici, Paolo; Fernandes, Jose A.; Gasche, Loic; Huebert, Klaus B.; Hufnagl, Marc; Jones, Miranda C.; Kempf, Alexander; Keyl, Friedemann; Maar, Marie; Mahévas, Stéphanie; Marchal, Paul; Nicolas, Delphine; Pinnegar, John K.; Rivot, Etienne; Rochette, Sébastien; Sell, Anne F.; Sinerchia, Matteo; Solidoro, Cosimo; Somerfield, Paul J.; Teal, Lorna R.; Travers-Trolet, Morgan; van de Wolfshaar, Karen E.
2018-02-01
We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Ronald W.; Collins, Benjamin S.; Godfrey, Andrew T.
2016-12-09
In order to support engineering analysis of Virtual Environment for Reactor Analysis (VERA) model results, the Consortium for Advanced Simulation of Light Water Reactors (CASL) needs a tool that provides visualizations of HDF5 files that adhere to the VERAOUT specification. VERAView provides an interactive graphical interface for the visualization and engineering analyses of output data from VERA. The Python-based software provides instantaneous 2D and 3D images, 1D plots, and alphanumeric data from VERA multi-physics simulations.
NASA Astrophysics Data System (ADS)
Berendsen, Herman J. C.
2004-06-01
The simulation of physical systems requires a simplified, hierarchical approach which models each level from the atomistic to the macroscopic scale. From quantum mechanics to fluid dynamics, this book systematically treats the broad scope of computer modeling and simulations, describing the fundamental theory behind each level of approximation. Berendsen evaluates each stage in relation to its applications giving the reader insight into the possibilities and limitations of the models. Practical guidance for applications and sample programs in Python are provided. With a strong emphasis on molecular models in chemistry and biochemistry, this book will be suitable for advanced undergraduate and graduate courses on molecular modeling and simulation within physics, biophysics, physical chemistry and materials science. It will also be a useful reference to all those working in the field. Additional resources for this title including solutions for instructors and programs are available online at www.cambridge.org/9780521835275. The first book to cover the wide range of modeling and simulations, from atomistic to the macroscopic scale, in a systematic fashion Providing a wealth of background material, it does not assume advanced knowledge and is eminently suitable for course use Contains practical examples and sample programs in Python
Advances in SiC/SiC Composites for Aerospace Applications
NASA Technical Reports Server (NTRS)
DiCarlo, James A.
2006-01-01
In recent years, supported by a variety of materials development programs, NASA Glenn Research Center has significantly increased the thermostructural capability of SiC/SiC composite materials for high-temperature aerospace applications. These state-of-the-art advances have occurred in every key constituent of the composite: fiber, fiber coating, matrix, and environmental barrier coating, as well as processes for forming the fiber architectures needed for complex-shaped components such as turbine vanes for gas turbine engines. This presentation will briefly elaborate on the nature of these advances in terms of performance data and underlying mechanisms. Based on a list of first-order property goals for typical high-temperature applications, key data from a variety of laboratory tests are presented which demonstrate that the NASA-developed constituent materials and processes do indeed result in SiC/SiC systems with the desired thermal and structural capabilities. Remaining process and microstructural issues for further property enhancement are discussed, as well as on-going approaches at NASA to solve these issues. NASA efforts to develop physics-based property models that can be used not only for component design and life modeling, but also for constituent material and process improvement will also be discussed.
Physical therapy mandates by Medicare administrative contractors: effective or wasteful?
Fehring, Thomas K; Fehring, Keith; Odum, Susan M; Halsey, David
2013-10-01
Documentation of medical necessity for arthroplasty has come under scrutiny by Medicare. In some jurisdictions three months of physical therapy prior to arthroplasty has been mandated. The purpose of this study was to determine the efficacy and cost of this policy to treat advanced osteoarthritis. A systematic review was performed to assimilate efficacy data for physical therapy in patients with advanced osteoarthritis. The number of arthroplasties performed annually was obtained to calculate cost. Evidence-based studies documenting the efficacy of physical therapy in treating advanced arthritis are lacking with a potential cost of 36-68 million dollars. Physical therapy mandates by administrative contractors are not only ineffective but are costly without patient benefit. Medical necessity documentation should be driven by orthopedists not retroactively by Medicare contractors. Copyright © 2013 Elsevier Inc. All rights reserved.
The Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
NASA Astrophysics Data System (ADS)
Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick
2017-04-01
Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model, SUPERFLEX is capable of predicting runoff, soil moisture, and SMOS-like brightness temperature time series. Such a model is traditionally calibrated using only discharge measurements. In this study we designed a multi-objective calibration procedure based on both discharge measurements and SMOS-derived brightness temperature observations in order to evaluate the added value of remotely sensed soil moisture data in the calibration process. As a test case we set up the SUPERFLEX model for the large scale Murray-Darling catchment in Australia ( 1 Million km2). When compared to in situ soil moisture time series, model predictions show good agreement resulting in correlation coefficients exceeding 70 % and Root Mean Squared Errors below 1 %. When benchmarked with the physically based land surface model CLM, SUPERFLEX exhibits similar performance levels. By adapting the runoff routing function within the SUPERFLEX model, the predicted discharge results in a Nash Sutcliff Efficiency exceeding 0.7 over both the calibration and the validation periods.
Advanced quantitative measurement methodology in physics education research
NASA Astrophysics Data System (ADS)
Wang, Jing
The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.
Trends in Nuclear Explosion Monitoring Research & Development - A Physics Perspective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maceira, Monica; Blom, Philip Stephen; MacCarthy, Jonathan K.
This document entitled “Trends in Nuclear Explosion Monitoring Research and Development – A Physics Perspective” reviews the accessible literature, as it relates to nuclear explosion monitoring and the Comprehensive Nuclear-Test-Ban Treaty (CTBT, 1996), for four research areas: source physics (understanding signal generation), signal propagation (accounting for changes through physical media), sensors (recording the signals), and signal analysis (processing the signal). Over 40 trends are addressed, such as moving from 1D to 3D earth models, from pick-based seismic event processing to full waveform processing, and from separate treatment of mechanical waves in different media to combined analyses. Highlighted in the documentmore » for each trend are the value and benefit to the monitoring mission, key papers that advanced the science, and promising research and development for the future.« less
Leonard, Tammy; Shuval, Kerem; de Oliveira, Angela; Skinner, Celette Sugg; Eckel, Catherine; Murdoch, James C.
2014-01-01
Purpose To examine the relationship between physical activity stages of change and preferences for financial risk and time. Design A cross-sectional, community-based study. Setting A low-income, urban, African American neighborhood. Subjects 169 adults Measures Self-reported physical activity stages of change—precontemplation to maintenance, objectively measured BMI and waist circumference, and economic preferences for time and risk measured via incentivized economic experiments. Analysis Multivariable ordered logistic regression models were used to examine the association between physical activity stages of change and economic preferences while controlling for demographic characteristics of the individuals. Results Individuals who are more tolerant of financial risks (OR=1.31, p<0.05) and whose time preferences indicate more patience (OR=1.68, p<0.01) are more likely to be in a more advanced physical activity stage (e.g. from preparation to action). The likelihood of being in the maintenance stage increases by 5.6 and 10.9 percentage points for each 1 unit increase in financial risk tolerance or 1 unit increase in the time preference measure, respectively. Conclusions Greater tolerance of financial risk and more patient time preferences among this low-income ethnic minority population are associated with a more advanced physical activity stage. Further exploration is clearly warranted in larger and more representative samples. PMID:23448410
NASA Technical Reports Server (NTRS)
Hope, W. W.; Johnson, L. P.; Obl, W.; Stewart, A.; Harris, W. C.; Craig, R. D.
2000-01-01
Faculty in the Department of Physical, Environmental and Computer Sciences strongly believe in the concept that undergraduate research and research-related activities must be integrated into the fabric of our undergraduate Science and Technology curricula. High level skills, such as problem solving, reasoning, collaboration and the ability to engage in research, are learned for advanced study in graduate school or for competing for well paying positions in the scientific community. One goal of our academic programs is to have a pipeline of research activities from high school to four year college, to graduate school, based on the GISS Institute on Climate and Planets model.
NASA Technical Reports Server (NTRS)
Salas, Manuel D.
2007-01-01
The research program of the aerodynamics, aerothermodynamics and plasmadynamics discipline of NASA's Hypersonic Project is reviewed. Details are provided for each of its three components: 1) development of physics-based models of non-equilibrium chemistry, surface catalytic effects, turbulence, transition and radiation; 2) development of advanced simulation tools to enable increased spatial and time accuracy, increased geometrical complexity, grid adaptation, increased physical-processes complexity, uncertainty quantification and error control; and 3) establishment of experimental databases from ground and flight experiments to develop better understanding of high-speed flows and to provide data to validate and guide the development of simulation tools.
Xia, Ruiping; Stone, John R; Hoffman, Julie E; Klappa, Susan G
2016-03-01
In physical therapy, there is increasing focus on the need at the community level to promote health, eliminate disparities in health status, and ameliorate risk factors among underserved minorities. Community-based participatory research (CBPR) is the most promising paradigm for pursuing these goals. Community-based participatory research stresses equitable partnering of the community and investigators in light of local social, structural, and cultural elements. Throughout the research process, the CBPR model emphasizes coalition and team building that joins partners with diverse skills/expertise, knowledge, and sensitivities. This article presents core concepts and principles of CBPR and the rationale for its application in the management of health issues at the community level. Community-based participatory research is now commonly used to address public health issues. A literature review identified limited reports of its use in physical therapy research and services. A published study is used to illustrate features of CBPR for physical therapy. The purpose of this article is to promote an understanding of how physical therapists could use CBPR as a promising way to advance the profession's goals of community health and elimination of health care disparities, and social responsibility. Funding opportunities for the support of CBPR are noted. © 2016 American Physical Therapy Association.
Performance in Physical Science Education by Dint of Advance Organiser Model of Teaching
ERIC Educational Resources Information Center
Bency, P. B. Beulahbel; Raja, B. William Dharma
2010-01-01
Education should be made painless and the teaching must be made effective. Teaching is an activity, which is designed and performed for multiple objectives, in terms of changes in student behaviours. Models of teaching are just a blue print designed in advance for providing necessary structure and direction to the teacher for realizing the…
NASA Astrophysics Data System (ADS)
Turinsky, Paul J.; Kothe, Douglas B.
2016-05-01
The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry.
Peck, Kirk; Paschal, Karen; Black, Lisa; Nelson, Kelly
2014-01-01
Prior to graduation, students often express an interest to advance clinical and professional skills in teaching, research, administration, and various niche practice areas. The acquisition of advanced education in selected areas of practice is believed to improve employment opportunities, accelerate career advancement including eligibility for professional certifications, and contribute to personal satisfaction in the profession. The purpose of this paper is to (1) describe an innovative model of education, the Directed Practice Experience (DPE) elective, that incorporates a student-initiated learning process designed to achieve student-identified professional goals, and (2) report the outcomes for graduates who have completed the DPE in an entry-level program in physical therapy education. Students who met select criteria were eligible to complete a DPE. Applicants designed a 4- to 6-week clinical education experience consisting of stated rationale for personal and professional growth, examples of leadership and service, and self-directed objectives that are beyond entry-level expectations as measured by the revised Physical Therapist Clinical Performance Instrument, version 2006. Twenty-six students have completed DPEs since 2005. Fifty percent resulted in new academic partnerships. At least 25% of graduates now serve as clinical instructors for the entry-level program. Those who participated in DPEs have also completed post-graduate residencies, attained ABPTS Board certifications, authored peer-reviewed publications, and taught in both PT and residency programs. The DPE model allows qualified students to acquire advanced personal skills and knowledge prior to graduation in areas of professional practice that exceed entry-level expectations. The model is applicable to all CAPTE accredited physical therapy education programs and is especially beneficial for academic programs desiring to form new community partnerships for student clinical education.
A contact angle hysteresis model based on the fractal structure of contact line.
Wu, Shuai; Ma, Ming
2017-11-01
Contact angle is one of the most popular concept used in fields such as wetting, transport and microfludics. In practice, different contact angles such as equilibrium, receding and advancing contact angles are observed due to hysteresis. The connection among these contact angles is important in revealing the chemical and physical properties of surfaces related to wetting. Inspired by the fractal structure of contact line, we propose a single parameter model depicting the connection of the three angles. This parameter is decided by the fractal structure of the contact line. The results of this model agree with experimental observations. In certain cases, it can be reduced to other existing models. It also provides a new point of view in understanding the physical nature of the contact angle hysteresis. Interestingly, some counter-intuitive phenomena, such as the binary receding angles, are indicated in this model, which are waited to be validated by experiments. Copyright © 2017 Elsevier Inc. All rights reserved.
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
The role of data fusion in predictive maintenance using digital twin
NASA Astrophysics Data System (ADS)
Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih
2018-04-01
Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.
NASA Technical Reports Server (NTRS)
DiCarlo, James A.
2011-01-01
Under the Supersonics Project of the NASA Fundamental Aeronautics Program, modeling and experimental efforts are underway to develop generic physics-based tools to better implement lightweight ceramic matrix composites into supersonic engine components and to assure sufficient durability for these components in the engine environment. These activities, which have a crosscutting aspect for other areas of the Fundamental Aero program, are focusing primarily on improving the multi-directional design strength and rupture strength of high-performance SiC/SiC composites by advanced fiber architecture design. This presentation discusses progress in tool development with particular focus on the use of 2.5D-woven architectures and state-of-the-art constituents for a generic un-cooled SiC/SiC low-pressure turbine blade.
Advances in water resources research in the Upper Blue Nile basin and the way forward: A review
NASA Astrophysics Data System (ADS)
Dile, Yihun Taddele; Tekleab, Sirak; Ayana, Essayas K.; Gebrehiwot, Solomon G.; Worqlul, Abeyou W.; Bayabil, Haimanote K.; Yimam, Yohannes T.; Tilahun, Seifu A.; Daggupati, Prasad; Karlberg, Louise; Srinivasan, Raghavan
2018-05-01
The Upper Blue Nile basin is considered as the lifeline for ∼250 million people and contributes ∼50 Gm3/year of water to the Nile River. Poor land management practices in the Ethiopian highlands have caused a significant amount of soil erosion, thereby threatening the productivity of the Ethiopian agricultural system, degrading the health of the aquatic ecosystem, and shortening the life of downstream reservoirs. The Upper Blue Nile basin, because of limited research and availability of data, has been considered as the "great unknown." In the recent past, however, more research has been published. Nonetheless, there is no state-of-the-art review that presents research achievements, gaps and future directions. Hence, this paper aims to bridge this gap by reviewing the advances in water resources research in the basin while highlighting research needs and future directions. We report that there have been several research projects that try to understand the biogeochemical processes by collecting information on runoff, groundwater recharge, sediment transport, and tracers. Different types of hydrological models have been applied. Most of the earlier research used simple conceptual and statistical approaches for trend analysis and water balance estimations, mainly using rainfall and evapotranspiration data. More recent research has been using advanced semi-physically/physically based distributed hydrological models using high-resolution temporal and spatial data for diverse applications. We identified several research gaps and provided recommendations to address them. While we have witnessed advances in water resources research in the basin, we also foresee opportunities for further advancement. Incorporating the research findings into policy and practice will significantly benefit the development and transformation agenda of the Ethiopian government.
Leading institutional change: Implementing Studio in physics and beyond
NASA Astrophysics Data System (ADS)
Kohl, Patrick; Kuo, H. Vincent
2013-04-01
The Colorado School of Mines (CSM) teaches its first-year calculus-based introductory physics courses (Physics I and Physics II) using a hybrid of lecture and Studio physics. This model was first implemented in Physics I in 1997, and was established in Physics II in the fall of 2007. In this talk, we highlight the stages of the transformation from traditional to Studio, highlighting what has worked and what has not, and describing methods for assessment and evaluation. Results suggest that Studio has increased student performance and satisfaction despite an aggressive expansion of class sizes in the past few years. Gains have been concentrated mostly in problem-solving skills and exam performance (as opposed to conceptual survey gains), in contrast to what has sometimes been seen in other studies. Most recently, we as a department have been capitalizing on our successes with Studio physics to take a leadership role in disseminating advanced educational methods throughout CSM, both vertically (into upper division physics courses) and horizontally (into various departments outside of physics). We will briefly describe progress so far.
Benitez, Tanya J; Cherrington, Andrea L; Joseph, Rodney P; Keller, Colleen; Marcus, Bess; Meneses, Karen; Marquez, Becky; Pekmezi, Dorothy
2015-07-01
Latinas in the US report high levels of physical inactivity and are disproportionally burdened by related health conditions (eg, type 2 diabetes, obesity), highlighting the need for innovative strategies to reduce these disparities. A 1-month single-arm pretest-posttest design was utilized to assess the feasibility and acceptability of a culturally and linguistically adapted Internet-based physical activity intervention for Spanish-speaking Latinas. The intervention was based on the Social Cognitive Theory and the Transtheoretical Model. Changes in physical activity and related psychosocial variables were measured at baseline and the end of the 1-month intervention. The sample included 24 Latina adults (mean age, 35.17±11.22 years). Most (83.3%) were born outside the continental US. Intent-to-treat analyses showed a significant increase (P=.001) in self-reported moderate- to vigorous-intensity physical activity from a median of 12.5 min/wk at baseline to 67.5 min/wk at the 1-month assessment. Participants reported significant increases in self-efficacy as well as cognitive and behavioral processes of change. Nearly half of the participants (45.8%) reported advancing at least one stage of change during the course of the 1-month intervention. Findings support the feasibility and acceptability of using interactive Internet-based technology to promote physical activity among Latinas in Alabama.
NASA Astrophysics Data System (ADS)
Dietz, Laura
The Science Teaching Advancement through Modeling Physical Science (STAMPS) professional development workshop was evaluated for effectiveness in improving teachers' and students' content knowledge. Previous research has shown modeling to be an effective method of instruction for improving student and teacher content knowledge, evidenced by assessment scores. Data includes teacher scores on the Force Concept Inventory (FCI; Hestenes, Wells, & Swackhamer, 1992) and the Chemistry Concept Inventory (CCI; Jenkins, Birk, Bauer, Krause, & Pavelich, 2004), as well as student scores on a physics and chemistry assessment. Quantitative data is supported by teacher responses to a post workshop survey and classroom observations. Evaluation of the data shows that the STAMPS professional development workshop was successful in improving both student and teacher content knowledge. Conclusions and suggestions for future study are also included.
NASA Astrophysics Data System (ADS)
Polverino, Pierpaolo; Frisk, Erik; Jung, Daniel; Krysander, Mattias; Pianese, Cesare
2017-07-01
The present paper proposes an advanced approach for Polymer Electrolyte Membrane Fuel Cell (PEMFC) systems fault detection and isolation through a model-based diagnostic algorithm. The considered algorithm is developed upon a lumped parameter model simulating a whole PEMFC system oriented towards automotive applications. This model is inspired by other models available in the literature, with further attention to stack thermal dynamics and water management. The developed model is analysed by means of Structural Analysis, to identify the correlations among involved physical variables, defined equations and a set of faults which may occur in the system (related to both auxiliary components malfunctions and stack degradation phenomena). Residual generators are designed by means of Causal Computation analysis and the maximum theoretical fault isolability, achievable with a minimal number of installed sensors, is investigated. The achieved results proved the capability of the algorithm to theoretically detect and isolate almost all faults with the only use of stack voltage and temperature sensors, with significant advantages from an industrial point of view. The effective fault isolability is proved through fault simulations at a specific fault magnitude with an advanced residual evaluation technique, to consider quantitative residual deviations from normal conditions and achieve univocal fault isolation.
NASA Astrophysics Data System (ADS)
Sakurai, Yoshinori; Tanaka, Hiroki; Takata, Takushi; Fujimoto, Nozomi; Suzuki, Minoru; Masunaga, Shinichiro; Kinashi, Yuko; Kondo, Natsuko; Narabayashi, Masaru; Nakagawa, Yosuke; Watanabe, Tsubasa; Ono, Koji; Maruhashi, Akira
2015-07-01
At the Kyoto University Research Reactor Institute (KURRI), a clinical study of boron neutron capture therapy (BNCT) using a neutron irradiation facility installed at the research nuclear reactor has been regularly performed since February 1990. As of November 2014, 510 clinical irradiations were carried out using the reactor-based system. The world's first accelerator-based neutron irradiation system for BNCT clinical irradiation was completed at this institute in early 2009, and the clinical trial using this system was started in 2012. A shift of BCNT from special particle therapy to a general one is now in progress. To promote and support this shift, improvements to the irradiation system, as well as its preparation, and improvements in the physical engineering and the medical physics processes, such as dosimetry systems and quality assurance programs, must be considered. The recent advances in BNCT at KURRI are reported here with a focus on physical engineering and medical physics topics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Rui
The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactormore » concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.« less
Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows
NASA Technical Reports Server (NTRS)
West, Jeff; Wright, Jeffrey; Thakur, Siddharth; Luke, Ed; Grinstead, Nathan
2012-01-01
The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries.
Dynamically adaptive data-driven simulation of extreme hydrological flows
NASA Astrophysics Data System (ADS)
Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint
2018-02-01
Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.
Kawamura, Kazuya; Kobayashi, Yo; Fujie, Masakatsu G
2007-01-01
Medical technology has advanced with the introduction of robot technology, making previous medical treatments that were very difficult far more possible. However, operation of a surgical robot demands substantial training and continual practice on the part of the surgeon because it requires difficult techniques that are different from those of traditional surgical procedures. We focused on a simulation technology based on the physical characteristics of organs. In this research, we proposed the development of surgical simulation, based on a physical model, for intra-operative navigation by a surgeon. In this paper, we describe the design of our system, in particular our organ deformation calculator. The proposed simulation system consists of an organ deformation calculator and virtual slave manipulators. We obtained adequate experimental results of a target node at a nearby point of interaction, because this point ensures better accuracy for our simulation model. The next research step would be to focus on a surgical environment in which internal organ models would be integrated into a slave simulation system.
NASA Astrophysics Data System (ADS)
2014-03-01
WE RECOMMEND Collider: step inside the world's greatest experiment A great exhibition at the Science Museum in London Hero Steam Turbine Superb engine model gets up to 2500 rpm Most of Our Universe is Missing BBC video explores the dark truth Serving the Reich Science and morality in Nazi Germany The Good Research Guide A non-specialist book for teachers starting out in education research WORTH A LOOK Breakthrough to CLIL for Physics A book based on a physics curriculum for non-English students WEB WATCH Electric cycles online: patterns of use APPS The virtual laboratory advances personal skills
Space Electrochemical Research and Technology (SERT)
NASA Technical Reports Server (NTRS)
1987-01-01
The conference provided a forum to assess critical needs and technologies for the NASA electrochemical energy conversion and storage program. It was aimed at providing guidance to NASA on the appropriate direction and emphasis of that program. A series of related overviews were presented in the areas of NASA advanced mission models (space stations, low and geosynchronous Earth orbit missions, planetary missions, and space transportation). Papers were presented and workshops conducted in a variety of technical areas, including advanced rechargeables, advanced concepts, critical physical electrochemical issues, and modeling.
Modeling of Microstructure Evolution During Alloy Solidification
NASA Astrophysics Data System (ADS)
Zhu, Mingfang; Pan, Shiyan; Sun, Dongke
In recent years, considerable advances have been achieved in the numerical modeling of microstructure evolution during solidification. This paper presents the models based on the cellular automaton (CA) technique and lattice Boltzmann method (LBM), which can reproduce a wide variety of solidification microstructure features observed experimentally with an acceptable computational efficiency. The capabilities of the models are addressed by presenting representative examples encompassing a broad variety of issues, such as the evolution of dendritic structure and microsegregation in two and three dimensions, dendritic growth in the presence of convection, divorced eutectic solidification of spheroidal graphite irons, and gas porosity formation. The simulations offer insights into the underlying physics of microstructure formation during alloy solidification.
Cross-sectional mapping for refined beam elements with applications to shell-like structures
NASA Astrophysics Data System (ADS)
Pagani, A.; de Miguel, A. G.; Carrera, E.
2017-06-01
This paper discusses the use of higher-order mapping functions for enhancing the physical representation of refined beam theories. Based on the Carrera unified formulation (CUF), advanced one-dimensional models are formulated by expressing the displacement field as a generic expansion of the generalized unknowns. According to CUF, a novel physically/geometrically consistent model is devised by employing Legendre-like polynomial sets to approximate the generalized unknowns at the cross-sectional level, whereas a local mapping technique based on the blending functions method is used to describe the exact physical boundaries of the cross-section domain. Classical and innovative finite element methods, including hierarchical p-elements and locking-free integration schemes, are utilized to solve the governing equations of the unified beam theory. Several numerical applications accounting for small displacements/rotations and strains are discussed, including beam structures with cross-sectional curved edges, cylindrical shells, and thin-walled aeronautical wing structures with reinforcements. The results from the proposed methodology are widely assessed by comparisons with solutions from the literature and commercial finite element software tools. The attention is focussed on the high computational efficiency and the marked capabilities of the present beam model, which can deal with a broad spectrum of structural problems with unveiled accuracy in terms of geometrical representation of the domain boundaries.
Theoretical studies of the physics of the solar atmosphere
NASA Technical Reports Server (NTRS)
Hollweg, Joseph V.
1992-01-01
Significant advances in our theoretical basis for understanding several physical processes related to dynamical phenomena on the sun were achieved. We have advanced a new model for spicules and fibrils. We have provided a simple physical view of resonance absorption of MHD surface waves; this allowed an approximate mathematical procedure for obtaining a wealth of new analytical results which we applied to coronal heating and p-mode absorption at magnetic regions. We provided the first comprehensive models for the heating and acceleration of the transition region, corona, and solar wind. We provided a new view of viscosity under coronal conditions. We provided new insights into Alfven wave propagation in the solar atmosphere. And recently we have begun work in a new direction: parametric instabilities of Alfven waves.
A test harness for accelerating physics parameterization advancements into operations
NASA Astrophysics Data System (ADS)
Firl, G. J.; Bernardet, L.; Harrold, M.; Henderson, J.; Wolff, J.; Zhang, M.
2017-12-01
The process of transitioning advances in parameterization of sub-grid scale processes from initial idea to implementation is often much quicker than the transition from implementation to use in an operational setting. After all, considerable work must be undertaken by operational centers to fully test, evaluate, and implement new physics. The process is complicated by the scarcity of like-to-like comparisons, availability of HPC resources, and the ``tuning problem" whereby advances in physics schemes are difficult to properly evaluate without first undertaking the expensive and time-consuming process of tuning to other schemes within a suite. To address this process shortcoming, the Global Model TestBed (GMTB), supported by the NWS NGGPS project and undertaken by the Developmental Testbed Center, has developed a physics test harness. It implements the concept of hierarchical testing, where the same code can be tested in model configurations of varying complexity from single column models (SCM) to fully coupled, cycled global simulations. Developers and users may choose at which level of complexity to engage. Several components of the physics test harness have been implemented, including a SCM and an end-to-end workflow that expands upon the one used at NOAA/EMC to run the GFS operationally, although the testbed components will necessarily morph to coincide with changes to the operational configuration (FV3-GFS). A standard, relatively user-friendly interface known as the Interoperable Physics Driver (IPD) is available for physics developers to connect their codes. This prerequisite exercise allows access to the testbed tools and removes a technical hurdle for potential inclusion into the Common Community Physics Package (CCPP). The testbed offers users the opportunity to conduct like-to-like comparisons between the operational physics suite and new development as well as among multiple developments. GMTB staff have demonstrated use of the testbed through a comparison between the 2017 operational GFS suite and one containing the Grell-Freitas convective parameterization. An overview of the physics test harness and its early use will be presented.
Hybrid Modeling Improves Health and Performance Monitoring
NASA Technical Reports Server (NTRS)
2007-01-01
Scientific Monitoring Inc. was awarded a Phase I Small Business Innovation Research (SBIR) project by NASA's Dryden Flight Research Center to create a new, simplified health-monitoring approach for flight vehicles and flight equipment. The project developed a hybrid physical model concept that provided a structured approach to simplifying complex design models for use in health monitoring, allowing the output or performance of the equipment to be compared to what the design models predicted, so that deterioration or impending failure could be detected before there would be an impact on the equipment's operational capability. Based on the original modeling technology, Scientific Monitoring released I-Trend, a commercial health- and performance-monitoring software product named for its intelligent trending, diagnostics, and prognostics capabilities, as part of the company's complete ICEMS (Intelligent Condition-based Equipment Management System) suite of monitoring and advanced alerting software. I-Trend uses the hybrid physical model to better characterize the nature of health or performance alarms that result in "no fault found" false alarms. Additionally, the use of physical principles helps I-Trend identify problems sooner. I-Trend technology is currently in use in several commercial aviation programs, and the U.S. Air Force recently tapped Scientific Monitoring to develop next-generation engine health-management software for monitoring its fleet of jet engines. Scientific Monitoring has continued the original NASA work, this time under a Phase III SBIR contract with a joint NASA-Pratt & Whitney aviation security program on propulsion-controlled aircraft under missile-damaged aircraft conditions.
Advanced Ground Systems Maintenance Physics Models for Diagnostics Project
NASA Technical Reports Server (NTRS)
Harp, Janicce Leshay
2014-01-01
The project will use high-fidelity physics models and simulations to simulate real-time operations of cryogenic and systems and calculate the status/health of the systems. The project enables the delivery of system health advisories to ground system operators. The capability will also be used to conduct planning and analysis of cryogenic system operations.
Research and technology, fiscal year 1982
NASA Technical Reports Server (NTRS)
1982-01-01
Advanced studies are reviewed. Atmospheric sciences, magnetospheric physics, solar physics, gravitational physics, astronomy, and materials processing in space comprise the research programs. Large space systems, propulsion technology, materials and processes, electrical/electronic systems, data bases/design criteria, and facilities development comprise the technology development activities.
Accelerating advances in continental domain hydrologic modeling
Archfield, Stacey A.; Clark, Martyn; Arheimer, Berit; Hay, Lauren E.; McMillan, Hilary; Kiang, Julie E.; Seibert, Jan; Hakala, Kirsti; Bock, Andrew R.; Wagener, Thorsten; Farmer, William H.; Andreassian, Vazken; Attinger, Sabine; Viglione, Alberto; Knight, Rodney; Markstrom, Steven; Over, Thomas M.
2015-01-01
In the past, hydrologic modeling of surface water resources has mainly focused on simulating the hydrologic cycle at local to regional catchment modeling domains. There now exists a level of maturity among the catchment, global water security, and land surface modeling communities such that these communities are converging toward continental domain hydrologic models. This commentary, written from a catchment hydrology community perspective, provides a review of progress in each community toward this achievement, identifies common challenges the communities face, and details immediate and specific areas in which these communities can mutually benefit one another from the convergence of their research perspectives. Those include: (1) creating new incentives and infrastructure to report and share model inputs, outputs, and parameters in data services and open access, machine-independent formats for model replication or reanalysis; (2) ensuring that hydrologic models have: sufficient complexity to represent the dominant physical processes and adequate representation of anthropogenic impacts on the terrestrial water cycle, a process-based approach to model parameter estimation, and appropriate parameterizations to represent large-scale fluxes and scaling behavior; (3) maintaining a balance between model complexity and data availability as well as uncertainties; and (4) quantifying and communicating significant advancements toward these modeling goals.
SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.
Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi
2010-01-01
Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.
NASA Astrophysics Data System (ADS)
Lien, F. S.; Yee, E.; Ji, H.; Keats, A.; Hsieh, K. J.
2006-06-01
The release of chemical, biological, radiological, or nuclear (CBRN) agents by terrorists or rogue states in a North American city (densely populated urban centre) and the subsequent exposure, deposition and contamination are emerging threats in an uncertain world. The modeling of the transport, dispersion, deposition and fate of a CBRN agent released in an urban environment is an extremely complex problem that encompasses potentially multiple space and time scales. The availability of high-fidelity, time-dependent models for the prediction of a CBRN agent's movement and fate in a complex urban environment can provide the strongest technical and scientific foundation for support of Canada's more broadly based effort at advancing counter-terrorism planning and operational capabilities.The objective of this paper is to report the progress of developing and validating an integrated, state-of-the-art, high-fidelity multi-scale, multi-physics modeling system for the accurate and efficient prediction of urban flow and dispersion of CBRN (and other toxic) materials discharged into these flows. Development of this proposed multi-scale modeling system will provide the real-time modeling and simulation tool required to predict injuries, casualties and contamination and to make relevant decisions (based on the strongest technical and scientific foundations) in order to minimize the consequences of a CBRN incident in a populated centre.
Janssens, K; Van Brecht, A; Zerihun Desta, T; Boonen, C; Berckmans, D
2004-06-01
The present paper outlines a modeling approach, which has been developed to model the internal dynamics of heat and moisture transfer in an imperfectly mixed ventilated airspace. The modeling approach, which combines the classical heat and moisture balance differential equations with the use of experimental time-series data, provides a physically meaningful description of the process and is very useful for model-based control purposes. The paper illustrates how the modeling approach has been applied to a ventilated laboratory test room with internal heat and moisture production. The results are evaluated and some valuable suggestions for future research are forwarded. The modeling approach outlined in this study provides an ideal form for advanced model-based control system design. The relatively low number of parameters makes it well suited for model-based control purposes, as a limited number of identification experiments is sufficient to determine these parameters. The model concept provides information about the air quality and airflow pattern in an arbitrary building. By using this model as a simulation tool, the indoor air quality and airflow pattern can be optimized.
Introduction: Cardiovascular physics
NASA Astrophysics Data System (ADS)
Wessel, Niels; Kurths, Jürgen; Ditto, William; Bauernschmitt, Robert
2007-03-01
The number of patients suffering from cardiovascular diseases increases unproportionally high with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop high-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Cardiovascular physics-which interconnects medicine, physics, biology, engineering, and mathematics-is based on interdisciplinary collaboration of specialists from the above scientific fields and attempts to gain deeper insights into pathophysiology and treatment options. This paper summarizes advances in cardiovascular physics with emphasis on a workshop held in Bad Honnef, Germany, in May 2005. The meeting attracted an interdisciplinary audience and led to a number of papers covering the main research fields of cardiovascular physics, including data analysis, modeling, and medical application. The variety of problems addressed by this issue underlines the complexity of the cardiovascular system. It could be demonstrated in this Focus Issue, that data analyses and modeling methods from cardiovascular physics have the ability to lead to significant improvements in different medical fields. Consequently, this Focus Issue of Chaos is a status report that may invite all interested readers to join the community and find competent discussion and cooperation partners.
Crowell, Michael S; Dedekam, Erik A; Johnson, Michael R; Dembowski, Scott C; Westrick, Richard B; Goss, Donald L
2016-10-01
While advanced diagnostic imaging is a large contributor to the growth in health care costs, direct-access to physical therapy is associated with decreased rates of diagnostic imaging. No study has systematically evaluated with evidence-based criteria the appropriateness of advanced diagnostic imaging, including magnetic resonance imaging (MRI), when ordered by physical therapists. The primary purpose of this study was to describe the appropriateness of magnetic resonance imaging (MRI) or magnetic resonance arthrogram (MRA) exams ordered by physical therapists in a direct-access sports physical therapy clinic. Retrospective observational study of practice. Greater than 80% of advanced diagnostic imaging orders would have an American College of Radiology (ACR) Appropriateness Criteria rating of greater than 6, indicating an imaging order that is usually appropriate. A 2-year retrospective analysis identified 108 MRI/MRA examination orders from four physical therapists. A board-certified radiologist determined the appropriateness of each order based on ACR appropriateness criteria. The principal investigator and co-investigator radiologist assessed agreement between the clinical diagnosis and MRI/surgical findings. Knee (31%) and shoulder (25%) injuries were the most common. Overall, 55% of injuries were acute. The mean ACR rating was 7.7; scores from six to nine have been considered appropriate orders and higher ratings are better. The percentage of orders complying with ACR appropriateness criteria was 83.2%. Physical therapist's clinical diagnosis was confirmed by MRI/MRA findings in 64.8% of cases and was confirmed by surgical findings in 90% of cases. Physical therapists providing musculoskeletal primary care in a direct-access sports physical therapy clinic appropriately ordered advanced diagnostic imaging in over 80% of cases. Future research should prospectively compare physical therapist appropriateness and utilization to other groups of providers and explore the effects of physical therapist imaging privileging on outcomes. Diagnosis, Level 3.
WE-F-211-01: The Evolving Landscape of Scientific Publishing.
Armato, S; Hendee, W; Marshall, C; Curran, B
2012-06-01
The dissemination of scientific advances has changed little since the first peer-reviewed journal was published in 1665 - that is, until this past decade. The print journal, delivered by mail and stored on office shelves and in library reading rooms around the world, has been transformed by immediate, on-demand access to scientific discovery in electronic form. At the same time, the producers and consumers of that scientific content have greatly increased in number, and the balance between supply and demand has required innovations in the world of scientific publishing. In light of technological advances and societal expectations, the dissemination of scientific knowledge has assumed a new form, one that is dynamic and rapidly changing. The academic medical physicist must understand this evolution to ensure that appropriate decisions are made with regard to journal submission strategies and that relevant information on new findings is obtained in a timely manner. Medical Physics is adapting to these changes in substantive ways. This new scientific publishing landscape has implications for subscription models, targeted access through semantic enrichment, user interactivity with content, customized content delivery, and advertising opportunities. Many organizations, including the AAPM, depend on scientific publishing as a significant source of revenue, but web-based delivery raises the expectation that access should be free and threatens this model. The purpose of this symposium is to explore the factors that have contributed to the current state of scientific publishing, to anticipate future directions in this arena, and to convey how medical physicists may benefit from the expanded opportunities, both as authors and as readers. 1. To appreciate the importance of scientific and clinical practice communication for the advancement of the medical physics field 2. To understand the roles of the Editorial Board and the Journal Business Management Committee in the promotion and advancement of Medical Physics 3. To explore technology-driven content delivery mechanisms and their role in facilitating content access and driving content usage 4. To understand the potential benefits and pitfalls of various economic and editorial models of scientific publications and the recent shifts away from the traditional role of libraries. © 2012 American Association of Physicists in Medicine.
Wind-US Code Physical Modeling Improvements to Complement Hypersonic Testing and Evaluation
NASA Technical Reports Server (NTRS)
Georgiadis, Nicholas J.; Yoder, Dennis A.; Towne, Charles S.; Engblom, William A.; Bhagwandin, Vishal A.; Power, Greg D.; Lankford, Dennis W.; Nelson, Christopher C.
2009-01-01
This report gives an overview of physical modeling enhancements to the Wind-US flow solver which were made to improve the capabilities for simulation of hypersonic flows and the reliability of computations to complement hypersonic testing. The improvements include advanced turbulence models, a bypass transition model, a conjugate (or closely coupled to vehicle structure) conduction-convection heat transfer capability, and an upgraded high-speed combustion solver. A Mach 5 shock-wave boundary layer interaction problem is used to investigate the benefits of k- s and k-w based explicit algebraic stress turbulence models relative to linear two-equation models. The bypass transition model is validated using data from experiments for incompressible boundary layers and a Mach 7.9 cone flow. The conjugate heat transfer method is validated for a test case involving reacting H2-O2 rocket exhaust over cooled calorimeter panels. A dual-mode scramjet configuration is investigated using both a simplified 1-step kinetics mechanism and an 8-step mechanism. Additionally, variations in the turbulent Prandtl and Schmidt numbers are considered for this scramjet configuration.
NASA Astrophysics Data System (ADS)
Huang, Shih-Chieh Douglas
In this dissertation, I investigate the effects of a grounded learning experience on college students' mental models of physics systems. The grounded learning experience consisted of a priming stage and an instruction stage, and within each stage, one of two different types of visuo-haptic representation was applied: visuo-gestural simulation (visual modality and gestures) and visuo-haptic simulation (visual modality, gestures, and somatosensory information). A pilot study involving N = 23 college students examined how using different types of visuo-haptic representation in instruction affected people's mental model construction for physics systems. Participants' abilities to construct mental models were operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Findings from this pilot study revealed that, while both simulations significantly improved participants' mental modal construction for physics systems, visuo-haptic simulation was significantly better than visuo-gestural simulation. In addition, clinical interviews suggested that participants' mental model construction for physics systems benefited from receiving visuo-haptic simulation in a tutorial prior to the instruction stage. A dissertation study involving N = 96 college students examined how types of visuo-haptic representation in different applications support participants' mental model construction for physics systems. Participant's abilities to construct mental models were again operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Participants' physics misconceptions were also measured before and after the grounded learning experience. Findings from this dissertation study not only revealed that visuo-haptic simulation was significantly more effective in promoting mental model construction and remedying participants' physics misconceptions than visuo-gestural simulation, they also revealed that visuo-haptic simulation was more effective during the priming stage than during the instruction stage. Interestingly, the effects of visuo-haptic simulation in priming and visuo-haptic simulation in instruction on participants' pretest-to-posttest gain scores for a basic physics system appeared additive. These results suggested that visuo-haptic simulation is effective in physics learning, especially when it is used during the priming stage.
Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction
Mach, J. C.; Budrow, C. J.; Pagan, D. C.; ...
2017-03-15
Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present paper, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to developmore » significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. Finally, the experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.« less
Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mach, J. C.; Budrow, C. J.; Pagan, D. C.
Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present paper, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to developmore » significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. Finally, the experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.« less
Development of the Joint NASA/NCAR General Circulation Model
NASA Technical Reports Server (NTRS)
Lin, S.-J.; Rood, R. B.
1999-01-01
The Data Assimilation Office at NASA/Goddard Space Flight Center is collaborating with NCAR/CGD in an ambitious proposal for the development of a unified climate, numerical weather prediction, and chemistry transport model which is suitable for global data assimilation of the physical and chemical state of the Earth's atmosphere. A prototype model based on the NCAR CCM3 physics and the NASA finite-volume dynamical core has been built. A unique feature of the NASA finite-volume dynamical core is its advanced tracer transport algorithm on the floating Lagrangian control-volume coordinate. The model currently has a highly idealized ozone production/loss chemistry derived from the observed 2D (latitude-height) climatology of the recent decades. Nevertheless, the simulated horizontal wave structure of the total ozone is in good qualitative agreement with the observed (TOMS). Long term climate simulations and NWP experiments have been carried out. Current up to date status and futur! e plan will be discussed in the conference.
Simple universal models capture all classical spin physics.
De las Cuevas, Gemma; Cubitt, Toby S
2016-03-11
Spin models are used in many studies of complex systems because they exhibit rich macroscopic behavior despite their microscopic simplicity. Here, we prove that all the physics of every classical spin model is reproduced in the low-energy sector of certain "universal models," with at most polynomial overhead. This holds for classical models with discrete or continuous degrees of freedom. We prove necessary and sufficient conditions for a spin model to be universal and show that one of the simplest and most widely studied spin models, the two-dimensional Ising model with fields, is universal. Our results may facilitate physical simulations of Hamiltonians with complex interactions. Copyright © 2016, American Association for the Advancement of Science.
Hanrahan, Nancy P.; Wu, Evan; Kelly, Deena; Aiken, Linda H.; Blank, Michael B.
2011-01-01
Individuals with serious mental illness have greater risk for contracting HIV, multiple morbidities, and die 25 years younger than the general population. This high need and high cost subgroup face unique barriers to accessing required health care in the current health care system. The effectiveness of an advanced practice nurse model of care management was assessed in a four-year random controlled trial. Results are reported in this paper. In a four-year random controlled trial, a total of 238 community-dwelling individuals with HIV and serious mental illness (SMI) were randomly assigned to an intervention group (n=128) or to a control group (n=110). Over 12 months, the intervention group received care management from advanced practice psychiatric nurse, and the control group received usual care. The intervention group showed significant improvement in depression (P=.012) and the physical component of health-related quality of life (P=.03) from baseline to 12 months. The advanced practice psychiatric nurse intervention is a model of care that holds promise for a higher quality of care and outcomes for this vulnerable population. PMID:21935499
NASA Astrophysics Data System (ADS)
Williams, C. Jason; Pierson, Frederick B.; Al-Hamdan, Osama Z.; Robichaud, Peter R.; Nearing, Mark A.; Hernandez, Mariano; Weltz, Mark A.; Spaeth, Kenneth E.; Goodrich, David C.
2017-04-01
Fire activity continues to increase in semi-arid regions around the globe. Private and governmental land management entities are challenged with predicting and mitigating post-fire hydrologic and erosion responses on these landscapes. For more than a decade, a team of scientists with the US Department of Agriculture has collaborated on extensive post-fire hydrologic field research and the application of field research to development of post-fire hydrology and erosion predictive technologies. Experiments funded through this research investigated the impacts of fire on vegetation and soils and the effects of these fire-induced changes on infiltration, runoff generation, erodibility, and soil erosion processes. The distribution of study sites spans diverse topography across grassland, shrubland, and woodland landscapes throughout the western United States. Knowledge gleaned from the extensive field experiments was applied to develop and enhance physically-based models for hillslope- to watershed-scale runoff and erosion prediction. Our field research and subsequent data syntheses have identified key knowledge gaps and challenges regarding post-fire hydrology and erosion modeling. Our presentation details some consistent trends across a diverse domain and varying landscape conditions based on our extensive field campaigns. We demonstrate how field data have advanced our understanding of post-fire hydrology and erosion for semi-arid landscapes and highlight remaining key knowledge gaps. Lastly, we briefly show how our well-replicated experimental methodologies have contributed to advancements in hydrologic and erosion model development for the post-fire environment.
NDE in aerospace-requirements for science, sensors and sense.
Heyman, J S
1989-01-01
The complexity of modern NDE (nondestructive evaluation) arises from four main factors: quantitative measurement, science, physical models for computational analysis, realistic interfacing with engineering decisions, and direct access to management priorities. Recent advances in the four factors of NDE are addressed. Physical models of acoustic propagation are presented that have led to the development of measurement technologies advancing the ability to assure that materials and structures will perform a design. In addition, a brief discussion is given of current research for future mission needs such as smart structures that sense their own health. Such advances permit projects to integrate design for inspection into their plans, bringing NDE into engineering and management priorities. The measurement focus is on ultrasonics with generous case examples. Problem solutions highlighted include critical stress in fasteners, residual stress in steel, NDE laminography, and solid rocket motor NDE.
NDE in aerospace - Requirements for science, sensors and sense
NASA Technical Reports Server (NTRS)
Heyman, Joseph S.
1989-01-01
The complexity of modern nondestructive evaluation (NDE) arises from four main factors: quantitative measurement science, physical models for computational analysis, realistic interfacing with engineering decisions, and direct access to management priorities. Recent advances in the four factors of NDE are addressed. Physical models of acoustic propagation are presented that have led to the development of measurement technologies advancing the ability to assure that materials and structures will perform as designed. In addition, a brief discussion is given of current research for future mission needs such as smart structures that sense their own health. Such advances permit projects to integrate design for inspection into their plans, bringing NDE into engineering and management priorities. The measurement focus is on ultrasonics with generous case examples. Problem solutions highlighted include critical stress in fasteners, residual stress in steel, NDE laminography, and solid rocket motor NDE.
Catchment scale afforestation for mitigating flooding
NASA Astrophysics Data System (ADS)
Barnes, Mhari; Quinn, Paul; Bathurst, James; Birkinshaw, Stephen
2016-04-01
After the 2013-14 floods in the UK there were calls to 'forest the uplands' as a solution to reducing flood risk across the nation. At present, 1 in 6 homes in Britain are at risk of flooding and current EU legislation demands a sustainable, 'nature-based solution'. However, the role of forests as a natural flood management technique remains highly controversial, due to a distinct lack of robust evidence into its effectiveness in reducing flood risk during extreme events. SHETRAN, physically-based spatially-distributed hydrological models of the Irthing catchment and Wark forest sub-catchments (northern England) have been developed in order to test the hypothesis of the effect trees have on flood magnitude. The advanced physically-based models have been designed to model scale-related responses from 1, through 10, to 100km2, a first study of the extent to which afforestation and woody debris runoff attenuation features (RAFs) may help to mitigate floods at the full catchment scale (100-1000 km2) and on a national basis. Furthermore, there is a need to analyse the extent to which land management practices, and the installation of nature-based RAFs, such as woody debris dams, in headwater catchments can attenuate flood-wave movement, and potentially reduce downstream flood risk. The impacts of riparian planting and the benefits of adding large woody debris of several designs and on differing sizes of channels has also been simulated using advanced hydrodynamic (HiPIMS) and hydrological modelling (SHETRAN). With the aim of determining the effect forestry may have on flood frequency, 1000 years of generated rainfall data representative of current conditions has been used to determine the difference between current land-cover, different distributions of forest cover and the defining scenarios - complete forest removal and complete afforestation of the catchment. The simulations show the percentage of forestry required to have a significant impact on mitigating downstream flood risk at sub-catchment and catchment scale. Key words: Flood peak, nature-based solutions, forest hydrology, hydrological modelling, SHETRAN, flood frequency, flood magnitude, land-cover change, upland afforestation.
NASA Technical Reports Server (NTRS)
Mansour, Nagi N.; Wray, Alan A.; Mehrotra, Piyush; Henney, Carl; Arge, Nick; Godinez, H.; Manchester, Ward; Koller, J.; Kosovichev, A.; Scherrer, P.;
2013-01-01
The Sun lies at the center of space weather and is the source of its variability. The primary input to coronal and solar wind models is the activity of the magnetic field in the solar photosphere. Recent advancements in solar observations and numerical simulations provide a basis for developing physics-based models for the dynamics of the magnetic field from the deep convection zone of the Sun to the corona with the goal of providing robust near real-time boundary conditions at the base of space weather forecast models. The goal is to develop new strategic capabilities that enable characterization and prediction of the magnetic field structure and flow dynamics of the Sun by assimilating data from helioseismology and magnetic field observations into physics-based realistic magnetohydrodynamics (MHD) simulations. The integration of first-principle modeling of solar magnetism and flow dynamics with real-time observational data via advanced data assimilation methods is a new, transformative step in space weather research and prediction. This approach will substantially enhance an existing model of magnetic flux distribution and transport developed by the Air Force Research Lab. The development plan is to use the Space Weather Modeling Framework (SWMF) to develop Coupled Models for Emerging flux Simulations (CMES) that couples three existing models: (1) an MHD formulation with the anelastic approximation to simulate the deep convection zone (FSAM code), (2) an MHD formulation with full compressible Navier-Stokes equations and a detailed description of radiative transfer and thermodynamics to simulate near-surface convection and the photosphere (Stagger code), and (3) an MHD formulation with full, compressible Navier-Stokes equations and an approximate description of radiative transfer and heating to simulate the corona (Module in BATS-R-US). CMES will enable simulations of the emergence of magnetic structures from the deep convection zone to the corona. Finally, a plan will be summarized on the development of a Flux Emergence Prediction Tool (FEPT) in which helioseismology-derived data and vector magnetic maps are assimilated into CMES that couples the dynamics of magnetic flux from the deep interior to the corona.
Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans
NASA Astrophysics Data System (ADS)
Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj
2016-06-01
This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.
Modelling vortex-induced fluid-structure interaction.
Benaroya, Haym; Gabbai, Rene D
2008-04-13
The principal goal of this research is developing physics-based, reduced-order, analytical models of nonlinear fluid-structure interactions associated with offshore structures. Our primary focus is to generalize the Hamilton's variational framework so that systems of flow-oscillator equations can be derived from first principles. This is an extension of earlier work that led to a single energy equation describing the fluid-structure interaction. It is demonstrated here that flow-oscillator models are a subclass of the general, physical-based framework. A flow-oscillator model is a reduced-order mechanical model, generally comprising two mechanical oscillators, one modelling the structural oscillation and the other a nonlinear oscillator representing the fluid behaviour coupled to the structural motion.Reduced-order analytical model development continues to be carried out using a Hamilton's principle-based variational approach. This provides flexibility in the long run for generalizing the modelling paradigm to complex, three-dimensional problems with multiple degrees of freedom, although such extension is very difficult. As both experimental and analytical capabilities advance, the critical research path to developing and implementing fluid-structure interaction models entails-formulating generalized equations of motion, as a superset of the flow-oscillator models; and-developing experimentally derived, semi-analytical functions to describe key terms in the governing equations of motion. The developed variational approach yields a system of governing equations. This will allow modelling of multiple d.f. systems. The extensions derived generalize the Hamilton's variational formulation for such problems. The Navier-Stokes equations are derived and coupled to the structural oscillator. This general model has been shown to be a superset of the flow-oscillator model. Based on different assumptions, one can derive a variety of flow-oscillator models.
Analytical and simulator study of advanced transport
NASA Technical Reports Server (NTRS)
Levison, W. H.; Rickard, W. W.
1982-01-01
An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.
Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danon, Yaron; Nazarewicz, Witold; Talou, Patrick
2013-02-18
This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahesh, M; Borras, C; Frey, G
This workshop is jointly organized by the AAPM, the Spanish (SEFM) and the Russian (AMPR) Medical Physics Societies, as part of formal educational exchange agreements signed by the AAPM with each one of these two societies.With the rapid technological advances in radiation therapy both for treatment and imaging, it is challenging how physics is taught to medical physicists practicing in radiation therapy. The main Objectives: of this workshop is to bring forth current status, challenges and issues related to education of radiation therapy physicists here in the US, Spain and Russia. Medical physicists from each one of these countries willmore » present educational requirements of international recommendations and directives and analyze their impact on national legislations. Current and future educational models and plans for harmonization will be described. The role of universities, professional societies and examination boards, such as the American Board of Radiology, will be discussed. Minimum standards will be agreed upon. Learning Objectives: Review medical physics educational models supported by AAPM, SEFM, and AMPR. Discuss the role of governmental and non-governmental organizations in elaborating and adopting medical physics syllabi. Debate minimum educational standards for medical physics education based on country-specific resources.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Towards Plasma-Based Water Purification: Challenges and Prospects for the Future
NASA Astrophysics Data System (ADS)
Foster, John
2016-10-01
Freshwater scarcity derived from climate change, pollution, and over-development has led to serious consideration for water reuse. Advanced water treatment technologies will be required to process wastewater slated for reuse. One new and emerging technology that could potentially address the removal micropollutants in both drinking water as well as wastewater slated for reuse is plasma-based water purification. Plasma in contact with liquid water generates reactive species that attack and ultimately mineralize organic contaminants in solution. This interaction takes place in a boundary layer centered at the plasma-liquid interface. An understanding of the physical processes taking place at this interface, though poorly understood, is key to the optimization of plasma water purifiers. High electric field conditions, large density gradients, plasma-driven chemistries, and fluid dynamic effects prevail in this multiphase region. The region is also the source function for longer-lived reactive species that ultimately treat the water. Here, we review the need for advanced water treatment methods and in the process, make the case for plasma-based methods. Additionally, we survey the basic methods of interacting plasma with liquid water (including a discussion of breakdown processes in water), the current state of understanding of the physical processes taking place at the plasma-liquid interface, and the role that these processes play in water purification. The development of diagnostics usable in this multiphase environment along modeling efforts aimed at elucidating physical processes taking place at the interface are also detailed. Key experiments that demonstrate the capability of plasma-based water treatment are also reviewed. The technical challenges to the implementation of plasma-based water reactors are also discussed. NSF CBET 1336375 and DOE DE-SC0001939.
Shallwani, Shirin M; Simmonds, Maureen J; Kasymjanova, Goulnar; Spahija, Jadranka
2016-09-01
Our objectives were: (a) to identify predictors of change in health-related quality of life (HRQOL) in patients with advanced non-small cell lung cancer (NSCLC) undergoing chemotherapy; and (b) to characterize symptom status, nutritional status, physical performance and HRQOL in this population and to estimate the extent to which these variables change following two cycles of chemotherapy. A secondary analysis of a longitudinal observational study of 47 patients (24 men and 23 women) with newly diagnosed advanced NSCLC receiving two cycles of first-line chemotherapy was performed. Primary outcomes were changes in HRQOL (physical and mental component summaries (PCS and MCS) of the 36-item Short-Form Health Survey (SF-36)). Predictors in the models included pre-chemotherapy patient-reported symptoms (Schwartz Cancer Fatigue Scale (SCFS) and Lung Cancer Subscale), nutritional screening (Patient-Generated Subjective Global Assessment) and physical performance measures (6-min Walk Test (6MWT), one-minute chair rise test and grip strength). Mean SF-36 PCS score, 6MWT distance and grip strength declined following two cycles of chemotherapy (p<0.05). Multiple linear regression modelling revealed pre-chemotherapy SCFS score and 6MWT distance as the strongest predictors of change in the mental component of HRQOL accounting for 13% and 9% of the variance, respectively. No significant predictors were found for change in the physical component of HRQOL. Pre-chemotherapy 6MWT distance and fatigue severity predicted change in the mental component of HRQOL in patients with advanced NSCLC undergoing chemotherapy, while physical performance declined during treatment. Clinical management of these factors may be useful for HRQOL optimization in this population. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Cross hole GPR traveltime inversion using a fast and accurate neural network as a forward model
NASA Astrophysics Data System (ADS)
Mejer Hansen, Thomas
2017-04-01
Probabilistic formulated inverse problems can be solved using Monte Carlo based sampling methods. In principle both advanced prior information, such as based on geostatistics, and complex non-linear forward physical models can be considered. However, in practice these methods can be associated with huge computational costs that in practice limit their application. This is not least due to the computational requirements related to solving the forward problem, where the physical response of some earth model has to be evaluated. Here, it is suggested to replace a numerical complex evaluation of the forward problem, with a trained neural network that can be evaluated very fast. This will introduce a modeling error, that is quantified probabilistically such that it can be accounted for during inversion. This allows a very fast and efficient Monte Carlo sampling of the solution to an inverse problem. We demonstrate the methodology for first arrival travel time inversion of cross hole ground-penetrating radar (GPR) data. An accurate forward model, based on 2D full-waveform modeling followed by automatic travel time picking, is replaced by a fast neural network. This provides a sampling algorithm three orders of magnitude faster than using the full forward model, and considerably faster, and more accurate, than commonly used approximate forward models. The methodology has the potential to dramatically change the complexity of the types of inverse problems that can be solved using non-linear Monte Carlo sampling techniques.
A Historical Perspective on Dynamics Testing at the Langley Research Center
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Kvaternik, Raymond G.; Hanks, Brantley R.
2000-01-01
The experience and advancement of Structural dynamics testing for space system applications at the Langley Research Center of the National Aeronautics and Space Administration (NASA) over the past four decades is reviewed. This experience began in the 1960's with the development of a technology base using a variety of physical models to explore dynamic phenomena and to develop reliable analytical modeling capability for space systems. It continued through the 1970's and 80's with the development of rapid, computer-aided test techniques, the testing of low-natural frequency, gravity-sensitive systems, the testing of integrated structures with active flexible motion control, and orbital flight measurements, It extended into the 1990's where advanced computerized system identification methods were developed for estimating the dynamic states of complex, lightweight, flexible aerospace systems, The scope of discussion in this paper includes ground and flight tests and summarizes lessons learned in both successes and failures.
Advancing investigation and physical modeling of first-order fire effects on soils
William J. Massman; John M. Frank; Sacha J. Mooney
2010-01-01
Heating soil during intense wildland fires or slash-pile burns can alter the soil irreversibly, resulting in many significant long-term biological, chemical, physical, and hydrological effects. To better understand these long-term effects, it is necessary to improve modeling capability and prediction of the more immediate, or first-order, effects that fire can have on...
PHYSICS OF ECLIPSING BINARIES. II. TOWARD THE INCREASED MODEL FIDELITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prša, A.; Conroy, K. E.; Horvat, M.
The precision of photometric and spectroscopic observations has been systematically improved in the last decade, mostly thanks to space-borne photometric missions and ground-based spectrographs dedicated to finding exoplanets. The field of eclipsing binary stars strongly benefited from this development. Eclipsing binaries serve as critical tools for determining fundamental stellar properties (masses, radii, temperatures, and luminosities), yet the models are not capable of reproducing observed data well, either because of the missing physics or because of insufficient precision. This led to a predicament where radiative and dynamical effects, insofar buried in noise, started showing up routinely in the data, but weremore » not accounted for in the models. PHOEBE (PHysics Of Eclipsing BinariEs; http://phoebe-project.org) is an open source modeling code for computing theoretical light and radial velocity curves that addresses both problems by incorporating missing physics and by increasing the computational fidelity. In particular, we discuss triangulation as a superior surface discretization algorithm, meshing of rotating single stars, light travel time effects, advanced phase computation, volume conservation in eccentric orbits, and improved computation of local intensity across the stellar surfaces that includes the photon-weighted mode, the enhanced limb darkening treatment, the better reflection treatment, and Doppler boosting. Here we present the concepts on which PHOEBE is built and proofs of concept that demonstrate the increased model fidelity.« less
Static Behavior of Chalcogenide Based Programmable Metallization Cells
NASA Astrophysics Data System (ADS)
Rajabi, Saba
Nonvolatile memory (NVM) technologies have been an integral part of electronic systems for the past 30 years. The ideal non-volatile memory have minimal physical size, energy usage, and cost while having maximal speed, capacity, retention time, and radiation hardness. A promising candidate for next-generation memory is ion-conducting bridging RAM which is referred to as programmable metallization cell (PMC), conductive bridge RAM (CBRAM), or electrochemical metallization memory (ECM), which is likely to surpass flash memory in all the ideal memory characteristics. A comprehensive physics-based model is needed to completely understand PMC operation and assist in design optimization. To advance the PMC modeling effort, this thesis presents a precise physical model parameterizing materials associated with both ion-rich and ion-poor layers of the PMC's solid electrolyte, so that captures the static electrical behavior of the PMC in both its low-resistance on-state (LRS) and high resistance off-state (HRS). The experimental data is measured from a chalcogenide glass PMC designed and manufactured at ASU. The static on- and off-state resistance of a PMC device composed of a layered (Ag-rich/Ag-poor) Ge30Se70 ChG film is characterized and modeled using three dimensional simulation code written in Silvaco Atlas finite element analysis software. Calibrating the model to experimental data enables the extraction of device parameters such as material bandgaps, workfunctions, density of states, carrier mobilities, dielectric constants, and affinities. The sensitivity of our modeled PMC to the variation of its prominent achieved material parameters is examined on the HRS and LRS impedance behavior. The obtained accurate set of material parameters for both Ag-rich and Ag-poor ChG systems and process variation verification on electrical characteristics enables greater fidelity in PMC device simulation, which significantly enhances our ability to understand the underlying physics of ChG-based resistive switching memory.
Faith, Myles S
2008-12-01
This report summarizes emerging opportunities for behavioral science to help advance the field of gene-environment and gene-behavior interactions, based on presentations at The National Cancer Institute (NCI) Workshop, "Gene-Nutrition and Gene-Physical Activity Interactions in the Etiology of Obesity." Three opportunities are highlighted: (i) designing potent behavioral "challenges" in experiments, (ii) determining viable behavioral phenotypes for genetics studies, and (iii) identifying specific measures of the environment or environmental exposures. Additional points are underscored, including the need to incorporate novel findings from neuroimaging studies regarding motivation and drive for eating and physical activity. Advances in behavioral science theory and methods can play an important role in advancing understanding of gene-brain-behavior relationships in obesity onset.
Application of Advanced Process Control techniques to a pusher type reheating furnace
NASA Astrophysics Data System (ADS)
Zanoli, S. M.; Pepe, C.; Barboni, L.
2015-11-01
In this paper an Advanced Process Control system aimed at controlling and optimizing a pusher type reheating furnace located in an Italian steel plant is proposed. The designed controller replaced the previous control system, based on PID controllers manually conducted by process operators. A two-layer Model Predictive Control architecture has been adopted that, exploiting a chemical, physical and economic modelling of the process, overcomes the limitations of plant operators’ mental model and knowledge. In addition, an ad hoc decoupling strategy has been implemented, allowing the selection of the manipulated variables to be used for the control of each single process variable. Finally, in order to improve the system flexibility and resilience, the controller has been equipped with a supervision module. A profitable trade-off between conflicting specifications, e.g. safety, quality and production constraints, energy saving and pollution impact, has been guaranteed. Simulation tests and real plant results demonstrated the soundness and the reliability of the proposed system.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
Calculation of protein-ligand binding affinities.
Gilson, Michael K; Zhou, Huan-Xiang
2007-01-01
Accurate methods of computing the affinity of a small molecule with a protein are needed to speed the discovery of new medications and biological probes. This paper reviews physics-based models of binding, beginning with a summary of the changes in potential energy, solvation energy, and configurational entropy that influence affinity, and a theoretical overview to frame the discussion of specific computational approaches. Important advances are reported in modeling protein-ligand energetics, such as the incorporation of electronic polarization and the use of quantum mechanical methods. Recent calculations suggest that changes in configurational entropy strongly oppose binding and must be included if accurate affinities are to be obtained. The linear interaction energy (LIE) and molecular mechanics Poisson-Boltzmann surface area (MM-PBSA) methods are analyzed, as are free energy pathway methods, which show promise and may be ready for more extensive testing. Ultimately, major improvements in modeling accuracy will likely require advances on multiple fronts, as well as continued validation against experiment.
An assessment of CFD-based wall heat transfer models in piston engines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sircar, Arpan; Paul, Chandan; Ferreyro-Fernandez, Sebastian
The lack of accurate submodels for in-cylinder heat transfer has been identified as a key shortcoming in developing truly predictive, physics-based computational fluid dynamics (CFD) models that can be used to develop combustion systems for advanced high-efficiency, low-emissions engines. Only recently have experimental methods become available that enable accurate near-wall measurements to enhance simulation capability via advancing models. Initial results show crank-angle dependent discrepancies with respect to previously used boundary-layer models of up to 100%. However, available experimental data is quite sparse (only few data points on engine walls) and limited (available measurements are those of heat flux only). Predictivemore » submodels are needed for medium-resolution ("engineering") LES and for unsteady Reynolds-averaged simulations (URANS). Recently, some research groups have performed DNS studies on engine-relevant conditions using simple geometries. These provide very useful data for benchmarking wall heat transfer models under such conditions. Further, a number of new and more sophisticated models have also become available in the literature which account for these engine-like conditions. Some of these have been incorporated while others of a more complex nature, which include solving additional partial differential equations (PDEs) within the thin boundary layer near the wall, are underway. These models will then be tested against the available DNS/experimental data in both SI (spark-ignition) and CI (compression-ignition) engines.« less
Multiscale simulation of molecular processes in cellular environments.
Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone
2016-11-13
We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
Hill, Jennie L; Chau, Clarice; Luebbering, Candice R; Kolivras, Korine K; Zoellner, Jamie
2012-09-06
Low-income, ethnic/racial minorities and rural populations are at increased risk for obesity and related chronic health conditions when compared to white, urban and higher-socio-economic status (SES) peers. Recent systematic reviews highlight the influence of the built environment on obesity, yet very few of these studies consider rural areas or populations. Utilizing a CBPR process, this study advances community-driven causal models to address obesity by exploring the difference in resources for physical activity and food outlets by block group race and income in a small regional city that anchors a rural health disparate region. To guide this inquiry we hypothesized that lower income and racially diverse block groups would have fewer food outlets, including fewer grocery stores and fewer physical activity outlets. We further hypothesized that walkability, as defined by a computed walkability index, would be lower in the lower income block groups. Using census data and GIS, base maps of the region were created and block groups categorized by income and race. All food outlets and physical activity resources were enumerated and geocoded and a walkability index computed. Analyses included one-way MANOVA and spatial autocorrelation. In total, 49 stores, 160 restaurants and 79 physical activity outlets were enumerated. There were no differences in the number of outlets by block group income or race. Further, spatial analyses suggest that the distribution of outlets is dispersed across all block groups. Under the larger CPBR process, this enumeration study advances the causal models set forth by the community members to address obesity by providing an overview of the food and physical activity environment in this region. This data reflects the food and physical activity resources available to residents in the region and will aid many of the community-academic partners as they pursue intervention strategies targeting obesity.
Nonequilibrium radiative hypersonic flow simulation
NASA Astrophysics Data System (ADS)
Shang, J. S.; Surzhikov, S. T.
2012-08-01
Nearly all the required scientific disciplines for computational hypersonic flow simulation have been developed on the framework of gas kinetic theory. However when high-temperature physical phenomena occur beneath the molecular and atomic scales, the knowledge of quantum physics and quantum chemical-physics becomes essential. Therefore the most challenging topics in computational simulation probably can be identified as the chemical-physical models for a high-temperature gaseous medium. The thermal radiation is also associated with quantum transitions of molecular and electronic states. The radiative energy exchange is characterized by the mechanisms of emission, absorption, and scattering. In developing a simulation capability for nonequilibrium radiation, an efficient numerical procedure is equally important both for solving the radiative transfer equation and for generating the required optical data via the ab-initio approach. In computational simulation, the initial values and boundary conditions are paramount for physical fidelity. Precise information at the material interface of ablating environment requires more than just a balance of the fluxes across the interface but must also consider the boundary deformation. The foundation of this theoretic development shall be built on the eigenvalue structure of the governing equations which can be described by Reynolds' transport theorem. Recent innovations for possible aerospace vehicle performance enhancement via an electromagnetic effect appear to be very attractive. The effectiveness of this mechanism is dependent strongly on the degree of ionization of the flow medium, the consecutive interactions of fluid dynamics and electrodynamics, as well as an externally applied magnetic field. Some verified research results in this area will be highlighted. An assessment of all these most recent advancements in nonequilibrium modeling of chemical kinetics, chemical-physics kinetics, ablation, radiative exchange, computational algorithms, and the aerodynamic-electromagnetic interaction are summarized and delineated. The critical basic research areas for physic-based hypersonic flow simulation should become self-evident through the present discussion. Nevertheless intensive basic research efforts must be sustained in these areas for fundamental knowledge and future technology advancement.
NASA Astrophysics Data System (ADS)
Gerszewski, Daniel James
Physical simulation has become an essential tool in computer animation. As the use of visual effects increases, the need for simulating real-world materials increases. In this dissertation, we consider three problems in physics-based animation: large-scale splashing liquids, elastoplastic material simulation, and dimensionality reduction techniques for fluid simulation. Fluid simulation has been one of the greatest successes of physics-based animation, generating hundreds of research papers and a great many special effects over the last fifteen years. However, the animation of large-scale, splashing liquids remains challenging. We show that a novel combination of unilateral incompressibility, mass-full FLIP, and blurred boundaries is extremely well-suited to the animation of large-scale, violent, splashing liquids. Materials that incorporate both plastic and elastic deformations, also referred to as elastioplastic materials, are frequently encountered in everyday life. Methods for animating such common real-world materials are useful for effects practitioners and have been successfully employed in films. We describe a point-based method for animating elastoplastic materials. Our primary contribution is a simple method for computing the deformation gradient for each particle in the simulation. Given the deformation gradient, we can apply arbitrary constitutive models and compute the resulting elastic forces. Our method has two primary advantages: we do not store or compare to an initial rest configuration and we work directly with the deformation gradient. The first advantage avoids poor numerical conditioning and the second naturally leads to a multiplicative model of deformation appropriate for finite deformations. One of the most significant drawbacks of physics-based animation is that ever-higher fidelity leads to an explosion in the number of degrees of freedom. This problem leads us to the consideration of dimensionality reduction techniques. We present several enhancements to model-reduced fluid simulation that allow improved simulation bases and two-way solid-fluid coupling. Specifically, we present a basis enrichment scheme that allows us to combine data-driven or artistically derived bases with more general analytic bases derived from Laplacian Eigenfunctions. Additionally, we handle two-way solid-fluid coupling in a time-splitting fashion---we alternately timestep the fluid and rigid body simulators, while taking into account the effects of the fluid on the rigid bodies and vice versa. We employ the vortex panel method to handle solid-fluid coupling and use dynamic pressure to compute the effect of the fluid on rigid bodies. Taken together, these contributions have advanced the state-of-the art in physics-based animation and are practical enough to be used in production pipelines.
NASA Astrophysics Data System (ADS)
Stafford, Luc
Advances in electronics and photonics critically depend upon plasma-based materials processing either for transferring small lithographic patterns into underlying materials (plasma etching) or for the growth of high-quality films. This thesis deals with the etching mechanisms of materials using high-density plasmas. The general objective of this work is to provide an original framework for the plasma-material interaction involved in the etching of advanced materials by putting the emphasis on complex oxides such as SrTiO3, (Ba,Sr)TiO 3 and SrBi2Ta2O9 films. Based on a synthesis of the descriptions proposed by different authors to explain the etching characteristics of simple materials in noble and halogenated plasma mixtures, we propose comprehensive rate models for physical and chemical plasma etching processes. These models have been successfully validated using experimental data published in literature for Si, Pt, W, SiO2 and ZnO. As an example, we have been able to adequately describe the simultaneous dependence of the etch rate on ion and reactive neutral fluxes and on the ion energy. From an exhaustive experimental investigation of the plasma and etching properties, we have also demonstrated that the validity of the proposed models can be extended to complex oxides such as SrTiO3, (Ba,Sr)TiO 3 and SrBi2Ta2O9 films. We also reported for the first time physical aspects involved in plasma etching such as the influence of the film microstructural properties on the sputter-etch rate and the influence of the positive ion composition on the ion-assisted desorption dynamics. Finally, we have used our deep investigation of the etching mechanisms of STO films and the resulting excellent control of the etch rate to fabricate a ridge waveguide for photonic device applications. Keywords: plasma etching, sputtering, adsorption and desorption dynamics, high-density plasmas, plasma diagnostics, advanced materials, photonic applications.
NASA Astrophysics Data System (ADS)
Laïssaoui, Mounir; Mesbah, Mohamed; Madani, Khodir; Kiniouar, Hocine
2018-05-01
To analyze the water budget under human influences in the Isser wadi alluvial aquifer in the northeast of Algeria, we built a mathematical model which can be used for better managing groundwater exploitation. A modular three-dimensional finite-difference groundwater flow model (MODFLOW) was used. The modelling system is largely based on physical laws and employs a numerical method of the finite difference to simulate water movement and fluxes in a horizontally discretized field. After calibration in steady-state, the model could reproduce the initial heads with a rather good precision. It enabled us to quantify the aquifer water balance terms and to obtain a conductivity zones distribution. The model also highlighted the relevant role of the Isser wadi which constitutes a drain of great importance for the aquifer, ensuring alone almost all outflows. The scenarios suggested in transient simulations showed that an increase in the pumping would only increase the lowering of the groundwater levels and disrupting natural balance of aquifer. However, it is clear that this situation depends primarily on the position of pumping wells in the plain as well as on the extracted volumes of water. As proven by the promising results of model, this physically based and distributed-parameter model is a valuable contribution to the ever-advancing technology of hydrological modelling and water resources assessment.
A network model for characterizing brine channels in sea ice
NASA Astrophysics Data System (ADS)
Lieblappen, Ross M.; Kumar, Deip D.; Pauls, Scott D.; Obbard, Rachel W.
2018-03-01
The brine pore space in sea ice can form complex connected structures whose geometry is critical in the governance of important physical transport processes between the ocean, sea ice, and surface. Recent advances in three-dimensional imaging using X-ray micro-computed tomography have enabled the visualization and quantification of the brine network morphology and variability. Using imaging of first-year sea ice samples at in situ temperatures, we create a new mathematical network model to characterize the topology and connectivity of the brine channels. This model provides a statistical framework where we can characterize the pore networks via two parameters, depth and temperature, for use in dynamical sea ice models. Our approach advances the quantification of brine connectivity in sea ice, which can help investigations of bulk physical properties, such as fluid permeability, that are key in both global and regional sea ice models.
A Simulation and Modeling Framework for Space Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S
This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less
Engineering Cell-Cell Signaling
Milano, Daniel F.; Natividad, Robert J.; Asthagiri, Anand R.
2014-01-01
Juxtacrine cell-cell signaling mediated by the direct interaction of adjoining mammalian cells is arguably the mode of cell communication that is most recalcitrant to engineering. Overcoming this challenge is crucial for progress in biomedical applications, such as tissue engineering, regenerative medicine, immune system engineering and therapeutic design. Here, we describe the significant advances that have been made in developing synthetic platforms (materials and devices) and synthetic cells (cell surface engineering and synthetic gene circuits) to modulate juxtacrine cell-cell signaling. In addition, significant progress has been made in elucidating design rules and strategies to modulate juxtacrine signaling based on quantitative, engineering analysis of the mechanical and regulatory role of juxtacrine signals in the context of other cues and physical constraints in the microenvironment. These advances in engineering juxtacrine signaling lay a strong foundation for an integrative approach to utilizing synthetic cells, advanced ‘chassis’ and predictive modeling to engineer the form and function of living tissues. PMID:23856592
Science and engineering research opportunities at the National Science Foundation.
Demir, Semahat S
2004-01-01
Research at the interface of the physical sciences and life sciences has produced remarkable advances and understanding in biology and medicine over the past fifty years. These bases for many of these healthcare and research advances have been discoveries in the quantitative sciences and engineering approaches to applying them. The National Science Foundation supports research and development in the physical sciences which underpins multi-disciplinary approaches to addressing problems in biology and medicine. This presentation will cover research opportunities offered by the NSF and collaborative programs with the NIH to transfer the resulting advances and technologies.
Winer, Gerald A; Cottrell, Jane E; Bica, Lori A
2009-06-01
A series of studies examined the presence of centralist versus peripheralist responding about the physical location of psychological processes. Centralists respond that processes such as cognition and emotion are a function of the brain. Peripheralists respond that such processes are located in other parts of the body, such as the heart. Although peripheralist responses declined across grade levels, even older children and adults often gave peripheralist answers, depending on the context of the questions. Peripheralist responses occurred when participants were asked about the effect of switching irrelevant body parts between two people and when they were asked to choose a different body part among four choices. Results also showed adults' responses varied with different contextual cues. The findings support a coexistence model of development which argues for the simultaneous presence of developmentally advanced reasoning or scientifically based knowledge along with presumably less advanced, intuitive-based reasoning, or folk beliefs.
NASA Astrophysics Data System (ADS)
Eylon, Bat-Sheva; Bagno, Esther
2006-12-01
How can one increase the awareness of teachers to the existence and importance of knowledge gained through physics education research (PER) and provide them with capabilities to use it? How can one enrich teachers’ physics knowledge and the related pedagogical content knowledge of topics singled out by PER? In this paper we describe a professional development model that attempts to respond to these needs. We report on a study of the model’s implementation in a program for 22 high-school experienced physics teachers. In this program teachers (in teams of 5-6) developed during a year and a half (about 330h ), several lessons (minimodules) dealing with a topic identified as problematic by PER. The teachers employed a systematic research-based approach and used PER findings. The program consisted of three stages, each culminating with a miniconference: 1. Defining teaching and/or learning goals based on content analysis and diagnosis of students’ prior knowledge. 2. Designing the lessons using PER-based instructional strategies. 3. Performing a small-scale research study that accompanies the development process and publishing the results. We describe a case study of one of the groups and bring evidence that demonstrates how the workshop advanced: (a) Teachers’ awareness of deficiencies in their own knowledge of physics and pedagogy, and their perceptions about their students’ knowledge; (b) teachers’ knowledge of physics and physics pedagogy; (c) a systematic research-based approach to the design of lessons; (d) the formation of a community of practice; and (e) acquaintance with central findings of PER. There was a clear effect on teachers’ practice in the context of the study as indicated by the materials brought to the workshop. The teachers also reported that they continued to use the insights gained, mainly in the topics that were investigated by themselves and by their peers.
ERIC Educational Resources Information Center
Pold, Jack; Mulvey, Patrick
2016-01-01
By the time people earn physics PhDs, they have learned a great deal about physics and how research is conducted. However, physics PhDs also develop skills and knowledge in a number of related areas, such as advanced mathematics, programming, modeling, and technical writing. Physics PhDs draw upon an arsenal of skills and knowledge in their…
NASA Astrophysics Data System (ADS)
Ryu, Hoon; Jeong, Yosang; Kang, Ji-Hoon; Cho, Kyu Nam
2016-12-01
Modelling of multi-million atomic semiconductor structures is important as it not only predicts properties of physically realizable novel materials, but can accelerate advanced device designs. This work elaborates a new Technology-Computer-Aided-Design (TCAD) tool for nanoelectronics modelling, which uses a sp3d5s∗ tight-binding approach to describe multi-million atomic structures, and simulate electronic structures with high performance computing (HPC), including atomic effects such as alloy and dopant disorders. Being named as Quantum simulation tool for Advanced Nanoscale Devices (Q-AND), the tool shows nice scalability on traditional multi-core HPC clusters implying the strong capability of large-scale electronic structure simulations, particularly with remarkable performance enhancement on latest clusters of Intel Xeon PhiTM coprocessors. A review of the recent modelling study conducted to understand an experimental work of highly phosphorus-doped silicon nanowires, is presented to demonstrate the utility of Q-AND. Having been developed via Intel Parallel Computing Center project, Q-AND will be open to public to establish a sound framework of nanoelectronics modelling with advanced HPC clusters of a many-core base. With details of the development methodology and exemplary study of dopant electronics, this work will present a practical guideline for TCAD development to researchers in the field of computational nanoelectronics.
NASA Astrophysics Data System (ADS)
Donnay, Karsten
2015-03-01
The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].
Specification and Prediction of the Radiation Environment Using Data Assimilative VERB code
NASA Astrophysics Data System (ADS)
Shprits, Yuri; Kellerman, Adam
2016-07-01
We discuss how data assimilation can be used for the reconstruction of long-term evolution, bench-marking of the physics based codes and used to improve the now-casting and focusing of the radiation belts and ring current. We also discuss advanced data assimilation methods such as parameter estimation and smoothing. We present a number of data assimilation applications using the VERB 3D code. The 3D data assimilative VERB allows us to blend together data from GOES, RBSP A and RBSP B. 1) Model with data assimilation allows us to propagate data to different pitch angles, energies, and L-shells and blends them together with the physics-based VERB code in an optimal way. We illustrate how to use this capability for the analysis of the previous events and for obtaining a global and statistical view of the system. 2) The model predictions strongly depend on initial conditions that are set up for the model. Therefore, the model is as good as the initial conditions that it uses. To produce the best possible initial conditions, data from different sources (GOES, RBSP A, B, our empirical model predictions based on ACE) are all blended together in an optimal way by means of data assimilation, as described above. The resulting initial conditions do not have gaps. This allows us to make more accurate predictions. Real-time prediction framework operating on our website, based on GOES, RBSP A, B and ACE data, and 3D VERB, is presented and discussed.
Graphene Foam: Uniaxial Tension Behavior and Fracture Mode Based on a Mesoscopic Model.
Pan, Douxing; Wang, Chao; Wang, Tzu-Chiang; Yao, Yugui
2017-09-26
Because of the combined advantages of both porous materials and two-dimensional (2D) graphene sheets, superior mechanical properties of three-dimensional (3D) graphene foams have received much attention from material scientists and energy engineers. Here, a 2D mesoscopic graphene model (Modell. Simul. Mater. Sci. Eng. 2011, 19, 054003), was expanded into a 3D bonded graphene foam system by utilizing physical cross-links and van der Waals forces acting among different mesoscopic graphene flakes by considering the debonding behavior, to evaluate the uniaxial tension behavior and fracture mode based on in situ SEM tensile testing (Carbon 2015, 85, 299). We reasonably reproduced a multipeak stress-strain relationship including its obvious yielding plateau and a ductile fracture mode near 45° plane from the tensile direction including the corresponding fracture morphology. Then, a power scaling law of tensile elastic modulus with mass density and an anisotropic strain-dependent Poisson's ratio were both deduced. The mesoscopic physical mechanism of tensile deformation was clearly revealed through the local stress state and evolution of mesostructure. The fracture feature of bonded graphene foam and its thermodynamic state were directly navigated to the tearing pattern of mesoscopic graphene flakes. This study provides an effective way to understand the mesoscopic physical nature of 3D graphene foams, and hence it may contribute to the multiscale computations of micro/meso/macromechanical performances and optimal design of advanced graphene-foam-based materials.
Brzezicki, Marcin
2013-01-01
Issues of transparency perception are addressed from an architectural perspective, pointing out previously neglected factors that greatly influence this phenomenon in the scale of a building. The simplified perforated model of a transparent surface presented in the paper has been based on previously developed theories and involves the balance of light reflected versus light transmitted. Its aim is to facilitate an understanding of non-intuitive phenomena related to transparency (eg dynamically changing reflectance) for readers without advanced knowledge of molecular physics. A verification of the presented model has been based on the comparison of optical performance of the model with the results of Fresnel's equations for light-transmitting materials. The presented methodology is intended to be used both in the design and explanatory stages of architectural practice and vision research. Incorporation of architectural issues could enrich the perspective of scientists representing other disciplines.
NASA Astrophysics Data System (ADS)
Blasch, Erik; Salerno, John; Kadar, Ivan; Yang, Shanchieh J.; Fenstermacher, Laurie; Endsley, Mica; Grewe, Lynne
2013-05-01
During the SPIE 2012 conference, panelists convened to discuss "Real world issues and challenges in Human Social/Cultural/Behavioral modeling with Applications to Information Fusion." Each panelist presented their current trends and issues. The panel had agreement on advanced situation modeling, working with users for situation awareness and sense-making, and HSCB context modeling in focusing research activities. Each panelist added different perspectives based on the domain of interest such as physical, cyber, and social attacks from which estimates and projections can be forecasted. Also, additional techniques were addressed such as interest graphs, network modeling, and variable length Markov Models. This paper summarizes the panelists discussions to highlight the common themes and the related contrasting approaches to the domains in which HSCB applies to information fusion applications.
Recent progress in design and hybridization of planar grating-based transceivers
NASA Astrophysics Data System (ADS)
Bidnyk, S.; Pearson, M.; Balakrishnan, A.; Gao, M.
2007-06-01
We report on recent progress in simulations, physical layout, fabrication and hybridization of planar grating-based transceivers for passive optical networks (PONs). Until recently, PON transceivers have been manufactured using bulk micro-optical components. Today, advancements in modeling and simulation techniques has made it possible to design complex elements in the same silica-on silicon PLC platform and create an alternative platform for manufacturing of bi-directional transceivers. In our chips we simulated an integrated chip that monolithically combined planar reflective gratings and cascaded Mach-Zehnder interferometers. We used a combination of the finite element method and beam propagation method to model cascaded interferometers with enhanced coupling coefficients. Our simulations show that low-diffraction order planar reflective gratings, designed for small incidence and reflection angles, possess the required dispersion strength to meet the PON specifications. Subsequently, we created structures for passive alignment and hybridized photodetectors and lasers. We believe that advancements in simulation of planar lightwave circuits with embedded planar reflective gratings will result in displacement of the thin-film filters (TFFs) technology in many applications that require a high degree of monolithic and hybrid integration.
Improving Turbine Performance with Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
DiCarlo, James A.
2007-01-01
Under the new NASA Fundamental Aeronautics Program, efforts are on-going within the Supersonics Project aimed at the implementation of advanced SiC/SiC ceramic composites into hot section components of future gas turbine engines. Due to recent NASA advancements in SiC-based fibers and matrices, these composites are lighter and capable of much higher service temperatures than current metallic superalloys, which in turn will allow the engines to operate at higher efficiencies and reduced emissions. This presentation briefly reviews studies within Task 6.3.3 that are primarily aimed at developing physics-based concepts, tools, and process/property models for micro- and macro-structural design, fabrication, and lifing of SiC/SiC turbine components in general and airfoils in particular. Particular emphasis is currently being placed on understanding and modeling (1) creep effects on residual stress development within the component, (2) fiber architecture effects on key composite properties such as design strength, and (3) preform formation processes so that the optimum architectures can be implemented into complex-shaped components, such as turbine vanes and blades.
Allison, J.; Amako, K.; Apostolakis, J.; ...
2016-07-01
Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. In conclusion, the adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions tomore » the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.« less
Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data
NASA Astrophysics Data System (ADS)
Deng, Xinyi
2016-08-01
A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in population spiking data. Lastly, we proposed a general three-step paradigm that allows us to relate behavioral outcomes of various tasks to simultaneously recorded neural activity across multiple brain areas, which is a step towards closed-loop therapies for psychological diseases using real-time neural stimulation. These methods are suitable for real-time implementation for content-based feedback experiments.
NASA Astrophysics Data System (ADS)
Rouet-Leduc, B.; Hulbert, C.; Riviere, J.; Lubbers, N.; Barros, K.; Marone, C.; Johnson, P. A.
2016-12-01
Forecasting failure is a primary goal in diverse domains that include earthquake physics, materials science, nondestructive evaluation of materials and other engineering applications. Due to the highly complex physics of material failure and limitations on gathering data in the failure nucleation zone, this goal has often appeared out of reach; however, recent advances in instrumentation sensitivity, instrument density and data analysis show promise toward forecasting failure times. Here, we show that we can predict frictional failure times of both slow and fast stick slip failure events in the laboratory. This advance is made possible by applying a machine learning approach known as Random Forests1(RF) to the continuous acoustic emission (AE) time series recorded by detectors located on the fault blocks. The RF is trained using a large number of statistical features derived from the AE time series signal. The model is then applied to data not previously analyzed. Remarkably, we find that the RF method predicts upcoming failure time far in advance of a stick slip event, based only on a short time window of data. Further, the algorithm accurately predicts the time of the beginning and end of the next slip event. The predicted time improves as failure is approached, as other data features add to prediction. Our results show robust predictions of slow and dynamic failure based on acoustic emissions from the fault zone throughout the laboratory seismic cycle. The predictions are based on previously unidentified tremor-like acoustic signals that occur during stress build up and the onset of macroscopic frictional weakening. We suggest that the tremor-like signals carry information about fault zone processes and allow precise predictions of failure at any time in the slow slip or stick slip cycle2. If the laboratory experiments represent Earth frictional conditions, it could well be that signals are being missed that contain highly useful predictive information. 1Breiman, L. Random forests. Machine Learning 45, 5-32 (2001). 2Rouet-Leduc, B. C. Hulbert, N. Lubbers, K. Barros and P. A. Johnson, Learning the physics of failure, in review (2016).
Design Considerations of a Virtual Laboratory for Advanced X-ray Sources
NASA Astrophysics Data System (ADS)
Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.
2004-11-01
The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.
Recent Advances in Ionospheric Modeling Using the USU GAIM Data Assimilation Models
NASA Astrophysics Data System (ADS)
Scherliess, L.; Thompson, D. C.; Schunk, R. W.
2009-12-01
The ionospheric plasma distribution at low and mid latitudes has been shown to display both a background state (climatology) and a disturbed state (weather). Ionospheric climatology has been successfully modeled, but ionospheric weather has been much more difficult to model because the ionosphere can vary significantly on an hour-by-hour basis. Unfortunately, ionospheric weather can have detrimental effects on several human activities and systems, including high-frequency communications, over-the-horizon radars, and survey and navigation systems using Global Positioning System (GPS) satellites. As shown by meteorologists and oceanographers, the most reliable weather models are physics-based, data-driven models that use Kalman filter or other data assimilation techniques. Since the state of a medium (ocean, lower atmosphere, ionosphere) is driven by complex and frequently nonlinear internal and external processes, it is not possible to accurately specify all of the drivers and initial conditions of the medium. Therefore physics-based models alone cannot provide reliable specifications and forecasts. In an effort to better understand the ionosphere and to mitigate its adverse effects on military and civilian operations, specification and forecast models are being developed that use state-of-the-art data assimilation techniques. Over the past decade, Utah State University (USU) has developed two data assimilation models for the ionosphere as part of the USU Global Assimilation of Ionospheric Measurements (GAIM) program and one of these models has been implemented at the Air Force Weather Agency for operational use. The USU-GAIM models are also being used for scientific studies, and this should lead to a dramatic advance in our understanding of ionospheric physics; similar to what occurred in meteorology and oceanography after the introduction of data assimilation models in those fields. Both USU-GAIM models are capable of assimilating data from a variety of data sources, including in situ electron densities from satellites, bottomside electron density profiles from ionosondes, total electron content (TEC) measurements between ground receivers and the GPS satellites, occultation data from satellite constellations, and ultraviolet emissions from the ionosphere measured by satellites. We will present the current status of the model development and discuss the employed data assimilation technique. Recent examples of the ionosphere specifications obtained from our model runs will be presented with an emphasis on the ionospheric plasma distribution during the current low solar activity conditions. Various comparisons with independent data will also be shown in an effort to validate the models.
Simulating Coupling Complexity in Space Plasmas: First Results from a new code
NASA Astrophysics Data System (ADS)
Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.
2005-12-01
The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.
An advanced technique for the prediction of decelerator system dynamics.
NASA Technical Reports Server (NTRS)
Talay, T. A.; Morris, W. D.; Whitlock, C. H.
1973-01-01
An advanced two-body six-degree-of-freedom computer model employing an indeterminate structures approach has been developed for the parachute deployment process. The program determines both vehicular and decelerator responses to aerodynamic and physical property inputs. A better insight into the dynamic processes that occur during parachute deployment has been developed. The model is of value in sensitivity studies to isolate important parameters that affect the vehicular response.
Numerical modelling of river morphodynamics: Latest developments and remaining challenges
NASA Astrophysics Data System (ADS)
Siviglia, Annunziato; Crosato, Alessandra
2016-07-01
Numerical morphodynamic models provide scientific frameworks for advancing our understanding of river systems. The research on involved topics is an important and socially relevant undertaking regarding our environment. Nowadays numerical models are used for different purposes, from answering questions about basic morphodynamic research to managing complex river engineering problems. Due to increasing computer power and the development of advanced numerical techniques, morphodynamic models are now more and more used to predict the bed patterns evolution to a broad spectrum of spatial and temporal scales. The development and the success of application of such models are based upon a wide range of disciplines from applied mathematics for the numerical solution of the equations to geomorphology for the physical interpretation of the results. In this light we organized this special issue (SI) soliciting multidisciplinary contributions which encompass any aspect needed for the development and applications of such models. Most of the papers in the SI stem from contributions to session HS9.5/GM7.11 on numerical modelling and experiments in river morphodynamics at the European Geosciences Union (EGU) General Assembly held in Vienna, April 27th to May 2nd 2014.
Dynamic Modeling, Model-Based Control, and Optimization of Solid Oxide Fuel Cells
NASA Astrophysics Data System (ADS)
Spivey, Benjamin James
2011-07-01
Solid oxide fuel cells are a promising option for distributed stationary power generation that offers efficiencies ranging from 50% in stand-alone applications to greater than 80% in cogeneration. To advance SOFC technology for widespread market penetration, the SOFC should demonstrate improved cell lifetime and load-following capability. This work seeks to improve lifetime through dynamic analysis of critical lifetime variables and advanced control algorithms that permit load-following while remaining in a safe operating zone based on stress analysis. Control algorithms typically have addressed SOFC lifetime operability objectives using unconstrained, single-input-single-output control algorithms that minimize thermal transients. Existing SOFC controls research has not considered maximum radial thermal gradients or limits on absolute temperatures in the SOFC. In particular, as stress analysis demonstrates, the minimum cell temperature is the primary thermal stress driver in tubular SOFCs. This dissertation presents a dynamic, quasi-two-dimensional model for a high-temperature tubular SOFC combined with ejector and prereformer models. The model captures dynamics of critical thermal stress drivers and is used as the physical plant for closed-loop control simulations. A constrained, MIMO model predictive control algorithm is developed and applied to control the SOFC. Closed-loop control simulation results demonstrate effective load-following, constraint satisfaction for critical lifetime variables, and disturbance rejection. Nonlinear programming is applied to find the optimal SOFC size and steady-state operating conditions to minimize total system costs.
NASA Astrophysics Data System (ADS)
Wardaya, P. D.; Noh, K. A. B. M.; Yusoff, W. I. B. W.; Ridha, S.; Nurhandoko, B. E. B.
2014-09-01
This paper discusses a new approach for investigating the seismic wave velocity of rock, specifically carbonates, as affected by their pore structures. While the conventional routine of seismic velocity measurement highly depends on the extensive laboratory experiment, the proposed approach utilizes the digital rock physics view which lies on the numerical experiment. Thus, instead of using core sample, we use the thin section image of carbonate rock to measure the effective seismic wave velocity when travelling on it. In the numerical experiment, thin section images act as the medium on which wave propagation will be simulated. For the modeling, an advanced technique based on artificial neural network was employed for building the velocity and density profile, replacing image's RGB pixel value with the seismic velocity and density of each rock constituent. Then, ultrasonic wave was simulated to propagate in the thin section image by using finite difference time domain method, based on assumption of an acoustic-isotropic medium. Effective velocities were drawn from the recorded signal and being compared to the velocity modeling from Wyllie time average model and Kuster-Toksoz rock physics model. To perform the modeling, image analysis routines were undertaken for quantifying the pore aspect ratio that is assumed to represent the rocks pore structure. In addition, porosity and mineral fraction required for velocity modeling were also quantified by using integrated neural network and image analysis technique. It was found that the Kuster-Toksoz gives the closer prediction to the measured velocity as compared to the Wyllie time average model. We also conclude that Wyllie time average that does not incorporate the pore structure parameter deviates significantly for samples having more than 40% porosity. Utilizing this approach we found a good agreement between numerical experiment and theoretically derived rock physics model for estimating the effective seismic wave velocity of rock.
Recent progress in the imaging of soil processes at the microscopic scale, and a look ahead
NASA Astrophysics Data System (ADS)
Garnier, Patricia; Baveye, Philippe C.; Pot, Valérie; Monga, Olivier; Portell, Xavier
2016-04-01
Over the last few years, tremendous progress has been achieved in the visualization of soil structures at the microscopic scale. Computed tomography, based on synchrotron X-ray beams or table-top equipment, allows the visualization of pore geometry at micrometric resolution. Chemical and microbiological information obtainable in 2D cuts through soils can now be interpolated, with the support of CT-data, to produce 3-dimensional maps. In parallel with these analytical advances, significant progress has also been achieved in the computer simulation and visualization of a range of physical, chemical, and microbiological processes taking place in soil pores. In terms of water distribution and transport in soils, for example, the use of Lattice-Boltzmann models as well as models based on geometric primitives has been shown recently to reproduce very faithfully observations made with synchrotron X-ray tomography. Coupling of these models with fungal and bacterial growth models allows the description of a range of microbiologically-mediated processes of great importance at the moment, for example in terms of carbon sequestration. In this talk, we shall review progress achieved to date in this field, indicate where questions remain unanswered, and point out areas where further advances are expected in the next few years.
Improving 3D Genome Reconstructions Using Orthologous and Functional Constraints
Diament, Alon; Tuller, Tamir
2015-01-01
The study of the 3D architecture of chromosomes has been advancing rapidly in recent years. While a number of methods for 3D reconstruction of genomic models based on Hi-C data were proposed, most of the analyses in the field have been performed on different 3D representation forms (such as graphs). Here, we reproduce most of the previous results on the 3D genomic organization of the eukaryote Saccharomyces cerevisiae using analysis of 3D reconstructions. We show that many of these results can be reproduced in sparse reconstructions, generated from a small fraction of the experimental data (5% of the data), and study the properties of such models. Finally, we propose for the first time a novel approach for improving the accuracy of 3D reconstructions by introducing additional predicted physical interactions to the model, based on orthologous interactions in an evolutionary-related organism and based on predicted functional interactions between genes. We demonstrate that this approach indeed leads to the reconstruction of improved models. PMID:26000633
ERIC Educational Resources Information Center
Ahtee, Maija, Ed.; And Others
The main purpose of this symposium was to find new ideas and resources for the evaluation and improvement of physics education on all levels. The papers included in this document are entitled: (1) "Quality of Physics Teaching Through Building Models and Advancing Research Skills"; (2) "Evaluation of Physics Education in Terms of Its…
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
Practical quantum mechanics-based fragment methods for predicting molecular crystal properties.
Wen, Shuhao; Nanda, Kaushik; Huang, Yuanhang; Beran, Gregory J O
2012-06-07
Significant advances in fragment-based electronic structure methods have created a real alternative to force-field and density functional techniques in condensed-phase problems such as molecular crystals. This perspective article highlights some of the important challenges in modeling molecular crystals and discusses techniques for addressing them. First, we survey recent developments in fragment-based methods for molecular crystals. Second, we use examples from our own recent research on a fragment-based QM/MM method, the hybrid many-body interaction (HMBI) model, to analyze the physical requirements for a practical and effective molecular crystal model chemistry. We demonstrate that it is possible to predict molecular crystal lattice energies to within a couple kJ mol(-1) and lattice parameters to within a few percent in small-molecule crystals. Fragment methods provide a systematically improvable approach to making predictions in the condensed phase, which is critical to making robust predictions regarding the subtle energy differences found in molecular crystals.
Physical activity in advanced cancer patients: a systematic review protocol.
Lowe, Sonya S; Tan, Maria; Faily, Joan; Watanabe, Sharon M; Courneya, Kerry S
2016-03-11
Progressive, incurable cancer is associated with increased fatigue, increased muscle weakness, and reduced physical functioning, all of which negatively impact quality of life. Physical activity has demonstrated benefits on cancer-related fatigue and physical functioning in early-stage cancer patients; however, its impact on these outcomes in end-stage cancer has not been established. The aim of this systematic review is to determine the potential benefits, harms, and effects of physical activity interventions on quality of life outcomes in advanced cancer patients. A systematic review of peer-reviewed literature on physical activity in advanced cancer patients will be undertaken. Empirical quantitative studies will be considered for inclusion if they present interventional or observational data on physical activity in advanced cancer patients. Searches will be conducted in the following electronic databases: CINAHL; CIRRIE Database of International Rehabilitation Research; Cochrane Database of Systematic Reviews (CDSR); Database of Abstracts of Reviews of Effects (DARE); Cochrane Central Register of Controlled Trials (CENTRAL); EMBASE; MEDLINE; PEDro: the Physiotherapy Evidence Database; PQDT; PsycInfo; PubMed; REHABDATA; Scopus; SPORTDiscus; and Web of Science, to identify relevant studies of interest. Additional strategies to identify relevant studies will include citation searches and evaluation of reference lists of included articles. Titles, abstracts, and keywords of identified studies from the search strategies will be screened for inclusion criteria. Two independent reviewers will conduct quality appraisal using the Effective Public Health Practice Project Quality Assessment Tool for Quantitative Studies (EPHPP) and the Cochrane risk of bias tool. A descriptive summary of included studies will describe the study designs, participant and activity characteristics, and objective and patient-reported outcomes. This systematic review will summarize the current evidence base on physical activity interventions in advanced cancer patients. The findings from this systematic review will identify gaps to be explored by future research studies and inform future practice guideline development of physical activity interventions in advanced cancer patients. PROSPERO CRD42015026281.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
Wang, Hongyuan; Zhang, Wei; Dong, Aotuo
2012-11-10
A modeling and validation method of photometric characteristics of the space target was presented in order to track and identify different satellites effectively. The background radiation characteristics models of the target were built based on blackbody radiation theory. The geometry characteristics of the target were illustrated by the surface equations based on its body coordinate system. The material characteristics of the target surface were described by a bidirectional reflectance distribution function model, which considers the character of surface Gauss statistics and microscale self-shadow and is obtained by measurement and modeling in advance. The contributing surfaces of the target to observation system were determined by coordinate transformation according to the relative position of the space-based target, the background radiation sources, and the observation platform. Then a mathematical model on photometric characteristics of the space target was built by summing reflection components of all the surfaces. Photometric characteristics simulation of the space-based target was achieved according to its given geometrical dimensions, physical parameters, and orbital parameters. Experimental validation was made based on the scale model of the satellite. The calculated results fit well with the measured results, which indicates the modeling method of photometric characteristics of the space target is correct.
ERIC Educational Resources Information Center
Anderson, Garrett; Mulvey, Patrick
2013-01-01
By the time people earn physics PhDs, they have learned a great deal about physics and how research is conducted. However, physics PhDs also develop skills and knowledge in a number of related areas, such as advanced mathematics, programming, modeling, and technical writing. New physics PhDs draw upon an arsenal of skills and knowledge in their…
A Statistician's View of Upcoming Grand Challenges
NASA Astrophysics Data System (ADS)
Meng, Xiao Li
2010-01-01
In this session we have seen some snapshots of the broad spectrum of challenges, in this age of huge, complex, computer-intensive models, data, instruments,and questions. These challenges bridge astronomy at many wavelengths; basic physics; machine learning; -- and statistics. At one end of our spectrum, we think of 'compressing' the data with non-parametric methods. This raises the question of creating 'pseudo-replicas' of the data for uncertainty estimates. What would be involved in, e.g. boot-strap and related methods? Somewhere in the middle are these non-parametric methods for encapsulating the uncertainty information. At the far end, we find more model-based approaches, with the physics model embedded in the likelihood and analysis. The other distinctive problem is really the 'black-box' problem, where one has a complicated e.g. fundamental physics-based computer code, or 'black box', and one needs to know how changing the parameters at input -- due to uncertainties of any kind -- will map to changing the output. All of these connect to challenges in complexity of data and computation speed. Dr. Meng will highlight ways to 'cut corners' with advanced computational techniques, such as Parallel Tempering and Equal Energy methods. As well, there are cautionary tales of running automated analysis with real data -- where "30 sigma" outliers due to data artifacts can be more common than the astrophysical event of interest.
Investigation of mechanistic deterioration modeling for bridge design and management.
DOT National Transportation Integrated Search
2017-04-01
The ongoing deterioration of highway bridges in Colorado dictates that an effective method for allocating limited management resources be developed. In order to predict bridge deterioration in advance, mechanistic models that analyze the physical pro...
Pubertal Development: Correspondence between Hormonal and Physical Development
ERIC Educational Resources Information Center
Shirtcliff, Elizabeth A.; Dahl, Ronald E.; Pollak, Seth D.
2009-01-01
Puberty is advanced by sex hormones, yet it is not clear how it is best measured. The interrelation of multiple indices of puberty was examined, including the Pubertal Development Scale (PDS), a picture-based interview about puberty (PBIP), and a physical exam. These physical pubertal measures were then associated with basal hormones responsible…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wardaya, P. D., E-mail: pongga.wardaya@utp.edu.my; Noh, K. A. B. M., E-mail: pongga.wardaya@utp.edu.my; Yusoff, W. I. B. W., E-mail: pongga.wardaya@utp.edu.my
This paper discusses a new approach for investigating the seismic wave velocity of rock, specifically carbonates, as affected by their pore structures. While the conventional routine of seismic velocity measurement highly depends on the extensive laboratory experiment, the proposed approach utilizes the digital rock physics view which lies on the numerical experiment. Thus, instead of using core sample, we use the thin section image of carbonate rock to measure the effective seismic wave velocity when travelling on it. In the numerical experiment, thin section images act as the medium on which wave propagation will be simulated. For the modeling, anmore » advanced technique based on artificial neural network was employed for building the velocity and density profile, replacing image's RGB pixel value with the seismic velocity and density of each rock constituent. Then, ultrasonic wave was simulated to propagate in the thin section image by using finite difference time domain method, based on assumption of an acoustic-isotropic medium. Effective velocities were drawn from the recorded signal and being compared to the velocity modeling from Wyllie time average model and Kuster-Toksoz rock physics model. To perform the modeling, image analysis routines were undertaken for quantifying the pore aspect ratio that is assumed to represent the rocks pore structure. In addition, porosity and mineral fraction required for velocity modeling were also quantified by using integrated neural network and image analysis technique. It was found that the Kuster-Toksoz gives the closer prediction to the measured velocity as compared to the Wyllie time average model. We also conclude that Wyllie time average that does not incorporate the pore structure parameter deviates significantly for samples having more than 40% porosity. Utilizing this approach we found a good agreement between numerical experiment and theoretically derived rock physics model for estimating the effective seismic wave velocity of rock.« less
NASA Astrophysics Data System (ADS)
Khuwaileh, Bassam
High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL) based algorithm previously developed to quantify the uncertainty for single physics models is extended for large scale multi-physics coupled problems with feedback effect. Moreover, a non-linear surrogate based UQ approach is developed, used and compared to performance of the KL approach and brute force Monte Carlo (MC) approach. On the other hand, an efficient Data Assimilation (DA) algorithm is developed to assess information about model's parameters: nuclear data cross-sections and thermal-hydraulics parameters. Two improvements are introduced in order to perform DA on the high dimensional problems. First, a goal-oriented surrogate model can be used to replace the original models in the depletion sequence (MPACT -- COBRA-TF - ORIGEN). Second, approximating the complex and high dimensional solution space with a lower dimensional subspace makes the sampling process necessary for DA possible for high dimensional problems. Moreover, safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. Accordingly, an inverse problem can be defined and solved to assess the contributions from sources of uncertainty; and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this dissertation a subspace-based gradient-free and nonlinear algorithm for inverse uncertainty quantification namely the Target Accuracy Assessment (TAA) has been developed and tested. The ideas proposed in this dissertation were first validated using lattice physics applications simulated using SCALE6.1 package (Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) lattice models). Ultimately, the algorithms proposed her were applied to perform UQ and DA for assembly level (CASL progression problem number 6) and core wide problems representing Watts Bar Nuclear 1 (WBN1) for cycle 1 of depletion (CASL Progression Problem Number 9) modeled via simulated using VERA-CS which consists of several multi-physics coupled models. The analysis and algorithms developed in this dissertation were encoded and implemented in a newly developed tool kit algorithms for Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE).
ERIC Educational Resources Information Center
Bhathal, Ragbir; Sharma, Manjula D.; Mendez, Alberto
2010-01-01
This paper describes an educational analysis of a first year physics experiment on standing waves for engineering students. The educational analysis is based on the ACELL (Advancing Chemistry by Enhancing Learning in the Laboratory) approach which includes a statement of educational objectives and an analysis of student learning experiences. The…
NASA Astrophysics Data System (ADS)
Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin
2018-01-01
The precise modeling of subatomic particle interactions and propagation through matter is paramount for the advancement of nuclear and particle physics searches and precision measurements. The most computationally expensive step in the simulation pipeline of a typical experiment at the Large Hadron Collider (LHC) is the detailed modeling of the full complexity of physics processes that govern the motion and evolution of particle showers inside calorimeters. We introduce CaloGAN, a new fast simulation technique based on generative adversarial networks (GANs). We apply these neural networks to the modeling of electromagnetic showers in a longitudinally segmented calorimeter and achieve speedup factors comparable to or better than existing full simulation techniques on CPU (100 ×-1000 × ) and even faster on GPU (up to ˜105× ). There are still challenges for achieving precision across the entire phase space, but our solution can reproduce a variety of geometric shower shape properties of photons, positrons, and charged pions. This represents a significant stepping stone toward a full neural network-based detector simulation that could save significant computing time and enable many analyses now and in the future.
Can Industrial Physics Avoid Being Creatively Destroyed?
NASA Astrophysics Data System (ADS)
Hass, Kenneth C.
2004-03-01
Opportunities abound for physics and physicists to remain vital contributors to industrial innovation throughout the 21st century. The key questions are whether those trained in physics are sufficiently willing and flexible to continuously enhance their value to their companies by adapting to changing business priorities and whether business leaders are sufficiently enlightened to recognize and exploit the unique skills and creativity that physicists often provide. "Industrial physics" today is more diverse than ever, and answers to the above questions will vary with sector, company, and even individual physicists. Such heterogeneity creates new challenges for the physics community in general, which may need to undergo significant cultural change to maintain strong ties between physicists in industry, academia, and government. Insights from the emerging science of complex systems will be used to emphasize the importance of realistic mental models for the interactions between science and technology and the pathways from scientific advance to successful commercialization. Examples will be provided of the ongoing value of physics-based research in the auto industry and of the growing importance of interdisciplinary approaches to the technical needs of industry.
Identifiability Of Systems With Modeling Errors
NASA Technical Reports Server (NTRS)
Hadaegh, Yadolah " fred"
1988-01-01
Advances in theory of modeling errors reported. Recent paper on errors in mathematical models of deterministic linear or weakly nonlinear systems. Extends theoretical work described in NPO-16661 and NPO-16785. Presents concrete way of accounting for difference in structure between mathematical model and physical process or system that it represents.
TU-C-18C-01: Medical Physics 1.0 to 2.0: Introduction and Panel Discussion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samei, E; Pfeiffer, D; Frey, G
2014-06-15
Medical Physics 2.0, a new frontier in clinical imaging physics: Diagnostic imaging has always been a technological highlight of modern medicine. Imaging systems, with their ever-expanding advancement in terms of technology and application, increasingly require skilled expertise to understand the delicacy of their operation, monitor their performance, design their effective use, and ensure their overall quality and safety, scientifically and in quantitative terms. Physicists can play a crucial role in that process. But that role has largely remained a severely untapped resource. Many imaging centers fail to appreciate this potential, with medical physics groups either nonexistent or highly understaffed andmore » their services poorly integrated into the patient care process. As a field, we have yet to define and enact how the clinical physicist can engage as an active, effective, and integral member of the clinical team, and how the services that she/he provides can be financially accounted for. Physicists do and will always contribute to research and development. However, their indispensible contribution to clinical imaging operations is something that has not been adequately established. That, in conjunction with new realities of healthcare practice, indicates a growing need to establish an updated approach to clinical medical imaging physics. This presentation aims to describe a vision as how clinical imaging physics can expand beyond traditional insular models of inspection and acceptance testing, oriented toward compliance, towards team-based models of operational engagement addressing topics such as new non-classical challenges of new technologies, quantitative imaging, and operational optimization. The Medical Physics 2.0 paradigm extends clinical medical physics from isolated characterization of inherent properties of the equipment to effective use of the equipment and to retrospective evaluation of clinical performance. This is an existential transition of the field that speaks to the new paradigms of value-based and evidence-based medicine, comparative effectiveness, and meaningful use. The panel discussion that follows includes prominent practitioners, thinkers, and leaders that would lead the discussion on how Medical Physics 2.0 can be actualized. Topics of discussion will include the administrative, financial, regulatory, and accreditation requirements of the new paradigm, effective models of practice, and the steps that we need to take to make MP 2.0 a reality. Learning Objectives: To understand the new paradigm of clinical medical physics practice extending from traditional insular models of compliance towards teambased models of operational engagement. To understand how clinical physics can most effectively contribute to clinical care. Learn to identify strengths and weaknesses in studies designed to measure the effect of low doses of ionizing radiation To recognize the impediments to Medical Physics 2.0 paradigm.« less
TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science
NASA Astrophysics Data System (ADS)
Wilson, C. R.; Spiegelman, M.; van Keken, P.
2012-12-01
Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are generated from this file but share an infrastructure for services common to all models, e.g. diagnostics, checkpointing and global non-linear convergence monitoring. This maximizes code reusability, reliability and longevity ensuring that scientific results and the methods used to acquire them are transparent and reproducible. TerraFERMA has been tested against many published geodynamic benchmarks including 2D/3D thermal convection problems, the subduction zone benchmarks and benchmarks for magmatic solitary waves. It is currently being used in the investigation of reactive cracking phenomena with applications to carbon sequestration, but we will principally discuss its use in modeling the migration of fluids in subduction zones. Subduction zones require an understanding of the highly nonlinear interactions of fluids with solids and thus provide an excellent scientific driver for the development of multi-physics software.
Supersonic Combustion in Air-Breathing Propulsion Systems for Hypersonic Flight
NASA Astrophysics Data System (ADS)
Urzay, Javier
2018-01-01
Great efforts have been dedicated during the last decades to the research and development of hypersonic aircrafts that can fly at several times the speed of sound. These aerospace vehicles have revolutionary applications in national security as advanced hypersonic weapons, in space exploration as reusable stages for access to low Earth orbit, and in commercial aviation as fast long-range methods for air transportation of passengers around the globe. This review addresses the topic of supersonic combustion, which represents the central physical process that enables scramjet hypersonic propulsion systems to accelerate aircrafts to ultra-high speeds. The description focuses on recent experimental flights and ground-based research programs and highlights associated fundamental flow physics, subgrid-scale model development, and full-system numerical simulations.
Hybrid Circuits with Nanofluidic Diodes and Load Capacitors
NASA Astrophysics Data System (ADS)
Ramirez, P.; Garcia-Morales, V.; Gomez, V.; Ali, M.; Nasir, S.; Ensinger, W.; Mafe, S.
2017-06-01
The chemical and physical input signals characteristic of micro- and nanofluidic devices operating in ionic solutions should eventually be translated into output electric currents and potentials that are monitored with solid-state components. This crucial step requires the design of hybrid circuits showing robust electrical coupling between ionic solutions and electronic elements. We study experimentally and theoretically the connectivity of the nanofluidic diodes in single-pore and multipore membranes with conventional capacitor systems for the cases of constant, periodic, and white-noise input potentials. The experiments demonstrate the reliable operation of these hybrid circuits over a wide range of membrane resistances, electrical capacitances, and solution p H values. The model simulations are based on empirical equations that have a solid physical basis and provide a convenient description of the electrical circuit operation. The results should contribute to advance signal transduction and processing using nanopore-based biosensors and bioelectronic interfaces.
A Robotic Coach Architecture for Elder Care (ROCARE) Based on Multi-user Engagement Models
Fan, Jing; Bian, Dayi; Zheng, Zhi; Beuscher, Linda; Newhouse, Paul A.; Mion, Lorraine C.; Sarkar, Nilanjan
2017-01-01
The aging population with its concomitant medical conditions, physical and cognitive impairments, at a time of strained resources, establishes the urgent need to explore advanced technologies that may enhance function and quality of life. Recently, robotic technology, especially socially assistive robotics has been investigated to address the physical, cognitive, and social needs of older adults. Most system to date have predominantly focused on one-on-one human robot interaction (HRI). In this paper, we present a multi-user engagement-based robotic coach system architecture (ROCARE). ROCARE is capable of administering both one-on-one and multi-user HRI, providing implicit and explicit channels of communication, and individualized activity management for long-term engagement. Two preliminary feasibility studies, a one-on-one interaction and a triadic interaction with two humans and a robot, were conducted and the results indicated potential usefulness and acceptance by older adults, with and without cognitive impairment. PMID:28113672
The effects of physical aging at elevated temperatures on the viscoelastic creep on IM7/K3B
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Feldman, Mark
1994-01-01
Physical aging at elevated temperature of the advanced composite IM7/K3B was investigated through the use of creep compliance tests. Testing consisted of short term isothermal, creep/recovery with the creep segments performed at constant load. The matrix dominated transverse tensile and in-plane shear behavior were measured at temperatures ranging from 200 to 230 C. Through the use of time based shifting procedures, the aging shift factors, shift rates and momentary master curve parameters were found at each temperature. These material parameters were used as input to a predictive methodology, which was based upon effective time theory and linear viscoelasticity combined with classical lamination theory. Long term creep compliance test data was compared to predictions to verify the method. The model was then used to predict the long term creep behavior for several general laminates.
A Robotic Coach Architecture for Elder Care (ROCARE) Based on Multi-User Engagement Models.
Fan, Jing; Bian, Dayi; Zheng, Zhi; Beuscher, Linda; Newhouse, Paul A; Mion, Lorraine C; Sarkar, Nilanjan
2017-08-01
The aging population with its concomitant medical conditions, physical and cognitive impairments, at a time of strained resources, establishes the urgent need to explore advanced technologies that may enhance function and quality of life. Recently, robotic technology, especially socially assistive robotics has been investigated to address the physical, cognitive, and social needs of older adults. Most system to date have predominantly focused on one-on-one human robot interaction (HRI). In this paper, we present a multi-user engagement-based robotic coach system architecture (ROCARE). ROCARE is capable of administering both one-on-one and multi-user HRI, providing implicit and explicit channels of communication, and individualized activity management for long-term engagement. Two preliminary feasibility studies, a one-on-one interaction and a triadic interaction with two humans and a robot, were conducted and the results indicated potential usefulness and acceptance by older adults, with and without cognitive impairment.
Quantum lattice model solver HΦ
NASA Astrophysics Data System (ADS)
Kawamura, Mitsuaki; Yoshimi, Kazuyoshi; Misawa, Takahiro; Yamaji, Youhei; Todo, Synge; Kawashima, Naoki
2017-08-01
HΦ [aitch-phi ] is a program package based on the Lanczos-type eigenvalue solution applicable to a broad range of quantum lattice models, i.e., arbitrary quantum lattice models with two-body interactions, including the Heisenberg model, the Kitaev model, the Hubbard model and the Kondo-lattice model. While it works well on PCs and PC-clusters, HΦ also runs efficiently on massively parallel computers, which considerably extends the tractable range of the system size. In addition, unlike most existing packages, HΦ supports finite-temperature calculations through the method of thermal pure quantum (TPQ) states. In this paper, we explain theoretical background and user-interface of HΦ. We also show the benchmark results of HΦ on supercomputers such as the K computer at RIKEN Advanced Institute for Computational Science (AICS) and SGI ICE XA (Sekirei) at the Institute for the Solid State Physics (ISSP).
DARPA Helicopter Quieting Program W911NF0410424
2009-05-01
Leishman , J. G. and Beddoes , T. S., “A Semi-Empirical Model for Dynamic Stall ,” Journal of the American Heli- copter Society, Vol. 34, No. 3, July 1989...of physical phenomena that include transonic and compressibility effects on the advancing blade, dynamic stall on the retreating blades and the...research approach is that even the most advanced models of a given discipline, e.g., comprehensive structural or flight dynamics codes , concentrate on a very
Tumour and normal tissue radiobiology in mouse models: how close are mice to mini-humans?
Koontz, Bridget F; Verhaegen, Frank; De Ruysscher, Dirk
2017-01-01
Animal modelling is essential to the study of radiobiology and the advancement of clinical radiation oncology by providing preclinical data. Mouse models in particular have been highly utilized in the study of both tumour and normal tissue radiobiology because of their cost effectiveness and versatility. Technology has significantly advanced in preclinical radiation techniques to allow highly conformal image-guided irradiation of small animals in an effort to mimic human treatment capabilities. However, the biological and physical limitations of animal modelling should be recognized and considered when interpreting preclinical radiotherapy (RT) studies. Murine tumour and normal tissue radioresponse has been shown to vary from human cellular and molecular pathways. Small animal irradiation techniques utilize different anatomical boundaries and may have different physical properties than human RT. This review addresses the difference between the human condition and mouse models and discusses possible strategies for future refinement of murine models of cancer and radiation for the benefit of both basic radiobiology and clinical translation.
Tumour and normal tissue radiobiology in mouse models: how close are mice to mini-humans?
Verhaegen, Frank; De Ruysscher, Dirk
2017-01-01
Animal modelling is essential to the study of radiobiology and the advancement of clinical radiation oncology by providing preclinical data. Mouse models in particular have been highly utilized in the study of both tumour and normal tissue radiobiology because of their cost effectiveness and versatility. Technology has significantly advanced in preclinical radiation techniques to allow highly conformal image-guided irradiation of small animals in an effort to mimic human treatment capabilities. However, the biological and physical limitations of animal modelling should be recognized and considered when interpreting preclinical radiotherapy (RT) studies. Murine tumour and normal tissue radioresponse has been shown to vary from human cellular and molecular pathways. Small animal irradiation techniques utilize different anatomical boundaries and may have different physical properties than human RT. This review addresses the difference between the human condition and mouse models and discusses possible strategies for future refinement of murine models of cancer and radiation for the benefit of both basic radiobiology and clinical translation. PMID:27612010
Atomistic modeling for interfacial properties of Ni-Al-V ternary system
NASA Astrophysics Data System (ADS)
Dong, Wei-ping; Lee, Byeong-Joo; Chen, Zheng
2014-05-01
Interatomic potentials for Ni-Al-V ternary systems have been developed based on the second-nearest-neighbor modified embedded-atom method potential formalism. The potentials can describe various fundamental physical properties of the relevant materials in good agreement with experimental information. The potential is utilized for an atomistic computation of interfacial properties of Ni-Al-V alloys. It is found that vanadium atoms segregate on the γ-fcc/L12 interface and this segregation affects the interfacial properties. The applicability of the atomistic approach to an elaborate alloy design of advanced Ni-based superalloys through the investigation of the effect of alloying elements on interfacial properties is discussed.
Cross-disciplinarity in the advance of Antarctic ecosystem research.
Gutt, J; Isla, E; Bertler, A N; Bodeker, G E; Bracegirdle, T J; Cavanagh, R D; Comiso, J C; Convey, P; Cummings, V; De Conto, R; De Master, D; di Prisco, G; d'Ovidio, F; Griffiths, H J; Khan, A L; López-Martínez, J; Murray, A E; Nielsen, U N; Ott, S; Post, A; Ropert-Coudert, Y; Saucède, T; Scherer, R; Schiaparelli, S; Schloss, I R; Smith, C R; Stefels, J; Stevens, C; Strugnell, J M; Trimborn, S; Verde, C; Verleyen, E; Wall, D H; Wilson, N G; Xavier, J C
2018-02-01
The biodiversity, ecosystem services and climate variability of the Antarctic continent and the Southern Ocean are major components of the whole Earth system. Antarctic ecosystems are driven more strongly by the physical environment than many other marine and terrestrial ecosystems. As a consequence, to understand ecological functioning, cross-disciplinary studies are especially important in Antarctic research. The conceptual study presented here is based on a workshop initiated by the Research Programme Antarctic Thresholds - Ecosystem Resilience and Adaptation of the Scientific Committee on Antarctic Research, which focussed on challenges in identifying and applying cross-disciplinary approaches in the Antarctic. Novel ideas and first steps in their implementation were clustered into eight themes. These ranged from scale problems, through risk maps, and organism/ecosystem responses to multiple environmental changes and evolutionary processes. Scaling models and data across different spatial and temporal scales were identified as an overarching challenge. Approaches to bridge gaps in Antarctic research programmes included multi-disciplinary monitoring, linking biomolecular findings and simulated physical environments, as well as integrative ecological modelling. The results of advanced cross-disciplinary approaches can contribute significantly to our knowledge of Antarctic and global ecosystem functioning, the consequences of climate change, and to global assessments that ultimately benefit humankind. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
2016-03-31
particular physical model under consideration. Therefore, in the following the enrichment functions are discussed with respect to particular...some domains of influence are extended outside of the physical boundary, the reproducing conditions enforced in Eq. (6) guarantee the order of...often used in astrophysics problems, where many fluid problems are encountered and even “solid" bodies deform under their own gravity. It can also
NASA Iced Aerodynamics and Controls Current Research
NASA Technical Reports Server (NTRS)
Addy, Gene
2009-01-01
This slide presentation reviews the state of current research in the area of aerodynamics and aircraft control with ice conditions by the Aviation Safety Program, part of the Integrated Resilient Aircraft Controls Project (IRAC). Included in the presentation is a overview of the modeling efforts. The objective of the modeling is to develop experimental and computational methods to model and predict aircraft response during adverse flight conditions, including icing. The Aircraft icing modeling efforts includes the Ice-Contaminated Aerodynamics Modeling, which examines the effects of ice contamination on aircraft aerodynamics, and CFD modeling of ice-contaminated aircraft aerodynamics, and Advanced Ice Accretion Process Modeling which examines the physics of ice accretion, and works on computational modeling of ice accretions. The IRAC testbed, a Generic Transport Model (GTM) and its use in the investigation of the effects of icing on its aerodynamics is also reviewed. This has led to a more thorough understanding and models, both theoretical and empirical of icing physics and ice accretion for airframes, advanced 3D ice accretion prediction codes, CFD methods for iced aerodynamics and better understanding of aircraft iced aerodynamics and its effects on control surface effectiveness.
NASA Astrophysics Data System (ADS)
Wu, Haiqing; Bai, Bing; Li, Xiaochun
2018-02-01
Existing analytical or approximate solutions that are appropriate for describing the migration mechanics of CO2 and the evolution of fluid pressure in reservoirs do not consider the high compressibility of CO2, which reduces their calculation accuracy and application value. Therefore, this work first derives a new governing equation that represents the movement of complex fluids in reservoirs, based on the equation of continuity and the generalized Darcy's law. A more rigorous definition of the coefficient of compressibility of fluid is then presented, and a power function model (PFM) that characterizes the relationship between the physical properties of CO2 and the pressure is derived. Meanwhile, to avoid the difficulty of determining the saturation of fluids, a method that directly assumes the average relative permeability of each fluid phase in different fluid domains is proposed, based on the theory of gradual change. An advanced analytical solution is obtained that includes both the partial miscibility and the compressibility of CO2 and brine in evaluating the evolution of fluid pressure by integrating within different regions. Finally, two typical sample analyses are used to verify the reliability, improved nature and universality of this new analytical solution. Based on the physical characteristics and the results calculated for the examples, this work elaborates the concept and basis of partitioning for use in further work.
Some highlights in few-body nuclear physics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holt, R. J.
2000-12-07
During the past five years, there have been tremendous advances in both experiments and theoretical calculations in few-body nuclear systems. Advances in technology have permitted experiments of unprecedented accuracy. Jefferson Laboratory has begun operation and the first round of experimental results have become available. New polarization techniques have been exploited at a number of laboratories, in particular, at Jefferson Lab, IUCF, RIKEN, NIKHEF, Mainz, MIT-Bates and HERMES. Some of these results will be shown here. In addition, there have been tremendous advances in few-body theory. Five modern two-nucleon potentials have which describe the nucleon-nucleon data extremely well have become available.more » A standard model of nuclear physics based on these two nucleon potentials as well as modern three-nucleon forces has emerged. This standard model has enjoyed tremendous success in the few body systems. Exact three-body calculations have been extended into the continuum in order to take full advantage of scattering data in advancing our understanding of the the few-nucleon system. In addition, the application of chiral symmetry has become an important constraint on nucleon-nucleon as well as three-nucleon forces. As a result of all these efforts, we have seen rapid developments in the three-body force. Despite these advances, there remain some extremely important open issues: (1) What is the role of quarks and gluons in nuclear structure; (2) Can we distinguish meson exchange from quark interchange; (3) Is few-body theory sufficient to describe simultaneously the mass 2, 3 and 4 form factors; (4) What is the isospin and spin dependence of the three-body force; (5) Are there medium modifications for nucleons and mesons in nuclei; (6) Is there an enhancement of antiquarks or pions in nuclei related to the binding; and (7) Are short range correlations observable in nuclei? In this paper the author summarizes the status of our understanding of these issues.« less
ERIC Educational Resources Information Center
Young, Robert D.
1973-01-01
Discusses the charge independence, wavefunctions, magnetic moments, and high-energy scattering of hadrons on the basis of group theory and nonrelativistic quark model with mass spectrum calculated by first-order perturbation theory. The presentation is explainable to advanced undergraduate students. (CC)
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
Visualizing and Quantifying Pore Scale Fluid Flow Processes With X-ray Microtomography
NASA Astrophysics Data System (ADS)
Wildenschild, D.; Hopmans, J. W.; Vaz, C. M.; Rivers, M. L.
2001-05-01
When using mathematical models based on Darcy's law it is often necessary to simplify geometry, physics or both and the capillary bundle-of-tubes approach neglects a fundamentally important characteristic of porous solids, namely interconnectedness of the pore space. New approaches to pore-scale modeling that arrange capillary tubes in two- or three-dimensional pore space have been and are still under development: Network models generally represent the pore space by spheres while the pore throats are usually represented by cylinders or conical shapes. Lattice Boltzmann approaches numerically solve the Navier-Stokes equations in a realistic microscopically disordered geometry, which offers the ability to study the microphysical basis of macroscopic flow without the need for a simplified geometry or physics. In addition to these developments in numerical modeling techniques, new theories have proposed that interfacial area should be considered as a primary variable in modeling of a multi-phase flow system. In the wake of this progress emerges an increasing need for new ways of evaluating pore-scale models, and for techniques that can resolve and quantify phase interfaces in porous media. The mechanisms operating at the pore-scale cannot be measured with traditional experimental techniques, however x-ray computerized microtomography (CMT) provides non-invasive observation of, for instance, changing fluid phase content and distribution on the pore scale. Interfacial areas have thus far been measured indirectly, but with the advances in high-resolution imaging using CMT it is possible to track interfacial area and curvature as a function of phase saturation or capillary pressure. We present results obtained at the synchrotron-based microtomography facility (GSECARS, sector 13) at the Advanced Photon Source at Argonne National Laboratory. Cylindrical sand samples of either 6 or 1.5 mm diameter were scanned at different stages of drainage and for varying boundary conditions. A significant difference in fluid saturation and phase distribution was observed for different drainage conditions, clearly showing preferential flow and a dependence on the applied flow rate. For the 1.5 mm sample individual pores and water/air interfaces could be resolved and quantified using image analysis techniques. Use of the Advanced Photon Source was supported by the U.S. Department of Energy, Basic Energy Sciences, Office of Science, under Contract No. W-31-109-Eng-38.
Plasma-based water purification: Challenges and prospects for the future
NASA Astrophysics Data System (ADS)
Foster, John E.
2017-05-01
Freshwater scarcity derived from seasonal weather variations, climate change, and over-development has led to serious consideration for water reuse. Water reuse involves the direct processing of wastewater for either indirect or directly potable water reuse. In either case, advanced water treatment technologies will be required to process the water to the point that it can be reused in a meaningful way. Additionally, there is growing concern regarding micropollutants, such as pharmaceuticals and personal care products, which have been detected in finished drinking water not removed by conventional means. The health impact of these contaminants in low concentration is not well understood. Pending regulatory action, the removal of these contaminants by water treatment plants will also require advanced technology. One new and emerging technology that could potentially address the removal of micropollutants in both finished drinking water as well as wastewater slated for reuse is plasma-based water purification. Plasma in contact with liquid water generates a host of reactive species that attack and ultimately mineralize contaminants in solution. This interaction takes place in the boundary layer or interaction zone centered at the plasma-liquid water interface. An understanding of the physical processes taking place at the interface, though poorly understood, is key to the optimization of plasma-based water purifiers. High electric field conditions, large density gradients, plasma-driven chemistries, and fluid dynamic effects prevail in this multiphase region. The region is also the source function for longer-lived reactive species that ultimately treat the water. Here, we review the need for advanced water treatment methods and in the process, make the case for plasma-based methods. Additionally, we survey the basic methods of interacting plasma with liquid water (including a discussion of breakdown processes in water), the current state of understanding of the physical processes taking place at the plasma-liquid interface, and the role these processes play in water purification. The development of plasma diagnostics usable in this multiphase environment along with modeling efforts aimed at elucidating physical processes taking place at the interface are also detailed. Key experiments that demonstrate the capability of plasma-based water treatment are also reviewed. The technical challenges to the implementation of plasma-based water reactors are also discussed. We conclude with a discussion of prospects for the future of plasma-based water purification.
Trujillo, Caleb M; Anderson, Trevor R; Pelaez, Nancy J
2016-06-01
In biology and physiology courses, students face many difficulties when learning to explain mechanisms, a topic that is demanding due to the immense complexity and abstract nature of molecular and cellular mechanisms. To overcome these difficulties, we asked the following question: how does an instructor transform their understanding of biological mechanisms and other difficult-to-learn topics so that students can comprehend them? To address this question, we first reviewed a model of the components used by biologists to explain molecular and cellular mechanisms: the MACH model, with the components of methods (M), analogies (A), context (C), and how (H). Next, instructional materials were developed and the teaching activities were piloted with a physical MACH model. Students who used the MACH model to guide their explanations of mechanisms exhibited both improvements and some new difficulties. Third, a series of design-based research cycles was applied to bring the activities with an improved physical MACH model into biology and biochemistry courses. Finally, a useful rubric was developed to address prevalent student difficulties. Here, we present, for physiology and biology instructors, the knowledge and resources for explaining molecular and cellular mechanisms in undergraduate courses with an instructional design process aimed at realizing pedagogical content knowledge for teaching. Our four-stage process could be adapted to advance instruction with a range of models in the life sciences. Copyright © 2016 The American Physiological Society.
Anderson, Trevor R.; Pelaez, Nancy J.
2016-01-01
In biology and physiology courses, students face many difficulties when learning to explain mechanisms, a topic that is demanding due to the immense complexity and abstract nature of molecular and cellular mechanisms. To overcome these difficulties, we asked the following question: how does an instructor transform their understanding of biological mechanisms and other difficult-to-learn topics so that students can comprehend them? To address this question, we first reviewed a model of the components used by biologists to explain molecular and cellular mechanisms: the MACH model, with the components of methods (M), analogies (A), context (C), and how (H). Next, instructional materials were developed and the teaching activities were piloted with a physical MACH model. Students who used the MACH model to guide their explanations of mechanisms exhibited both improvements and some new difficulties. Third, a series of design-based research cycles was applied to bring the activities with an improved physical MACH model into biology and biochemistry courses. Finally, a useful rubric was developed to address prevalent student difficulties. Here, we present, for physiology and biology instructors, the knowledge and resources for explaining molecular and cellular mechanisms in undergraduate courses with an instructional design process aimed at realizing pedagogical content knowledge for teaching. Our four-stage process could be adapted to advance instruction with a range of models in the life sciences. PMID:27231262
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snelson, C. M., Chipman, V. D., White, R. L., Emmitt, R. F., Townsend, M. J., Barker, D., Lee, P.
Understanding the changes in seismic energy as it travels from the near field to the far field is the ultimate goal in monitoring for explosive events of interest. This requires a clear understanding of explosion phenomenology as it relates to seismic, infrasound, and acoustic signals. Although there has been much progress in modeling these phenomena, this has been primarily based in the empirical realm. As a result, the logical next step in advancing the seismic monitoring capability of the United States is to conduct field tests that can expand the predictive capability of the physics-based modeling currently under development. Themore » Source Physics Experiment at the Nevada National Security Site (SPE-N) is the first step in this endeavor to link the empirically based with the physics-based modeling. This is a collaborative project between National Security Technologies (NSTec), Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Sandia National Laboratories (SNL), the Defense Threat Reduction Agency (DTRA), and the Air Force Technical Applications Center (AFTAC). The test series require both the simple and complex cases to fully characterize the problem, which is to understand the transition of seismic energy from the near field to the far field; to understand the development of S-waves in explosives sources; and how anisotropy controls seismic energy transmission and partitioning. The current series is being conducted in a granite body called the Climax Stock. This location was chosen for several reasons, including the fairly homogenous granite; the location of previous nuclear tests in the same rock body; and generally the geology has been well characterized. The simple geology series is planned for 7 shots using conventional explosives in the same shot hole surrounded by Continuous Reflectometry for Radius vs. Time Experiment (CORRTEX), Time of Arrival (TOA), Velocity of Detonation (VOD), down-hole accelerometers, surface accelerometers, infrasound, and a suite of seismic sensors of various frequency bands from the near field to the far field. This allows for the use of a single test bed in the simple geology case instead of multiple tests beds to obtain the same results. The shots are planned at various depths to obtain a Green’s function, scaled-depth of burial data, nominal depth of burial data and damage zone data. SPE1-N was conducted in May 2011 as a 220 lb (100 kg) TNT equivalent calibration shot at a depth of 180 ft (55 m). SPE2-N was conducted in October 2011 as a 2200 lb (1000 kg) TNT equivalent calibration shot at a depth of 150 ft (46 m). SPE3-N was conducted in July 2012 as a 2200 lb (1000 kg) TNT equivalent calibration shot at a depth of 150 ft (46 m) in the damaged zone. Over 400 data channels were recorded for each of these shots and data recovery was about 95% with high signal to noise ratio. Once the simple geology site data has been utilized, a new test bed will be developed in a complex geology site to test these physics based models. Ultimately, the results from this project will provide the next advances in the science of monitoring to enable a physics-based predicative capability.« less
ERIC Educational Resources Information Center
Szott, Aaron
2014-01-01
Traditional physics labs at the high school level are often closed-ended. The outcomes are known in advance and students replicate procedures recommended by the teacher. Over the years, I have come to appreciate the great opportunities created by allowing students investigative freedom in physics laboratories. I have realized that a laboratory…
Developing a predictive model for the chemical composition of soot nanoparticles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Violi, Angela; Michelsen, Hope; Hansen, Nils
In order to provide the scientific foundation to enable technology breakthroughs in transportation fuel, it is important to develop a combustion modeling capability to optimize the operation and design of evolving fuels in advanced engines for transportation applications. The goal of this proposal is to develop a validated predictive model to describe the chemical composition of soot nanoparticles in premixed and diffusion flames. Atomistic studies in conjunction with state-of-the-art experiments are the distinguishing characteristics of this unique interdisciplinary effort. The modeling effort has been conducted at the University of Michigan by Prof. A. Violi. The experimental work has entailed amore » series of studies using different techniques to analyze gas-phase soot precursor chemistry and soot particle production in premixed and diffusion flames. Measurements have provided spatial distributions of polycyclic aromatic hydrocarbons and other gas-phase species and size and composition of incipient soot nanoparticles for comparison with model results. The experimental team includes Dr. N. Hansen and H. Michelsen at Sandia National Labs' Combustion Research Facility, and Dr. K. Wilson as collaborator at Lawrence Berkeley National Lab's Advanced Light Source. Our results show that the chemical and physical properties of nanoparticles affect the coagulation behavior in soot formation, and our results on an experimentally validated, predictive model for the chemical composition of soot nanoparticles will not only enhance our understanding of soot formation since but will also allow the prediction of particle size distributions under combustion conditions. These results provide a novel description of soot formation based on physical and chemical properties of the particles for use in the next generation of soot models and an enhanced capability for facilitating the design of alternative fuels and the engines they will power.« less
Soil Erosion as a stochastic process
NASA Astrophysics Data System (ADS)
Casper, Markus C.
2015-04-01
The main tools to provide estimations concerning risk and amount of erosion are different types of soil erosion models: on the one hand, there are empirically based model concepts on the other hand there are more physically based or process based models. However, both types of models have substantial weak points. All empirical model concepts are only capable of providing rough estimates over larger temporal and spatial scales, they do not account for many driving factors that are in the scope of scenario related analysis. In addition, the physically based models contain important empirical parts and hence, the demand for universality and transferability is not given. As a common feature, we find, that all models rely on parameters and input variables, which are to certain, extend spatially and temporally averaged. A central question is whether the apparent heterogeneity of soil properties or the random nature of driving forces needs to be better considered in our modelling concepts. Traditionally, researchers have attempted to remove spatial and temporal variability through homogenization. However, homogenization has been achieved through physical manipulation of the system, or by statistical averaging procedures. The price for obtaining this homogenized (average) model concepts of soils and soil related processes has often been a failure to recognize the profound importance of heterogeneity in many of the properties and processes that we study. Especially soil infiltrability and the resistance (also called "critical shear stress" or "critical stream power") are the most important empirical factors of physically based erosion models. The erosion resistance is theoretically a substrate specific parameter, but in reality, the threshold where soil erosion begins is determined experimentally. The soil infiltrability is often calculated with empirical relationships (e.g. based on grain size distribution). Consequently, to better fit reality, this value needs to be corrected experimentally. To overcome this disadvantage of our actual models, soil erosion models are needed that are able to use stochastic directly variables and parameter distributions. There are only some minor approaches in this direction. The most advanced is the model "STOSEM" proposed by Sidorchuk in 2005. In this model, only a small part of the soil erosion processes is described, the aggregate detachment and the aggregate transport by flowing water. The concept is highly simplified, for example, many parameters are temporally invariant. Nevertheless, the main problem is that our existing measurements and experiments are not geared to provide stochastic parameters (e.g. as probability density functions); in the best case they deliver a statistical validation of the mean values. Again, we get effective parameters, spatially and temporally averaged. There is an urgent need for laboratory and field experiments on overland flow structure, raindrop effects and erosion rate, which deliver information on spatial and temporal structure of soil and surface properties and processes.
NASA Technical Reports Server (NTRS)
Gayda, John
2003-01-01
As part of NASA s Advanced Subsonic Technology Program, a study of stabilization heat treatment options for an advanced nickel-base disk alloy, ME 209, was performed. Using a simple, physically based approach, the effect of stabilization heat treatments on tensile and creep properties was analyzed in this paper. Solutions temperature, solution cooling rate, and stabilization temperature/time were found to have a significant impact on tensile and creep properties. These effects were readily quantified using the following methodology. First, the effect of solution cooling rate was assessed to determine its impact on a given property. The as-cooled property was then modified by using two multiplicative factors which assess the impact of solution temperature and stabilization parameters. Comparison of experimental data with predicted values showed this physically based analysis produced good results that rivaled the statistical analysis employed, which required numerous changes in the form of the regression equation depending on the property and temperature in question. As this physically based analysis uses the data for input, it should be noted that predictions which attempt to extrapolate beyond the bounds of the data must be viewed with skepticism. Future work aimed at expanding the range of the stabilization/aging parameters explored in this study would be highly desirable, especially at the higher solution cooling rates.
The Source Physics Experiments (SPE) at the Nevada National Security Site (NNSS): An Overview
NASA Astrophysics Data System (ADS)
Snelson, C. M.; Chipman, V.; White, R. L.; Emmitt, R.; Townsend, M.; Barker, D.; Lee, P.
2012-12-01
Understanding the changes in seismic energy as it travels from the near field to the far field is the ultimate goal in monitoring for explosive events of interest. This requires a clear understanding of explosion phenomenology as it relates to seismic, infrasound, and acoustic signals. Although there has been much progress in modeling these phenomena, this has been primarily based in the empirical realm. As a result, the logical next step in advancing the seismic monitoring capability of the United States is to conduct field tests that can expand the predictive capability of the physics-based modeling currently under development. The Source Physics Experiment at the Nevada National Security Site (SPE) is the first step in this endeavor to link the empirically based with the physics-based modeling. This is a collaborative project between National Security Technologies (NSTec), Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Sandia National Laboratories (SNL), the Defense Threat Reduction Agency (DTRA), and the Air Force Technical Applications Center (AFTAC). The test series require both the simple and complex cases to fully characterize the problem, which is to understand the transition of seismic energy from the near field to the far field; to understand the development of S-waves in explosives sources; and how anisotropy controls seismic energy transmission and partitioning. The current series is being conducted in a granite body called the Climax Stock. This location was chosen for several reasons, including the fairly homogenous granite; the location of previous nuclear tests in the same rock body; and generally the geology has been well characterized. The simple geology series is planned for 7 shots using conventional explosives in the same shot hole surrounded by Continuous Reflectometry for Radius vs. Time Experiment (CORRTEX), Time of Arrival (TOA), Velocity of Detonation (VOD), down-hole accelerometers, surface accelerometers, infrasound, and a suite of seismic sensors of various frequency bands from the near field to the far field. This allows for the use of a single test bed in the simple geology case instead of multiple tests beds to obtain the same results. The shots are planned at various depths to obtain a Green's function, scaled-depth of burial data, nominal depth of burial data and damage zone data. SPE1 was conducted in May 2011 as a 220 lb (100 kg) TNT equivalent calibration shot at a depth of 180 ft (55 m). SPE2 was conducted in October 2011 as a 2200 lb (1000 kg) TNT equivalent calibration shot at a depth of 150 ft (46 m). SPE3 was conducted in July 2012 as a 2200 lb (1000 kg) TNT equivalent calibration shot at a depth of 150 ft (46 m) in the damaged zone. Over 400 data channels were recorded for each of these shots and data recovery was about 95% with high signal to noise ratio. Once the simple geology site data has been utilized, a new test bed will be developed in a complex geology site to test these physics based models. Ultimately, the results from this project will provide the next advances in the science of monitoring to enable a physics-based predicative capability. This work was done by National Security Technologies, LLC, under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy. DOE/NV/25946--1584
NASA Technical Reports Server (NTRS)
Castelli, Michael G.
1990-01-01
A number of viscoplastic constitutive models were developed to describe deformation behavior under complex combinations of thermal and mechanical loading. Questions remain, however, regarding the validity of procedures used to characterize these models for specific structural alloys. One area of concern is that the majority of experimental data available for this purpose are determined under isothermal conditions. This experimental study is aimed at determining whether viscoplastic constitutive theories characterized using an isothermal data base can adequately model material response under the complex thermomechanical loading conditions typical of power generation service. The approach adopted was to conduct a series of carefully controlled thermomechanical experiments on a nickel-based superalloy, Hastelloy Alloy X. Previous investigations had shown that this material experiences metallurgical instabilities leading to complex hardening behavior, termed dynamic strain aging. Investigating this phenomenon under full thermomechanical conditions leads to a number of challenging experimental difficulties which up to the present work were unresolved. To correct this situation, a number of advances were made in thermomechanical testing techniques. Advanced methods for dynamic temperature gradient control, phasing control and thermal strain compensation were developed and incorporated into real time test control software. These advances allowed the thermomechanical data to be analyzed with minimal experimental uncertainty. The thermomechanical results were evaluated on both a phenomenological and microstructural basis. Phenomenological results revealed that the thermomechanical hardening trends were not bounded by those displayed under isothermal conditions. For the case of Hastelloy Alloy X (and similar dynamic strain aging materials), this strongly suggests that some form of thermomechanical testing is necessary when characterizing a thermoviscoplastic deformation model. Transmission electron microscopy was used to study the microstructural physics, and analyze the unique phenomenological behavior.
Layer 1 VPN services in distributed next-generation SONET/SDH networks with inverse multiplexing
NASA Astrophysics Data System (ADS)
Ghani, N.; Muthalaly, M. V.; Benhaddou, D.; Alanqar, W.
2006-05-01
Advances in next-generation SONET/SDH along with GMPLS control architectures have enabled many new service provisioning capabilities. In particular, a key services paradigm is the emergent Layer 1 virtual private network (L1 VPN) framework, which allows multiple clients to utilize a common physical infrastructure and provision their own 'virtualized' circuit-switched networks. This precludes expensive infrastructure builds and increases resource utilization for carriers. Along these lines, a novel L1 VPN services resource management scheme for next-generation SONET/SDH networks is proposed that fully leverages advanced virtual concatenation and inverse multiplexing features. Additionally, both centralized and distributed GMPLS-based implementations are also tabled to support the proposed L1 VPN services model. Detailed performance analysis results are presented along with avenues for future research.
NASA Astrophysics Data System (ADS)
Moldwin, M.; Morrow, C. A.; White, S. C.; Ivie, R.
2014-12-01
Members of the Education & Workforce Working Group and the American Institute of Physics (AIP) conducted the first ever National Demographic Survey of working professionals for the 2012 National Academy of Sciences Solar and Space Physics Decadal Survey to learn about the demographics of this sub-field of space science. The instrument contained questions for participants on: the type of workplace; basic demographic information regarding gender and minority status, educational pathways (discipline of undergrad degree, field of their PhD), how their undergraduate and graduate student researchers are funded, participation in NSF and NASA funded spaceflight missions and suborbital programs, and barriers to career advancement. Using contact data bases from AGU, the American Astronomical Society's Solar Physics Division (AAS-SPD), attendees of NOAA's Space Weather Week and proposal submissions to NSF's Atmospheric, Geospace Science Division, the AIP's Statistical Research Center cross correlated and culled these data bases resulting in 2776 unique email addresses of US based working professionals. The survey received 1305 responses (51%) and generated 125 pages of single space answers to a number of open-ended questions. This talk will summarize the highlights of this first-ever demographic survey including findings extracted from the open-ended responses regarding barriers to career advancement which showed significant gender differences.
Images of Inherited War: Three American Presidents in Vietnam
2011-06-01
Dependent Realism to demonstrate how theoretical advances in modern physical science correlate to cognitive theories in International Relations. We...Quantum Physics and Model-Dependent Realism In his book, The Grand Design, theoretical physicist and cosmologist Stephen Hawking draws on theoretical...exhibited wave-like properties and that existing scientific laws could not account for their behavior. Newtonian physics was “built on a framework
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Strategic Planning for Drought Mitigation Under Climate Change
NASA Astrophysics Data System (ADS)
Cai, X.; Zeng, R.; Valocchi, A. J.; Song, J.
2012-12-01
Droughts continue to be a major natural hazard and mounting evidence of global warming confronts society with a pressing question: Will climate change aggravate the risk of drought at local scale? It is important to explore what additional risk will be imposed by climate change and what level of strategic measures should be undertaken now to avoid vulnerable situations in the future, given that tactical measures may not avoid large damage. This study addresses the following key questions on strategic planning for drought mitigation under climate change: What combination of strategic and tactical measures will move the societal system response from a vulnerable situation to a resilient one with minimum cost? Are current infrastructures and their operation enough to mitigate the damage of future drought, or do we need in-advance infrastructure expansion for future drought preparedness? To address these questions, this study presents a decision support framework based on a coupled simulation and optimization model. A quasi-physically based watershed model is established for the Frenchman Creek Basin (FCB), part of the Republic River Basin, where groundwater based irrigation plays a significant role in agriculture production and local hydrological cycle. The physical model is used to train a statistical surrogate model, which predicts the watershed responses under future climate conditions. The statistical model replaces the complex physical model in the simulation-optimization framework, which makes the models computationally tractable. Decisions for drought preparedness include traditional short-term tactical measures (e.g. facility operation) and long-term or in-advance strategic measures, which require capital investment. A scenario based three-stage stochastic optimization model assesses the roles of strategic measures and tactical measures in drought preparedness and mitigation. Two benchmark climate prediction horizons, 2040s and 2090s, represent mid-term and long-term planning, respectively, compared to the baseline of the climate of 1980-2000. To handle uncertainty in climate change projections, outputs from three General Circulation Models (GCMs) with Regional Climate Model (RCM) for dynamic downscaling (PCM-RCM, Hadley-RCM, and CCSM-RCM) and four CO2 emission scenarios are used to represent the various possible climatic conditions in the mid-term (2040's) and long-term (2090's) time horizons. The model results show the relative roles of mid- and long-term investments and the complementary relationships between wait-and-see decisions and here-and-now decisions on infrastructure expansion. Even the best tactical measures (irrigation operation) alone are not sufficient for drought mitigation in the future. Infrastructure expansion is critical especially for environmental conversation purposes. With increasing budget, investment should be shifted from tactical measures to strategic measures for drought preparedness. Infrastructure expansion is preferred for the long term plan than the mid-term plan, i.e., larger investment is proposed in 2040s than the current, due to a larger likelihood of drought in 2090s than 2040s. Thus larger BMP expansion is proposed in 2040s for droughts preparedness in 2090s.
“Elegant Tool” Delivers Genome-Level Science for Electrolytes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keith Arterburn
Now, a ‘disruptive, virtual scientific simulation tool’ delivers a new, genome-level investigation for electrolytes to develop better, more efficient batteries. Dr. Kevin Gering, an Idaho National Laboratory researcher, has developed the Advanced Electrolyte Model (AEM), a copyrighted molecular-based simulation tool that has been scientifically proven and validated using at least a dozen ‘real-world’ physical metrics. Nominated for the 2014 international R&D 100 Award, AEM revolutionizes electrolyte materials selection, optimizing combinations and key design elements to make battery design and experimentation quick, accurate and responsive to specific needs.
ERIC Educational Resources Information Center
de los Santos, Desiree´ M.; Montes, Antonio; Sa´nchez-Coronilla, Antonio; Navas, Javier
2014-01-01
A Project Based Learning (PBL) methodology was used in the practical laboratories of the Advanced Physical Chemistry department. The project type proposed simulates "real research" focusing on sol-gel synthesis and the application of the obtained sol as a stone consolidant. Students were divided into small groups (2 to 3 students) to…
NASA Astrophysics Data System (ADS)
Haghnegahdar, Amin; Elshamy, Mohamed; Yassin, Fuad; Razavi, Saman; Wheater, Howard; Pietroniro, Al
2017-04-01
Complex physically-based environmental models are being increasingly used as the primary tool for watershed planning and management due to advances in computation power and data acquisition. Model sensitivity analysis plays a crucial role in understanding the behavior of these complex models and improving their performance. Due to the non-linearity and interactions within these complex models, Global sensitivity analysis (GSA) techniques should be adopted to provide a comprehensive understanding of model behavior and identify its dominant controls. In this study we adopt a multi-basin multi-criteria GSA approach to systematically assess the behavior of the Modélisation Environmentale-Surface et Hydrologie (MESH) across various hydroclimatic conditions in Canada including areas in the Great Lakes Basin, Mackenzie River Basin, and South Saskatchewan River Basin. MESH is a semi-distributed physically-based coupled land surface-hydrology modelling system developed by Environment and Climate Change Canada (ECCC) for various water resources management purposes in Canada. We use a novel method, called Variogram Analysis of Response Surfaces (VARS), to perform sensitivity analysis. VARS is a variogram-based GSA technique that can efficiently provide a spectrum of sensitivity information across a range of scales within the parameter space. We use multiple metrics to identify dominant controls of model response (e.g. streamflow) to model parameters under various conditions such as high flows, low flows, and flow volume. We also investigate the influence of initial conditions on model behavior as part of this study. Our preliminary results suggest that this type of GSA can significantly help with estimating model parameters, decreasing calibration computational burden, and reducing prediction uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. Q.; Shemon, E. R.; Mahadevan, Vijay S.
SHARP, developed under the NEAMS Reactor Product Line, is an advanced modeling and simulation toolkit for the analysis of advanced nuclear reactors. SHARP is comprised of three physics modules currently including neutronics, thermal hydraulics, and structural mechanics. SHARP empowers designers to produce accurate results for modeling physical phenomena that have been identified as important for nuclear reactor analysis. SHARP can use existing physics codes and take advantage of existing infrastructure capabilities in the MOAB framework and the coupling driver/solver library, the Coupled Physics Environment (CouPE), which utilizes the widely used, scalable PETSc library. This report aims at identifying the coupled-physicsmore » simulation capability of SHARP by introducing the demonstration example called sahex in advance of the SHARP release expected by Mar 2016. sahex consists of 6 fuel pins with cladding, 1 control rod, sodium coolant and an outer duct wall that encloses all the other components. This example is carefully chosen to demonstrate the proof of concept for solving more complex demonstration examples such as EBR II assembly and ABTR full core. The workflow of preparing the input files, running the case and analyzing the results is demonstrated in this report. Moreover, an extension of the sahex model called sahex_core, which adds six homogenized neighboring assemblies to the full heterogeneous sahex model, is presented to test homogenization capabilities in both Nek5000 and PROTEUS. Some primary information on the configuration and build aspects for the SHARP toolkit, which includes capability to auto-download dependencies and configure/install with optimal flags in an architecture-aware fashion, is also covered by this report. A step-by-step instruction is provided to help users to create their cases. Details on these processes will be provided in the SHARP user manual that will accompany the first release.« less
Vaquerizo, Beatriz; Theriault-Lauzier, Pascal; Piazza, Nicolo
2015-12-01
Mitral regurgitation is the most prevalent valvular heart disease worldwide. Despite the widespread availability of curative surgical intervention, a considerable proportion of patients with severe mitral regurgitation are not referred for treatment, largely due to the presence of left ventricular dysfunction, advanced age, and comorbid illnesses. Transcatheter mitral valve replacement is a promising therapeutic alternative to traditional surgical valve replacement. The complex anatomical and pathophysiological nature of the mitral valvular complex, however, presents significant challenges to the successful design and implementation of novel transcatheter mitral replacement devices. Patient-specific 3-dimensional computer-based models enable accurate assessment of the mitral valve anatomy and preprocedural simulations for transcatheter therapies. Such information may help refine the design features of novel transcatheter mitral devices and enhance procedural planning. Herein, we describe a novel medical image-based processing tool that facilitates accurate, noninvasive assessment of the mitral valvular complex, by creating precise three-dimensional heart models. The 3-dimensional computer reconstructions are then converted to a physical model using 3-dimensional printing technology, thereby enabling patient-specific assessment of the interaction between device and patient. It may provide new opportunities for a better understanding of the mitral anatomy-pathophysiology-device interaction, which is of critical importance for the advancement of transcatheter mitral valve replacement. Copyright © 2015 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
Computational approaches to substrate-based cell motility
Ziebert, Falko; Aranson, Igor S.
2016-07-15
Substrate-based crawling motility of eukaryotic cells is essential for many biological functions, both in developing and mature organisms. Motility dysfunctions are involved in several life-threatening pathologies such as cancer and metastasis. Motile cells are also a natural realization of active, self-propelled ‘particles’, a popular research topic in nonequilibrium physics. Finally, from the materials perspective, assemblies of motile cells and evolving tissues constitute a class of adaptive self-healing materials that respond to the topography, elasticity, and surface chemistry of the environment and react to external stimuli. Although a comprehensive understanding of substrate-based cell motility remains elusive, progress has been achieved recentlymore » in its modeling on the whole cell level. Furthermore we survey the most recent advances in computational approaches to cell movement and demonstrate how these models improve our understanding of complex self-organized systems such as living cells.« less
SIERRA Multimechanics Module: Aria User Manual Version 4.44
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierra Thermal /Fluid Team
2017-04-01
Aria is a Galerkin fnite element based program for solving coupled-physics problems described by systems of PDEs and is capable of solving nonlinear, implicit, transient and direct-to-steady state problems in two and three dimensions on parallel architectures. The suite of physics currently supported by Aria includes thermal energy transport, species transport, and electrostatics as well as generalized scalar, vector and tensor transport equations. Additionally, Aria includes support for manufacturing process fows via the incompressible Navier-Stokes equations specialized to a low Reynolds number ( %3C 1 ) regime. Enhanced modeling support of manufacturing processing is made possible through use of eithermore » arbitrary Lagrangian- Eulerian (ALE) and level set based free and moving boundary tracking in conjunction with quasi-static nonlinear elastic solid mechanics for mesh control. Coupled physics problems are solved in several ways including fully-coupled Newton's method with analytic or numerical sensitivities, fully-coupled Newton- Krylov methods and a loosely-coupled nonlinear iteration about subsets of the system that are solved using combinations of the aforementioned methods. Error estimation, uniform and dynamic h -adaptivity and dynamic load balancing are some of Aria's more advanced capabilities. Aria is based upon the Sierra Framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierra Thermal/Fluid Team
Aria is a Galerkin fnite element based program for solving coupled-physics problems described by systems of PDEs and is capable of solving nonlinear, implicit, transient and direct-to-steady state problems in two and three dimensions on parallel architectures. The suite of physics currently supported by Aria includes thermal energy transport, species transport, and electrostatics as well as generalized scalar, vector and tensor transport equations. Additionally, Aria includes support for manufacturing process fows via the incompressible Navier-Stokes equations specialized to a low Reynolds number ( %3C 1 ) regime. Enhanced modeling support of manufacturing processing is made possible through use of eithermore » arbitrary Lagrangian- Eulerian (ALE) and level set based free and moving boundary tracking in conjunction with quasi-static nonlinear elastic solid mechanics for mesh control. Coupled physics problems are solved in several ways including fully-coupled Newton's method with analytic or numerical sensitivities, fully-coupled Newton- Krylov methods and a loosely-coupled nonlinear iteration about subsets of the system that are solved using combinations of the aforementioned methods. Error estimation, uniform and dynamic h -adaptivity and dynamic load balancing are some of Aria's more advanced capabilities. Aria is based upon the Sierra Framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierra Thermal /Fluid Team
Aria is a Galerkin finite element based program for solving coupled-physics problems described by systems of PDEs and is capable of solving nonlinear, implicit, transient and direct-to-steady state problems in two and three dimensions on parallel architectures. The suite of physics currently supported by Aria includes thermal energy transport, species transport, and electrostatics as well as generalized scalar, vector and tensor transport equations. Additionally, Aria includes support for manufacturing process flows via the incompressible Navier-Stokes equations specialized to a low Reynolds number (Re %3C 1) regime. Enhanced modeling support of manufacturing processing is made possible through use of either arbitrarymore » Lagrangian- Eulerian (ALE) and level set based free and moving boundary tracking in conjunction with quasi-static nonlinear elastic solid mechanics for mesh control. Coupled physics problems are solved in several ways including fully-coupled Newton's method with analytic or numerical sensitivities, fully-coupled Newton- Krylov methods and a loosely-coupled nonlinear iteration about subsets of the system that are solved using combinations of the aforementioned methods. Error estimation, uniform and dynamic h-adaptivity and dynamic load balancing are some of Aria's more advanced capabilities. Aria is based upon the Sierra Framework.« less
NASA Technical Reports Server (NTRS)
Neitzke, Kurt W.; Guerreiro, Nelson M.
2014-01-01
A design study was completed to explore the theoretical physical capacity (TPC) of the John F. Kennedy International Airport (KJFK) runway system for a northflow configuration assuming impedance-free (to throughput) air traffic control functionality. Individual runways were modeled using an agent-based, airspace simulation tool, the Airspace Concept Evaluation System (ACES), with all runways conducting both departures and arrivals on a first-come first-served (FCFS) scheduling basis. A realistic future flight schedule was expanded to 3.5 times the traffic level of a selected baseline day, September 26, 2006, to provide a steady overdemand state for KJFK runways. Rules constraining departure and arrival operations were defined to reflect physical limits beyond which safe operations could no longer be assumed. Safety buffers to account for all sources of operational variability were not included in the TPC estimate. Visual approaches were assumed for all arrivals to minimize inter-arrival spacing. Parallel runway operations were assumed to be independent based on lateral spacing distances. Resulting time intervals between successive airport operations were primarily constrained by same-runway and then by intersecting-runway spacing requirements. The resulting physical runway capacity approximates a theoretical limit that cannot be exceeded without modifying runway interaction assumptions. Comparison with current KJFK operational limits for a north-flow runway configuration indicates a substantial throughput gap of approximately 48%. This gap may be further analyzed to determine which part may be feasibly bridged through the deployment of advanced systems and procedures, and which part cannot, because it is either impossible or not cost-effective to control. Advanced systems for bridging the throughput gap may be conceptualized and simulated using this same experimental setup to estimate the level of gap closure achieved.
3D Printing and Digital Rock Physics for Geomaterials
NASA Astrophysics Data System (ADS)
Martinez, M. J.; Yoon, H.; Dewers, T. A.
2015-12-01
Imaging techniques for the analysis of porous structures have revolutionized our ability to quantitatively characterize geomaterials. Digital representations of rock from CT images and physics modeling based on these pore structures provide the opportunity to further advance our quantitative understanding of fluid flow, geomechanics, and geochemistry, and the emergence of coupled behaviors. Additive manufacturing, commonly known as 3D printing, has revolutionized production of custom parts with complex internal geometries. For the geosciences, recent advances in 3D printing technology may be co-opted to print reproducible porous structures derived from CT-imaging of actual rocks for experimental testing. The use of 3D printed microstructure allows us to surmount typical problems associated with sample-to-sample heterogeneity that plague rock physics testing and to test material response independent from pore-structure variability. Together, imaging, digital rocks and 3D printing potentially enables a new workflow for understanding coupled geophysical processes in a real, but well-defined setting circumventing typical issues associated with reproducibility, enabling full characterization and thus connection of physical phenomena to structure. In this talk we will discuss the possibilities that these technologies can bring to geosciences and present early experiences with coupled multiscale experimental and numerical analysis using 3D printed fractured rock specimens. In particular, we discuss the processes of selection and printing of transparent fractured specimens based on 3D reconstruction of micro-fractured rock to study fluid flow characterization and manipulation. Micro-particle image velocimetry is used to directly visualize 3D single and multiphase flow velocity in 3D fracture networks. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Robotic Billiards: Understanding Humans in Order to Counter Them.
Nierhoff, Thomas; Leibrandt, Konrad; Lorenz, Tamara; Hirche, Sandra
2016-08-01
Ongoing technological advances in the areas of computation, sensing, and mechatronics enable robotic-based systems to interact with humans in the real world. To succeed against a human in a competitive scenario, a robot must anticipate the human behavior and include it in its own planning framework. Then it can predict the next human move and counter it accordingly, thus not only achieving overall better performance but also systematically exploiting the opponent's weak spots. Pool is used as a representative scenario to derive a model-based planning and control framework where not only the physics of the environment but also a model of the opponent is considered. By representing the game of pool as a Markov decision process and incorporating a model of the human decision-making based on studies, an optimized policy is derived. This enables the robot to include the opponent's typical game style into its tactical considerations when planning a stroke. The results are validated in simulations and real-life experiments with an anthropomorphic robot playing pool against a human.
Bridging Empirical and Physical Approaches for Landslide Monitoring and Early Warning
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Kumar, Sujay; Harrison, Ken
2011-01-01
Rainfall-triggered landslides typically occur and are evaluated at local scales, using slope-stability models to calculate coincident changes in driving and resisting forces at the hillslope level in order to anticipate slope failures. Over larger areas, detailed high resolution landslide modeling is often infeasible due to difficulties in quantifying the complex interaction between rainfall infiltration and surface materials as well as the dearth of available in situ soil and rainfall estimates and accurate landslide validation data. This presentation will discuss how satellite precipitation and surface information can be applied within a landslide hazard assessment framework to improve landslide monitoring and early warning by considering two disparate approaches to landslide hazard assessment: an empirical landslide forecasting algorithm and a physical slope-stability model. The goal of this research is to advance near real-time landslide hazard assessment and early warning at larger spatial scales. This is done by employing high resolution surface and precipitation information within a probabilistic framework to provide more physically-based grounding to empirical landslide triggering thresholds. The empirical landslide forecasting tool, running in near real-time at http://trmm.nasa.gov, considers potential landslide activity at the global scale and relies on Tropical Rainfall Measuring Mission (TRMM) precipitation data and surface products to provide a near real-time picture of where landslides may be triggered. The physical approach considers how rainfall infiltration on a hillslope affects the in situ hydro-mechanical processes that may lead to slope failure. Evaluation of these empirical and physical approaches are performed within the Land Information System (LIS), a high performance land surface model processing and data assimilation system developed within the Hydrological Sciences Branch at NASA's Goddard Space Flight Center. LIS provides the capabilities to quantify uncertainty from model inputs and calculate probabilistic estimates for slope failures. Results indicate that remote sensing data can provide many of the spatiotemporal requirements for accurate landslide monitoring and early warning; however, higher resolution precipitation inputs will help to better identify small-scale precipitation forcings that contribute to significant landslide triggering. Future missions, such as the Global Precipitation Measurement (GPM) mission will provide more frequent and extensive estimates of precipitation at the global scale, which will serve as key inputs to significantly advance the accuracy of landslide hazard assessment, particularly over larger spatial scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, Gerhard; Bostelmann, F.
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less
A Simple Mechanical Model for the Isotropic Harmonic Oscillator
ERIC Educational Resources Information Center
Nita, Gelu M.
2010-01-01
A constrained elastic pendulum is proposed as a simple mechanical model for the isotropic harmonic oscillator. The conceptual and mathematical simplicity of this model recommends it as an effective pedagogical tool in teaching basic physics concepts at advanced high school and introductory undergraduate course levels. (Contains 2 figures.)
A 1DVAR-based snowfall rate retrieval algorithm for passive microwave radiometers
NASA Astrophysics Data System (ADS)
Meng, Huan; Dong, Jun; Ferraro, Ralph; Yan, Banghua; Zhao, Limin; Kongoli, Cezar; Wang, Nai-Yu; Zavodsky, Bradley
2017-06-01
Snowfall rate retrieval from spaceborne passive microwave (PMW) radiometers has gained momentum in recent years. PMW can be so utilized because of its ability to sense in-cloud precipitation. A physically based, overland snowfall rate (SFR) algorithm has been developed using measurements from the Advanced Microwave Sounding Unit-A/Microwave Humidity Sounder sensor pair and the Advanced Technology Microwave Sounder. Currently, these instruments are aboard five polar-orbiting satellites, namely, NOAA-18, NOAA-19, Metop-A, Metop-B, and Suomi-NPP. The SFR algorithm relies on a separate snowfall detection algorithm that is composed of a satellite-based statistical model and a set of numerical weather prediction model-based filters. There are four components in the SFR algorithm itself: cloud properties retrieval, computation of ice particle terminal velocity, ice water content adjustment, and the determination of snowfall rate. The retrieval of cloud properties is the foundation of the algorithm and is accomplished using a one-dimensional variational (1DVAR) model. An existing model is adopted to derive ice particle terminal velocity. Since no measurement of cloud ice distribution is available when SFR is retrieved in near real time, such distribution is implicitly assumed by deriving an empirical function that adjusts retrieved SFR toward radar snowfall estimates. Finally, SFR is determined numerically from a complex integral. The algorithm has been validated against both radar and ground observations of snowfall events from the contiguous United States with satisfactory results. Currently, the SFR product is operationally generated at the National Oceanic and Atmospheric Administration and can be obtained from that organization.
Rapp, Teresa L; Phillips, Susan R; Dmochowski, Ivan J
2016-12-13
The study of ruthenium polypyridyl complexes can be widely applied across disciplines in the undergraduate curriculum. Ruthenium photochemistry has advanced many fields including dye-sensitized solar cells, photoredox catalysis, light-driven water oxidation, and biological electron transfer. Equally promising are ruthenium polypyridyl complexes that provide a sterically bulky, photolabile moiety for transiently "caging" biologically active molecules. Photouncaging involves the use of visible (1-photon) or near-IR (2-photon) light to break one or more bonds between ruthenium and coordinated ligand(s), which can occur on short time scales and in high quantum yields. In this work we demonstrate the use of a model "caged" acetonitrile complex, Ru(2,2'-bipyridine) 2 (acetonitrile) 2 , or RuMeCN in an advanced synthesis and physical chemistry laboratory. Students made RuMeCN in an advanced synthesis laboratory course and performed UV-vis spectroscopy and electrochemistry. The following semester students investigated RuMeCN photolysis kinetics in a physical chemistry laboratory. These two exercises may also be combined to create a 2-week module in an advanced undergraduate laboratory course.
2016-01-01
The study of ruthenium polypyridyl complexes can be widely applied across disciplines in the undergraduate curriculum. Ruthenium photochemistry has advanced many fields including dye-sensitized solar cells, photoredox catalysis, light-driven water oxidation, and biological electron transfer. Equally promising are ruthenium polypyridyl complexes that provide a sterically bulky, photolabile moiety for transiently “caging” biologically active molecules. Photouncaging involves the use of visible (1-photon) or near-IR (2-photon) light to break one or more bonds between ruthenium and coordinated ligand(s), which can occur on short time scales and in high quantum yields. In this work we demonstrate the use of a model “caged” acetonitrile complex, Ru(2,2′-bipyridine)2(acetonitrile)2, or RuMeCN in an advanced synthesis and physical chemistry laboratory. Students made RuMeCN in an advanced synthesis laboratory course and performed UV–vis spectroscopy and electrochemistry. The following semester students investigated RuMeCN photolysis kinetics in a physical chemistry laboratory. These two exercises may also be combined to create a 2-week module in an advanced undergraduate laboratory course. PMID:28649139
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
Physical and Chemical Properties of the Copper-Alanine System: An Advanced Laboratory Project
ERIC Educational Resources Information Center
Farrell, John J.
1977-01-01
An integrated physical-analytical-inorganic chemistry laboratory procedure for use with undergraduate biology majors is described. The procedure requires five to six laboratory periods and includes acid-base standardizations, potentiometric determinations, computer usage, spectrophotometric determinations of crystal-field splitting…
Detailed Modeling of Physical Processes in Electron Sources for Accelerator Applications
NASA Astrophysics Data System (ADS)
Chubenko, Oksana; Afanasev, Andrei
2017-01-01
At present, electron sources are essential in a wide range of applications - from common technical use to exploring the nature of matter. Depending on the application requirements, different methods and materials are used to generate electrons. State-of-the-art accelerator applications set a number of often-conflicting requirements for electron sources (e.g., quantum efficiency vs. polarization, current density vs. lifetime, etc). Development of advanced electron sources includes modeling and design of cathodes, material growth, fabrication of cathodes, and cathode testing. The detailed simulation and modeling of physical processes is required in order to shed light on the exact mechanisms of electron emission and to develop new-generation electron sources with optimized efficiency. The purpose of the present work is to study physical processes in advanced electron sources and develop scientific tools, which could be used to predict electron emission from novel nano-structured materials. In particular, the area of interest includes bulk/superlattice gallium arsenide (bulk/SL GaAs) photo-emitters and nitrogen-incorporated ultrananocrystalline diamond ((N)UNCD) photo/field-emitters. Work supported by The George Washington University and Euclid TechLabs LLC.
Physical and mathematical cochlear models
NASA Astrophysics Data System (ADS)
Lim, Kian-Meng
2000-10-01
The cochlea is an intricate organ in the inner ear responsible for our hearing. Besides acting as a transducer to convert mechanical sound vibrations to electrical neural signals, the cochlea also amplifies and separates the sound signal into its spectral components for further processing in the brain. It operates over a broad-band of frequency and a huge dynamic range of input while maintaining a low power consumption. The present research takes the approach of building cochlear models to study and understand the underlying mechanics involved in the functioning of the cochlea. Both physical and mathematical models of the cochlea are constructed. The physical model is a first attempt to build a life- sized replica of the human cochlea using advanced micro- machining techniques. The model takes a modular design, with a removable silicon-wafer based partition membrane encapsulated in a plastic fluid chamber. Preliminary measurements in the model are obtained and they compare roughly with simulation results. Parametric studies on the design parameters of the model leads to an improved design of the model. The studies also revealed that the width and orthotropy of the basilar membrane in the cochlea have significant effects on the sharply tuned responses observed in the biological cochlea. The mathematical model is a physiologically based model that includes three-dimensional viscous fluid flow and a tapered partition with variable properties along its length. A hybrid asymptotic and numerical method provides a uniformly valid and efficient solution to the short and long wave regions in the model. Both linear and non- linear activity are included in the model to simulate the active cochlea. The mathematical model has successfully reproduced many features of the response in the biological cochlea, as observed in experiment measurements performed on animals. These features include sharply tuned frequency responses, significant amplification with inclusion of activity, and non-linear effects such as compression of response with stimulus level, two-tone suppression and the generation of harmonic and distortion products.
Lithium Gadolinium Borate in Plastic Scintillator as an Antineutrino Detection Material
2010-06-01
advancement of fundamental particle physics, development of the standard model of particle physics and our understanding many cosmological processes...MeVee). Where the light produced by by a 1MeV electron is 1 MeVee by definition , but a heavy charged particle would have a kinetic energy of several
ERIC Educational Resources Information Center
Keith, Wayne; Martin, Cynthia; Veltkamp, Pamela
2013-01-01
Using model rockets to teach physics can be an effective way to engage students in learning. In this paper, we present a curriculum developed in response to an expressed need for helping high school students review physics equations in preparation for a state-mandated exam. This required a mode of teaching that was more advanced and analytical…
Medical physics practice in the next decade
Paliwal, Bhudatt
2006-01-01
Impressive advances in computers and materials science have fueled a broad-based confluence of basic science breakthroughs. These advances are making us reformulate our learning, teaching and credentialing methodologies and research and development frontiers. We are now in the age of molecular medicine. In the entire field of health care, a paradigm shift from population-based solutions to individual specific care is taking place. These trends are reshaping the practice of medical physics. In this short presentation, examples are given to illustrate developments in image-guided intensity-modulated and adaptive helical tomotherapy, enhanced application of intensity modulation radiotherapy (IMRT) using adaptive radiotherapy and conformal avoidance. These advances include improved normal tissue sparing and permit dose reconstruction and verification, thereby allowing significant biologically effective dose escalation and reduced radiation toxicity. The intrinsic capability of helical TomoTherapy for megavoltage CT imaging for IMRT image-guidance is also discussed. Finally developments in motion management are described. PMID:22275799
Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)
The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...
CFD for hypersonic airbreathing aircraft
NASA Technical Reports Server (NTRS)
Kumar, Ajay
1989-01-01
A general discussion is given on the use of advanced computational fluid dynamics (CFD) in analyzing the hypersonic flow field around an airbreathing aircraft. Unique features of the hypersonic flow physics are presented and an assessment is given of the current algorithms in terms of their capability to model hypersonic flows. Several examples of advanced CFD applications are then presented.
Compact modeling of total ionizing dose and aging effects in MOS technologies
Esqueda, Ivan S.; Barnaby, Hugh J.; King, Michael Patrick
2015-06-18
This paper presents a physics-based compact modeling approach that incorporates the impact of total ionizing dose (TID) and stress-induced defects into simulations of metal-oxide-semiconductor (MOS) devices and integrated circuits (ICs). This approach utilizes calculations of surface potential (ψs) to capture the charge contribution from oxide trapped charge and interface traps and to describe their impact on MOS electrostatics and device operating characteristics as a function of ionizing radiation exposure and aging effects. The modeling approach is demonstrated for bulk and silicon-on-insulator (SOI) MOS device. The formulation is verified using TCAD simulations and through the comparison of model calculations and experimentalmore » I-V characteristics from irradiated devices. The presented approach is suitable for modeling TID and aging effects in advanced MOS devices and ICs.« less
Spacing distribution functions for 1D point island model with irreversible attachment
NASA Astrophysics Data System (ADS)
Gonzalez, Diego; Einstein, Theodore; Pimpinelli, Alberto
2011-03-01
We study the configurational structure of the point island model for epitaxial growth in one dimension. In particular, we calculate the island gap and capture zone distributions. Our model is based on an approximate description of nucleation inside the gaps. Nucleation is described by the joint probability density p xy n (x,y), which represents the probability density to have nucleation at position x within a gap of size y. Our proposed functional form for p xy n (x,y) describes excellently the statistical behavior of the system. We compare our analytical model with extensive numerical simulations. Our model retains the most relevant physical properties of the system. This work was supported by the NSF-MRSEC at the University of Maryland, Grant No. DMR 05-20471, with ancillary support from the Center for Nanophysics and Advanced Materials (CNAM).
Wilson Prize article: From vacuum tubes to lasers and back again1
NASA Astrophysics Data System (ADS)
Madey, John M. J.
2014-07-01
The first demonstration of an optical-wavelength laser by Theodore Maiman in 1960 had a transformational impact on the paths that would be blazed to advance the state of the art of short wavelength coherent electron beam-based radiation sources. Free electron lasers (FELs) emerged from these efforts as the electron beam-based realization of the pioneering model of atom-based "optical masers" by Schawlow and Townes, but with far greater potential for tunable operation at high power and very short wavelengths. Further opportunities for yet greater capabilities may be inherent in our still growing understanding of the underlying physics. This article focuses on the FEL efforts in which the author was directly and personally involved.
Magnetic Excitations and Geometric Confinement; Theory and simulations
NASA Astrophysics Data System (ADS)
Wysin, Gary Matthew
2015-12-01
In this book, author Gary Wysin provides an overview of model systems and their behaviour and effects, and is intended for advanced students and researchers in physics, chemistry and engineering interested in confined magnetics. It is also suitable as an auxiliary text in a class on magnetism or solid state physics. Previous physics knowledge is expected, along with some basic knowledge of classical electromagnetism and electromagnetic waves for the latter chapters.
Recent Advances in WRF Modeling for Air Quality Applications
The USEPA uses WRF in conjunction with the Community Multiscale Air Quality (CMAQ) for air quality regulation and research. Over the years we have added physics options and geophysical datasets to the WRF system to enhance model capabilities especially for extended retrospective...
Advances in Global Full Waveform Inversion
NASA Astrophysics Data System (ADS)
Tromp, J.; Bozdag, E.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Modrak, R. T.; Orsvuran, R.; Smith, J. A.; Komatitsch, D.; Peter, D. B.
2017-12-01
Information about Earth's interior comes from seismograms recorded at its surface. Seismic imaging based on spectral-element and adjoint methods has enabled assimilation of this information for the construction of 3D (an)elastic Earth models. These methods account for the physics of wave excitation and propagation by numerically solving the equations of motion, and require the execution of complex computational procedures that challenge the most advanced high-performance computing systems. Current research is petascale; future research will require exascale capabilities. The inverse problem consists of reconstructing the characteristics of the medium from -often noisy- observations. A nonlinear functional is minimized, which involves both the misfit to the measurements and a Tikhonov-type regularization term to tackle inherent ill-posedness. Achieving scalability for the inversion process on tens of thousands of multicore processors is a task that offers many research challenges. We initiated global "adjoint tomography" using 253 earthquakes and produced the first-generation model named GLAD-M15, with a transversely isotropic model parameterization. We are currently running iterations for a second-generation anisotropic model based on the same 253 events. In parallel, we continue iterations for a transversely isotropic model with a larger dataset of 1,040 events to determine higher-resolution plume and slab images. A significant part of our research has focused on eliminating I/O bottlenecks in the adjoint tomography workflow. This has led to the development of a new Adaptable Seismic Data Format based on HDF5, and post-processing tools based on the ADIOS library developed by Oak Ridge National Laboratory. We use the Ensemble Toolkit for workflow stabilization & management to automate the workflow with minimal human interaction.
Nonlinear flight control design using backstepping methodology
NASA Astrophysics Data System (ADS)
Tran, Thanh Trung
The subject of nonlinear flight control design using backstepping control methodology is investigated in the dissertation research presented here. Control design methods based on nonlinear models of the dynamic system provide higher utility and versatility because the design model more closely matches the physical system behavior. Obtaining requisite model fidelity is only half of the overall design process, however. Design of the nonlinear control loops can lessen the effects of nonlinearity, or even exploit nonlinearity, to achieve higher levels of closed-loop stability, performance, and robustness. The goal of the research is to improve control quality for a general class of strict-feedback dynamic systems and provide flight control architectures to augment the aircraft motion. The research is divided into two parts: theoretical control development for the strict-feedback form of nonlinear dynamic systems and application of the proposed theory for nonlinear flight dynamics. In the first part, the research is built on two components: transforming the nonlinear dynamic model to a canonical strict-feedback form and then applying backstepping control theory to the canonical model. The research considers a process to determine when this transformation is possible, and when it is possible, a systematic process to transfer the model is also considered when practical. When this is not the case, certain modeling assumptions are explored to facilitate the transformation. After achieving the canonical form, a systematic design procedure for formulating a backstepping control law is explored in the research. Starting with the simplest subsystem and ending with the full system, pseudo control concepts based on Lyapunov control functions are used to control each successive subsystem. Typically each pseudo control must be solved from a nonlinear algebraic equation. At the end of this process, the physical control input must be re-expressed in terms of the physical states by eliminating the pseudo control transformations. In the second part, the research focuses on nonlinear control design for flight dynamics of aircraft motion. Some assumptions on aerodynamics of the aircraft are addressed to transform full nonlinear flight dynamics into the canonical strict-feedback form. The assumptions are also analyzed, validated, and compared to show the advantages and disadvantages of the design models. With the achieved models, investigation focuses on formulating the backstepping control laws and provides an advanced control algorithm for nonlinear flight dynamics of the aircraft. Experimental and simulation studies are successfully implemented to validate the proposed control method. Advancement of nonlinear backstepping control theory and its application to nonlinear flight control are achieved in the dissertation research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.; Britt, J.; Birkmire, R.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less
NASA Astrophysics Data System (ADS)
Lachieze-Rey, Marc
This book delivers a quantitative account of the science of cosmology, designed for a non-specialist audience. The basic principles are outlined using simple maths and physics, while still providing rigorous models of the Universe. It offers an ideal introduction to the key ideas in cosmology, without going into technical details. The approach used is based on the fundamental ideas of general relativity such as the spacetime interval, comoving coordinates, and spacetime curvature. It provides an up-to-date and thoughtful discussion of the big bang, and the crucial questions of structure and galaxy formation. Questions of method and philosophical approaches in cosmology are also briefly discussed. Advanced undergraduates in either physics or mathematics would benefit greatly from use either as a course text or as a supplementary guide to cosmology courses.
NASA Astrophysics Data System (ADS)
Chen, Yi-Chieh; Li, Tsung-Han; Lin, Hung-Yu; Chen, Kao-Tun; Wu, Chun-Sheng; Lai, Ya-Chieh; Hurat, Philippe
2018-03-01
Along with process improvement and integrated circuit (IC) design complexity increased, failure rate caused by optical getting higher in the semiconductor manufacture. In order to enhance chip quality, optical proximity correction (OPC) plays an indispensable rule in the manufacture industry. However, OPC, includes model creation, correction, simulation and verification, is a bottleneck from design to manufacture due to the multiple iterations and advanced physical behavior description in math. Thus, this paper presented a pattern-based design technology co-optimization (PB-DTCO) flow in cooperation with OPC to find out patterns which will negatively affect the yield and fixed it automatically in advance to reduce the run-time in OPC operation. PB-DTCO flow can generate plenty of test patterns for model creation and yield gaining, classify candidate patterns systematically and furthermore build up bank includes pairs of match and optimization patterns quickly. Those banks can be used for hotspot fixing, layout optimization and also be referenced for the next technology node. Therefore, the combination of PB-DTCO flow with OPC not only benefits for reducing the time-to-market but also flexible and can be easily adapted to diversity OPC flow.
Survey of current situation in radiation belt modeling
NASA Technical Reports Server (NTRS)
Fung, Shing F.
2004-01-01
The study of Earth's radiation belts is one of the oldest subjects in space physics. Despite the tremendous progress made in the last four decades, we still lack a complete understanding of the radiation belts in terms of their configurations, dynamics, and detailed physical accounts of their sources and sinks. The static nature of early empirical trapped radiation models, for examples, the NASA AP-8 and AE-8 models, renders those models inappropriate for predicting short-term radiation belt behaviors associated with geomagnetic storms and substorms. Due to incomplete data coverage, these models are also inaccurate at low altitudes (e.g., <1000 km) where many robotic and human space flights occur. The availability of radiation data from modern space missions and advancement in physical modeling and data management techniques have now allowed the development of new empirical and physical radiation belt models. In this paper, we will review the status of modern radiation belt modeling. Published by Elsevier Ltd on behalf of COSPAR.
NASA Technical Reports Server (NTRS)
1981-01-01
The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.
NASA Astrophysics Data System (ADS)
Richmond, P.; Ausloos, M.; Dacorogna, M.
2002-05-01
The area of research described as “econophysics" is renewing a kinship between physicists and economists and financial practitioners, that has been lost since the 19th century when scientists such as Pascal and Halley made groundbreaking advances in the area. Now, new meetings are revealing new research opportunities outside the established pathways traditionally explored within economics and finance. In December 2001, around 100 researchers from across the world attended the EPS meeting “Applications of Physics to Financial Analysis” (APFA3). This was held in the Museum of London Conference Centre which was chosen for its proximity to the City of London and its trading centres. The meeting was especially useful in bringing together roughly equal numbers of physicists, mathematicians and financial practitioners. Taking part in the conference we had the impression that, whilst the relation between physics and applied finance may still be at an early stage, it is evolving very quickly. As in nature, a sign of evolution is the emergence of different and specialised branches, each w ith their own specific character. Papers covered a range of topics, including: market modelling, risk management, agent-based modelling, hedging in incomplete markets, benchmarking, performance measurement, foreign exchange markets, time series analysis and prediction, efficient market hypothesis, equilibrium and non-equilibrium markets, economic a nd financial networks, the valuation of derivatives, growth and bankruptcy. The meeting was sponsored by the European Physical Society and the UK Institute of Physics. The invited speakers were J.Ph. Bouchaud, J.F. Muzy, K. Sneppen, G. Iori and S. Solomon. Articles outlining some of the more interesting advances in this fie ld have been selected by the Guest Editors, from amongst the submitted articles, and after having been refereed, they are presented here in this edition of EPJ B. APFA3 closed on a positive note. There was a feeling that links between academia and industry are healthy and that these new interactions between Physics and Finance are producing valuable scientific and economic results.
Recent advances in QM/MM free energy calculations using reference potentials.
Duarte, Fernanda; Amrein, Beat A; Blaha-Nelson, David; Kamerlin, Shina C L
2015-05-01
Recent years have seen enormous progress in the development of methods for modeling (bio)molecular systems. This has allowed for the simulation of ever larger and more complex systems. However, as such complexity increases, the requirements needed for these models to be accurate and physically meaningful become more and more difficult to fulfill. The use of simplified models to describe complex biological systems has long been shown to be an effective way to overcome some of the limitations associated with this computational cost in a rational way. Hybrid QM/MM approaches have rapidly become one of the most popular computational tools for studying chemical reactivity in biomolecular systems. However, the high cost involved in performing high-level QM calculations has limited the applicability of these approaches when calculating free energies of chemical processes. In this review, we present some of the advances in using reference potentials and mean field approximations to accelerate high-level QM/MM calculations. We present illustrative applications of these approaches and discuss challenges and future perspectives for the field. The use of physically-based simplifications has shown to effectively reduce the cost of high-level QM/MM calculations. In particular, lower-level reference potentials enable one to reduce the cost of expensive free energy calculations, thus expanding the scope of problems that can be addressed. As was already demonstrated 40 years ago, the usage of simplified models still allows one to obtain cutting edge results with substantially reduced computational cost. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014. Published by Elsevier B.V.
Physical biology of human brain development.
Budday, Silvia; Steinmann, Paul; Kuhl, Ellen
2015-01-01
Neurodevelopment is a complex, dynamic process that involves a precisely orchestrated sequence of genetic, environmental, biochemical, and physical events. Developmental biology and genetics have shaped our understanding of the molecular and cellular mechanisms during neurodevelopment. Recent studies suggest that physical forces play a central role in translating these cellular mechanisms into the complex surface morphology of the human brain. However, the precise impact of neuronal differentiation, migration, and connection on the physical forces during cortical folding remains unknown. Here we review the cellular mechanisms of neurodevelopment with a view toward surface morphogenesis, pattern selection, and evolution of shape. We revisit cortical folding as the instability problem of constrained differential growth in a multi-layered system. To identify the contributing factors of differential growth, we map out the timeline of neurodevelopment in humans and highlight the cellular events associated with extreme radial and tangential expansion. We demonstrate how computational modeling of differential growth can bridge the scales-from phenomena on the cellular level toward form and function on the organ level-to make quantitative, personalized predictions. Physics-based models can quantify cortical stresses, identify critical folding conditions, rationalize pattern selection, and predict gyral wavelengths and gyrification indices. We illustrate that physical forces can explain cortical malformations as emergent properties of developmental disorders. Combining biology and physics holds promise to advance our understanding of human brain development and enable early diagnostics of cortical malformations with the ultimate goal to improve treatment of neurodevelopmental disorders including epilepsy, autism spectrum disorders, and schizophrenia.
NASA Astrophysics Data System (ADS)
Varade, D. M.; Dikshit, O.
2017-12-01
Modeling and forecasting of snowmelt runoff are significant for understanding the hydrological processes in the cryosphere which requires timely information regarding snow physical properties such as liquid water content and density of snow in the topmost layer of the snowpack. Both the seasonal runoffs and avalanche forecasting are vastly dependent on the inherent physical characteristics of the snowpack which are conventionally measured by field surveys in difficult terrains at larger impending costs and manpower. With advances in remote sensing technology and the increase in the availability of satellite data, the frequency and extent of these surveys could see a declining trend in future. In this study, we present a novel approach for estimating snow wetness and snow density using visible and infrared bands that are available with most multi-spectral sensors. We define a trapezoidal feature space based on the spectral reflectance in the near infrared band and the Normalized Differenced Snow Index (NDSI), referred to as NIR-NDSI space, where dry snow and wet snow are observed in the left diagonal upper and lower right corners, respectively. The corresponding pixels are extracted by approximating the dry and wet edges which are used to develop a linear physical model to estimate snow wetness. Snow density is then estimated using the modeled snow wetness. Although the proposed approach has used Sentinel-2 data, it can be extended to incorporate data from other multi-spectral sensors. The estimated values for snow wetness and snow density show a high correlation with respect to in-situ measurements. The proposed model opens a new avenue for remote sensing of snow physical properties using multi-spectral data, which were limited in the literature.
Route constraints model based on polychromatic sets
NASA Astrophysics Data System (ADS)
Yin, Xianjun; Cai, Chao; Wang, Houjun; Li, Dongwu
2018-03-01
With the development of unmanned aerial vehicle (UAV) technology, the fields of its application are constantly expanding. The mission planning of UAV is especially important, and the planning result directly influences whether the UAV can accomplish the task. In order to make the results of mission planning for unmanned aerial vehicle more realistic, it is necessary to consider not only the physical properties of the aircraft, but also the constraints among the various equipment on the UAV. However, constraints among the equipment of UAV are complex, and the equipment has strong diversity and variability, which makes these constraints difficult to be described. In order to solve the above problem, this paper, referring to the polychromatic sets theory used in the advanced manufacturing field to describe complex systems, presents a mission constraint model of UAV based on polychromatic sets.
Graph-based sensor fusion for classification of transient acoustic signals.
Srinivas, Umamahesh; Nasrabadi, Nasser M; Monga, Vishal
2015-03-01
Advances in acoustic sensing have enabled the simultaneous acquisition of multiple measurements of the same physical event via co-located acoustic sensors. We exploit the inherent correlation among such multiple measurements for acoustic signal classification, to identify the launch/impact of munition (i.e., rockets, mortars). Specifically, we propose a probabilistic graphical model framework that can explicitly learn the class conditional correlations between the cepstral features extracted from these different measurements. Additionally, we employ symbolic dynamic filtering-based features, which offer improvements over the traditional cepstral features in terms of robustness to signal distortions. Experiments on real acoustic data sets show that our proposed algorithm outperforms conventional classifiers as well as the recently proposed joint sparsity models for multisensor acoustic classification. Additionally our proposed algorithm is less sensitive to insufficiency in training samples compared to competing approaches.
Second Law based definition of passivity/activity of devices
NASA Astrophysics Data System (ADS)
Sundqvist, Kyle M.; Ferry, David K.; Kish, Laszlo B.
2017-10-01
Recently, our efforts to clarify the old question, if a memristor is a passive or active device [1], triggered debates between engineers, who have had advanced definitions of passivity/activity of devices, and physicists with significantly different views about this seemingly simple question. This debate triggered our efforts to test the well-known engineering concepts about passivity/activity in a deeper way, challenging them by statistical physics. It is shown that the advanced engineering definition of passivity/activity of devices is self-contradictory when a thermodynamical system executing Johnson-Nyquist noise is present. A new, statistical physical, self-consistent definition based on the Second Law of Thermodynamics is introduced. It is also shown that, in a system with uniform temperature distribution, any rectifier circuitry that can rectify thermal noise must contain an active circuit element, according to both the engineering and statistical physical definitions.
Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments
NASA Astrophysics Data System (ADS)
Atwal, Gurinder S.; Kinney, Justin B.
2016-03-01
A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.
On multi-site damage identification using single-site training data
NASA Astrophysics Data System (ADS)
Barthorpe, R. J.; Manson, G.; Worden, K.
2017-11-01
This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.
Model-Based Systems Engineering in Concurrent Engineering Centers
NASA Technical Reports Server (NTRS)
Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman
2015-01-01
Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a focused design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.
Model-Based Systems Engineering in Concurrent Engineering Centers
NASA Technical Reports Server (NTRS)
Iwata, Curtis; Infeld, Samatha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman
2015-01-01
Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.
Elliott, Lydia; DeCristofaro, Claire; Carpenter, Alesia
2012-09-01
This article describes the development and implementation of integrated use of personal handheld devices (personal digital assistants, PDAs) and high-fidelity simulation in an advanced health assessment course in a graduate family nurse practitioner (NP) program. A teaching tool was developed that can be utilized as a template for clinical case scenarios blending these separate technologies. Review of the evidence-based literature, including peer-reviewed articles and reviews. Blending the technologies of high-fidelity simulation and handheld devices (PDAs) provided a positive learning experience for graduate NP students in a teaching laboratory setting. Combining both technologies in clinical case scenarios offered a more real-world learning experience, with a focus on point-of-care service and integration of interview and physical assessment skills with existing standards of care and external clinical resources. Faculty modeling and advance training with PDA technology was crucial to success. Faculty developed a general template tool and systems-based clinical scenarios integrating PDA and high-fidelity simulation. Faculty observations, the general template tool, and one scenario example are included in this article. ©2012 The Author(s) Journal compilation ©2012 American Academy of Nurse Practitioners.
THE NET ADVANCE OF PHYSICS Review Articles and Tutorials in an Encyclopædic Format Established 1995 [Link to MIT] Computer support for The Net Advance of Physics is furnished by The Massachusetts Newest Additions SPECIAL FEATURES: Net Advance RETRO: Nineteenth Century Physics History of Science
Model-based Optimization and Feedback Control of the Current Density Profile Evolution in NSTX-U
NASA Astrophysics Data System (ADS)
Ilhan, Zeki Okan
Nuclear fusion research is a highly challenging, multidisciplinary field seeking contributions from both plasma physics and multiple engineering areas. As an application of plasma control engineering, this dissertation mainly explores methods to control the current density profile evolution within the National Spherical Torus eXperiment-Upgrade (NSTX-U), which is a substantial upgrade based on the NSTX device, which is located in Princeton Plasma Physics Laboratory (PPPL), Princeton, NJ. Active control of the toroidal current density profile is among those plasma control milestones that the NSTX-U program must achieve to realize its next-step operational goals, which are characterized by high-performance, long-pulse, MHD-stable plasma operation with neutral beam heating. Therefore, the aim of this work is to develop model-based, feedforward and feedback controllers that can enable time regulation of the current density profile in NSTX-U by actuating the total plasma current, electron density, and the powers of the individual neutral beam injectors. Motivated by the coupled, nonlinear, multivariable, distributed-parameter plasma dynamics, the first step towards control design is the development of a physics-based, control-oriented model for the current profile evolution in NSTX-U in response to non-inductive current drives and heating systems. Numerical simulations of the proposed control-oriented model show qualitative agreement with the high-fidelity physics code TRANSP. The next step is to utilize the proposed control-oriented model to design an open-loop actuator trajectory optimizer. Given a desired operating state, the optimizer produces the actuator trajectories that can steer the plasma to such state. The objective of the feedforward control design is to provide a more systematic approach to advanced scenario planning in NSTX-U since the development of such scenarios is conventionally carried out experimentally by modifying the tokamak's actuator trajectories and analyzing the resulting plasma evolution. Finally, the proposed control-oriented model is embedded in feedback control schemes based on optimal control and Model Predictive Control (MPC) approaches. Integrators are added to the standard Linear Quadratic Gaussian (LQG) and MPC formulations to provide robustness against various modeling uncertainties and external disturbances. The effectiveness of the proposed feedback controllers in regulating the current density profile in NSTX-U is demonstrated in closed-loop nonlinear simulations. Moreover, the optimal feedback control algorithm has been implemented successfully in closed-loop control simulations within TRANSP through the recently developed Expert routine. (Abstract shortened by ProQuest.).
NASA Astrophysics Data System (ADS)
2002-11-01
CD-ROM REVIEW (551) Essential Physics BOOK REVIEWS (551) Collins Advanced Science: Physics, 2nd edition Quarks, Leptons and the Big Bang, 2nd edition Do Brilliantly: A2 Physics IGCSE Physics Geophysics in the UK Synoptic Skills in Advanced Physics Flash! The hunt for the biggest explosions in the universe Materials Maths for Advanced Physics
Quasi-steady aerodynamic model of clap-and-fling flapping MAV and validation using free-flight data.
Armanini, S F; Caetano, J V; Croon, G C H E de; Visser, C C de; Mulder, M
2016-06-30
Flapping-wing aerodynamic models that are accurate, computationally efficient and physically meaningful, are challenging to obtain. Such models are essential to design flapping-wing micro air vehicles and to develop advanced controllers enhancing the autonomy of such vehicles. In this work, a phenomenological model is developed for the time-resolved aerodynamic forces on clap-and-fling ornithopters. The model is based on quasi-steady theory and accounts for inertial, circulatory, added mass and viscous forces. It extends existing quasi-steady approaches by: including a fling circulation factor to account for unsteady wing-wing interaction, considering real platform-specific wing kinematics and different flight regimes. The model parameters are estimated from wind tunnel measurements conducted on a real test platform. Comparison to wind tunnel data shows that the model predicts the lift forces on the test platform accurately, and accounts for wing-wing interaction effectively. Additionally, validation tests with real free-flight data show that lift forces can be predicted with considerable accuracy in different flight regimes. The complete parameter-varying model represents a wide range of flight conditions, is computationally simple, physically meaningful and requires few measurements. It is therefore potentially useful for both control design and preliminary conceptual studies for developing new platforms.
On validating remote sensing simulations using coincident real data
NASA Astrophysics Data System (ADS)
Wang, Mingming; Yao, Wei; Brown, Scott; Goodenough, Adam; van Aardt, Jan
2016-05-01
The remote sensing community often requires data simulation, either via spectral/spatial downsampling or through virtual, physics-based models, to assess systems and algorithms. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model is one such first-principles, physics-based model for simulating imagery for a range of modalities. Complex simulation of vegetation environments subsequently has become possible, as scene rendering technology and software advanced. This in turn has created questions related to the validity of such complex models, with potential multiple scattering, bidirectional distribution function (BRDF), etc. phenomena that could impact results in the case of complex vegetation scenes. We selected three sites, located in the Pacific Southwest domain (Fresno, CA) of the National Ecological Observatory Network (NEON). These sites represent oak savanna, hardwood forests, and conifer-manzanita-mixed forests. We constructed corresponding virtual scenes, using airborne LiDAR and imaging spectroscopy data from NEON, ground-based LiDAR data, and field-collected spectra to characterize the scenes. Imaging spectroscopy data for these virtual sites then were generated using the DIRSIG simulation environment. This simulated imagery was compared to real AVIRIS imagery (15m spatial resolution; 12 pixels/scene) and NEON Airborne Observation Platform (AOP) data (1m spatial resolution; 180 pixels/scene). These tests were performed using a distribution-comparison approach for select spectral statistics, e.g., established the spectra's shape, for each simulated versus real distribution pair. The initial comparison results of the spectral distributions indicated that the shapes of spectra between the virtual and real sites were closely matched.
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
Structural and congenital heart disease interventions: the role of three-dimensional printing.
Meier, L M; Meineri, M; Qua Hiansen, J; Horlick, E M
2017-02-01
Advances in catheter-based interventions in structural and congenital heart disease have mandated an increased demand for three-dimensional (3D) visualisation of complex cardiac anatomy. Despite progress in 3D imaging modalities, the pre- and periprocedural visualisation of spatial anatomy is relegated to two-dimensional flat screen representations. 3D printing is an evolving technology based on the concept of additive manufacturing, where computerised digital surface renders are converted into physical models. Printed models replicate complex structures in tangible forms that cardiovascular physicians and surgeons can use for education, preprocedural planning and device testing. In this review we discuss the different steps of the 3D printing process, which include image acquisition, segmentation, printing methods and materials. We also examine the expanded applications of 3D printing in the catheter-based treatment of adult patients with structural and congenital heart disease while highlighting the current limitations of this technology in terms of segmentation, model accuracy and dynamic capabilities. Furthermore, we provide information on the resources needed to establish a hospital-based 3D printing laboratory.
NASA Astrophysics Data System (ADS)
Gebregiorgis, A. S.; Peters-Lidard, C. D.; Tian, Y.; Hossain, F.
2011-12-01
Hydrologic modeling has benefited from operational production of high resolution satellite rainfall products. The global coverage, near-real time availability, spatial and temporal sampling resolutions have advanced the application of physically based semi-distributed and distributed hydrologic models for wide range of environmental decision making processes. Despite these successes, the existence of uncertainties due to indirect way of satellite rainfall estimates and hydrologic models themselves remain a challenge in making meaningful and more evocative predictions. This study comprises breaking down of total satellite rainfall error into three independent components (hit bias, missed precipitation and false alarm), characterizing them as function of land use and land cover (LULC), and tracing back the source of simulated soil moisture and runoff error in physically based distributed hydrologic model. Here, we asked "on what way the three independent total bias components, hit bias, missed, and false precipitation, affect the estimation of soil moisture and runoff in physically based hydrologic models?" To understand the clear picture of the outlined question above, we implemented a systematic approach by characterizing and decomposing the total satellite rainfall error as a function of land use and land cover in Mississippi basin. This will help us to understand the major source of soil moisture and runoff errors in hydrologic model simulation and trace back the information to algorithm development and sensor type which ultimately helps to improve algorithms better and will improve application and data assimilation in future for GPM. For forest and woodland and human land use system, the soil moisture was mainly dictated by the total bias for 3B42-RT, CMORPH, and PERSIANN products. On the other side, runoff error was largely dominated by hit bias than the total bias. This difference occurred due to the presence of missed precipitation which is a major contributor to the total bias both during the summer and winter seasons. Missed precipitation, most likely light rain and rain over snow cover, has significant effect on soil moisture and are less capable of producing runoff that results runoff dependency on the hit bias only.
ERIC Educational Resources Information Center
Singh, Gurmukh
2012-01-01
The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…
Advances in the physics basis for the European DEMO design
NASA Astrophysics Data System (ADS)
Wenninger, R.; Arbeiter, F.; Aubert, J.; Aho-Mantila, L.; Albanese, R.; Ambrosino, R.; Angioni, C.; Artaud, J.-F.; Bernert, M.; Fable, E.; Fasoli, A.; Federici, G.; Garcia, J.; Giruzzi, G.; Jenko, F.; Maget, P.; Mattei, M.; Maviglia, F.; Poli, E.; Ramogida, G.; Reux, C.; Schneider, M.; Sieglin, B.; Villone, F.; Wischmeier, M.; Zohm, H.
2015-06-01
In the European fusion roadmap, ITER is followed by a demonstration fusion power reactor (DEMO), for which a conceptual design is under development. This paper reports the first results of a coherent effort to develop the relevant physics knowledge for that (DEMO Physics Basis), carried out by European experts. The program currently includes investigations in the areas of scenario modeling, transport, MHD, heating & current drive, fast particles, plasma wall interaction and disruptions.
Non-collinear Generation of Angularly Isolated Circularly Polarized High Harmonics
2015-09-21
collinear HHG using both intuitive physical models as well as advanced numerical calculations. In the photon picture (Fig. 1b), we show that the NCP...Department of Physics , University of Colorado and NIST, Boulder, Colorado 80309, USA. 2Grupo de Investigación en Óptica Extrema, Universidad de... Physics , Colorado School of Mines, Golden, Colorado 80401, USA. *e-mail: danhickstein@gmail.com ARTICLES PUBLISHED ONLINE: 21 SEPTEMBER 2015 | DOI
A model to advance nursing science in trauma practice and injury outcomes research.
Richmond, Therese S; Aitken, Leanne M
2011-12-01
This discussion paper reports development of a model to advance nursing science and practice in trauma care based on an analysis of the literature and expert opinion. The continuum of clinical care provided to trauma patients extends from the time of injury through to long-term recovery and final outcomes. Nurses bring a unique expertise to meet the complex physical and psychosocial needs of trauma patients and their families to influence outcomes across this entire continuum. Literature was obtained by searching CINAHL, PubMed and OvidMedline databases for 1990-2010. Search terms included trauma, nursing, scope of practice and role, with results restricted to those published in English. Manual searches of relevant journals and websites were undertaken. Core concepts in this trauma outcomes model include environment, person/family, structured care settings, long-term outcomes and nursing interventions. The relationships between each of these concepts extend across all phases of care. Intermediate outcomes are achieved in each phase of care and influence and have congruence with long-term outcomes. Implications for policy and practice. This model is intended to provide a framework to assist trauma nurses and researchers to consider the injured person in the context of the social, economic, cultural and physical environment from which they come and the long-term goals that each person has during recovery. The entire model requires testing in research and assessment of its practical contribution to practice. Planning and integrating care across the trauma continuum and recognition of the role of the injured person's background, family and resources will lead to improved long-term outcomes. © 2011 Blackwell Publishing Ltd.
Recent Advances in Model-Assisted Probability of Detection
NASA Technical Reports Server (NTRS)
Thompson, R. Bruce; Brasche, Lisa J.; Lindgren, Eric; Swindell, Paul; Winfree, William P.
2009-01-01
The increased role played by probability of detection (POD) in structural integrity programs, combined with the significant time and cost associated with the purely empirical determination of POD, provides motivation for alternate means to estimate this important metric of NDE techniques. One approach to make the process of POD estimation more efficient is to complement limited empirical experiments with information from physics-based models of the inspection process or controlled laboratory experiments. The Model-Assisted Probability of Detection (MAPOD) Working Group was formed by the Air Force Research Laboratory, the FAA Technical Center, and NASA to explore these possibilities. Since the 2004 inception of the MAPOD Working Group, 11 meetings have been held in conjunction with major NDE conferences. This paper will review the accomplishments of this group, which includes over 90 members from around the world. Included will be a discussion of strategies developed to combine physics-based and empirical understanding, draft protocols that have been developed to guide application of the strategies, and demonstrations that have been or are being carried out in a number of countries. The talk will conclude with a discussion of future directions, which will include documentation of benefits via case studies, development of formal protocols for engineering practice, as well as a number of specific technical issues.
Review of the Scientific Understanding of Radioactive Waste at the U.S. DOE Hanford Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Reid A.; Buck, Edgar C.; Chun, Jaehun
This paper reviews the origin and chemical and rheological complexity of radioactive waste at the U.S. Department of Energy’s Hanford Site. The waste, stored in underground tanks, was generated via three distinct processes over decades of plutonium extraction operations. Although close records were kept of original waste disposition, tank-to-tank transfers and conditions that impede equilibrium complicate our understanding of the chemistry, phase composition, and rheology of the waste. Tank waste slurries comprise particles and aggregates from nano to micron scales, with varying densities, morphologies, heterogeneous compositions, and complicated responses to flow regimes and process conditions. Further, remnant or changing radiationmore » fields may affect the stability and rheology of the waste. These conditions pose challenges for transport through conduits or pipes to treatment plants for vitrification. Additionally, recalcitrant boehmite degrades glass quality and must be reduced prior to vitrification, but dissolves much more slowly than predicted given surface normalized rates. Existing empirical models based on ex situ experiments and observations lack true predictive capabilities. Recent advances in in situ microscopy, aberration corrected TEM, theoretical modeling across scales, and experimental methods for probing the physics and chemistry at mineral-fluid and mineral-mineral interfaces are being implemented to build robustly predictive physics-based models.« less
A pervasive visual-haptic framework for virtual delivery training.
Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V
2010-03-01
Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.
University Research in Support of TREAT Modeling and Simulation, FY 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeHart, Mark David
Idaho National Laboratory is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under the Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. In support of this research, INL is working with four universities to explore advanced solution methods that will complement or augment capabilities in MAMMOTH. This report consists of a collection of year end summaries of research from the universities performed inmore » support of TREAT modeling and simulation. This research was led by Prof. Sedat Goluoglu at the University of Florida, Profs. Jim Morel and Jean Ragusa at Texas A&M University, Profs. Benoit Forget and Kord Smith at Massachusetts Institute of Technology, Prof. Leslie Kerby of Idaho State University and Prof. Barry Ganapol of University of Arizona. A significant number of students were supported at various levels though the projects and, for some, also as interns at INL.« less
Surface Rupture Effects on Earthquake Moment-Area Scaling Relations
NASA Astrophysics Data System (ADS)
Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro
2017-09-01
Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.
Reflector and Protections in a Sodium-cooled Fast Reactor: Modelling and Optimization
NASA Astrophysics Data System (ADS)
Blanchet, David; Fontaine, Bruno
2017-09-01
The ASTRID project (Advanced Sodium Technological Reactor for Industrial Demonstration) is a Generation IV nuclear reactor concept under development in France [1]. In this frame, studies are underway to optimize radial reflectors and protections. Considering radial protections made in natural boron carbide, this study is conducted to assess the neutronic performances of the MgO as the reference choice for reflector material, in comparison with other possible materials including a more conventional stainless steel. The analysis is based upon a simplified 1-D and 2-D deterministic modelling of the reactor, providing simplified interfaces between core, reflector and protections. Such models allow examining detailed reaction rate distributions; they also provide physical insights into local spectral effects occurring at the Core-Reflector and at the Reflector-Protection interfaces.
NASA Technical Reports Server (NTRS)
Singh, Bhim S.
2003-01-01
NASA is preparing to undertake science-driven exploration missions. The NASA Exploration Team's vision is a cascade of stepping stones. The stepping-stone will build the technical capabilities needed for each step with multi-use technologies and capabilities. An Agency-wide technology investment and development program is necessary to implement the vision. The NASA Exploration Team has identified a number of areas where significant advances are needed to overcome all engineering and medical barriers to the expansion of human space exploration beyond low-Earth orbit. Closed-loop life support systems and advanced propulsion and power technologies are among the areas requiring significant advances from the current state-of-the-art. Studies conducted by the National Academy of Science's National Research Council and Workshops organized by NASA have shown that multiphase flow and phase change play a crucial role in many of these advanced technology concepts. Lack of understanding of multiphase flow, phase change, and interfacial phenomena in the microgravity environment has been a major hurdle. An understanding of multiphase flow and phase change in microgravity is, therefore, critical to advancing many technologies needed. Recognizing this, the Office of Biological and Physical Research (OBPR) has initiated a strategic research thrust to augment the ongoing fundamental research in fluid physics and transport phenomena discipline with research especially aimed at understanding key multiphase flow related issues in propulsion, power, thermal control, and closed-loop advanced life support systems. A plan for integrated theoretical and experimental research that has the highest probability of providing data, predictive tools, and models needed by the systems developers to incorporate highly promising multiphase-based technologies is currently in preparation. This plan is being developed with inputs from scientific community, NASA mission planners and industry personnel. The fundamental research in multiphase flow and phase change in microgravity is aimed at developing better mechanistic understanding of pool boiling and ascertaining the effects of gravity on heat transfer and the critical heat flux. Space flight experiments conducted in space have shown that nucleate pool boiling can be sustained under certain conditions in the microgravity environment. New space flight experiments are being developed to provide more quantitative information on pool boiling in microgravity. Ground-based investigations are also being conducted to develop mechanistic models for flow and pool boiling. An overview of the research plan and roadmap for the strategic research in multiphase flow and phase change as well as research findings from the ongoing program will be presented.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1992-01-01
Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Jie; Kim, Donghun; Braun, James E.
It is important to have practical methods for constructing a good mathematical model for a building's thermal system for energy audits, retrofit analysis and advanced building controls, e.g. model predictive control. Identification approaches based on semi-physical model structures are popular in building science for those purposes. However conventional gray box identification approaches applied to thermal networks would fail when significant unmeasured heat gains present in estimation data. Although this situation is very common and practical, there has been little research to tackle this issue in building science. This paper presents an overall identification approach to alleviate influences of unmeasured disturbances,more » and hence to obtain improved gray-box building models. The approach was applied to an existing open space building and the performance is demonstrated.« less
Effects of rehabilitation among patients with advances cancer: a systematic review.
Salakari, Minna R J; Surakka, Tiina; Nurminen, Raija; Pylkkänen, Liisa
2015-05-01
In parallel with the rising incidence of cancer and improved treatment, there is a continuous increase in the number of patients living with cancer as a chronic condition. Many cancer patients experience long-term disability and require continuous oncological treatment, care and support. The aim of this review is to evaluate the most recent data on the effects of rehabilitation among patients with advanced cancer. A systematic review was conducted according to Fink's model. Only randomized controlled trials (RCTs) published in 2009-2014 were included. Medline/PubMed and Cochrane databases were searched; five groups of keywords were used. The articles were evaluated for outcome and methodological quality. Thirteen RCTs (1169 participants) were evaluated. Most studies were on the effects of physical exercise in patients with advanced cancer (N = 7). Physical exercise was associated with a significant improvement in general wellbeing and quality of life. Rehabilitation had positive effects on fatigue, general condition, mood, and coping with cancer. Rehabilitation is needed also among patients with advanced disease and in palliative care. Exercise improves physical performance and has positive effects on several other quality of life domains. More data and RCTs are needed, but current evidence gives an indication that rehabilitation is suitable and can be recommended for patients living with advanced cancer.
Computational Nanoelectronics and Nanotechnology at NASA ARC
NASA Technical Reports Server (NTRS)
Saini, Subhash; Kutler, Paul (Technical Monitor)
1998-01-01
Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technology are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotechnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.
Computational Nanoelectronics and Nanotechnology at NASA ARC
NASA Technical Reports Server (NTRS)
Saini, Subhash
1998-01-01
Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technolpgy are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotecnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.
Virtual suturing simulation based on commodity physics engine for medical learning.
Choi, Kup-Sze; Chan, Sze-Ho; Pang, Wai-Man
2012-06-01
Development of virtual-reality medical applications is usually a complicated and labour intensive task. This paper explores the feasibility of using commodity physics engine to develop a suturing simulator prototype for manual skills training in the fields of nursing and medicine, so as to enjoy the benefits of rapid development and hardware-accelerated computation. In the prototype, spring-connected boxes of finite dimension are used to simulate soft tissues, whereas needle and thread are modelled with chained segments. Spherical joints are used to simulate suture's flexibility and to facilitate thread cutting. An algorithm is developed to simulate needle insertion and thread advancement through the tissue. Two-handed manipulations and force feedback are enabled with two haptic devices. Experiments on the closure of a wound show that the prototype is able to simulate suturing procedures at interactive rates. The simulator is also used to study a curvature-adaptive suture modelling technique. Issues and limitations of the proposed approach and future development are discussed.
Efficient physics-based tracking of heart surface motion for beating heart surgery robotic systems.
Bogatyrenko, Evgeniya; Pompey, Pascal; Hanebeck, Uwe D
2011-05-01
Tracking of beating heart motion in a robotic surgery system is required for complex cardiovascular interventions. A heart surface motion tracking method is developed, including a stochastic physics-based heart surface model and an efficient reconstruction algorithm. The algorithm uses the constraints provided by the model that exploits the physical characteristics of the heart. The main advantage of the model is that it is more realistic than most standard heart models. Additionally, no explicit matching between the measurements and the model is required. The application of meshless methods significantly reduces the complexity of physics-based tracking. Based on the stochastic physical model of the heart surface, this approach considers the motion of the intervention area and is robust to occlusions and reflections. The tracking algorithm is evaluated in simulations and experiments on an artificial heart. Providing higher accuracy than the standard model-based methods, it successfully copes with occlusions and provides high performance even when all measurements are not available. Combining the physical and stochastic description of the heart surface motion ensures physically correct and accurate prediction. Automatic initialization of the physics-based cardiac motion tracking enables system evaluation in a clinical environment.
NASA Astrophysics Data System (ADS)
Dong, L.
2017-12-01
Abstract: The original urban surface structure changed a lot because of the rapid development of urbanization. Impermeable area has increased a lot. It causes great pressure for city flood control and drainage. Songmushan reservoir basin with high degree of urbanization is taken for an example. Pixel from Landsat is decomposed by Linear spectral mixture model and the proportion of urban area in it is considered as impervious rate. Based on impervious rate data before and after urbanization, an physically based distributed hydrological model, Liuxihe Model, is used to simulate the process of hydrology. The research shows that the performance of the flood forecasting of high urbanization area carried out with Liuxihe Model is perfect and can meet the requirement of the accuracy of city flood control and drainage. The increase of impervious area causes conflux speed more quickly and peak flow to be increased. It also makes the time of peak flow advance and the runoff coefficient increase. Key words: Liuxihe Model; Impervious rate; City flood control and drainage; Urbanization; Songmushan reservoir basin
NASA Astrophysics Data System (ADS)
Shubitidze, Fridon; Barrowes, Benjamin E.; Shamatava, Irma; Sigman, John; O'Neill, Kevin A.
2018-05-01
Processing electromagnetic induction signals from subsurface targets, for purposes of discrimination, requires accurate physical models. To date, successful approaches for on-land cases have entailed advanced modeling of responses by the targets themselves, with quite adequate treatment of instruments as well. Responses from the environment were typically slight and/or were treated very simply. When objects are immersed in saline solutions, however, more sophisticated modeling of the diffusive EMI physics in the environment is required. One needs to account for the response of the environment itself as well as the environment's frequency and time-dependent effects on both primary and secondary fields, from sensors and targets, respectively. Here we explicate the requisite physics and identify its effects quantitatively via analytical, numerical, and experimental investigations. Results provide a path for addressing the quandaries posed by previous underwater measurements and indicate how the environmental physics may be included in more successful processing.
Virtual Engineering and Science Team - Reusable Autonomy for Spacecraft Subsystems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Johnson, Michael A.; Rilee, Michael L.; Truszkowski, Walt; Thompson, Bryan; Day, John H. (Technical Monitor)
2002-01-01
In this paper we address the design, development, and evaluation of the Virtual Engineering and Science Team (VEST) tool - a revolutionary way to achieve onboard subsystem/instrument autonomy. VEST directly addresses the technology needed for advanced autonomy enablers for spacecraft subsystems. It will significantly support the efficient and cost effective realization of on-board autonomy and contribute directly to realizing the concept of an intelligent autonomous spacecraft. VEST will support the evolution of a subsystem/instrument model that is probably correct and from that model the automatic generation of the code needed to support the autonomous operation of what was modeled. VEST will directly support the integration of the efforts of engineers, scientists, and software technologists. This integration of efforts will be a significant advancement over the way things are currently accomplished. The model, developed through the use of VEST, will be the basis for the physical construction of the subsystem/instrument and the generated code will support its autonomous operation once in space. The close coupling between the model and the code, in the same tool environment, will help ensure that correct and reliable operational control of the subsystem/instrument is achieved.VEST will provide a thoroughly modern interface that will allow users to easily and intuitively input subsystem/instrument requirements and visually get back the system's reaction to the correctness and compatibility of the inputs as the model evolves. User interface/interaction, logic, theorem proving, rule-based and model-based reasoning, and automatic code generation are some of the basic technologies that will be brought into play in realizing VEST.
US hospital-based direct access with radiology referral: an administrative case report.
Keil, Aaron; Brown, Suzanne Robben
2015-01-01
Legislative gains in the US allow physical therapists to function in expanded scopes of practice including direct access and referral to specialists. The combination of direct access with privileges to order imaging studies directly offers a desirable practice status for many physical therapists, especially in musculoskeletal focused settings. Although direct access is legal in all US jurisdictions, institutional-based physical therapy settings have not embraced these practices. Barriers cited to implementing direct access with advanced practice are concerns over medical and administrative opposition, institutional policies, provider qualifications and reimbursement. This administrative case report describes the process taken to allow therapists to see patients without a referral and to order diagnostic imaging studies at an academic medical center. Nine-month implementation results show 66 patients seen via direct access with 15% referred for imaging studies. Claims submitted to 20 different insurance providers were reimbursed at 100%. While institutional regulations and reimbursement are reported as barriers to direct access, this report highlights the process one academic medical center used to implement direct access and advanced practice radiology referral by updating policies and procedures, identifying advanced competencies and communicating with necessary stakeholder groups. Favorable reimbursement for services is documented.
Understanding and predicting profile structure and parametric scaling of intrinsic rotation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, W. X.; Grierson, B. A.; Ethier, S.
2017-08-10
This study reports on a recent advance in developing physical understanding and a first-principles-based model for predicting intrinsic rotation profiles in magnetic fusion experiments. It is shown for the first time that turbulent fluctuation-driven residual stress (a non-diffusive component of momentum flux) along with diffusive momentum flux can account for both the shape and magnitude of the observed intrinsic toroidal rotation profile. Both the turbulence intensity gradient and zonal flow E×B shear are identified as major contributors to the generation of the k ∥-asymmetry needed for the residual stress generation. The model predictions of core rotation based on global gyrokineticmore » simulations agree well with the experimental measurements of main ion toroidal rotation for a set of DIII-D ECH discharges. The validated model is further used to investigate the characteristic dependence of residual stress and intrinsic rotation profile structure on the multi-dimensional parametric space covering the turbulence type, q-profile structure, and up-down asymmetry in magnetic geometry with the goal of developing the physics understanding needed for rotation profile control and optimization. It is shown that in the flat-q profile regime, intrinsic rotations driven by ITG and TEM turbulence are in the opposite direction (i.e., intrinsic rotation reverses). The predictive model also produces reversed intrinsic rotation for plasmas with weak and normal shear q-profiles.« less
Modeling of the radiation belt megnetosphere in decisional timeframes
Koller, Josef; Reeves, Geoffrey D; Friedel, Reiner H.W.
2013-04-23
Systems and methods for calculating L* in the magnetosphere with essentially the same accuracy as with a physics based model at many times the speed by developing a surrogate trained to be a surrogate for the physics-based model. The trained model can then beneficially process input data falling within the training range of the surrogate model. The surrogate model can be a feedforward neural network and the physics-based model can be the TSK03 model. Operatively, the surrogate model can use parameters on which the physics-based model was based, and/or spatial data for the location where L* is to be calculated. Surrogate models should be provided for each of a plurality of pitch angles. Accordingly, a surrogate model having a closed drift shell can be used from the plurality of models. The feedforward neural network can have a plurality of input-layer units, there being at least one input-layer unit for each physics-based model parameter, a plurality of hidden layer units and at least one output unit for the value of L*.
Soares, Wuber J. S.; Lima, Camila A.; Bilton, Tereza L.; Ferrioli, Eduardo; Dias, Rosângela C.; Perracini, Monica R.
2015-01-01
Objective: To investigate the relationship between self-perceived fatigue with different physical functioning tests and functional performance scales used for evaluating mobility-related disability among community-dwelling older persons. Method: This is a cross-sectional, population-based study. The sample was composed of older persons with 65 years of age or more living in Cuiabá, MT, and Barueri, SP, Brazil. The data for this study is from the FIBRA Network Study. The presence of self-perceived fatigue was assessed using self-reports based on the Center for Epidemiologic Studies-Depression Scale. The Lawton instrumental activities of daily living scale (IADL) and the advanced activities of daily living scale (AADL) were used to assess performance and participation restriction. The following physical functioning tests were used: five-step test (FST), the Short Physical Performance Battery (SPPB), and usual gait speed (UGS). Three models of logistic regression analysis were conducted, and a significance level of α<0.05 was adopted. Results: The sample was composed of 776 older adults with a mean age (SD) of 71.9 (5.9) years, of whom the majority were women (74%). The prevalence of self-perceived fatigue within the participants was 20%. After adjusting for covariates, SPPB, UGS, IADL, and AADL remained associated with self-perceived fatigue in the final multivariate regression model. Conclusion: Our results suggest that there is an association between self-perceived fatigue and lower extremity function, usual gait speed and activity limitation and participation restriction in older adults. Further cohort studies are needed to investigate which physical performance measure may be able to predict the negative impact of fatigue in older adults. PMID:26039035
NASA LWS Institute GIC Working Group: GIC science, engineering and applications readiness
NASA Astrophysics Data System (ADS)
Pulkkinen, A. A.; Thomson, A. W. P.; Bernabeu, E.
2016-12-01
In recognition of the rapidly growing interest on the topic, this paper is based on the findings of the very first NASA Living With a Star (LWS) Institute Working Group that was specifically targeting the GIC issue. The new LWS Institutes program element was launched 2014 and the concept is built around small working group style meetings that focus on well defined problems that demand intense, direct interactions between colleagues in neighboring disciplines to facilitate the development of a deeper understanding of the variety of processes that link the solar activity to Earth's environment. The LWS Institute Geomagnetically Induced Currents (GIC) Working Group (WG) led by A. Pulkkinen (NASA GSFC) and co-led by E. Bernabeu (PJM) and A. Thomson (BGS) was selected competitively as the pilot activity for the new LWS element. The GIC WG was tasked to 1) identify, advance, and address the open scientific and engineering questions pertaining to GIC, 2) advance predictive modeling of GIC, 3) advocate and act as a catalyst to identify resources for addressing the multidisciplinary topic of GIC. In this paper, we target the goal 1) of the GIC WG. More specifically, the goal of this paper is to review the current status and future challenges pertaining to science, engineering and applications of the GIC problem. Science is understood here as the basic space and Earth sciences research that allow improved understanding and physics-based modeling of physical processes behind GIC. Engineering in turn is understood here as the "impact" aspect of GIC. The impact includes any physical effects GIC may have on the performance of the manmade infrastructure. Applications is understood as the models, tools and activities that can provide actionable information to entities such as power systems operators for mitigating the effects of GIC and government for managing any potential consequences from GIC impact to critical infrastructure. In this sense, applications can be considered as the ultimate goal of our GIC work and thus in assessing the status of the field, we specifically will quantify the readiness of various applications in the GIC effects mitigation context.
An atomic clock with 10(-18) instability.
Hinkley, N; Sherman, J A; Phillips, N B; Schioppo, M; Lemke, N D; Beloy, K; Pizzocaro, M; Oates, C W; Ludlow, A D
2013-09-13
Atomic clocks have been instrumental in science and technology, leading to innovations such as global positioning, advanced communications, and tests of fundamental constant variation. Timekeeping precision at 1 part in 10(18) enables new timing applications in relativistic geodesy, enhanced Earth- and space-based navigation and telescopy, and new tests of physics beyond the standard model. Here, we describe the development and operation of two optical lattice clocks, both using spin-polarized, ultracold atomic ytterbium. A measurement comparing these systems demonstrates an unprecedented atomic clock instability of 1.6 × 10(-18) after only 7 hours of averaging.
NASA Human Research Program Space Radiation Program Element
NASA Technical Reports Server (NTRS)
Chappell, Lori; Huff, Janice; Patel, Janapriya; Wang, Minli; Hu, Shaowwen; Kidane, Yared; Myung-Hee, Kim; Li, Yongfeng; Nounu, Hatem; Plante, Ianik;
2013-01-01
The goal of the NASA Human Research Program's Space Radiation Program Element is to ensure that crews can safely live and work in the space radiation environment. Current work is focused on developing the knowledge base and tools required for accurate assessment of health risks resulting from space radiation exposure including cancer and circulatory and central nervous system diseases, as well as acute risks from solar particle events. Division of Space Life Sciences (DSLS) Space Radiation Team scientists work at multiple levels to advance this goal, with major projects in biological risk research; epidemiology; and physical, biophysical, and biological modeling.
NASA Astrophysics Data System (ADS)
Aasi, J.; Abadie, J.; Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M.; Accadia, T.; Acernese, F.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Ajith, P.; Allen, B.; Allocca, A.; Amador Ceron, E.; Amariutei, D.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Ast, S.; Aston, S. M.; Astone, P.; Atkinson, D.; Aufmuth, P.; Aulbert, C.; Aylott, B. E.; Babak, S.; Baker, P.; Ballardin, G.; Ballmer, S.; Bao, Y.; Barayoga, J. C. B.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barton, M. A.; Bartos, I.; Bassiri, R.; Bastarrika, M.; Basti, A.; Batch, J.; Bauchrowitz, J.; Bauer, Th. S.; Bebronne, M.; Beck, D.; Behnke, B.; Bejger, M.; Beker, M. G.; Bell, A. S.; Bell, C.; Belopolski, I.; Benacquista, M.; Berliner, J. M.; Bertolini, A.; Betzwieser, J.; Beveridge, N.; Beyersdorf, P. T.; Bhadbade, T.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Biswas, R.; Bitossi, M.; Bizouard, M. A.; Black, E.; Blackburn, J. K.; Blackburn, L.; Blair, D.; Bland, B.; Blom, M.; Bock, O.; Bodiya, T. P.; Bogan, C.; Bond, C.; Bondarescu, R.; Bondu, F.; Bonelli, L.; Bonnand, R.; Bork, R.; Born, M.; Boschi, V.; Bose, S.; Bosi, L.; Bouhou, B.; Braccini, S.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Breyer, J.; Briant, T.; Bridges, D. O.; Brillet, A.; Brinkmann, M.; Brisson, V.; Britzger, M.; Brooks, A. F.; Brown, D. A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Burguet–Castell, J.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Calloni, E.; Camp, J. B.; Campsie, P.; Cannon, K.; Canuel, B.; Cao, J.; Capano, C. D.; Carbognani, F.; Carbone, L.; Caride, S.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C.; Cesarini, E.; Chalermsongsak, T.; Charlton, P.; Chassande-Mottin, E.; Chen, W.; Chen, X.; Chen, Y.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Chow, J.; Christensen, N.; Chua, S. S. Y.; Chung, C. T. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, D. E.; Clark, J. A.; Clayton, J. H.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colacino, C. N.; Colla, A.; Colombini, M.; Conte, A.; Conte, R.; Cook, D.; Corbitt, T. R.; Cordier, M.; Cornish, N.; Corsi, A.; Costa, C. A.; Coughlin, M.; Coulon, J.-P.; Couvares, P.; Coward, D. M.; Cowart, M.; Coyne, D. C.; Creighton, J. D. E.; Creighton, T. D.; Cruise, A. M.; Cumming, A.; Cunningham, L.; Cuoco, E.; Cutler, R. M.; Dahl, K.; Damjanic, M.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dattilo, V.; Daudert, B.; Daveloza, H.; Davier, M.; Daw, E. J.; Dayanga, T.; De Rosa, R.; DeBra, D.; Debreczeni, G.; Degallaix, J.; Del Pozzo, W.; Dent, T.; Dergachev, V.; DeRosa, R.; Dhurandhar, S.; Di Fiore, L.; Di Lieto, A.; Di Palma, I.; Di Paolo Emilio, M.; Di Virgilio, A.; Díaz, M.; Dietz, A.; Donovan, F.; Dooley, K. L.; Doravari, S.; Dorsher, S.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Dumas, J.-C.; Dwyer, S.; Eberle, T.; Edgar, M.; Edwards, M.; Effler, A.; Ehrens, P.; Endrőczi, G.; Engel, R.; Etzel, T.; Evans, K.; Evans, M.; Evans, T.; Factourovich, M.; Fafone, V.; Fairhurst, S.; Farr, B. F.; Farr, W. M.; Favata, M.; Fazi, D.; Fehrmann, H.; Feldbaum, D.; Feroz, F.; Ferrante, I.; Ferrini, F.; Fidecaro, F.; Finn, L. S.; Fiori, I.; Fisher, R. P.; Flaminio, R.; Foley, S.; Forsi, E.; Forte, L. A.; Fotopoulos, N.; Fournier, J.-D.; Franc, J.; Franco, S.; Frasca, S.; Frasconi, F.; Frede, M.; Frei, M. A.; Frei, Z.; Freise, A.; Frey, R.; Fricke, T. T.; Friedrich, D.; Fritschel, P.; Frolov, V. V.; Fujimoto, M.-K.; Fulda, P. J.; Fyffe, M.; Gair, J.; Galimberti, M.; Gammaitoni, L.; Garcia, J.; Garufi, F.; Gáspár, M. E.; Gelencser, G.; Gemme, G.; Genin, E.; Gennai, A.; Gergely, L. Á.; Ghosh, S.; Giaime, J. A.; Giampanis, S.; Giardina, K. D.; Giazotto, A.; Gil-Casanova, S.; Gill, C.; Gleason, J.; Goetz, E.; González, G.; Gorodetsky, M. L.; Goßler, S.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gray, C.; Greenhalgh, R. J. S.; Gretarsson, A. M.; Griffo, C.; Grote, H.; Grover, K.; Grunewald, S.; Guidi, G. M.; Guido, C.; Gupta, R.; Gustafson, E. K.; Gustafson, R.; Hallam, J. M.; Hammer, D.; Hammond, G.; Hanks, J.; Hanna, C.; Hanson, J.; Harms, J.; Harry, G. M.; Harry, I. W.; Harstad, E. D.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Hayama, K.; Hayau, J.-F.; Heefner, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M. A.; Heng, I. S.; Heptonstall, A. W.; Herrera, V.; Heurs, M.; Hewitson, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Holt, K.; Holtrop, M.; Hong, T.; Hooper, S.; Hough, J.; Howell, E. J.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Ingram, D. R.; Inta, R.; Isogai, T.; Ivanov, A.; Izumi, K.; Jacobson, M.; James, E.; Jang, Y. J.; Jaranowski, P.; Jesse, E.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; Kalmus, P.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kasprzack, M.; Kasturi, R.; Katsavounidis, E.; Katzman, W.; Kaufer, H.; Kaufman, K.; Kawabe, K.; Kawamura, S.; Kawazoe, F.; Keitel, D.; Kelley, D.; Kells, W.; Keppel, D. G.; Keresztes, Z.; Khalaidovski, A.; Khalili, F. Y.; Khazanov, E. A.; Kim, B. K.; Kim, C.; Kim, H.; Kim, K.; Kim, N.; Kim, Y. M.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Klimenko, S.; Kline, J.; Kokeyama, K.; Kondrashov, V.; Koranda, S.; Korth, W. Z.; Kowalska, I.; Kozak, D.; Kringel, V.; Krishnan, B.; Królak, A.; Kuehn, G.; Kumar, P.; Kumar, R.; Kurdyumov, R.; Kwee, P.; Lam, P. K.; Landry, M.; Langley, A.; Lantz, B.; Lastzka, N.; Lawrie, C.; Lazzarini, A.; Le Roux, A.; Leaci, P.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Leong, J. R.; Leonor, I.; Leroy, N.; Letendre, N.; Lhuillier, V.; Li, J.; Li, T. G. F.; Lindquist, P. E.; Litvine, V.; Liu, Y.; Liu, Z.; Lockerbie, N. A.; Lodhia, D.; Logue, J.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J.; Lubinski, M.; Lück, H.; Lundgren, A. P.; Macarthur, J.; Macdonald, E.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Mageswaran, M.; Mailand, K.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A.; Maros, E.; Marque, J.; Martelli, F.; Martin, I. W.; Martin, R. M.; Marx, J. N.; Mason, K.; Masserot, A.; Matichard, F.; Matone, L.; Matzner, R. A.; Mavalvala, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McGuire, S. C.; McIntyre, G.; McIver, J.; Meadors, G. D.; Mehmet, M.; Meier, T.; Melatos, A.; Melissinos, A. C.; Mendell, G.; Menéndez, D. F.; Mercer, R. A.; Meshkov, S.; Messenger, C.; Meyer, M. S.; Miao, H.; Michel, C.; Milano, L.; Miller, J.; Minenkov, Y.; Mingarelli, C. M. F.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moe, B.; Mohan, M.; Mohapatra, S. R. P.; Moraru, D.; Moreno, G.; Morgado, N.; Morgia, A.; Mori, T.; Morriss, S. R.; Mosca, S.; Mossavi, K.; Mours, B.; Mow–Lowry, C. M.; Mueller, C. L.; Mueller, G.; Mukherjee, S.; Mullavey, A.; Müller-Ebhardt, H.; Munch, J.; Murphy, D.; Murray, P. G.; Mytidis, A.; Nash, T.; Naticchioni, L.; Necula, V.; Nelson, J.; Neri, I.; Newton, G.; Nguyen, T.; Nishizawa, A.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Oldenberg, R. G.; O'Reilly, B.; O'Shaughnessy, R.; Osthelder, C.; Ott, C. D.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Page, A.; Palladino, L.; Palomba, C.; Pan, Y.; Pankow, C.; Paoletti, F.; Paoletti, R.; Papa, M. A.; Parisi, M.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Pedraza, M.; Penn, S.; Perreca, A.; Persichetti, G.; Phelps, M.; Pichot, M.; Pickenpack, M.; Piergiovanni, F.; Pierro, V.; Pihlaja, M.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Pletsch, H. J.; Plissi, M. V.; Poggiani, R.; Pöld, J.; Postiglione, F.; Poux, C.; Prato, M.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L. G.; Puncken, O.; Punturo, M.; Puppo, P.; Quetschke, V.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Rácz, I.; Radkins, H.; Raffai, P.; Rakhmanov, M.; Ramet, C.; Rankins, B.; Rapagnani, P.; Raymond, V.; Re, V.; Reed, C. M.; Reed, T.; Regimbau, T.; Reid, S.; Reitze, D. H.; Ricci, F.; Riesen, R.; Riles, K.; Roberts, M.; Robertson, N. A.; Robinet, F.; Robinson, C.; Robinson, E. L.; Rocchi, A.; Roddy, S.; Rodriguez, C.; Rodruck, M.; Rolland, L.; Rollins, J. G.; Romano, R.; Romie, J. H.; Rosińska, D.; Röver, C.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Salemi, F.; Sammut, L.; Sandberg, V.; Sankar, S.; Sannibale, V.; Santamaría, L.; Santiago-Prieto, I.; Santostasi, G.; Saracco, E.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Savage, R. L.; Schilling, R.; Schnabel, R.; Schofield, R. M. S.; Schulz, B.; Schutz, B. F.; Schwinberg, P.; Scott, J.; Scott, S. M.; Seifert, F.; Sellers, D.; Sentenac, D.; Sergeev, A.; Shaddock, D. A.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Shoemaker, D. H.; Sidery, T. L.; Siemens, X.; Sigg, D.; Simakov, D.; Singer, A.; Singer, L.; Sintes, A. M.; Skelton, G. R.; Slagmolen, B. J. J.; Slutsky, J.; Smith, J. R.; Smith, M. R.; Smith, R. J. E.; Smith-Lefebvre, N. D.; Somiya, K.; Sorazu, B.; Speirits, F. C.; Sperandio, L.; Stefszky, M.; Steinert, E.; Steinlechner, J.; Steinlechner, S.; Steplewski, S.; Stochino, A.; Stone, R.; Strain, K. A.; Strigin, S. E.; Stroeer, A. S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sung, M.; Susmithan, S.; Sutton, P. J.; Swinkels, B.; Szeifert, G.; Tacca, M.; Taffarello, L.; Talukder, D.; Tanner, D. B.; Tarabrin, S. P.; Taylor, R.; ter Braack, A. P. M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Thüring, A.; Titsler, C.; Tokmakov, K. V.; Tomlinson, C.; Toncelli, A.; Tonelli, M.; Torre, O.; Torres, C. V.; Torrie, C. I.; Tournefier, E.; Travasso, F.; Traylor, G.; Tse, M.; Ugolini, D.; Vahlbruch, H.; Vajente, G.; van den Brand, J. F. J.; Van Den Broeck, C.; van der Putten, S.; van Veggel, A. A.; Vass, S.; Vasuth, M.; Vaulin, R.; Vavoulidis, M.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Villar, A. E.; Vinet, J.-Y.; Vitale, S.; Vocca, H.; Vorvick, C.; Vyatchanin, S. P.; Wade, A.; Wade, L.; Wade, M.; Waldman, S. J.; Wallace, L.; Wan, Y.; Wang, M.; Wang, X.; Wanner, A.; Ward, R. L.; Was, M.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wessels, P.; West, M.; Westphal, T.; Wette, K.; Whelan, J. T.; Whitcomb, S. E.; White, D. J.; Whiting, B. F.; Wiesner, K.; Wilkinson, C.; Willems, P. A.; Williams, L.; Williams, R.; Willke, B.; Wimmer, M.; Winkelmann, L.; Winkler, W.; Wipf, C. C.; Wiseman, A. G.; Wittel, H.; Woan, G.; Wooley, R.; Worden, J.; Yablon, J.; Yakushin, I.; Yamamoto, H.; Yamamoto, K.; Yancey, C. C.; Yang, H.; Yeaton-Massey, D.; Yoshida, S.; Yvert, M.; Zadrożny, A.; Zanolin, M.; Zendri, J.-P.; Zhang, F.; Zhang, L.; Zhao, C.; Zotov, N.; Zucker, M. E.; Zweizig, J.
2013-09-01
Compact binary systems with neutron stars or black holes are one of the most promising sources for ground-based gravitational-wave detectors. Gravitational radiation encodes rich information about source physics; thus parameter estimation and model selection are crucial analysis steps for any detection candidate events. Detailed models of the anticipated waveforms enable inference on several parameters, such as component masses, spins, sky location and distance, that are essential for new astrophysical studies of these sources. However, accurate measurements of these parameters and discrimination of models describing the underlying physics are complicated by artifacts in the data, uncertainties in the waveform models and in the calibration of the detectors. Here we report such measurements on a selection of simulated signals added either in hardware or software to the data collected by the two LIGO instruments and the Virgo detector during their most recent joint science run, including a “blind injection” where the signal was not initially revealed to the collaboration. We exemplify the ability to extract information about the source physics on signals that cover the neutron-star and black-hole binary parameter space over the component mass range 1M⊙-25M⊙ and the full range of spin parameters. The cases reported in this study provide a snapshot of the status of parameter estimation in preparation for the operation of advanced detectors.
Advances in land modeling of KIAPS based on the Noah Land Surface Model
NASA Astrophysics Data System (ADS)
Koo, Myung-Seo; Baek, Sunghye; Seol, Kyung-Hee; Cho, Kyoungmi
2017-08-01
As of 2013, the Noah Land Surface Model (LSM) version 2.7.1 was implemented in a new global model being developed at the Korea Institute of Atmospheric Prediction Systems (KIAPS). This land surface scheme is further refined in two aspects, by adding new physical processes and by updating surface input parameters. Thus, the treatment of glacier land, sea ice, and snow cover are addressed more realistically. Inconsistencies in the amount of absorbed solar flux at ground level by the land surface and radiative processes are rectified. In addition, new parameters are available by using 1-km land cover data, which had usually not been possible at a global scale. Land surface albedo/emissivity climatology is newly created using Moderate-Resolution Imaging Spectroradiometer (MODIS) satellitebased data and adjusted parameterization. These updates have been applied to the KIAPS-developed model and generally provide a positive impact on near-surface weather forecasting.
Magnetic biosensors: Modelling and simulation.
Nabaei, Vahid; Chandrawati, Rona; Heidari, Hadi
2018-04-30
In the past few years, magnetoelectronics has emerged as a promising new platform technology in various biosensors for detection, identification, localisation and manipulation of a wide spectrum of biological, physical and chemical agents. The methods are based on the exposure of the magnetic field of a magnetically labelled biomolecule interacting with a complementary biomolecule bound to a magnetic field sensor. This Review presents various schemes of magnetic biosensor techniques from both simulation and modelling as well as analytical and numerical analysis points of view, and the performance variations under magnetic fields at steady and nonstationary states. This is followed by magnetic sensors modelling and simulations using advanced Multiphysics modelling software (e.g. Finite Element Method (FEM) etc.) and home-made developed tools. Furthermore, outlook and future directions of modelling and simulations of magnetic biosensors in different technologies and materials are critically discussed. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Current advancements and challenges in soil-root interactions modelling
NASA Astrophysics Data System (ADS)
Schnepf, Andrea; Huber, Katrin; Abesha, Betiglu; Meunier, Felicien; Leitner, Daniel; Roose, Tiina; Javaux, Mathieu; Vanderborght, Jan; Vereecken, Harry
2015-04-01
Roots change their surrounding soil chemically, physically and biologically. This includes changes in soil moisture and solute concentration, the exudation of organic substances into the rhizosphere, increased growth of soil microorganisms, or changes in soil structure. The fate of water and solutes in the root zone is highly determined by these root-soil interactions. Mathematical models of soil-root systems in combination with non-invasive techniques able to characterize root systems are a promising tool to understand and predict the behaviour of water and solutes in the root zone. With respect to different fields of applications, predictive mathematical models can contribute to the solution of optimal control problems in plant recourse efficiency. This may result in significant gains in productivity, efficiency and environmental sustainability in various land use activities. Major challenges include the coupling of model parameters of the relevant processes with the surrounding environment such as temperature, nutrient concentration or soil water content. A further challenge is the mathematical description of the different spatial and temporal scales involved. This includes in particular the branched structures formed by root systems or the external mycelium of mycorrhizal fungi. Here, reducing complexity as well as bridging between spatial scales is required. Furthermore, the combination of experimental and mathematical techniques may advance the field enormously. Here, the use of root system, soil and rhizosphere models is presented through a number of modelling case studies, including image based modelling of phosphate uptake by a root with hairs, model-based optimization of root architecture for phosphate uptake from soil, upscaling of rhizosphere models, modelling root growth in structured soil, and the effect of root hydraulic architecture on plant water uptake efficiency and drought resistance.
Current Advancements and Challenges in Soil-Root Interactions Modelling
NASA Astrophysics Data System (ADS)
Schnepf, A.; Huber, K.; Abesha, B.; Meunier, F.; Leitner, D.; Roose, T.; Javaux, M.; Vanderborght, J.; Vereecken, H.
2014-12-01
Roots change their surrounding soil chemically, physically and biologically. This includes changes in soil moisture and solute concentration, the exudation of organic substances into the rhizosphere, increased growth of soil microorganisms, or changes in soil structure. The fate of water and solutes in the root zone is highly determined by these root-soil interactions. Mathematical models of soil-root systems in combination with non-invasive techniques able to characterize root systems are a promising tool to understand and predict the behaviour of water and solutes in the root zone. With respect to different fields of applications, predictive mathematical models can contribute to the solution of optimal control problems in plant recourse efficiency. This may result in significant gains in productivity, efficiency and environmental sustainability in various land use activities. Major challenges include the coupling of model parameters of the relevant processes with the surrounding environment such as temperature, nutrient concentration or soil water content. A further challenge is the mathematical description of the different spatial and temporal scales involved. This includes in particular the branched structures formed by root systems or the external mycelium of mycorrhizal fungi. Here, reducing complexity as well as bridging between spatial scales is required. Furthermore, the combination of experimental and mathematical techniques may advance the field enormously. Here, the use of root system, soil and rhizosphere models is presented through a number of modelling case studies, including image based modelling of phosphate uptake by a root with hairs, model-based optimization of root architecture for phosphate uptake from soil, upscaling of rhizosphere models, modelling root growth in structured soil, and the effect of root hydraulic architecture on plant water uptake efficiency and drought resistance.
A statistical approach for inferring the 3D structure of the genome.
Varoquaux, Nelle; Ay, Ferhat; Noble, William Stafford; Vert, Jean-Philippe
2014-06-15
Recent technological advances allow the measurement, in a single Hi-C experiment, of the frequencies of physical contacts among pairs of genomic loci at a genome-wide scale. The next challenge is to infer, from the resulting DNA-DNA contact maps, accurate 3D models of how chromosomes fold and fit into the nucleus. Many existing inference methods rely on multidimensional scaling (MDS), in which the pairwise distances of the inferred model are optimized to resemble pairwise distances derived directly from the contact counts. These approaches, however, often optimize a heuristic objective function and require strong assumptions about the biophysics of DNA to transform interaction frequencies to spatial distance, and thereby may lead to incorrect structure reconstruction. We propose a novel approach to infer a consensus 3D structure of a genome from Hi-C data. The method incorporates a statistical model of the contact counts, assuming that the counts between two loci follow a Poisson distribution whose intensity decreases with the physical distances between the loci. The method can automatically adjust the transfer function relating the spatial distance to the Poisson intensity and infer a genome structure that best explains the observed data. We compare two variants of our Poisson method, with or without optimization of the transfer function, to four different MDS-based algorithms-two metric MDS methods using different stress functions, a non-metric version of MDS and ChromSDE, a recently described, advanced MDS method-on a wide range of simulated datasets. We demonstrate that the Poisson models reconstruct better structures than all MDS-based methods, particularly at low coverage and high resolution, and we highlight the importance of optimizing the transfer function. On publicly available Hi-C data from mouse embryonic stem cells, we show that the Poisson methods lead to more reproducible structures than MDS-based methods when we use data generated using different restriction enzymes, and when we reconstruct structures at different resolutions. A Python implementation of the proposed method is available at http://cbio.ensmp.fr/pastis. © The Author 2014. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Jankovsky, Robert S.; Jacobson, David T.; Rawlin, Vincent K.; Mason, Lee S.; Mantenieks, Maris A.; Manzella, David H.; Hofer, Richard R.; Peterson, Peter Y.
2001-01-01
NASA's Hall thruster program has base research and focused development efforts in support of the Advanced Space Transportation Program, Space-Based Program, and various other programs. The objective of the base research is to gain an improved understanding of the physical processes and engineering constraints of Hall thrusters to enable development of advanced Hall thruster designs. Specific technical questions that are current priorities of the base effort are: (1) How does thruster life vary with operating point? (2) How can thruster lifetime and wear rate be most efficiently evaluated? (3) What are the practical limitations for discharge voltage as it pertains to high specific impulse operation (high discharge voltage) and high thrust operation (low discharge voltage)? (4) What are the practical limits for extending Hall thrusters to very high input powers? and (5) What can be done during thruster design to reduce cost and integration concerns? The objective of the focused development effort is to develop a 50 kW-class Hall propulsion system, with a milestone of a 50 kW engineering model thruster/system by the end of program year 2006. Specific program wear 2001 efforts, along with the corporate and academic participation, are described.
Howard, Anita R.
2015-01-01
Oral health is managed based on objective measures such as the presence and severity of dental caries and periodontal disease. In recent years, oral health researchers and practitioners have shown increasing interest in a widened array of physical, psychological, and social factors found to influence patients’ oral health. In this article, we introduce a behavior change coaching approach that can be used to enhance psychosocial diagnosis and client-centered delivery of health-promoting interventions. Briefly, this health coaching approach is based on an interactive assessment (both physical and psychological), a non-judgmental exploration of patients’ knowledge, attitudes, and beliefs, a mapping of patient behaviors that may contribute to disease progression, gauging patient motivation, and tailoring health communication to encourage health-promoting behavior change. Developed in a clinical setting, this coaching model is supported by interdisciplinary theory, research, and practice on health behavior change. We suggest that, with supervision, this coaching process may be learned. PMID:26457237
A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Rui
An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less
A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses
Hu, Rui
2016-11-19
An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less
Biological materials by design.
Qin, Zhao; Dimas, Leon; Adler, David; Bratzel, Graham; Buehler, Markus J
2014-02-19
In this topical review we discuss recent advances in the use of physical insight into the way biological materials function, to design novel engineered materials 'from scratch', or from the level of fundamental building blocks upwards and by using computational multiscale methods that link chemistry to material function. We present studies that connect advances in multiscale hierarchical material structuring with material synthesis and testing, review case studies of wood and other biological materials, and illustrate how engineered fiber composites and bulk materials are designed, modeled, and then synthesized and tested experimentally. The integration of experiment and simulation in multiscale design opens new avenues to explore the physics of materials from a fundamental perspective, and using complementary strengths from models and empirical techniques. Recent developments in this field illustrate a new paradigm by which complex material functionality is achieved through hierarchical structuring in spite of simple material constituents.
Klassen, Tara D; Semrau, Jennifer A; Dukelow, Sean P; Bayley, Mark T; Hill, Michael D; Eng, Janice J
2017-09-01
Identifying practical ways to accurately measure exercise intensity and dose in clinical environments is essential to advancing stroke rehabilitation. This is especially relevant in monitoring walking activity during inpatient rehabilitation where recovery is greatest. This study evaluated the accuracy of a readily available consumer-based physical activity monitor during daily inpatient stroke rehabilitation physical therapy sessions. Twenty-one individuals admitted to inpatient rehabilitation were monitored for a total of 471 one-hour physical therapy sessions which consisted of walking and nonwalking therapeutic activities. Participants wore a consumer-based physical activity monitor (Fitbit One) and the gold standard for assessing step count (StepWatch Activity Monitor) during physical therapy sessions. Linear mixed modeling was used to assess the relationship of the step count of the Fitbit to the StepWatch Activity Monitor. Device accuracy is reported as the percent error of the Fitbit compared with the StepWatch Activity Monitor. A strong relationship (slope=0.99; 95% confidence interval, 0.97-1.01) was found between the number of steps captured by the Fitbit One and the StepWatch Activity Monitor. The Fitbit One had a mean error of 10.9% (5.3) for participants with walking velocities <0.4 m/s, 6.8% (3.0) for walking velocities between 0.4 and 0.8 m/s, and 4.4% (2.8) for walking velocities >0.8 m/s. This study provides preliminary evidence that the Fitbit One, when positioned on the nonparetic ankle, can accurately measure walking steps early after stroke during inpatient rehabilitation physical therapy sessions. URL: https://www.clinicaltrials.gov. Unique identifier: NCT01915368. © 2017 American Heart Association, Inc.
ERIC Educational Resources Information Center
Chen, Lung Hung; Wu, Chia-Huei; Kee, Ying Hwa; Lin, Meng-Shyan; Shui, Shang-Hsueh
2009-01-01
In this study, the hierarchical model of achievement motivation [Elliot, A. J. (1997). Integrating the "classic" and "contemporary" approaches to achievement motivation: A hierarchical model of approach and avoidance achievement motivation. In P. Pintrich & M. Maehr (Eds.), "Advances in motivation and achievement"…
Advanced Chemical Modeling for Turbulent Combustion Simulations
2012-05-03
premixed combustion. The chemistry work proposes a method for defining jet fuel surrogates, describes how different sub- mechanisms can be incorporated...Chemical Modeling For Turbulent Combustion Simulations Final Report submitted by: Heinz Pitsch (PI) Stanford University Mechanical Engineering Flow Physics...predict the combustion characteristics of fuel oxidation and pollutant emissions from engines . The relevant fuel chemistry must be accurately modeled
2011-04-01
advanced ROMS-CoSiNE-Optics model in a full three-dimensional environment. We collaborate with Dr. Curt Mobley at Sequoia Scientific to implement...projects. Besides working closely with the modeling group at the NRL and their BioSpace project, we are collaborating with Dr. Curtis Mobley of Sequoia
Phenomenological Modeling of Infrared Sources: Recent Advances
NASA Technical Reports Server (NTRS)
Leung, Chun Ming; Kwok, Sun (Editor)
1993-01-01
Infrared observations from planned space facilities (e.g., ISO (Infrared Space Observatory), SIRTF (Space Infrared Telescope Facility)) will yield a large and uniform sample of high-quality data from both photometric and spectroscopic measurements. To maximize the scientific returns of these space missions, complementary theoretical studies must be undertaken to interpret these observations. A crucial step in such studies is the construction of phenomenological models in which we parameterize the observed radiation characteristics in terms of the physical source properties. In the last decade, models with increasing degree of physical realism (in terms of grain properties, physical processes, and source geometry) have been constructed for infrared sources. Here we review current capabilities available in the phenomenological modeling of infrared sources and discuss briefly directions for future research in this area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merzari, E.; Yuan, Haomin; Kraus, A.
The NEAMS program aims to develop an integrated multi-physics simulation capability “pellet-to-plant” for the design and analysis of future generations of nuclear power plants. In particular, the Reactor Product Line code suite's multi-resolution hierarchy is being designed to ultimately span the full range of length and time scales present in relevant reactor design and safety analyses, as well as scale from desktop to petaflop computing platforms. Flow-induced vibration (FIV) is widespread problem in energy systems because they rely on fluid movement for energy conversion. Vibrating structures may be damaged as fatigue or wear occurs. Given the importance of reliable componentsmore » in the nuclear industry, flow-induced vibration has long been a major concern in safety and operation of nuclear reactors. In particular, nuclear fuel rods and steam generators have been known to suffer from flow-induced vibration and related failures. Advanced reactors, such as integral Pressurized Water Reactors (PWRs) considered for Small Modular Reactors (SMR), often rely on innovative component designs to meet cost and safety targets. One component that is the subject of advanced designs is the steam generator, some designs of which forego the usual shell-and-tube architecture in order to fit within the primary vessel. In addition to being more cost- and space-efficient, such steam generators need to be more reliable, since failure of the primary vessel represents a potential loss of coolant and a safety concern. A significant amount of data exists on flow-induced vibration in shell-and-tube heat exchangers, and heuristic methods are available to predict their occurrence based on a set of given assumptions. In contrast, advanced designs have far less data available. Advanced modeling and simulation based on coupled structural and fluid simulations have the potential to predict flow-induced vibration in a variety of designs, reducing the need for expensive experimental programs, especially at the design stage. Over the past five years, the Reactor Product Line has developed the integrated multi-physics code suite SHARP. The goal of developing such a tool is to perform multi-physics neutronics, thermal/fluid, and structural mechanics modeling of the components inside the full reactor core or portions of it with a user-specified fidelity. In particular SHARP contains high-fidelity single-physics codes Diablo for structural mechanics and Nek5000 for fluid mechanics calculations. Both codes are state-of-the-art, highly scalable tools that have been extensively validated. These tools form a strong basis on which to build a flow-induced vibration modeling capability. In this report we discuss one-way coupled calculations performed with Nek5000 and Diablo aimed at simulating available FIV experiments in helical steam generators in the turbulent buffeting regime. In this regime one-way coupling is judged sufficient because the pressure loads do not cause substantial displacements. It is also the most common source of vibration in helical steam generators at the low flows expected in integral PWRs. The legacy data is obtained from two datasets developed at Argonne and B&W.« less
NASA Technical Reports Server (NTRS)
1989-01-01
The primary objective of the Center for Turbulence Research (CTR) is to stimulate and produce advances in physical understanding of turbulence, in turbulence modeling and simulation, and in turbulence control. Topics addressed include: fundamental modeling of turbulence; turbulence structure and control; transition and turbulence in high-speed compressible flows; and turbulent reacting flows.
Numerical Analyses for Low Reynolds Flow in a Ventricular Assist Device.
Lopes, Guilherme; Bock, Eduardo; Gómez, Luben
2017-06-01
Scientific and technological advances in blood pump developments have been driven by their importance in cardiac patient treatments and in the expansion of life quality in assisted people. To improve and optimize the design and development, numerical tools were incorporated into the analyses of these mechanisms and have become indispensable in their advances. This study analyzes the flow behavior with low impeller Reynolds number, for which there is no consensus on the full development of turbulence in ventricular assist devices (VAD). For supporting analyses, computational numerical simulations were carried out in different scenarios with the same rotation speed. Two modeling approaches were applied: laminar flow and turbulent flow with the standard, RNG and realizable κ - ε; the standard and SST κ - ω models; and Spalart-Allmaras models. The results agree with the literature for VAD and the range for transient flows in stirred tanks with an impeller Reynolds number around 2800 for the tested scenarios. The turbulent models were compared, and it is suggested, based on the expected physical behavior, the use of κ-ε RNG, standard and SST κ-ω, and Spalart-Allmaras models to numerical analyses for low impeller Reynolds numbers according to the tested flow scenarios. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Time-Dependent Cryospheric Longwave Surface Emissivity Feedback in the Community Earth System Model
NASA Astrophysics Data System (ADS)
Kuo, Chaincy; Feldman, Daniel R.; Huang, Xianglei; Flanner, Mark; Yang, Ping; Chen, Xiuhong
2018-01-01
Frozen and unfrozen surfaces exhibit different longwave surface emissivities with different spectral characteristics, and outgoing longwave radiation and cooling rates are reduced for unfrozen scenes relative to frozen ones. Here physically realistic modeling of spectrally resolved surface emissivity throughout the coupled model components of the Community Earth System Model (CESM) is advanced, and implications for model high-latitude biases and feedbacks are evaluated. It is shown that despite a surface emissivity feedback amplitude that is, at most, a few percent of the surface albedo feedback amplitude, the inclusion of realistic, harmonized longwave, spectrally resolved emissivity information in CESM1.2.2 reduces wintertime Arctic surface temperature biases from -7.2 ± 0.9 K to -1.1 ± 1.2 K, relative to observations. The bias reduction is most pronounced in the Arctic Ocean, a region for which Coupled Model Intercomparison Project version 5 (CMIP5) models exhibit the largest mean wintertime cold bias, suggesting that persistent polar temperature biases can be lessened by including this physically based process across model components. The ice emissivity feedback of CESM1.2.2 is evaluated under a warming scenario with a kernel-based approach, and it is found that emissivity radiative kernels exhibit water vapor and cloud cover dependence, thereby varying spatially and decreasing in magnitude over the course of the scenario from secular changes in atmospheric thermodynamics and cloud patterns. Accounting for the temporally varying radiative responses can yield diagnosed feedbacks that differ in sign from those obtained from conventional climatological feedback analysis methods.
The thermal and mechanical stability of composite materials for space structures
NASA Technical Reports Server (NTRS)
Tompkins, S. S.; Sykes, G. F.; Bowles, D. E.
1985-01-01
A continuing research objective of the National Aeronautical and Space Administration (NASA) is to develop advanced composite materials for space structures. The thrust of this research is to gain fundamental understanding of the performance of advanced composites in the space environment. The emphasis has been to identify and model changes in the thermal-physical properties due to induced damage and develop improved materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Berry, R. A.; Martineau, R. C.
The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less
Advanced Electronic Structures
1992-10-01
Physicist Physical Electronics Laboratory SRI Project 2407 D T IC Prepared for: S ELECTEr’ Office of Naval Research DEC 0 81992 800 North Quincy Street...talk at the March 1992 meeting of the American Physical Society. The sub- ject was the use of pressure as a new variable for testing the underlying...on the MIGS model. We intend to submit to Physical Review Letters, and are only waiting for Eike to complete a draft of the ma iuscript. 1 2.1.2
Challenges in process marginality for advanced technology nodes and tackling its contributors
NASA Astrophysics Data System (ADS)
Narayana Samy, Aravind; Schiwon, Roberto; Seltmann, Rolf; Kahlenberg, Frank; Katakamsetty, Ushasree
2013-10-01
Process margin is getting critical in the present node shrinkage scenario due to the physical limits reached (Rayleigh's criterion) using ArF lithography tools. K1 is used to its best for better resolution and to enhance the process margin (28nm metal patterning k1=0.31). In this paper, we would like to give an overview of various contributors in the advanced technology nodes which limit the process margins and how the challenges have been tackled in a modern foundry model. Advanced OPC algorithms are used to make the design content at the mask optimum for patterning. However, as we work at the physical limit, critical features (Hot-spots) are very susceptible to litho process variations. Furthermore, etch can have a significant impact as well. Pattern that still looks healthy at litho can fail due to etch interactions. This makes the traditional 2D contour output from ORC tools not able to predict accurately all defects and hence not able to fully correct it in the early mask tapeout phase. The above makes a huge difference in the fast ramp-up and high yield in a competitive foundry market. We will explain in this paper how the early introduction of 3D resist model based simulation of resist profiles (resist top-loss, bottom bridging, top-rounding, etc.,) helped in our prediction and correction of hot-spots in the early 28nm process development phase. The paper also discusses about the other overall process window reduction contributors due to mask 3D effects, wafer topography (focus shifts/variations) and how this has been addressed with different simulation efforts in a fast and timely manner.
NASA Astrophysics Data System (ADS)
Vishnyakov, G. N.; Levin, G. G.; Minaev, V. L.
2017-09-01
A review of advanced equipment for automated interference measurements developed at the All-Russian Research Institute for Optical and Physical Measurements is given. Three types of interference microscopes based on the Linnik, Twyman-Green, and Fizeau interferometers with the use of the phase stepping method are presented.
A Quantum Chemistry Concept Inventory for Physical Chemistry Classes
ERIC Educational Resources Information Center
Dick-Perez, Marilu; Luxford, Cynthia J.; Windus, Theresa L.; Holme, Thomas
2016-01-01
A 14-item, multiple-choice diagnostic assessment tool, the quantum chemistry concept inventory or QCCI, is presented. Items were developed based on published student misconceptions and content coverage and then piloted and used in advanced physical chemistry undergraduate courses. In addition to the instrument itself, data from both a pretest,…
Top 10 Research Questions Related to Preventing Sudden Death in Sport and Physical Activity
ERIC Educational Resources Information Center
Katch, Rachel K.; Scarneo, Samantha E.; Adams, William M.; Armstrong, Lawrence E.; Belval, Luke N.; Stamm, Julie M.; Casa, Douglas J.
2017-01-01
Participation in organized sport and recreational activities presents an innate risk for serious morbidity and mortality. Although death during sport or physical activity has many causes, advancements in sports medicine and evidence-based standards of care have allowed clinicians to prevent, recognize, and treat potentially fatal injuries more…
Inquiry-Based Practical Work in Physical Sciences: Equitable Access and Social Justice Issues
ERIC Educational Resources Information Center
Tsakeni, Maria
2018-01-01
Physical sciences education comes with high expectations for learners to be successfully placed in tertiary institutions in related fields, and developing countries' aspirations to develop advanced and specialised skills to drive economies. However, some of the prevailing instructional strategies in science classrooms work to marginalise learners.…
From Gene to Protein: A 3-Week Intensive Course in Molecular Biology for Physical Scientists
ERIC Educational Resources Information Center
Nadeau, Jay L.
2009-01-01
This article describes a 3-week intensive molecular biology methods course based upon fluorescent proteins, which is successfully taught at the McGill University to advanced undergraduates and graduates in physics, chemical engineering, biomedical engineering, and medicine. No previous knowledge of biological terminology or methods is expected, so…
Simulation of the hybrid and steady state advanced operating modes in ITER
NASA Astrophysics Data System (ADS)
Kessel, C. E.; Giruzzi, G.; Sips, A. C. C.; Budny, R. V.; Artaud, J. F.; Basiuk, V.; Imbeaux, F.; Joffrin, E.; Schneider, M.; Murakami, M.; Luce, T.; St. John, Holger; Oikawa, T.; Hayashi, N.; Takizuka, T.; Ozeki, T.; Na, Y.-S.; Park, J. M.; Garcia, J.; Tucillo, A. A.
2007-09-01
Integrated simulations are performed to establish a physics basis, in conjunction with present tokamak experiments, for the operating modes in the International Thermonuclear Experimental Reactor (ITER). Simulations of the hybrid mode are done using both fixed and free-boundary 1.5D transport evolution codes including CRONOS, ONETWO, TSC/TRANSP, TOPICS and ASTRA. The hybrid operating mode is simulated using the GLF23 and CDBM05 energy transport models. The injected powers are limited to the negative ion neutral beam, ion cyclotron and electron cyclotron heating systems. Several plasma parameters and source parameters are specified for the hybrid cases to provide a comparison of 1.5D core transport modelling assumptions, source physics modelling assumptions, as well as numerous peripheral physics modelling. Initial results indicate that very strict guidelines will need to be imposed on the application of GLF23, for example, to make useful comparisons. Some of the variations among the simulations are due to source models which vary widely among the codes used. In addition, there are a number of peripheral physics models that should be examined, some of which include fusion power production, bootstrap current, treatment of fast particles and treatment of impurities. The hybrid simulations project to fusion gains of 5.6-8.3, βN values of 2.1-2.6 and fusion powers ranging from 350 to 500 MW, under the assumptions outlined in section 3. Simulations of the steady state operating mode are done with the same 1.5D transport evolution codes cited above, except the ASTRA code. In these cases the energy transport model is more difficult to prescribe, so that energy confinement models will range from theory based to empirically based. The injected powers include the same sources as used for the hybrid with the possible addition of lower hybrid. The simulations of the steady state mode project to fusion gains of 3.5-7, βN values of 2.3-3.0 and fusion powers of 290 to 415 MW, under the assumptions described in section 4. These simulations will be presented and compared with particular focus on the resulting temperature profiles, source profiles and peripheral physics profiles. The steady state simulations are at an early stage and are focused on developing a range of safety factor profiles with 100% non-inductive current.
Morphodynamic Modeling Using The SToRM Computational System
NASA Astrophysics Data System (ADS)
Simoes, F.
2016-12-01
The framework of the work presented here is the open source SToRM (System for Transport and River Modeling) eco-hydraulics modeling system, which is one of the models released with the iRIC hydraulic modeling graphical software package (http://i-ric.org/). SToRM has been applied to the simulation of various complex environmental problems, including natural waterways, steep channels with regime transition, and rapidly varying flood flows with wetting and drying fronts. In its previous version, however, channel bed was treated as static and the ability of simulating sediment transport rates or bed deformation was not included. The work presented here reports SToRM's newly developed extensions to expand the system's capability to calculate morphological changes in alluvial river systems. The sediment transport module of SToRM has been developed based on the general recognition that meaningful advances depend on physically solid formulations and robust and accurate numerical solution methods. The basic concepts of mass and momentum conservation are used, where the feedback mechanisms between the flow of water, the sediment in transport, and the bed changes are directly incorporated in the governing equations used in the mathematical model. This is accomplished via a non-capacity transport formulation based on the work of Cao et al. [Z. Cao et al., "Non-capacity or capacity model for fluvial sediment transport," Water Management, 165(WM4):193-211, 2012], where the governing equations are augmented with source/sink terms due to water-sediment interaction. The same unsteady, shock-capturing numerical schemes originally used in SToRM were adapted to the new physics, using a control volume formulation over unstructured computational grids. The presentation will include a brief overview of these methodologies, and the result of applications of the model to a number of relevant physical test cases with movable bed, where computational results are compared to experimental data.
Steady State Advanced Tokamak (SSAT): The mission and the machine
NASA Astrophysics Data System (ADS)
Thomassen, K.; Goldston, R.; Nevins, B.; Neilson, H.; Shannon, T.; Montgomery, B.
1992-03-01
Extending the tokamak concept to the steady state regime and pursuing advances in tokamak physics are important and complementary steps for the magnetic fusion energy program. The required transition away from inductive current drive will provide exciting opportunities for advances in tokamak physics, as well as important impetus to drive advances in fusion technology. Recognizing this, the Fusion Policy Advisory Committee and the U.S. National Energy Strategy identified the development of steady state tokamak physics and technology, and improvements in the tokamak concept, as vital elements in the magnetic fusion energy development plan. Both called for the construction of a steady state tokamak facility to address these plan elements. Advances in physics that produce better confinement and higher pressure limits are required for a similar unit size reactor. Regimes with largely self-driven plasma current are required to permit a steady-state tokamak reactor with acceptable recirculating power. Reliable techniques of disruption control will be needed to achieve the availability goals of an economic reactor. Thus the central role of this new tokamak facility is to point the way to a more attractive demonstration reactor (DEMO) than the present data base would support. To meet the challenges, we propose a new 'Steady State Advanced Tokamak' (SSAT) facility that would develop and demonstrate optimized steady state tokamak operating mode. While other tokamaks in the world program employ superconducting toroidal field coils, SSAT would be the first major tokamak to operate with a fully superconducting coil set in the elongated, divertor geometry planned for ITER and DEMO.
Recent advances in hypersonic technology
NASA Technical Reports Server (NTRS)
Dwoyer, Douglas L.
1990-01-01
This paper will focus on recent advances in hypersonic aerodynamic prediction techniques. Current capabilities of existing numerical methods for predicting high Mach number flows will be discussed and shortcomings will be identified. Physical models available for inclusion into modern codes for predicting the effects of transition and turbulence will also be outlined and their limitations identified. Chemical reaction models appropriate to high-speed flows will be addressed, and the impact of their inclusion in computational fluid dynamics codes will be discussed. Finally, the problem of validating predictive techniques for high Mach number flows will be addressed.
Recent Advances and Future Prospects in Fundamental Symmetries
NASA Astrophysics Data System (ADS)
Plaster, Brad
2017-09-01
A broad program of initiatives in fundamental symmetries seeks answers to several of the most pressing open questions in nuclear physics, ranging from the scale of the neutrino mass, to the particle-antiparticle nature of the neutrino, to the origin of the matter-antimatter asymmetry, to the limits of Standard Model interactions. Although the experimental program is quite broad, with efforts ranging from precision measurements of neutrino properties; to searches for electric dipole moments; to precision measurements of magnetic dipole moments; and to precision measurements of couplings, particle properties, and decays; all of these seemingly disparate initiatives are unified by several common threads. These include the use and exploitation of symmetry principles, novel cross-disciplinary experimental work at the forefront of the precision frontier, and the need for accompanying breakthroughs in development of the theory necessary for an interpretation of the anticipated results from these experiments. This talk will highlight recent accomplishments and advances in fundamental symmetries and point to the extraordinary level of ongoing activity aimed at realizing the development and interpretation of next-generation experiments. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, under Award Number DE-SC-0014622.
Learning Physics-based Models in Hydrology under the Framework of Generative Adversarial Networks
NASA Astrophysics Data System (ADS)
Karpatne, A.; Kumar, V.
2017-12-01
Generative adversarial networks (GANs), that have been highly successful in a number of applications involving large volumes of labeled and unlabeled data such as computer vision, offer huge potential for modeling the dynamics of physical processes that have been traditionally studied using simulations of physics-based models. While conventional physics-based models use labeled samples of input/output variables for model calibration (estimating the right parametric forms of relationships between variables) or data assimilation (identifying the most likely sequence of system states in dynamical systems), there is a greater opportunity to explore the full power of machine learning (ML) methods (e.g, GANs) for studying physical processes currently suffering from large knowledge gaps, e.g. ground-water flow. However, success in this endeavor requires a principled way of combining the strengths of ML methods with physics-based numerical models that are founded on a wealth of scientific knowledge. This is especially important in scientific domains like hydrology where the number of data samples is small (relative to Internet-scale applications such as image recognition where machine learning methods has found great success), and the physical relationships are complex (high-dimensional) and non-stationary. We will present a series of methods for guiding the learning of GANs using physics-based models, e.g., by using the outputs of physics-based models as input data to the generator-learner framework, and by using physics-based models as generators trained using validation data in the adversarial learning framework. These methods are being developed under the broad paradigm of theory-guided data science that we are developing to integrate scientific knowledge with data science methods for accelerating scientific discovery.
Modeling of luminance distribution in CAVE-type virtual reality systems
NASA Astrophysics Data System (ADS)
Meironke, Michał; Mazikowski, Adam
2017-08-01
At present, one of the most advanced virtual reality systems are CAVE-type (Cave Automatic Virtual Environment) installations. Such systems are usually consisted of four, five or six projection screens and in case of six screens arranged in form of a cube. Providing the user with a high level of immersion feeling in such systems is largely dependent of optical properties of the system. The modeling of physical phenomena plays nowadays a huge role in the most fields of science and technology. It allows to simulate work of device without a need to make any changes in the physical constructions. In this paper distribution of luminance in CAVE-type virtual reality systems were modelled. Calculations were performed for the model of 6-walled CAVE-type installation, based on Immersive 3D Visualization Laboratory, situated at the Faculty of Electronics, Telecommunications and Informatics at the Gdańsk University of Technology. Tests have been carried out for two different scattering distribution of the screen material in order to check how these characteristicinfluence on the luminance distribution of the whole CAVE. The basis assumption and simplification of modeled CAVE-type installation and results were presented. The brief discussion about the results and usefulness of developed model were also carried out.
NASA Astrophysics Data System (ADS)
Gianni, Guillaume; Doherty, John; Perrochet, Pierre; Brunner, Philip
2017-04-01
Physical properties of alluvial environments are typically featuring a high degree of anisotropy and are characterized by dynamic interactions between the surface and the subsurface. A literature review on current modelling practice shows that hydrogeological models are often calibrated using isotropic hydraulic conductivity fields and steady state conditions. We aim at understanding how these simplifications affect the predictions of hydraulic heads and exchange fluxes using fully coupled, physically based synthetic models and advanced calibration approaches. Specifically, we present an analysis of the information content provided by averaged, steady state hydraulic data compared to transient data with respect to the determination of aquifer hydraulic properties. We show that the information content in average hydraulic heads is insufficient to inform anisotropic properties of alluvial aquifers and can lead to important biases on the calibrated parameters. We further explore the consequences of these biases on predictions of fluxes and water table dynamics. The results of this synthetic analysis are considered in the calibration of a highly dynamic and anisotropic alluvial aquifer system in Switzerland (the Rhône River). The results of the synthetic and real-world modelling and calibration exercises provide insight on future data acquisition, modelling and calibration strategies for these environments.
2013-01-01
Background The wheat genome sequence is an essential tool for advanced genomic research and improvements. The generation of a high-quality wheat genome sequence is challenging due to its complex 17 Gb polyploid genome. To overcome these difficulties, sequencing through the construction of BAC-based physical maps of individual chromosomes is employed by the wheat genomics community. Here, we present the construction of the first comprehensive physical map of chromosome 1BS, and illustrate its unique gene space organization and evolution. Results Fingerprinted BAC clones were assembled into 57 long scaffolds, anchored and ordered with 2,438 markers, covering 83% of chromosome 1BS. The BAC-based chromosome 1BS physical map and gene order of the orthologous regions of model grass species were consistent, providing strong support for the reliability of the chromosome 1BS assembly. The gene space for chromosome 1BS spans the entire length of the chromosome arm, with 76% of the genes organized in small gene islands, accompanied by a two-fold increase in gene density from the centromere to the telomere. Conclusions This study provides new evidence on common and chromosome-specific features in the organization and evolution of the wheat genome, including a non-uniform distribution of gene density along the centromere-telomere axis, abundance of non-syntenic genes, the degree of colinearity with other grass genomes and a non-uniform size expansion along the centromere-telomere axis compared with other model cereal genomes. The high-quality physical map constructed in this study provides a solid basis for the assembly of a reference sequence of chromosome 1BS and for breeding applications. PMID:24359668
Quantifying Quality of Life and Disability of Patients with Advanced Schistosomiasis Japonica
Jia, Tie-Wu; Utzinger, Jürg; Deng, Yao; Yang, Kun; Li, Yi-Yi; Zhu, Jin-Huan; King, Charles H.; Zhou, Xiao-Nong
2011-01-01
Background The Chinese government lists advanced schistosomiasis as a leading healthcare priority due to its serious health and economic impacts, yet it has not been included in the estimates of schistosomiasis burden in the Global Burden of Disease (GBD) study. Therefore, the quality of life and disability weight (DW) for the advanced cases of schistosomiasis japonica have to be taken into account in the re-estimation of burden of disease due to schistosomiasis. Methodology/Principal Findings A patient-based quality-of-life evaluation was performed for advanced schistosomiasis japonica. Suspected or officially registered advanced cases in a Schistosoma japonicum-hyperendemic county of the People's Republic of China (P.R. China) were screened using a short questionnaire and physical examination. Disability and morbidity were assessed in confirmed cases, using the European quality of life questionnaire with an additional cognitive dimension (known as the “EQ-5D plus”), ultrasonography, and laboratory testing. The age-specific DW of advanced schistosomiasis japonica was estimated based on patients' self-rated health scores on the visual analogue scale of the questionnaire. The relationships between health status, morbidity and DW were explored using multivariate regression models. Of 506 candidates, 215 cases were confirmed as advanced schistosomiasis japonica and evaluated. Most of the patients reported impairments in at least one health dimension, such as pain or discomfort (90.7%), usual activities (87.9%), and anxiety or depression (80.9%). The overall DW was 0.447, and age-specific DWs ranged from 0.378 among individuals aged 30–44 years to 0.510 among the elderly aged ≥60 years. DWs are positively associated with loss of work capacity, psychological abnormality, ascites, and active hepatitis B virus, while splenectomy and high albumin were protective factors for quality of life. Conclusions/Significance These patient-preference disability estimates could provide updated data for a revision of the GBD, as well as for evidence-based decision-making in P.R. China's national schistosomiasis control program. PMID:21358814
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
Theory of mind selectively predicts preschoolers’ knowledge-based selective word learning
Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane
2015-01-01
Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory of mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children’s preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children’s developing social cognition and early learning. PMID:26211504
Theory of mind selectively predicts preschoolers' knowledge-based selective word learning.
Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane
2015-11-01
Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory-of-mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children's preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children's developing social cognition and early learning. © 2015 The British Psychological Society.
Advanced and secure architectural EHR approaches.
Blobel, Bernd
2006-01-01
Electronic Health Records (EHRs) provided as a lifelong patient record advance towards core applications of distributed and co-operating health information systems and health networks. For meeting the challenge of scalable, flexible, portable, secure EHR systems, the underlying EHR architecture must be based on the component paradigm and model driven, separating platform-independent and platform-specific models. Allowing manageable models, real systems must be decomposed and simplified. The resulting modelling approach has to follow the ISO Reference Model - Open Distributing Processing (RM-ODP). The ISO RM-ODP describes any system component from different perspectives. Platform-independent perspectives contain the enterprise view (business process, policies, scenarios, use cases), the information view (classes and associations) and the computational view (composition and decomposition), whereas platform-specific perspectives concern the engineering view (physical distribution and realisation) and the technology view (implementation details from protocols up to education and training) on system components. Those views have to be established for components reflecting aspects of all domains involved in healthcare environments including administrative, legal, medical, technical, etc. Thus, security-related component models reflecting all view mentioned have to be established for enabling both application and communication security services as integral part of the system's architecture. Beside decomposition and simplification of system regarding the different viewpoint on their components, different levels of systems' granularity can be defined hiding internals or focusing on properties of basic components to form a more complex structure. The resulting models describe both structure and behaviour of component-based systems. The described approach has been deployed in different projects defining EHR systems and their underlying architectural principles. In that context, the Australian GEHR project, the openEHR initiative, the revision of CEN ENV 13606 "Electronic Health Record communication", all based on Archetypes, but also the HL7 version 3 activities are discussed in some detail. The latter include the HL7 RIM, the HL7 Development Framework, the HL7's clinical document architecture (CDA) as well as the set of models from use cases, activity diagrams, sequence diagrams up to Domain Information Models (DMIMs) and their building blocks Common Message Element Types (CMET) Constraining Models to their underlying concepts. The future-proof EHR architecture as open, user-centric, user-friendly, flexible, scalable, portable core application in health information systems and health networks has to follow advanced architectural paradigms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Platania, P., E-mail: platania@ifp.cnr.it; Figini, L.; Farina, D.
The purpose of this work is the optical modeling and physical performances evaluations of the JT-60SA ECRF launcher system. The beams have been simulated with the electromagnetic code GRASP® and used as input for ECCD calculations performed with the beam tracing code GRAY, capable of modeling propagation, absorption and current drive of an EC Gaussion beam with general astigmatism. Full details of the optical analysis has been taken into account to model the launched beams. Inductive and advanced reference scenarios has been analysed for physical evaluations in the full poloidal and toroidal steering ranges for two slightly different layouts ofmore » the launcher system.« less
NASA Astrophysics Data System (ADS)
Haworth, Daniel
2013-11-01
The importance of explicitly accounting for the effects of unresolved turbulent fluctuations in Reynolds-averaged and large-eddy simulations of chemically reacting turbulent flows is increasingly recognized. Transported probability density function (PDF) methods have emerged as one of the most promising modeling approaches for this purpose. In particular, PDF methods provide an elegant and effective resolution to the closure problems that arise from averaging or filtering terms that correspond to nonlinear point processes, including chemical reaction source terms and radiative emission. PDF methods traditionally have been associated with studies of turbulence-chemistry interactions in laboratory-scale, atmospheric-pressure, nonluminous, statistically stationary nonpremixed turbulent flames; and Lagrangian particle-based Monte Carlo numerical algorithms have been the predominant method for solving modeled PDF transport equations. Recent advances and trends in PDF methods are reviewed and discussed. These include advances in particle-based algorithms, alternatives to particle-based algorithms (e.g., Eulerian field methods), treatment of combustion regimes beyond low-to-moderate-Damköhler-number nonpremixed systems (e.g., premixed flamelets), extensions to include radiation heat transfer and multiphase systems (e.g., soot and fuel sprays), and the use of PDF methods as the basis for subfilter-scale modeling in large-eddy simulation. Examples are provided that illustrate the utility and effectiveness of PDF methods for physics discovery and for applications to practical combustion systems. These include comparisons of results obtained using the PDF method with those from models that neglect unresolved turbulent fluctuations in composition and temperature in the averaged or filtered chemical source terms and/or the radiation heat transfer source terms. In this way, the effects of turbulence-chemistry-radiation interactions can be isolated and quantified.
Ghosh, Sujit K
2010-01-01
Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Pfalzer, Lucinda A.; Springer, Barbara; Levy, Ellen; McGarvey, Charles L.; Danoff, Jerome V.; Gerber, Lynn H.; Soballe, Peter W.
2012-01-01
Secondary prevention involves monitoring and screening to prevent negative sequelae from chronic diseases such as cancer. Breast cancer treatment sequelae, such as lymphedema, may occur early or late and often negatively affect function. Secondary prevention through prospective physical therapy surveillance aids in early identification and treatment of breast cancer–related lymphedema (BCRL). Early intervention may reduce the need for intensive rehabilitation and may be cost saving. This perspective article compares a prospective surveillance model with a traditional model of impairment-based care and examines direct treatment costs associated with each program. Intervention and supply costs were estimated based on the Medicare 2009 physician fee schedule for 2 groups: (1) a prospective surveillance model group (PSM group) and (2) a traditional model group (TM group). The PSM group comprised all women with breast cancer who were receiving interval prospective surveillance, assuming that one third would develop early-stage BCRL. The prospective surveillance model includes the cost of screening all women plus the cost of intervention for early-stage BCRL. The TM group comprised women referred for BCRL treatment using a traditional model of referral based on late-stage lymphedema. The traditional model cost includes the direct cost of treating patients with advanced-stage lymphedema. The cost to manage early-stage BCRL per patient per year using a prospective surveillance model is $636.19. The cost to manage late-stage BCRL per patient per year using a traditional model is $3,124.92. The prospective surveillance model is emerging as the standard of care in breast cancer treatment and is a potential cost-saving mechanism for BCRL treatment. Further analysis of indirect costs and utility is necessary to assess cost-effectiveness. A shift in the paradigm of physical therapy toward a prospective surveillance model is warranted. PMID:21921254
TEACHING PHYSICS: The quantum understanding of pre-university physics students
NASA Astrophysics Data System (ADS)
Ireson, Gren
2000-01-01
Students in England and Wales wishing to read for a physics-based degree will, in all but the more exceptional situations, be required to follow the two-year GCE Advanced-level physics course. This course includes, in its mandatory core, material that addresses the topic of `quantum phenomena'. Over the years journals such as this have published teaching strategies, for example Lawrence (1996), but few studies addressing what students understand of quantum phenomena can be found. This paper aims to address just this problem.