The Formation of Igneous CAIs and Chondrules by Impacts?
NASA Technical Reports Server (NTRS)
Connolly, Harold C., Jr.; Love, Stanley G.
2001-01-01
Numerous challenges exist with forming the igneous spheres found within chondrites via collision events in the early solar nebula. We explore these challenges and discuss potential methods to overcome them. Collision models should be received cautiously. Additional information is contained in the original extended abstract.
A Survey of Terrestrial Approaches to the Challenge of Lunar Dust Containment
NASA Technical Reports Server (NTRS)
Aguilera, Tatiana; Perry, Jay L.
2009-01-01
Numerous technical challenges exist to successfully extend lunar surface exploration beyond the tantalizing first steps of Apollo. Among these is the challenge of lunar dust intrusion into the cabin environment. Addressing this challenge includes the design of barriers to intrusion as well as techniques for removing the dust from the cabin atmosphere. Opportunities exist for adapting approaches employed in dusty industrial operations and pristine manufacturing environments to cabin environmental quality maintenance applications. A survey of process technologies employed by the semiconductor, pharmaceutical, food processing, and mining industries offers insight into basic approaches that may be suitable for adaptation to lunar surface exploration applications.
Career Advancement of Women Senior Academic Administrators in Indonesia: Supports and Challenges
ERIC Educational Resources Information Center
Murniati, Cecilia Titiek
2012-01-01
Increasing numbers of women have gained access to college and the college teaching profession worldwide. However, women continue to be underrepresented in academic, research, and leadership positions. Women who have aspirations for top leadership positions still encounter numerous internal and external challenges. Existent literature on women…
bb̅ud̅ four-quark systems in the Born-Oppenheimer approximation: prospects and challenges
NASA Astrophysics Data System (ADS)
Peters, Antje; Bicudo, Pedro; Wagner, Marc
2018-03-01
We summarize previous work on b̅b̅ud four-quark systems in the Born-Oppenheimer approximation and discuss first steps towards an extension to the theoretically more challenging bb̅ud̅ system. Strategies to identify a possibly existing bb̅ud̅ bound state are discussed and first numerical results are presented.
ASSESSMENT AND MANAGEMENT OF WATERSHED MICROBIAL CONTAMINANTS
Numerous sources of infectious disease causing microorganisms exist in watersheds and can impact recreational and drinking water quality. Organisms of concern include bacteria, viruses, and parasites. The watershed manager is challenged to limit human contact with pathogens, limi...
NASA Technical Reports Server (NTRS)
Shyne, Rickey J.
2002-01-01
The current paper discusses aerodynamic exhaust nozzle technology challenges for aircraft and space propulsion systems. Technology advances in computational and experimental methods have led to more accurate design and analysis tools, but many major challenges continue to exist in nozzle performance, jet noise and weight reduction. New generations of aircraft and space vehicle concepts dictate that exhaust nozzles have optimum performance, low weight and acceptable noise signatures. Numerous innovative nozzle concepts have been proposed for advanced subsonic, supersonic and hypersonic vehicle configurations such as ejector, mixer-ejector, plug, single expansion ramp, altitude compensating, lobed and chevron nozzles. This paper will discuss the technology barriers that exist for exhaust nozzles as well as current research efforts in place to address the barriers.
Taxonomy of Challenges for Digital Forensics.
Karie, Nickson M; Venter, Hein S
2015-07-01
Since its inception, over a decade ago, the field of digital forensics has faced numerous challenges. Despite different researchers and digital forensic practitioners having studied and analysed various known digital forensic challenges, as of 2013, there still exists a need for a formal classification of these challenges. This article therefore reviews existing research literature and highlights the various challenges that digital forensics has faced for the last 10 years. In conducting this research study, however, it was difficult for the authors to review all the existing research literature in the digital forensic domain; hence, sampling and randomization techniques were employed to facilitate the review of the gathered literature. Taxonomy of the various challenges is subsequently proposed in this paper based on our review of the literature. The taxonomy classifies the large number of digital forensic challenges into four well-defined and easily understood categories. The proposed taxonomy can be useful, for example, in future developments of automated digital forensic tools by explicitly describing processes and procedures that focus on addressing specific challenges identified in this paper. However, it should also be noted that the purpose of this paper was not to propose any solutions to the individual challenges that digital forensics face, but to serve as a survey of the state of the art of the research area. © 2015 American Academy of Forensic Sciences.
Digitized Archival Primary Sources in STEM: A Selected Webliography
ERIC Educational Resources Information Center
Jankowski, Amy
2017-01-01
Accessibility and findability of digitized archival resources can be a challenge, particularly for students or researchers not familiar with archival formats and digital interfaces, which adhere to different descriptive standards than more widely familiar library resources. Numerous aggregate archival collection databases exist, which provide a…
Aerothermodynamics of Blunt Body Entry Vehicles. Chapter 3
NASA Technical Reports Server (NTRS)
Hollis, Brian R.; Borrelli, Salvatore
2011-01-01
In this chapter, the aerothermodynamic phenomena of blunt body entry vehicles are discussed. Four topics will be considered that present challenges to current computational modeling techniques for blunt body environments: turbulent flow, non-equilibrium flow, rarefied flow, and radiation transport. Examples of comparisons between computational tools to ground and flight-test data will be presented in order to illustrate the challenges existing in the numerical modeling of each of these phenomena and to provide test cases for evaluation of Computational Fluid Dynamics (CFD) code predictions.
Aerothermodynamics of blunt body entry vehicles
NASA Astrophysics Data System (ADS)
Hollis, Brian R.; Borrelli, Salvatore
2012-01-01
In this chapter, the aerothermodynamic phenomena of blunt body entry vehicles are discussed. Four topics will be considered that present challenges to current computational modeling techniques for blunt body environments: turbulent flow, non-equilibrium flow, rarefied flow, and radiation transport. Examples of comparisons between computational tools to ground and flight-test data will be presented in order to illustrate the challenges existing in the numerical modeling of each of these phenomena and to provide test cases for evaluation of computational fluid dynamics (CFD) code predictions.
Revealing a Hidden Curriculum of Black Women's Erasure in Sexual Violence Prevention Policy
ERIC Educational Resources Information Center
Wooten, Sara Carrigan
2017-01-01
This article aims to challenge the framework by which rape and sexual assault prevention in higher education are being constituted by centring Black women's experiences of sexual violence within a prevention and response policy framework. Numerous research studies exist in the literature regarding the specific experience of sexual violence for…
Taking Stock: Implications of a New Vision of Science Learning for State Science Assessment
ERIC Educational Resources Information Center
Wertheim, Jill
2016-01-01
This article presents the author's response to the article "Taking Stock: Existing Resources for Assessing a New Vision of Science Learning" by Alonzo and Ke (this issue), which identifies numerous challenges that the Next Generation Science Standards (NGSS) pose for large-scale assessment. Jill Werthem comments that among those…
ERIC Educational Resources Information Center
Vega, Laura; Bajaj, Monisha
2016-01-01
The challenges of ensuring the right to education are numerous, especially when working with marginalised populations in fragile contexts. Despite having the legislation, strong constitutional support, and even educational innovations designed to guarantee the right to education, a major gap exists in Colombia between political intentions and the…
Beyond the Four Walls: Examining the Use of Authentic Learning Modules
ERIC Educational Resources Information Center
Jagielski, Donna Marie
2016-01-01
While attempting to provide real world experiences in STEM, educators face numerous challenges including adhering to curriculum requirements and working with potentially limited resources. The purpose of this action research study was to examine how the addition of authentic learning modules to the existing University of Arizona Middle School…
Brügmann, B.; Ghez, A. M.; Greiner, J.
2001-01-01
Recent progress in black hole research is illustrated by three examples. We discuss the observational challenges that were met to show that a supermassive black hole exists at the center of our galaxy. Stellar-size black holes have been studied in x-ray binaries and microquasars. Finally, numerical simulations have become possible for the merger of black hole binaries. PMID:11553801
Brügmann, B; Ghez, A M; Greiner, J
2001-09-11
Recent progress in black hole research is illustrated by three examples. We discuss the observational challenges that were met to show that a supermassive black hole exists at the center of our galaxy. Stellar-size black holes have been studied in x-ray binaries and microquasars. Finally, numerical simulations have become possible for the merger of black hole binaries.
Developing Teaching Material Software Assisted for Numerical Methods
NASA Astrophysics Data System (ADS)
Handayani, A. D.; Herman, T.; Fatimah, S.
2017-09-01
The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.
NASA Astrophysics Data System (ADS)
Perez, J. C.; Chandran, B. D. G.
2016-12-01
As Solar Probe Plus (SPP) explores the near-Sun environment, our ability to obtain meaningful interpretation of in-situ measurements faces two significant challenges. The first challenge is that the Taylor Hypothesis (TH), which is normally used in the interpretation of existing spacecraft data, breaks down at the low heliocentric distances that SPP mission will explore. The second challenge is our limited understanding of turbulence in this region, largely due to the theoretical and numerical difficulties in modeling this problem. In this work we present recent progress towards overcoming these challenges using high-resolution numerical simulations of Alfvenic turbulence in the inner heliosphere. We fly virtual SPP spacecraft in the simulation domain to obtain single-point measurements of the velocity and magnetic field fluctuations at several radial locations relevant to SPP. We use these virtual measurements to 1) validate a recently introduced modified TH that allows one to recover the spatial structure of the dominant (outward-propagating) Alfvenic fluctuations, of the kind SPP will encounter; and 2) to compare these virtual observations with our most recent phenomenological models of reflection-driven Alfven turbulence.
The existing situation and challenges regarding the use of plastic carrier bags in Europe.
Kasidoni, Maria; Moustakas, Konstantinos; Malamis, Dimitris
2015-05-01
Since day one, retailers and consumers have favoured plastic carrier bags. However, owing to the numerous environmental disadvantages, lightweight plastic carrier bags have been drawing the attention of the European Union competent authorities. Therefore, many European Union member states have taken action so as to reduce the use of plastic carrier bags. Based on the existing legislation and voluntary initiatives for the reduction of lightweight plastic carrier bags, the challenges and achieved outcomes from the implemented policy options in the various European Union member states are discussed and commented regarding the forthcoming transposition of the 'Directive 94/62/EC on packaging and packaging waste to reduce the consumption of lightweight plastic carrier bags' into the European Union member states' national law. © The Author(s) 2015.
Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.
Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J
2011-11-01
To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Xia, Xilin; Liang, Qiuhua; Ming, Xiaodong; Hou, Jingming
2017-05-01
Numerical models solving the full 2-D shallow water equations (SWEs) have been increasingly used to simulate overland flows and better understand the transient flow dynamics of flash floods in a catchment. However, there still exist key challenges that have not yet been resolved for the development of fully dynamic overland flow models, related to (1) the difficulty of maintaining numerical stability and accuracy in the limit of disappearing water depth and (2) inaccurate estimation of velocities and discharges on slopes as a result of strong nonlinearity of friction terms. This paper aims to tackle these key research challenges and present a new numerical scheme for accurately and efficiently modeling large-scale transient overland flows over complex terrains. The proposed scheme features a novel surface reconstruction method (SRM) to correctly compute slope source terms and maintain numerical stability at small water depth, and a new implicit discretization method to handle the highly nonlinear friction terms. The resulting shallow water overland flow model is first validated against analytical and experimental test cases and then applied to simulate a hypothetic rainfall event in the 42 km2 Haltwhistle Burn, UK.
ERIC Educational Resources Information Center
US Government Accountability Office, 2016
2016-01-01
Enacted in 2014, the Workforce Innovation and Opportunity Act (WIOA) brought numerous changes to existing federal employment and training programs, including requiring the Department of Labor (DOL) and the Department of Education (Education) to implement a common performance accountability system across the six WIOA-designated core programs. WIOA…
USDA-ARS?s Scientific Manuscript database
Salmonella colonization of food animals is a concern for animal health and public health as a food safety risk. Various obstacles impede the effort to reduce asymptomatic Salmonella carriage in food animals, including the existence of numerous serovars and the ubiquitous nature of Salmonella. To d...
Yuan Fang; Ge Sun; Peter Caldwell; Steven G. McNulty; Asko Noormets; Jean-Christophe Domec; John King; Zhiqiang Zhang; Xudong Zhang; Guanghui Lin; Guangsheng Zhou; Jingfeng Xiao; Jiquan Chen
2015-01-01
Evapotranspiration (ET) is arguably the most uncertain ecohydrologic variable for quantifying watershed water budgets. Although numerous ET and hydrological models exist, accurately predicting the effects of global change on water use and availability remains challenging because of model deficiency and/or a lack of input parameters. The objective of this study was to...
Rational Emotive Approaches to the Problems of Parents with Exceptional Children: A Brief Overview.
ERIC Educational Resources Information Center
McInerney, John F.
Parents of exceptional children face numerous challenges in their efforts to meet the needs of their child. Reaction to the realization that a problem exists in the child's development or educational achievement may lead to emotional distress which can be self-defeating. Such parents often benefit from a direct approach to addressing these issues…
Hydrogel based cartilaginous tissue regeneration: recent insights and technologies.
Chuah, Yon Jin; Peck, Yvonne; Lau, Jia En Josias; Hee, Hwan Tak; Wang, Dong-An
2017-03-28
Hydrogels have been extensively employed as an attractive biomaterial to address numerous existing challenges in the fields of regenerative medicine and research because of their unique properties such as the capability to encapsulate cells, high water content, ease of modification, low toxicity, injectability, in situ spatial fit and biocompatibility. These inherent properties have created many opportunities for hydrogels as a scaffold or a cell/drug carrier in tissue regeneration, especially in the field of cartilaginous tissue such as articular cartilage and intervertebral discs. A concise overview of the anatomy/physiology of these cartilaginous tissues and their pathophysiology, epidemiology and existing clinical treatments will be briefly described. This review article will discuss the current state-of-the-art of various polymers and developing strategies that are explored in establishing different technologies for cartilaginous tissue regeneration. In particular, an innovative approach to generate scaffold-free cartilaginous tissue via a transient hydrogel scaffolding system for disease modeling to pre-clinical trials will be examined. Following that, the article reviews numerous hydrogel-based medical implants used in clinical treatment of osteoarthritis and degenerated discs. Last but not least, the challenges and future directions of hydrogel based medical implants in the regeneration of cartilaginous tissue are also discussed.
Hall, William A; Bergom, Carmen; Thompson, Reid F; Baschnagel, Andrew M; Vijayakumar, Srinivasan; Willers, Henning; Li, X Allen; Schultz, Christopher J; Wilson, George D; West, Catharine M L; Capala, Jacek; Coleman, C Norman; Torres-Roca, Javier F; Weidhaas, Joanne; Feng, Felix Y
2018-06-01
To summarize important talking points from a 2016 symposium focusing on real-world challenges to advancing precision medicine in radiation oncology, and to help radiation oncologists navigate the practical challenges of precision, radiation oncology. The American Society for Radiation Oncology, American Association of Physicists in Medicine, and National Cancer Institute cosponsored a meeting on precision medicine in radiation oncology. In June 2016 numerous scientists, clinicians, and physicists convened at the National Institutes of Health to discuss challenges and future directions toward personalized radiation therapy. Various breakout sessions were held to discuss particular components and approaches to the implementation of personalized radiation oncology. This article summarizes the genomically guided radiation therapy breakout session. A summary of existing genomic data enabling personalized radiation therapy, ongoing clinical trials, current challenges, and future directions was collected. The group attempted to provide both a current overview of data that radiation oncologists could use to personalize therapy, along with data that are anticipated in the coming years. It seems apparent from the provided review that a considerable opportunity exists to truly bring genomically guided radiation therapy into clinical reality. Genomically guided radiation therapy is a necessity that must be embraced in the coming years. Incorporating these data into treatment recommendations will provide radiation oncologists with a substantial opportunity to improve outcomes for numerous cancer patients. More research focused on this topic is needed to bring genomic signatures into routine standard of care. Published by Elsevier Inc.
Exploring the Case for a Global Alliance for Medical Diagnostics Initiative
Mugambi, Melissa L.; Palamountain, Kara M.; Gallarda, Jim; Drain, Paul K.
2017-01-01
In recent years, the private and public sectors have increased investments in medical diagnostics for low- and middle-income countries (LMICs). Despite these investments, numerous barriers prevent the adoption of existing diagnostics and discourage the development and introduction of new diagnostics in LMICs. In the late 1990s, the global vaccine community had similar challenges, as vaccine coverage rates stagnated and the introduction of new vaccines was viewed as a distraction to delivering existing vaccines. To address these challenges, the international community came together and formed the Global Alliance for Vaccines Initiative (GAVI). Sixteen years after the formation of GAVI, we see evidence of a healthier global vaccine landscape. We discuss how GAVI’s four guiding principles (product, health systems strengthening, financing and market shaping) might apply to the advancement of medical diagnostics in LMICs. We present arguments for the international community and existing organizations to establish a Global Alliance for Medical Diagnostics Initiative (GAMDI). PMID:28134750
NASA Astrophysics Data System (ADS)
Xie, Dexuan
2014-10-01
The Poisson-Boltzmann equation (PBE) is one widely-used implicit solvent continuum model in the calculation of electrostatic potential energy for biomolecules in ionic solvent, but its numerical solution remains a challenge due to its strong singularity and nonlinearity caused by its singular distribution source terms and exponential nonlinear terms. To effectively deal with such a challenge, in this paper, new solution decomposition and minimization schemes are proposed, together with a new PBE analysis on solution existence and uniqueness. Moreover, a PBE finite element program package is developed in Python based on the FEniCS program library and GAMer, a molecular surface and volumetric mesh generation program package. Numerical tests on proteins and a nonlinear Born ball model with an analytical solution validate the new solution decomposition and minimization schemes, and demonstrate the effectiveness and efficiency of the new PBE finite element program package.
ERIC Educational Resources Information Center
Ogunleye, Ayodele; Owolabi, Tunde; Adeyemo, Sunday
2013-01-01
In recent times, the role of entrepreneurs has been recognized to be of great significance in accelerating the pace of growth of economic development of any country. Internet-enabled technologies have also challenged existing business models in numerous market sectors and offered innovation opportunities to a variety of stakeholders--not least…
Numerical Simulation of Black Holes
NASA Astrophysics Data System (ADS)
Teukolsky, Saul
2003-04-01
Einstein's equations of general relativity are prime candidates for numerical solution on supercomputers. There is some urgency in being able to carry out such simulations: Large-scale gravitational wave detectors are now coming on line, and the most important expected signals cannot be predicted except numerically. Problems involving black holes are perhaps the most interesting, yet also particularly challenging computationally. One difficulty is that inside a black hole there is a physical singularity that cannot be part of the computational domain. A second difficulty is the disparity in length scales between the size of the black hole and the wavelength of the gravitational radiation emitted. A third difficulty is that all existing methods of evolving black holes in three spatial dimensions are plagued by instabilities that prohibit long-term evolution. I will describe the ideas that are being introduced in numerical relativity to deal with these problems, and discuss the results of recent calculations of black hole collisions.
Computational Astrophysical Magnetohydrodynamics
NASA Astrophysics Data System (ADS)
Norman, M. L.
1994-05-01
Cosmic magnetic fields have intrigued and vexed astrophysicists seeking to understand their complex dynamics in a wide variety of astronomical settings. Magnetic fields are believed to play an important role in regulating star formation in molecular clouds, providing an effective viscosity in accretion disks, accelerating astrophysical jets, and influencing the large scale structure of the ISM of disk galaxies. Radio observations of supernova remnants and extragalactic radio jets prove that magnetic fields are are fundamentally linked to astrophysical particle acceleration. Magnetic fields exist on cosmological scales as shown by the existence of radio halos in clusters of galaxies. Theoretical investigation of these and other phenomena require numerical simulations due to the inherent complexity of MHD, but until now neither the computer power nor the numerical algorithms existed to mount a serious attack on the most important problems. That has now changed. Advances in parallel computing and numerical algorithms now permit the simulation of fully nonlinear, time-dependent astrophysical MHD in 2D and 3D. In this talk, I will describe the ZEUS codes for astrophysical MHD developed at the Laboratory for Computational Astrophysics (LCA) at the University of Illinois. These codes are now available to the national community. The numerical algorithms and test suite used to validate them are briefly discussed. Several applications of ZEUS to topics listed above are presented. An extension of ZEUS to model ambipolar diffusion in weakly ionized plasmas is illustrated. I discuss how continuing exponential growth in computer power and new numerical algorithms under development will allow us to tackle two grand challenges: compressible MHD turbulence and relativistic MHD. This work is partially supported by grants NSF AST-9201113 and NASA NAG 5-2493.
Enteral Formulas in Nutrition Support Practice: Is There a Better Choice for Your Patient?
Escuro, Arlene A; Hummell, A Christine
2016-12-01
Over the past few decades, the number of enteral formulas for use in hospitalized, critically ill, and home enteral patients has dramatically increased. Several enteral nutrition (EN) formula categories exist, which makes it challenging for clinicians to sort through the product claims and find the appropriate formula for the patient. Many formulas are available within each category, some of which may be significantly different from one another. Numerous systematic reviews of existing research and clinical practice guidelines evaluate the use of specialty formulas. This review aims to examine the differences in various enteral formula categories, identify applications in clinical practice, and evaluate the existing evidence and guideline recommendations for use of specific types of enteral formulas.
Liu, Jinxuan; Wöll, Christof
2017-10-02
Surface-supported metal-organic framework thin films are receiving increasing attention as a novel form of nanotechnology. New deposition techniques that enable the control of the film thickness, homogeneity, morphology, and dimensions with a huge number of metal-organic framework compounds offer tremendous opportunities in a number of different application fields. In response to increasing demands for environmental sustainability and cleaner energy, much effort in recent years has been devoted to the development of MOF thin films for applications in photovoltaics, CO 2 reduction, energy storage, water splitting, and electronic devices, as well as for the fabrication of membranes. Although existing applications are promising and encouraging, MOF thin films still face numerous challenges, including the need for a more thorough understanding of the thin-film growth mechanism, stability of the internal and external interfaces, strategies for doping and models for charge carrier transport. In this paper, we review the recent advances in MOF thin films, including fabrication and patterning strategies and existing nanotechnology applications. We conclude by listing the most attractive future opportunities as well as the most urgent challenges.
Benchmark Problems of the Geothermal Technologies Office Code Comparison Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.
A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less
Orbiter entry aerothermodynamics
NASA Technical Reports Server (NTRS)
Ried, R. C.
1985-01-01
The challenge in the definition of the entry aerothermodynamic environment arising from the challenge of a reliable and reusable Orbiter is reviewed in light of the existing technology. Select problems pertinent to the orbiter development are discussed with reference to comprehensive treatments. These problems include boundary layer transition, leeward-side heating, shock/shock interaction scaling, tile gap heating, and nonequilibrium effects such as surface catalysis. Sample measurements obtained from test flights of the Orbiter are presented with comparison to preflight expectations. Numerical and wind tunnel simulations gave efficient information for defining the entry environment and an adequate level of preflight confidence. The high quality flight data provide an opportunity to refine the operational capability of the orbiter and serve as a benchmark both for the development of aerothermodynamic technology and for use in meeting future entry heating challenges.
[Reimbursement of health apps by the German statutory health insurance].
Gregor-Haack, Johanna
2018-03-01
A reimbursement category for "apps" does not exist in German statutory health insurance. Nevertheless different ways for reimbursement of digital health care products or processes exist. This article provides an overview and a description of the most relevant finance and reimbursement categories for apps in German statutory health insurance. The legal qualifications and preconditions of reimbursement in the context of single contracts with one health insurance fund will be discussed as well as collective contracts with national statutory health insurance funds. The benefit of a general outline appeals especially in respect to the numerous new players and products in the health care market. The article will highlight that health apps can challenge existing legal market access and reimbursement criteria and paths. At the same time, these criteria and paths exist. In terms of a learning system, they need to be met and followed.
Mobile Multicast in Hierarchical Proxy Mobile IPV6
NASA Astrophysics Data System (ADS)
Hafizah Mohd Aman, Azana; Hashim, Aisha Hassan A.; Mustafa, Amin; Abdullah, Khaizuran
2013-12-01
Mobile Internet Protocol Version 6 (MIPv6) environments have been developing very rapidly. Many challenges arise with the fast progress of MIPv6 technologies and its environment. Therefore the importance of improving the existing architecture and operations increases. One of the many challenges which need to be addressed is the need for performance improvement to support mobile multicast. Numerous approaches have been proposed to improve mobile multicast performance. This includes Context Transfer Protocol (CXTP), Hierarchical Mobile IPv6 (HMIPv6), Fast Mobile IPv6 (FMIPv6) and Proxy Mobile IPv6 (PMIPv6). This document describes multicast context transfer in hierarchical proxy mobile IPv6 (H-PMIPv6) to provide better multicasting performance in PMIPv6 domain.
Novel Numerical Approaches to Loop Quantum Cosmology
NASA Astrophysics Data System (ADS)
Diener, Peter
2015-04-01
Loop Quantum Gravity (LQG) is an (as yet incomplete) approach to the quantization of gravity. When applied to symmetry reduced cosmological spacetimes (Loop Quantum Cosmology or LQC) one of the predictions of the theory is that the Big Bang is replaced by a Big Bounce, i.e. a previously existing contracting universe underwent a bounce at finite volume before becoming our expanding universe. The evolution equations of LQC take the form of difference equations (with the discretization given by the theory) that in the large volume limit can be approximated by partial differential equations (PDEs). In this talk I will first discuss some of the unique challenges encountered when trying to numerically solve these difference equations. I will then present some of the novel approaches that have been employed to overcome the challenges. I will here focus primarily on the Chimera scheme that takes advantage of the fact that the LQC difference equations can be approximated by PDEs in the large volume limit. I will finally also briefly discuss some of the results that have been obtained using these numerical techniques by performing simulations in regions of parameter space that were previously unreachable. This work is supported by a grant from the John Templeton Foundation and by NSF grant PHYS1068743.
Planning for large epidemics and pandemics: challenges from a policy perspective.
Jain, Vageesh; Duse, Adriano; Bausch, Daniel G
2018-05-24
Less than two decades into the 21st century, the world has already witnessed numerous large epidemics or pandemics. These events have highlighted inadequacies in both national and international capacity for outbreak prevention, detection, and response. Here, we review some of the major challenges from a policy perspective. The most important challenges facing policymakers include financing outbreak preparedness and response in a complex political environment with limited resources, coordinating response efforts among a growing and diverse range of national and international actors, accurately assessing national outbreak preparedness, addressing the shortfall in the global biomedical workforce, building surge capacity of both human and material resources, balancing investments in public health and curative services, building capacity for outbreak-related research and development, and reinforcing measures for infection prevention and control. In recent years, numerous epidemics and pandemics have caused not only considerable loss of life but also billions of dollars of economic loss. Although the events have served as a wake-up call and led to the implementation of relevant policies and counter-measures, such as the Global Health Security Agenda, many questions remain and much work to be done. Wise policies and approaches for outbreak control exist, but will require the political will to implement them.
Management of intracerebral hemorrhage.
Thabet, A M; Kottapally, M; Hemphill, J Claude
2017-01-01
Intracerebral hemorrhage (ICH) is a potentially devastating neurologic injury representing 10-15% of stroke cases in the USA each year. Numerous risk factors, including age, hypertension, male gender, coagulopathy, genetic susceptibility, and ethnic descent, have been identified. Timely identification, workup, and management of this condition remain a challenge for clinicians as numerous factors can present obstacles to achieving good functional outcomes. Several large clinical trials have been conducted over the prior decade regarding medical and surgical interventions. However, no specific treatment has shown a major impact on clinical outcome. Current management guidelines do exist based on medical evidence and consensus and these provide a framework for care. While management of hypertension and coagulopathy are generally considered basic tenets of ICH management, a variety of measures for surgical hematoma evacuation, intracranial pressure control, and intraventricular hemorrhage can be further pursued in the emergent setting for selected patients. The complexity of management in parenchymal cerebral hemorrhage remains challenging and offers many areas for further investigation. A systematic approach to the background, pathology, and early management of spontaneous parenchymal hemorrhage is provided. © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-04-01
The phase appearance/disappearance issue presents serious numerical challenges in two-phase flow simulations. Many existing reactor safety analysis codes use different kinds of treatments for the phase appearance/disappearance problem. However, to our best knowledge, there are no fully satisfactory solutions. Additionally, the majority of the existing reactor system analysis codes were developed using low-order numerical schemes in both space and time. In many situations, it is desirable to use high-resolution spatial discretization and fully implicit time integration schemes to reduce numerical errors. In this work, we adapted a high-resolution spatial discretization scheme on staggered grid mesh and fully implicit time integrationmore » methods (such as BDF1 and BDF2) to solve the two-phase flow problems. The discretized nonlinear system was solved by the Jacobian-free Newton Krylov (JFNK) method, which does not require the derivation and implementation of analytical Jacobian matrix. These methods were tested with a few two-phase flow problems with phase appearance/disappearance phenomena considered, such as a linear advection problem, an oscillating manometer problem, and a sedimentation problem. The JFNK method demonstrated extremely robust and stable behaviors in solving the two-phase flow problems with phase appearance/disappearance. No special treatments such as water level tracking or void fraction limiting were used. High-resolution spatial discretization and second- order fully implicit method also demonstrated their capabilities in significantly reducing numerical errors.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipnikov, Konstantin; Moulton, David; Svyatskiy, Daniil
2016-04-29
We develop a new approach for solving the nonlinear Richards’ equation arising in variably saturated flow modeling. The growing complexity of geometric models for simulation of subsurface flows leads to the necessity of using unstructured meshes and advanced discretization methods. Typically, a numerical solution is obtained by first discretizing PDEs and then solving the resulting system of nonlinear discrete equations with a Newton-Raphson-type method. Efficiency and robustness of the existing solvers rely on many factors, including an empiric quality control of intermediate iterates, complexity of the employed discretization method and a customized preconditioner. We propose and analyze a new preconditioningmore » strategy that is based on a stable discretization of the continuum Jacobian. We will show with numerical experiments for challenging problems in subsurface hydrology that this new preconditioner improves convergence of the existing Jacobian-free solvers 3-20 times. Furthermore, we show that the Picard method with this preconditioner becomes a more efficient nonlinear solver than a few widely used Jacobian-free solvers.« less
How to identify dislocations in molecular dynamics simulations?
NASA Astrophysics Data System (ADS)
Li, Duo; Wang, FengChao; Yang, ZhenYu; Zhao, YaPu
2014-12-01
Dislocations are of great importance in revealing the underlying mechanisms of deformed solid crystals. With the development of computational facilities and technologies, the observations of dislocations at atomic level through numerical simulations are permitted. Molecular dynamics (MD) simulation suggests itself as a powerful tool for understanding and visualizing the creation of dislocations as well as the evolution of crystal defects. However, the numerical results from the large-scale MD simulations are not very illuminating by themselves and there exist various techniques for analyzing dislocations and the deformed crystal structures. Thus, it is a big challenge for the beginners in this community to choose a proper method to start their investigations. In this review, we summarized and discussed up to twelve existing structure characterization methods in MD simulations of deformed crystal solids. A comprehensive comparison was made between the advantages and disadvantages of these typical techniques. We also examined some of the recent advances in the dynamics of dislocations related to the hydraulic fracturing. It was found that the dislocation emission has a significant effect on the propagation and bifurcation of the crack tip in the hydraulic fracturing.
NASA Astrophysics Data System (ADS)
Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb
2017-10-01
In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.
Exploring the Concept of HIV-Related Stigma
Florom-Smith, Aubrey L.; De Santis, Joseph P.
2013-01-01
BACKGROUND HIV infection is a chronic, manageable illness. Despite advances in the care and treatment of people living with HIV infection, HIV-related stigma remains a challenge to HIV testing, care, and prevention. Numerous studies have documented the impact of HIV-related stigma among various groups of people living with HIV infection, but the concept of HIV-related stigma remains unclear. PURPOSE Concept exploration of HIV-related stigma via an integrative literature review was conducted in order to examine the existing knowledge base of this concept. METHODS Search engines were employed to review the existing knowledge base of this concept. CONCLUSION After the integrative literature review, an analysis of HIV-related stigma emerged. Implications for future concept analysis, research, and practice are included. PMID:22861652
UltraPse: A Universal and Extensible Software Platform for Representing Biological Sequences.
Du, Pu-Feng; Zhao, Wei; Miao, Yang-Yang; Wei, Le-Yi; Wang, Likun
2017-11-14
With the avalanche of biological sequences in public databases, one of the most challenging problems in computational biology is to predict their biological functions and cellular attributes. Most of the existing prediction algorithms can only handle fixed-length numerical vectors. Therefore, it is important to be able to represent biological sequences with various lengths using fixed-length numerical vectors. Although several algorithms, as well as software implementations, have been developed to address this problem, these existing programs can only provide a fixed number of representation modes. Every time a new sequence representation mode is developed, a new program will be needed. In this paper, we propose the UltraPse as a universal software platform for this problem. The function of the UltraPse is not only to generate various existing sequence representation modes, but also to simplify all future programming works in developing novel representation modes. The extensibility of UltraPse is particularly enhanced. It allows the users to define their own representation mode, their own physicochemical properties, or even their own types of biological sequences. Moreover, UltraPse is also the fastest software of its kind. The source code package, as well as the executables for both Linux and Windows platforms, can be downloaded from the GitHub repository.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolly, S; Chen, H; Mutic, S
Purpose: A persistent challenge for the quality assessment of radiation therapy treatments (e.g. contouring accuracy) is the absence of the known, ground truth for patient data. Moreover, assessment results are often patient-dependent. Computer simulation studies utilizing numerical phantoms can be performed for quality assessment with a known ground truth. However, previously reported numerical phantoms do not include the statistical properties of inter-patient variations, as their models are based on only one patient. In addition, these models do not incorporate tumor data. In this study, a methodology was developed for generating numerical phantoms which encapsulate the statistical variations of patients withinmore » radiation therapy, including tumors. Methods: Based on previous work in contouring assessment, geometric attribute distribution (GAD) models were employed to model both the deterministic and stochastic properties of individual organs via principle component analysis. Using pre-existing radiation therapy contour data, the GAD models are trained to model the shape and centroid distributions of each organ. Then, organs with different shapes and positions can be generated by assigning statistically sound weights to the GAD model parameters. Organ contour data from 20 retrospective prostate patient cases were manually extracted and utilized to train the GAD models. As a demonstration, computer-simulated CT images of generated numerical phantoms were calculated and assessed subjectively and objectively for realism. Results: A cohort of numerical phantoms of the male human pelvis was generated. CT images were deemed realistic both subjectively and objectively in terms of image noise power spectrum. Conclusion: A methodology has been developed to generate realistic numerical anthropomorphic phantoms using pre-existing radiation therapy data. The GAD models guarantee that generated organs span the statistical distribution of observed radiation therapy patients, according to the training dataset. The methodology enables radiation therapy treatment assessment with multi-modality imaging and a known ground truth, and without patient-dependent bias.« less
NASA Technical Reports Server (NTRS)
Chronis, Themis; Case, Jonathan L.; Papadopoulos, Anastasios; Anagnostou, Emmanouil N.; Mecikalski, John R.; Haines, Stephanie L.
2008-01-01
Forecasting atmospheric and oceanic circulations accurately over the Eastern Mediterranean has proved to be an exceptional challenge. The existence of fine-scale topographic variability (land/sea coverage) and seasonal dynamics variations can create strong spatial gradients in temperature, wind and other state variables, which numerical models may have difficulty capturing. The Hellenic Center for Marine Research (HCMR) is one of the main operational centers for wave forecasting in the eastern Mediterranean. Currently, HCMR's operational numerical weather/ocean prediction model is based on the coupled Eta/Princeton Ocean Model (POM). Since 1999, HCMR has also operated the POSEIDON floating buoys as a means of state-of-the-art, real-time observations of several oceanic and surface atmospheric variables. This study attempts a first assessment at improving both atmospheric and oceanic prediction by initializing a regional Numerical Weather Prediction (NWP) model with high-resolution sea surface temperatures (SST) from remotely sensed platforms in order to capture the small-scale characteristics.
Making a difference: initiating and maintaining a faith-based free health clinic.
Dunn, Linda L
2009-01-01
This article is a summary of the challenges, struggles, and barriers that a group of churches encountered in developing a faith-based free health clinic. From the inception, this clinic has existed for the uninsured whose total household income aligns with the 2009 Fedral Poverty Guidelines. A voluntary interview with the executive director of The Good Samaritan Clinic revealed the experiential evolvement of this free health clinic. Numerous examples are shared that depict how this clinic has made a difference in the lives of many people.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... existing paragraph (b)(4) of the Rule, entitled ``Numerical Guidelines Applicable to Volatile Market Opens... existing paragraph (b)(2), which provides flexibility to FINRA to use different Numerical Guidelines or... of paragraph (b)(4) (``Numerical Guidelines Applicable to Volatile Market Opens'') of the existing...
Space-Time Conservation Element and Solution Element Method Being Developed
NASA Technical Reports Server (NTRS)
Chang, Sin-Chung; Himansu, Ananda; Jorgenson, Philip C. E.; Loh, Ching-Yuen; Wang, Xiao-Yen; Yu, Sheng-Tao
1999-01-01
The engineering research and design requirements of today pose great computer-simulation challenges to engineers and scientists who are called on to analyze phenomena in continuum mechanics. The future will bring even more daunting challenges, when increasingly complex phenomena must be analyzed with increased accuracy. Traditionally used numerical simulation methods have evolved to their present state by repeated incremental extensions to broaden their scope. They are reaching the limits of their applicability and will need to be radically revised, at the very least, to meet future simulation challenges. At the NASA Lewis Research Center, researchers have been developing a new numerical framework for solving conservation laws in continuum mechanics, namely, the Space-Time Conservation Element and Solution Element Method, or the CE/SE method. This method has been built from fundamentals and is not a modification of any previously existing method. It has been designed with generality, simplicity, robustness, and accuracy as cornerstones. The CE/SE method has thus far been applied in the fields of computational fluid dynamics, computational aeroacoustics, and computational electromagnetics. Computer programs based on the CE/SE method have been developed for calculating flows in one, two, and three spatial dimensions. Results have been obtained for numerous problems and phenomena, including various shock-tube problems, ZND detonation waves, an implosion and explosion problem, shocks over a forward-facing step, a blast wave discharging from a nozzle, various acoustic waves, and shock/acoustic-wave interactions. The method can clearly resolve shock/acoustic-wave interactions, wherein the difference of the magnitude between the acoustic wave and shock could be up to six orders. In two-dimensional flows, the reflected shock is as crisp as the leading shock. CE/SE schemes are currently being used for advanced applications to jet and fan noise prediction and to chemically reacting flows.
Durymanov, Mikhail; Kamaletdinova, Tatiana; Lehmann, Sarah E; Reineke, Joshua
2017-09-10
Over the past few decades, enhanced permeability of tumor vasculature was actively exploited for targeted delivery of anticancer nanomedicines resulting in numerous pharmaceutical products. Formation of new immature and leaky vessels along with inflammatory remodeling of existing vessels accompany development of numerous diseases beyond cancer and present an opportunity for passive accumulation of intravenously administered nanomedicines in many pathological tissues. To date, applications of non-cancerous enhanced permeation have been relatively unexploited as target tissues and may create new therapy and prevention technologies for many disorders. Herein, we summarize the current knowledge on the nature of enhanced vascular permeability in multiple non-cancerous pathological tissues. We also discuss the clinical status of nanotherapeutics with selectivity based on passive accumulation in non-cancerous target tissues, their challenges, and prospects. Copyright © 2017 Elsevier B.V. All rights reserved.
Do Open Source LMSs Support Personalization? A Comparative Evaluation
NASA Astrophysics Data System (ADS)
Kerkiri, Tania; Paleologou, Angela-Maria
A number of parameters that support the LMSs capabilities towards content personalization are presented and substantiated. These parameters constitute critical criteria for an exhaustive investigation of the personalization capabilities of the most popular open source LMSs. Results are comparatively shown and commented upon, thus highlighting a course of conduct for the implementation of new personalization methodologies for these LMSs, aligned at their existing infrastructure, to maintain support of the numerous educational institutions entrusting major part of their curricula to them. Meanwhile, new capabilities arise as drawn from a more efficient description of the existing resources -especially when organized into widely available repositories- that lead to qualitatively advanced learner-oriented courses which would ideally meet the challenge of combining personification of demand and personalization of thematic content at once.
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Monitoring biological diversity: strategies, tools, limitations, and challenges
Beever, E.A.
2006-01-01
Monitoring is an assessment of the spatial and temporal variability in one or more ecosystem properties, and is an essential component of adaptive management. Monitoring can help determine whether mandated environmental standards are being met and can provide an early-warning system of ecological change. Development of a strategy for monitoring biological diversity will likely be most successful when based upon clearly articulated goals and objectives and may be enhanced by including several key steps in the process. Ideally, monitoring of biological diversity will measure not only composition, but also structure and function at the spatial and temporal scales of interest. Although biodiversity monitoring has several key limitations as well as numerous theoretical and practical challenges, many tools and strategies are available to address or overcome such challenges; I summarize several of these. Due to the diversity of spatio-temporal scales and comprehensiveness encompassed by existing definitions of biological diversity, an effective monitoring design will reflect the desired sampling domain of interest and its key stressors, available funding, legal requirements, and organizational goals.
Many-Body Subradiant Excitations in Metamaterial Arrays: Experiment and Theory.
Jenkins, Stewart D; Ruostekoski, Janne; Papasimakis, Nikitas; Savo, Salvatore; Zheludev, Nikolay I
2017-08-04
Subradiant excitations, originally predicted by Dicke, have posed a long-standing challenge in physics owing to their weak radiative coupling to environment. Here we engineer massive coherently driven classical subradiance in planar metamaterial arrays as a spatially extended eigenmode comprising over 1000 metamolecules. By comparing the near- and far-field response in large-scale numerical simulations with those in experimental observations we identify strong evidence for classically correlated multimetamolecule subradiant states that dominate the total excitation energy. We show that similar spatially extended many-body subradiance can also exist in plasmonic metamaterial arrays at optical frequencies.
NASA Technical Reports Server (NTRS)
Kalluri, Sreeramesh
2013-01-01
Structural materials used in engineering applications routinely subjected to repetitive mechanical loads in multiple directions under non-isothermal conditions. Over past few decades, several multiaxial fatigue life estimation models (stress- and strain-based) developed for isothermal conditions. Historically, numerous fatigue life prediction models also developed for thermomechanical fatigue (TMF) life prediction, predominantly for uniaxial mechanical loading conditions. Realistic structural components encounter multiaxial loads and non-isothermal loading conditions, which increase potential for interaction of damage modes. A need exists for mechanical testing and development verification of life prediction models under such conditions.
Computational ecology as an emerging science
Petrovskii, Sergei; Petrovskaya, Natalia
2012-01-01
It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336
Numerical simulation of immiscible viscous fingering using adaptive unstructured meshes
NASA Astrophysics Data System (ADS)
Adam, A.; Salinas, P.; Percival, J. R.; Pavlidis, D.; Pain, C.; Muggeridge, A. H.; Jackson, M.
2015-12-01
Displacement of one fluid by another in porous media occurs in various settings including hydrocarbon recovery, CO2 storage and water purification. When the invading fluid is of lower viscosity than the resident fluid, the displacement front is subject to a Saffman-Taylor instability and is unstable to transverse perturbations. These instabilities can grow, leading to fingering of the invading fluid. Numerical simulation of viscous fingering is challenging. The physics is controlled by a complex interplay of viscous and diffusive forces and it is necessary to ensure physical diffusion dominates numerical diffusion to obtain converged solutions. This typically requires the use of high mesh resolution and high order numerical methods. This is computationally expensive. We demonstrate here the use of a novel control volume - finite element (CVFE) method along with dynamic unstructured mesh adaptivity to simulate viscous fingering with higher accuracy and lower computational cost than conventional methods. Our CVFE method employs a discontinuous representation for both pressure and velocity, allowing the use of smaller control volumes (CVs). This yields higher resolution of the saturation field which is represented CV-wise. Moreover, dynamic mesh adaptivity allows high mesh resolution to be employed where it is required to resolve the fingers and lower resolution elsewhere. We use our results to re-examine the existing criteria that have been proposed to govern the onset of instability.Mesh adaptivity requires the mapping of data from one mesh to another. Conventional methods such as consistent interpolation do not readily generalise to discontinuous fields and are non-conservative. We further contribute a general framework for interpolation of CV fields by Galerkin projection. The method is conservative, higher order and yields improved results, particularly with higher order or discontinuous elements where existing approaches are often excessively diffusive.
The Statistical Mechanics of Ideal MHD Turbulence
NASA Technical Reports Server (NTRS)
Shebalin, John V.
2003-01-01
Turbulence is a universal, nonlinear phenomenon found in all energetic fluid and plasma motion. In particular. understanding magneto hydrodynamic (MHD) turbulence and incorporating its effects in the computation and prediction of the flow of ionized gases in space, for example, are great challenges that must be met if such computations and predictions are to be meaningful. Although a general solution to the "problem of turbulence" does not exist in closed form, numerical integrations allow us to explore the phase space of solutions for both ideal and dissipative flows. For homogeneous, incompressible turbulence, Fourier methods are appropriate, and phase space is defined by the Fourier coefficients of the physical fields. In the case of ideal MHD flows, a fairly robust statistical mechanics has been developed, in which the symmetry and ergodic properties of phase space is understood. A discussion of these properties will illuminate our principal discovery: Coherent structure and randomness co-exist in ideal MHD turbulence. For dissipative flows, as opposed to ideal flows, progress beyond the dimensional analysis of Kolmogorov has been difficult. Here, some possible future directions that draw on the ideal results will also be discussed. Our conclusion will be that while ideal turbulence is now well understood, real turbulence still presents great challenges.
Volkman, Sarah K.; Ahouidi, Ambroise D.; Ndiaye, Daouda; Mboup, Souleymane; Wirth, Dyann F.
2014-01-01
A challenge to conducting high-impact and reproducible studies of the mechanisms of P. falciparum drug resistance, invasion, virulence, and immunity is the lack of robust and sustainable in vitro culture in the field. While the technology exists and is routinely utilized in developed countries, various factors–from cost, to supply, to quality–make it hard to implement in malaria endemic countries. Here, we design and rigorously evaluate an adjustable gas-mixing device for the in vitro culture of P. falciparum parasites in the field to circumvent this challenge. The device accurately replicates the gas concentrations needed to culture laboratory isolates, short-term adapted field isolates, cryopreserved previously non-adapted isolates, as well as to adapt ex vivo isolates to in vitro culture in the field. We also show an advantage over existing alternatives both in cost and in supply. Furthermore, the adjustable nature of the device makes it an ideal tool for many applications in which varied gas concentrations could be critical to culture success. This adjustable gas-mixing device will dramatically improve the feasibility of in vitro culture of Plasmodium falciparum parasites in malaria endemic countries given its numerous advantages. PMID:24603696
NASA Astrophysics Data System (ADS)
Akhtar, Taimoor; Shoemaker, Christine
2016-04-01
Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.
Appalachian residents’ experiences with and management of multiple morbidity
Schoenberg, Nancy E.; Bardach, Shoshana H.; Manchikanti, Kavita N.; Goodenow, Anne C.
2011-01-01
Approximately three quarters of middle aged and older adults have at least two simultaneously occurring chronic conditions (“multiple morbidity” or MM), a trend expected to increase dramatically throughout the world. Rural residents, who tend to have fewer personal and health resources, are more likely to experience MM. To improve our understanding of the ways in which vulnerable, rural residents in the U.S. experience and manage MM, we interviewed twenty rural Appalachian residents with MM. We identified the following themes; (a) MM has multifaceted challenges and is viewed as more than the sum of its parts; (b) numerous challenges exist to optimal MM self-management, particularly in a rural, under-resourced context; however, (c) participants described strategic methods of managing multiple chronic conditions, including prioritizing certain conditions and management strategies and drawing heavily on assistance from informal and formal sources. PMID:21263063
Multiscale Integration of -Omic, Imaging, and Clinical Data in Biomedical Informatics
Phan, John H.; Quo, Chang F.; Cheng, Chihwen; Wang, May Dongmei
2016-01-01
This paper reviews challenges and opportunities in multiscale data integration for biomedical informatics. Biomedical data can come from different biological origins, data acquisition technologies, and clinical applications. Integrating such data across multiple scales (e.g., molecular, cellular/tissue, and patient) can lead to more informed decisions for personalized, predictive, and preventive medicine. However, data heterogeneity, community standards in data acquisition, and computational complexity are big challenges for such decision making. This review describes genomic and proteomic (i.e., molecular), histopathological imaging (i.e., cellular/tissue), and clinical (i.e., patient) data; it includes case studies for single-scale (e.g., combining genomic or histopathological image data), multiscale (e.g., combining histopathological image and clinical data), and multiscale and multiplatform (e.g., the Human Protein Atlas and The Cancer Genome Atlas) data integration. Numerous opportunities exist in biomedical informatics research focusing on integration of multiscale and multiplatform data. PMID:23231990
Multiscale integration of -omic, imaging, and clinical data in biomedical informatics.
Phan, John H; Quo, Chang F; Cheng, Chihwen; Wang, May Dongmei
2012-01-01
This paper reviews challenges and opportunities in multiscale data integration for biomedical informatics. Biomedical data can come from different biological origins, data acquisition technologies, and clinical applications. Integrating such data across multiple scales (e.g., molecular, cellular/tissue, and patient) can lead to more informed decisions for personalized, predictive, and preventive medicine. However, data heterogeneity, community standards in data acquisition, and computational complexity are big challenges for such decision making. This review describes genomic and proteomic (i.e., molecular), histopathological imaging (i.e., cellular/tissue), and clinical (i.e., patient) data; it includes case studies for single-scale (e.g., combining genomic or histopathological image data), multiscale (e.g., combining histopathological image and clinical data), and multiscale and multiplatform (e.g., the Human Protein Atlas and The Cancer Genome Atlas) data integration. Numerous opportunities exist in biomedical informatics research focusing on integration of multiscale and multiplatform data.
Biochemical and physiological MR imaging of skeletal muscle at 7 tesla and above.
Chang, Gregory; Wang, Ligong; Cárdenas-Blanco, Arturo; Schweitzer, Mark E; Recht, Michael P; Regatte, Ravinder R
2010-06-01
Ultra-high field (UHF; >or=7 T) magnetic resonance imaging (MRI), with its greater signal-to-noise ratio, offers the potential for increased spatial resolution, faster scanning, and, above all, improved biochemical and physiological imaging of skeletal muscle. The increased spectral resolution and greater sensitivity to low-gamma nuclei available at UHF should allow techniques such as (1)H MR spectroscopy (MRS), (31)P MRS, and (23)Na MRI to be more easily implemented. Numerous technical challenges exist in the performance of UHF MRI, including changes in relaxation values, increased chemical shift and susceptibility artifact, radiofrequency (RF) coil design/B (1)(+) field inhomogeneity, and greater RF energy deposition. Nevertheless, the possibility of improved functional and metabolic imaging at UHF will likely drive research efforts in the near future to overcome these challenges and allow studies of human skeletal muscle physiology and pathophysiology to be possible at >or=7 T.
Sparse Coding and Counting for Robust Visual Tracking
Liu, Risheng; Wang, Jing; Shang, Xiaoke; Wang, Yiyang; Su, Zhixun; Cai, Yu
2016-01-01
In this paper, we propose a novel sparse coding and counting method under Bayesian framework for visual tracking. In contrast to existing methods, the proposed method employs the combination of L0 and L1 norm to regularize the linear coefficients of incrementally updated linear basis. The sparsity constraint enables the tracker to effectively handle difficult challenges, such as occlusion or image corruption. To achieve real-time processing, we propose a fast and efficient numerical algorithm for solving the proposed model. Although it is an NP-hard problem, the proposed accelerated proximal gradient (APG) approach is guaranteed to converge to a solution quickly. Besides, we provide a closed solution of combining L0 and L1 regularized representation to obtain better sparsity. Experimental results on challenging video sequences demonstrate that the proposed method achieves state-of-the-art results both in accuracy and speed. PMID:27992474
Howe, Adina; Chain, Patrick S. G.
2015-07-09
Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs), providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats. While numerous tools have been developed based on these methodological concepts, theymore » present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howe, Adina; Chain, Patrick S. G.
Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs), providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats. While numerous tools have been developed based on these methodological concepts, theymore » present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow.« less
The exposome concept: a challenge and a potential driver for environmental health research.
Siroux, Valérie; Agier, Lydiane; Slama, Rémy
2016-06-01
The exposome concept was defined in 2005 as encompassing all environmental exposures from conception onwards, as a new strategy to evidence environmental disease risk factors. Although very appealing, the exposome concept is challenging in many respects. In terms of assessment, several hundreds of time-varying exposures need to be considered, but increasing the number of exposures assessed should not be done at the cost of increased exposure misclassification. Accurately assessing the exposome currently requires numerous measurements, which rely on different technologies; resulting in an expensive set of protocols. In the future, high-throughput 'omics technologies may be a promising technique to integrate a wide range of exposures from a small numbers of biological matrices. Assessing the association between many exposures and health raises statistical challenges. Due to the correlation structure of the exposome, existing statistical methods cannot fully and efficiently untangle the exposures truly affecting the health outcome from correlated exposures. Other statistical challenges relate to accounting for exposure misclassification or identifying synergistic effects between exposures. On-going exposome projects are trying to overcome technical and statistical challenges. From a public health perspective, a better understanding of the environmental risk factors should open the way to improved prevention strategies. Copyright ©ERS 2016.
Accuracy of Time Integration Approaches for Stiff Magnetohydrodynamics Problems
NASA Astrophysics Data System (ADS)
Knoll, D. A.; Chacon, L.
2003-10-01
The simulation of complex physical processes with multiple time scales presents a continuing challenge to the computational plasma physisist due to the co-existence of fast and slow time scales. Within computational plasma physics, practitioners have developed and used linearized methods, semi-implicit methods, and time splitting in an attempt to tackle such problems. All of these methods are understood to generate numerical error. We are currently developing algorithms which remove such error for MHD problems [1,2]. These methods do not rely on linearization or time splitting. We are also attempting to analyze the errors introduced by existing ``implicit'' methods using modified equation analysis (MEA) [3]. In this presentation we will briefly cover the major findings in [3]. We will then extend this work further into MHD. This analysis will be augmented with numerical experiments with the hope of gaining insight, particularly into how these errors accumulate over many time steps. [1] L. Chacon,. D.A. Knoll, J.M. Finn, J. Comput. Phys., vol. 178, pp. 15-36 (2002) [2] L. Chacon and D.A. Knoll, J. Comput. Phys., vol. 188, pp. 573-592 (2003) [3] D.A. Knoll , L. Chacon, L.G. Margolin, V.A. Mousseau, J. Comput. Phys., vol. 185, pp. 583-611 (2003)
Devasenapathy, Deepa; Kannan, Kathiravan
2015-01-01
The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate. PMID:25793221
Devasenapathy, Deepa; Kannan, Kathiravan
2015-01-01
The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate.
NASA Astrophysics Data System (ADS)
Spannenberg, Jescica; Atangana, Abdon; Vermeulen, P. D.
2017-09-01
Fractional differentiation has adequate use for investigating real world scenarios related to geological formations associated with elasticity, heterogeneity, viscoelasticity, and the memory effect. Since groundwater systems exist in these geological formations, modelling groundwater recharge as a real world scenario is a challenging task to do because existing recharge estimation methods are governed by linear equations which make use of constant field parameters. This is inadequate because in reality these parameters are a function of both space and time. This study therefore concentrates on modifying the recharge equation governing the EARTH model, by application of the Eton approach. Accordingly, this paper presents a modified equation which is non-linear, and accounts for parameters in a way that it is a function of both space and time. To be more specific, herein, recharge and drainage resistance which are parameters within the equation, became a function of both space and time. Additionally, the study entailed solving the non-linear equation using an iterative method as well as numerical solutions by means of the Crank-Nicolson scheme. The numerical solutions were used alongside the Riemann-Liouville, Caputo-Fabrizio, and Atangana-Baleanu derivatives, so that account was taken for elasticity, heterogeneity, viscoelasticity, and the memory effect. In essence, this paper presents a more adequate model for recharge estimation.
Venturing into new realms? Microorganisms in space.
Moissl-Eichinger, Christine; Cockell, Charles; Rettberg, Petra
2016-09-01
One of the biggest challenges of science is the determination of whether extraterrestrial life exists. Although potential habitable areas might be available for complex life, it is more likely that microbial life could exist in space. Many extremotolerant and extremophilic microbes have been found to be able to withstand numerous, combined environmental factors, such as high or low temperatures and pressures, high-salt conditions, high doses of radiation, desiccation or nutrient limitations. They may even survive the transit from one planet to another. Terrestrial Mars-analogue sites are one focus of researchers, in order to understand the microbial diversity in preparation for upcoming space missions aimed at the detection of life. However, such missions could also pose a risk with respect to contamination of the extraterrestrial environment by accidentally transferred terrestrial microorganisms. Closer to the Earth, the International Space Station is the most enclosed habitat, where humans work and live-and with them numerous microorganisms. It is still unknown how microbes adapt to this environment, possibly even creating a risk for the crew. Information on the microbiology of the ISS will have an impact on the planning and implementation of long-term human spaceflights in order to ensure a safe, stable and balanced microbiome on board. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The scientific challenges to forecasting and nowcasting the solar origins of space weather (Invited)
NASA Astrophysics Data System (ADS)
Schrijver, C. J.; Title, A. M.
2013-12-01
With the full-sphere continuous coverage of the Sun achieved by combining SDO and STEREO imagery comes the realization that solar activity is a manifestation of local processes that respond to long-range if not global influences. Numerical experiments provide insights into these couplings, as well as into the intricacies of destabilizations of field emerging into pre-existing configurations and evolving within the context of their dynamic surroundings. With these capabilities grows an understanding of the difficulties in forecasting of the solar origins of space weather: we need assimilative global non-potential field models, but our observational resources are too limited to meet that need.
The neurobiology of psychopathy.
Glenn, Andrea L; Raine, Adrian
2008-09-01
Numerous studies have tackled the complex challenge of understanding the neural substrates of psychopathy, revealing that brain abnormalities exist on several levels and in several structures. As we discover more about complex neural networks, it becomes increasingly difficult to clarify how these systems interact with each other to produce the distinct pattern of behavioral and personality characteristics observed in psychopathy. The authors review the recent research on the neurobiology of psychopathy, beginning with molecular neuroscience work and progressing to the level of brain structures and their connectivity. Potential factors that may affect the development of brain impairments, as well as how some systems may be targeted for potential treatment, are discussed.
Challenges in the management of community pharmacies in Malaysia
2017-01-01
Background: The provision of professional pharmacy services by community pharmacists continues to be limited, particularly in low and middle income countries. It was postulated that multiple management challenges faced by community pharmacists contribute to this situation. Objective: The primary aim of the research was to determine the challenges faced in the management of community pharmacies in Sarawak (the largest state in Malaysia), and practical strategies to cope and overcome the challenges. Methods: Semi-structured interviews were carried out with community pharmacists practising in Sarawak. Purposive and snowball sampling were employed to ensure a diverse group of informants. The interviews were audio-recorded and transcribed verbatim, with the resultant data analysed using thematic analysis. Data collection, coding, interpretation were carried out iteratively until theoretical saturation. Results: Twenty respondents from different demographic characteristics were recruited. Six major themes were identified. Management challenges faced by community pharmacists traverse five major domains: market competition, legislative issues, customers’ knowledge and expectations, macroeconomic impacts and operational challenges. Most of these challenges require government intervention to be resolved. In the meantime, improving customer service and expanding the range of professional services were seen as the most viable strategies to cope with existing challenges. The main concern is that current legislative and economic landscape may hinder these strategies. Enactment of dispensing separation and more protective measures against market competition were suggested to alleviate the challenges faced. Conclusion: Numerous management challenges faced by community pharmacists that distract them from delivering professional pharmacy services have been highlighted. Urgent affirmative actions by the government are warranted in supporting community pharmacists to realise and maximise their potentials. PMID:28690697
Manufacturing and Security Challenges in 3D Printing
NASA Astrophysics Data System (ADS)
Zeltmann, Steven Eric; Gupta, Nikhil; Tsoutsos, Nektarios Georgios; Maniatakos, Michail; Rajendran, Jeyavijayan; Karri, Ramesh
2016-07-01
As the manufacturing time, quality, and cost associated with additive manufacturing (AM) continue to improve, more and more businesses and consumers are adopting this technology. Some of the key benefits of AM include customizing products, localizing production and reducing logistics. Due to these and numerous other benefits, AM is enabling a globally distributed manufacturing process and supply chain spanning multiple parties, and hence raises concerns about the reliability of the manufactured product. In this work, we first present a brief overview of the potential risks that exist in the cyber-physical environment of additive manufacturing. We then evaluate the risks posed by two different classes of modifications to the AM process which are representative of the challenges that are unique to AM. The risks posed are examined through mechanical testing of objects with altered printing orientation and fine internal defects. Finite element analysis and ultrasonic inspection are also used to demonstrate the potential for decreased performance and for evading detection. The results highlight several scenarios, intentional or unintentional, that can affect the product quality and pose security challenges for the additive manufacturing supply chain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Robert; Ang, James; Bergman, Keren
2014-02-10
Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
....'' Second, the Exchange proposes replacing existing paragraph (c)(4) of Rule 52.4, entitled ``Numerical... eliminate the ability of the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1... existing paragraph (c)(2), which provides flexibility to the Exchange to use different Numerical Guidelines...
Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang
2014-01-01
Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible. PMID:25745272
Gravitational Wave Science: Challenges for Numerical Relativistic Astrophysics
NASA Technical Reports Server (NTRS)
Cenrella, Joan
2005-01-01
Gravitational wave detectors on earth and in space will open up a new observational window on the universe. The new information about astrophysics and fundamental physics these observations will bring is expected to pose exciting challenges. This talk will provide an overview of this emerging area of gravitational wave science, with a focus on the challenges it will bring for numerical relativistic astrophysics and a look at some recent results.
2011-01-01
Background Hypertension may increase tortuosity or twistedness of arteries. We applied a centerline extraction algorithm and tortuosity metric to magnetic resonance angiography (MRA) brain images to quantitatively measure the tortuosity of arterial vessel centerlines. The most commonly used arterial tortuosity measure is the distance factor metric (DFM). This study tested a DFM based measurement’s ability to detect increases in arterial tortuosity of hypertensives using existing images. Existing images presented challenges such as different resolutions which may affect the tortuosity measurement, different depths of the area imaged, and different artifacts of imaging that require filtering. Methods The stability and accuracy of alternative centerline algorithms was validated in numerically generated models and test brain MRA data. Existing images were gathered from previous studies and clinical medical systems by manually reading electronic medical records to identify hypertensives and negatives. Images of different resolutions were interpolated to similar resolutions. Arterial tortuosity in MRA images was measured from a DFM curve and tested on numerically generated models as well as MRA images from two hypertensive and three negative control populations. Comparisons were made between different resolutions, different filters, hypertensives versus negatives, and different negative controls. Results In tests using numerical models of a simple helix, the measured tortuosity increased as expected with more tightly coiled helices. Interpolation reduced resolution-dependent differences in measured tortuosity. The Korean hypertensive population had significantly higher arterial tortuosity than its corresponding negative control population across multiple arteries. In addition one negative control population of different ethnicity had significantly less arterial tortuosity than the other two. Conclusions Tortuosity can be compared between images of different resolutions by interpolating from lower to higher resolutions. Use of a universal negative control was not possible in this study. The method described here detected elevated arterial tortuosity in a hypertensive population compared to the negative control population and can be used to study this relation in other populations. PMID:22166145
Diedrich, Karl T; Roberts, John A; Schmidt, Richard H; Kang, Chang-Ki; Cho, Zang-Hee; Parker, Dennis L
2011-10-18
Hypertension may increase tortuosity or twistedness of arteries. We applied a centerline extraction algorithm and tortuosity metric to magnetic resonance angiography (MRA) brain images to quantitatively measure the tortuosity of arterial vessel centerlines. The most commonly used arterial tortuosity measure is the distance factor metric (DFM). This study tested a DFM based measurement's ability to detect increases in arterial tortuosity of hypertensives using existing images. Existing images presented challenges such as different resolutions which may affect the tortuosity measurement, different depths of the area imaged, and different artifacts of imaging that require filtering. The stability and accuracy of alternative centerline algorithms was validated in numerically generated models and test brain MRA data. Existing images were gathered from previous studies and clinical medical systems by manually reading electronic medical records to identify hypertensives and negatives. Images of different resolutions were interpolated to similar resolutions. Arterial tortuosity in MRA images was measured from a DFM curve and tested on numerically generated models as well as MRA images from two hypertensive and three negative control populations. Comparisons were made between different resolutions, different filters, hypertensives versus negatives, and different negative controls. In tests using numerical models of a simple helix, the measured tortuosity increased as expected with more tightly coiled helices. Interpolation reduced resolution-dependent differences in measured tortuosity. The Korean hypertensive population had significantly higher arterial tortuosity than its corresponding negative control population across multiple arteries. In addition one negative control population of different ethnicity had significantly less arterial tortuosity than the other two. Tortuosity can be compared between images of different resolutions by interpolating from lower to higher resolutions. Use of a universal negative control was not possible in this study. The method described here detected elevated arterial tortuosity in a hypertensive population compared to the negative control population and can be used to study this relation in other populations.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... Exchange is proposing to replace existing paragraph (c)(4) of Rule 11.13, entitled ``Numerical Guidelines... the ability of the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1... of existing paragraph (c)(2), which provides flexibility to the Exchange to use different Numerical...
NASA Astrophysics Data System (ADS)
Singh, Rakesh Kumar; Ramadas, C.; Balachandra Shetty, P.; Satyanarayana, K. G.
2017-04-01
Considering the superior strength properties of polymer based composites over metallic materials, they are being used in primary structures of aircrafts. However, these polymeric materials are much more complex in behaviour due to their structural anisotropy along with existence of different materials unlike in metallic alloys. These pose challenge in flaw detection, residual strength determination and life of a structure with their high susceptibility to impact damage in the form of delaminations/disbonds or cracks. This reduces load-bearing capability and potentially leads to structural failure. With this background, this study presents a method to identify location of delamination interface along thickness of a laminate. Both numerical and experimental studies have been carried out with a view to identify the defect, on propagation, mode conversion and scattering characteristics of fundamental anti-symmetric Lamb mode (Ao) when it passed through a semi-infinite delamination. Further, the reflection and transmission scattering coefficients based on power and amplitude ratios of the scattered waves have been computed. The methodology was applied on numerically simulated delaminations to illustrate the efficacy of the method. Results showed that it could successfully identify delamination interface.
NASA Astrophysics Data System (ADS)
Liu, Changying; Iserles, Arieh; Wu, Xinyuan
2018-03-01
The Klein-Gordon equation with nonlinear potential occurs in a wide range of application areas in science and engineering. Its computation represents a major challenge. The main theme of this paper is the construction of symmetric and arbitrarily high-order time integrators for the nonlinear Klein-Gordon equation by integrating Birkhoff-Hermite interpolation polynomials. To this end, under the assumption of periodic boundary conditions, we begin with the formulation of the nonlinear Klein-Gordon equation as an abstract second-order ordinary differential equation (ODE) and its operator-variation-of-constants formula. We then derive a symmetric and arbitrarily high-order Birkhoff-Hermite time integration formula for the nonlinear abstract ODE. Accordingly, the stability, convergence and long-time behaviour are rigorously analysed once the spatial differential operator is approximated by an appropriate positive semi-definite matrix, subject to suitable temporal and spatial smoothness. A remarkable characteristic of this new approach is that the requirement of temporal smoothness is reduced compared with the traditional numerical methods for PDEs in the literature. Numerical results demonstrate the advantage and efficiency of our time integrators in comparison with the existing numerical approaches.
Computational domain discretization in numerical analysis of flow within granular materials
NASA Astrophysics Data System (ADS)
Sosnowski, Marcin
2018-06-01
The discretization of computational domain is a crucial step in Computational Fluid Dynamics (CFD) because it influences not only the numerical stability of the analysed model but also the agreement of obtained results and real data. Modelling flow in packed beds of granular materials is a very challenging task in terms of discretization due to the existence of narrow spaces between spherical granules contacting tangentially in a single point. Standard approach to this issue results in a low quality mesh and unreliable results in consequence. Therefore the common method is to reduce the diameter of the modelled granules in order to eliminate the single-point contact between the individual granules. The drawback of such method is the adulteration of flow and contact heat resistance among others. Therefore an innovative method is proposed in the paper: single-point contact is extended to a cylinder-shaped volume contact. Such approach eliminates the low quality mesh elements and simultaneously introduces only slight distortion to the flow as well as contact heat transfer. The performed analysis of numerous test cases prove the great potential of the proposed method of meshing the packed beds of granular materials.
NASA Astrophysics Data System (ADS)
Fauzi, Ahmad
2017-11-01
Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.
When good times go bad: managing 'legal high' complications in the emergency department.
Caffrey, Charles R; Lank, Patrick M
2018-01-01
Patients can use numerous drugs that exist outside of existing regulatory statutes in order to get "legal highs." Legal psychoactive substances represent a challenge to the emergency medicine physician due to the sheer number of available agents, their multiple toxidromes and presentations, their escaping traditional methods of analysis, and the reluctance of patients to divulge their use of these agents. This paper endeavors to cover a wide variety of "legal highs," or uncontrolled psychoactive substances that may have abuse potential and may result in serious toxicity. These agents include not only some novel psychoactive substances aka "designer drugs," but also a wide variety of over-the-counter medications, herbal supplements, and even a household culinary spice. The care of patients in the emergency department who have used "legal high" substances is challenging. Patients may misunderstand the substance they have been exposed to, there are rarely any readily available laboratory confirmatory tests for these substances, and the exact substances being abused may change on a near-daily basis. This review will attempt to group legal agents into expected toxidromes and discuss associated common clinical manifestations and management. A focus on aggressive symptom-based supportive care as well as management of end-organ dysfunction is the mainstay of treatment for these patients in the emergency department.
Dahmen, Jessamyn; Cook, Diane J; Wang, Xiaobo; Honglei, Wang
2017-08-01
Smart home design has undergone a metamorphosis in recent years. The field has evolved from designing theoretical smart home frameworks and performing scripted tasks in laboratories. Instead, we now find robust smart home technologies that are commonly used by large segments of the population in a variety of settings. Recent smart home applications are focused on activity recognition, health monitoring, and automation. In this paper, we take a look at another important role for smart homes: security. We first explore the numerous ways smart homes can and do provide protection for their residents. Next, we provide a comparative analysis of the alternative tools and research that has been developed for this purpose. We investigate not only existing commercial products that have been introduced but also discuss the numerous research that has been focused on detecting and identifying potential threats. Finally, we close with open challenges and ideas for future research that will keep individuals secure and healthy while in their own homes.
NASA Astrophysics Data System (ADS)
Williamson, C. E.; Weathers, K. C.; Knoll, L. B.; Brentrup, J.
2012-12-01
Recent rapid advances in sensor technology and cyberinfrastructure have enabled the development of numerous environmental observatories ranging from local networks at field stations and marine laboratories (FSML) to continental scale observatories such as the National Ecological Observatory Network (NEON) to global scale observatories such as the Global Lake Ecological Observatory Network (GLEON). While divergent goals underlie the initial development of these observatories, and they are often designed to serve different communities, many opportunities for synergies exist. In addition, the use of existing infrastructure may enhance the cost-effectiveness of building and maintaining large scale observatories. For example, FSMLs are established facilities with the staff and infrastructure to host sensor nodes of larger networks. Many field stations have existing staff and long-term databases as well as smaller sensor networks that are the product of a single or small group of investigators with a unique data management system embedded in a local or regional community. These field station based facilities and data are a potentially untapped gold mine for larger continental and global scale observatories; common ecological and environmental challenges centered on understanding the impacts of changing climate, land use, and invasive species often underlie these efforts. The purpose of this talk is to stimulate a dialog on the challenges of merging efforts across these different spatial and temporal scales, as well as addressing how to develop synergies among observatory networks with divergent roots and philosophical approaches. For example, FSMLs have existing long-term databases and facilities, while NEON has sparse past data but a well-developed template and closely coordinated team working in a coherent format across a continental scale. GLEON on the other hand is a grass-roots network of experts in science, information technology, and engineering with a common goal of building a scalable network around the world to understand and predict how lakes respond to global change. Creating synergies among networks at these divergent scales requires open discussions ranging from data collection and management to data serving and sharing. Coordination of these efforts can provide an additional opportunity to educate both students and the public in innovative new ways about the broader continental to global scale of ecological and environmental challenges that they have observed in their more local ecosystems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mikael
Advanced nuclear fuel cycles rely on successful chemical separation of various elements in the used fuel. Numerous solvent extraction (SX) processes have been developed for the recovery and purification of metal ions from this used material. However, the predictability of process operations has been challenged by the lack of a fundamental understanding of the chemical interactions in several of these separation systems. For example, gaps in the thermodynamic description of the mechanism and the complexes formed will make predictions very challenging. Recent studies of certain extraction systems under development and a number of more established SX processes have suggested thatmore » aggregate formation in the organic phase results in a transformation of its selectivity and efficiency. Aggregation phenomena have consistently been interfering in SX process development, and have, over the years, become synonymous with an undesirable effect that must be prevented. This multiyear, multicollaborative research effort was carried out to study solvation and self-organization in non-aqueous solutions at conditions promoting aggregation phenomena. Our approach to this challenging topic was to investigate extraction systems comprising more than one extraction reagent where synergy of the metal ion could be observed. These systems were probed for the existence of stable microemulsions in the organic phase, and a number of high-end characterization tools were employed to elucidate the role of the aggregates in metal ion extraction. The ultimate goal was to find connections between synergy of metal ion extraction and reverse micellar formation. Our main accomplishment for this project was the expansion of the understanding of metal ion complexation in the extraction system combining tributyl phosphate (TBP) and dibutyl phosphoric acid (HDBP). We have found that for this system no direct correlation exists for the metal ion extraction and the formation of aggregates, meaning that the metal ion is not solubilized in a reverse micelle core. Rather we have found solid evidence that the metal ions are extracted and coordinated by the organic ligands as suggested by classic SX theories. However, we have challenged the existence of mixed complexes that have been suggested to exist in this particular extraction system. Most importantly we have generated a wealth of information and trained students on important lab techniques and strengthened the collaboration between the DOE national laboratories and US educational institution involved in this work.« less
A systematic literature review of Burgers' equation with recent advances
NASA Astrophysics Data System (ADS)
Bonkile, Mayur P.; Awasthi, Ashish; Lakshmi, C.; Mukundan, Vijitha; Aswin, V. S.
2018-06-01
Even if numerical simulation of the Burgers' equation is well documented in the literature, a detailed literature survey indicates that gaps still exist for comparative discussion regarding the physical and mathematical significance of the Burgers' equation. Recently, an increasing interest has been developed within the scientific community, for studying non-linear convective-diffusive partial differential equations partly due to the tremendous improvement in computational capacity. Burgers' equation whose exact solution is well known, is one of the famous non-linear partial differential equations which is suitable for the analysis of various important areas. A brief historical review of not only the mathematical, but also the physical significance of the solution of Burgers' equation is presented, emphasising current research strategies, and the challenges that remain regarding the accuracy, stability and convergence of various schemes are discussed. One of the objectives of this paper is to discuss the recent developments in mathematical modelling of Burgers' equation and thus open doors for improvement. No claim is made that the content of the paper is new. However, it is a sincere effort to outline the physical and mathematical importance of Burgers' equation in the most simplified ways. We throw some light on the plethora of challenges which need to be overcome in the research areas and give motivation for the next breakthrough to take place in a numerical simulation of ordinary / partial differential equations.
A memetic optimization algorithm for multi-constrained multicast routing in ad hoc networks
Hammad, Karim; El Bakly, Ahmed M.
2018-01-01
A mobile ad hoc network is a conventional self-configuring network where the routing optimization problem—subject to various Quality-of-Service (QoS) constraints—represents a major challenge. Unlike previously proposed solutions, in this paper, we propose a memetic algorithm (MA) employing an adaptive mutation parameter, to solve the multicast routing problem with higher search ability and computational efficiency. The proposed algorithm utilizes an updated scheme, based on statistical analysis, to estimate the best values for all MA parameters and enhance MA performance. The numerical results show that the proposed MA improved the delay and jitter of the network, while reducing computational complexity as compared to existing algorithms. PMID:29509760
A memetic optimization algorithm for multi-constrained multicast routing in ad hoc networks.
Ramadan, Rahab M; Gasser, Safa M; El-Mahallawy, Mohamed S; Hammad, Karim; El Bakly, Ahmed M
2018-01-01
A mobile ad hoc network is a conventional self-configuring network where the routing optimization problem-subject to various Quality-of-Service (QoS) constraints-represents a major challenge. Unlike previously proposed solutions, in this paper, we propose a memetic algorithm (MA) employing an adaptive mutation parameter, to solve the multicast routing problem with higher search ability and computational efficiency. The proposed algorithm utilizes an updated scheme, based on statistical analysis, to estimate the best values for all MA parameters and enhance MA performance. The numerical results show that the proposed MA improved the delay and jitter of the network, while reducing computational complexity as compared to existing algorithms.
Workplace violence in hospitals: safe havens no more.
Warren, Bryan
2011-01-01
Healthcare presents many security challenges, particularly when it comes to workplace violence prevention. With a staff population that is approximately 80% female, 24-hour operations, numerous points of ingress and egress, and the high tension environment that exists in today's hospitals and urgent care centers, the stage is set for the "perfect storm" of workplace violence, the author points out. He cites statistics that healthcare workers are at a much higher risk of victimization than workers in other industries. The best strategy to prevent workplace violence in the healthcare environment, he says, is to develop a corporate culture that supports respect, open communication, employee involvement and participation and an effective training program.
Loisel, Patrick; Buchbinder, Rachelle; Hazard, Rowland; Keller, Robert; Scheel, Inger; van Tulder, Maurits; Webster, Barbara
2005-12-01
The process of returning disabled workers to work presents numerous challenges. In spite of the growing evidence regarding work disability prevention, little uptake of this evidence has been observed. One reason for limited dissemination of evidence is the complexity of the problem, as it is subject to multiple legal, administrative, social, political, and cultural challenges. A literature review and collection of experts' opinion is presented, on the current evidence for work disability prevention, and barriers to evidence implementation. Recommendations are presented for enhancing implementation of research results. The current evidence regarding work disability prevention shows that some clinical interventions (advice to return to modified work and graded activity programs) and some non-clinical interventions (at a service and policy/community level but not at a practice level) are effective in reducing work absenteeism. Implementation of evidence in work disability is a major challenge because intervention recommendations are often imprecise and not yet practical for immediate use, many barriers exist, and many stakeholders are involved. Future studies should involve all relevant stakeholders and aim at developing new strategies that are effective, efficient, and have a potential for successful implementation. These studies should be based upon a clearer conceptualization of the broader context and inter-relationships that determine return to work outcomes.
ERIC Educational Resources Information Center
Marchetti, Carol; Foster, Susan; Long, Gary; Stinson, Michael
2012-01-01
Teachers of introductory technical courses such as statistics face numerous challenges in the classroom, including student motivation and mathematical background, and difficulties in interpreting numerical results in context. Cooperative learning through small groups addresses many such challenges, but students for whom spoken English is not their…
NASA Astrophysics Data System (ADS)
Abbasbandy, S.; Van Gorder, R. A.; Hajiketabi, M.; Mesrizadeh, M.
2015-10-01
We consider traveling wave solutions to the Casimir equation for the Ito system (a two-field extension of the KdV equation). These traveling waves are governed by a nonlinear initial value problem with an interesting nonlinearity (which actually amplifies in magnitude as the size of the solution becomes small). The nonlinear problem is parameterized by two initial constant values, and we demonstrate that the existence of solutions is strongly tied to these parameter values. For our interests, we are concerned with positive, bounded, periodic wave solutions. We are able to classify parameter regimes which admit such solutions in full generality, thereby obtaining a nice existence result. Using the existence result, we are then able to numerically simulate the positive, bounded, periodic solutions. We elect to employ a group preserving scheme in order to numerically study these solutions, and an outline of this approach is provided. The numerical simulations serve to illustrate the properties of these solutions predicted analytically through the existence result. Physically, these results demonstrate the existence of a type of space-periodic structure in the Casimir equation for the Ito model, which propagates as a traveling wave.
A suite of benchmark and challenge problems for enhanced geothermal systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark; Fu, Pengcheng; McClure, Mark
A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilitiesmore » to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners. We present the suite of benchmark and challenge problems developed for the GTO-CCS, providing problem descriptions and sample solutions.« less
Building a genome analysis pipeline to predict disease risk and prevent disease.
Bromberg, Y
2013-11-01
Reduced costs and increased speed and accuracy of sequencing can bring the genome-based evaluation of individual disease risk to the bedside. While past efforts have identified a number of actionable mutations, the bulk of genetic risk remains hidden in sequence data. The biggest challenge facing genomic medicine today is the development of new techniques to predict the specifics of a given human phenome (set of all expressed phenotypes) encoded by each individual variome (full set of genome variants) in the context of the given environment. Numerous tools exist for the computational identification of the functional effects of a single variant. However, the pipelines taking advantage of full genomic, exomic, transcriptomic (and other) sequences have only recently become a reality. This review looks at the building of methodologies for predicting "variome"-defined disease risk. It also discusses some of the challenges for incorporating such a pipeline into everyday medical practice. © 2013. Published by Elsevier Ltd. All rights reserved.
Management of refractory pityriasis rubra pilaris: challenges and solutions
Moretta, Gaia; De Luca, Erika V; Di Stefani, Alessandro
2017-01-01
Pityriasis rubra pilaris (PRP) is a rare chronic inflammatory papulosquamous skin disease. Its clinical presentation and evolution is very variable. The most frequent clinical features are follicular papules, progressing to yellow-orange erythroderma with round small areas of normal skin and the well-demarcated palmoplantar keratoderma. Actually, six different types of PRP have been described based on clinical characteristics, age of onset, and prognosis. The pathogenesis is still unknown, and treatment can be challenging. Available treatments are mainly based on case reports or case series of clinical experience because no controlled randomized trials have never been performed because of the rarity of the condition. Traditional systemic treatment consists in retinoids, which are actually considered as first-line therapy, but refractory cases that do not respond or relapse after drug interruption do exist. In recent years, numerous reports have demonstrated the efficacy of new agents such as biological drugs. This article is an overview on available therapeutic options, in particular for refractory forms of PRP. PMID:29184428
Challenges in reducing dengue burden; diagnostics, control measures and vaccines.
Lam, Sai Kit
2013-09-01
Dengue is a major public health concern worldwide, with the number of infections increasing globally. The illness imposes the greatest economic and human burden on developing countries that have limited resources to deal with the scale of the problem. No cure for dengue exists; treatment is limited to rehydration therapy, and with vector control strategies proving to be relatively ineffective, a vaccine is an urgent priority. Despite the numerous challenges encountered in the development of a dengue vaccine, several vaccine candidates have shown promise in clinical development and it is believed that a vaccination program would be at least as cost-effective as current vector control programs. The lead candidate vaccine is a tetravalent, live attenuated, recombinant vaccine, which is currently in Phase III clinical trials. Vaccine introduction is a complex process that requires consideration and is discussed here. This review discusses the epidemiology, burden and pathogenesis of dengue, as well as the vaccine candidates currently in clinical development.
Clinical neuroscience of addiction: similarities and differences between alcohol and other drugs.
Karoly, Hollis C; YorkWilliams, Sophie L; Hutchison, Kent E
2015-11-01
Existing pharmacological treatments for alcohol use disorder (AUD) and other substance use disorders (SUDs) have demonstrated only modest efficacy. Although the field has recently emphasized testing and developing new compounds to treat SUDs, there are numerous challenges inherent to the development of novel medications, and this is particularly true for SUDs. Thus, research to date has tended toward the "repurposing" approach, in which medications developed to treat other mental or physical conditions are tested as SUD treatments. Often, potential treatments are examined across numerous drugs of abuse. Several repurposed medications have shown promise in treating a specific SUD, but few have shown efficacy across multiple SUDs. Examining similarities and differences between AUD and other SUDs may shed light on these findings and offer directions for future research. This qualitative review discusses similarities and differences in neural circuitry and molecular mechanism(s) across alcohol and other substances of abuse, and examines studies of pharmacotherapies for AUD and other SUDs. Substances of abuse share numerous molecular targets and involve much of the same neural circuitry, yet compounds tested because they putatively target common mechanisms have rarely indicated therapeutic promise for multiple SUDs. The lack of treatment efficacy across SUDs may be partially explained by limitations inherent in studying substance users, who comprise a highly heterogeneous population. Alternatively, medications may fail to show efficacy across multiple SUDs due to the fact that the differences between drug mechanisms are more important than their commonalities in terms of influencing treatment response. We suggest that exploring these differences could support novel treatment development, aid in identifying existing medications that may hold promise as treatments for specific SUDs, and ultimately advance translational research efforts. Copyright © 2015 by the Research Society on Alcoholism.
Modeling and Analysis of Wrinkled Membranes: An Overview
NASA Technical Reports Server (NTRS)
Yang, B.; Ding, H.; Lou, M.; Fang, H.; Broduer, Steve (Technical Monitor)
2001-01-01
Thin-film membranes are basic elements of a variety of space inflatable/deployable structures. Wrinkling degrades the performance and reliability of these membrane structures, and hence has been a topic of continued interest. Wrinkling analysis of membranes for general geometry and arbitrary boundary conditions is quite challenging. The objective of this presentation is two-fold. Firstly, the existing models of wrinkled membranes and related numerical solution methods are reviewed. The important issues to be discussed are the capability of a membrane model to characterize taut, wrinkled and slack states of membranes in a consistent and physically reasonable manner; the ability of a wrinkling analysis method to predict the formation and growth of wrinkled regions, and to determine out-of-plane deformation and wrinkled waves; the convergence of a numerical solution method for wrinkling analysis; and the compatibility of a wrinkling analysis with general-purpose finite element codes. According to this review, several opening issues in modeling and analysis of wrinkled membranes that are to be addressed in future research are summarized, The second objective of this presentation is to discuss a newly developed membrane model of two viable parameters (2-VP model) and associated parametric finite element method (PFEM) for wrinkling analysis are introduced. The innovations and advantages of the proposed membrane model and PFEM-based wrinkling analysis are: (1) Via a unified stress-strain relation; the 2-VP model treat the taut, wrinkled, and slack states of membranes consistently; (2) The PFEM-based wrinkling analysis has guaranteed convergence; (3) The 2-VP model along with PFEM is capable of predicting membrane out-of-plane deformations; and (4) The PFEM can be integrated into any existing finite element code. Preliminary numerical examples are also included in this presentation to demonstrate the 2-VP model and PFEM-based wrinkling analysis approach.
Confidentiality in participatory research: Challenges from one study.
Petrova, Elmira; Dewing, Jan; Camilleri, Michelle
2016-06-01
This article presents key ethical challenges that were encountered when conducting a participatory qualitative research project with a very specific, small group of nurses, in this case with practice development nurses in Malta. With the small number of nurses employed in practice development roles in Malta, there are numerous difficulties of maintaining confidentiality. Poorly constructed interventions by the researcher could have resulted in detrimental effects to research participants and the overall trustworthiness of the research. Generally, ethical guidelines for research exist to reinforce validity of research; however, there is not an established consensus on how these strategies can be utilised in some types of qualitative field work. The researcher used an exploratory case study methodology. The sample consisted of 10 participants who were interviewed twice using face-to-face interviews, over a period of 2 months. The study was ethically reviewed by the University Research Ethics Committee and the Faculty Research Ethics Committee, University of Malta. The participants referred to in this article have been given adequate information about the study and their consent has been obtained. Numerous strategies for ensuring confidentiality during recruitment of the participants, during data collection, during transcription and data analysis and during dissemination of research results assisted the researcher in responding to potential and actual ethical issues. This article emphasises the main strategies that can be used to respond to ethical challenges when researching with a small easily identifiable group. The learning discussed here may be relevant to or even transferable to other similar research studies or research contexts. These methods fostered a greater credibility throughout the research process and predisposed the participants to greater trust, and thus, they disclosed their experiences and speak more freely, thus enhancing the quality of the study. © The Author(s) 2014.
Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F
2008-07-19
This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and interventional planning.
Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.
2017-01-01
Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490
Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M
2017-01-01
Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.
NASA Astrophysics Data System (ADS)
Shahzad, M.; Rizvi, H.; Panwar, A.; Ryu, C. M.
2017-06-01
We have re-visited the existence criterion of the reverse shear Alfven eigenmodes (RSAEs) in the presence of the parallel equilibrium current by numerically solving the eigenvalue equation using a fast eigenvalue solver code KAES. The parallel equilibrium current can bring in the kink effect and is known to be strongly unfavorable for the RSAE. We have numerically estimated the critical value of the toroidicity factor Qtor in a circular tokamak plasma, above which RSAEs can exist, and compared it to the analytical one. The difference between the numerical and analytical critical values is small for low frequency RSAEs, but it increases as the frequency of the mode increases, becoming greater for higher poloidal harmonic modes.
Automatically Generated Algorithms for the Vertex Coloring Problem
Contreras Bolton, Carlos; Gatica, Gustavo; Parada, Víctor
2013-01-01
The vertex coloring problem is a classical problem in combinatorial optimization that consists of assigning a color to each vertex of a graph such that no adjacent vertices share the same color, minimizing the number of colors used. Despite the various practical applications that exist for this problem, its NP-hardness still represents a computational challenge. Some of the best computational results obtained for this problem are consequences of hybridizing the various known heuristics. Automatically revising the space constituted by combining these techniques to find the most adequate combination has received less attention. In this paper, we propose exploring the heuristics space for the vertex coloring problem using evolutionary algorithms. We automatically generate three new algorithms by combining elementary heuristics. To evaluate the new algorithms, a computational experiment was performed that allowed comparing them numerically with existing heuristics. The obtained algorithms present an average 29.97% relative error, while four other heuristics selected from the literature present a 59.73% error, considering 29 of the more difficult instances in the DIMACS benchmark. PMID:23516506
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
The Data Fusion Modeling (DFM) approach has been used to develop a groundwater flow and transport model of the Old Burial Grounds (OBG) at the US Department of Energy`s Savannah River Site (SRS). The resulting DFM model was compared to an existing model that was calibrated via the typical trial-and-error method. The OBG was chosen because a substantial amount of hydrogeologic information is available, a FACT (derivative of VAM3DCG) flow and transport model of the site exists, and the calibration and numerics were challenging with standard approaches. The DFM flow model developed here is similar to the flow model bymore » Flach et al. This allows comparison of the two flow models and validates the utility of DFM. The contaminant of interest for this study is tritium, because it is a geochemically conservative tracer that has been monitored along the seepline near the F-Area effluent and Fourmile Branch for several years.« less
Very Low Head Turbine Deployment in Canada
NASA Astrophysics Data System (ADS)
Kemp, P.; Williams, C.; Sasseville, Remi; Anderson, N.
2014-03-01
The Very Low Head (VLH) turbine is a recent turbine technology developed in Europe for low head sites in the 1.4 - 4.2 m range. The VLH turbine is primarily targeted for installation at existing hydraulic structures to provide a low impact, low cost, yet highly efficient solution. Over 35 VLH turbines have been successfully installed in Europe and the first VLH deployment for North America is underway at Wasdell Falls in Ontario, Canada. Deployment opportunities abound in Canada with an estimated 80,000 existing structures within North America for possible low-head hydro development. There are several new considerations and challenges for the deployment of the VLH turbine technology in Canada in adapting to the hydraulic, environmental, electrical and social requirements. Several studies were completed to determine suitable approaches and design modifications to mitigate risk and confirm turbine performance. Diverse types of existing weirs and spillways pose certain hydraulic design challenges. Physical and numerical modelling of the VLH deployment alternatives provided for performance optimization. For this application, studies characterizing the influence of upstream obstacles using water tunnel model testing as well as full-scale prototype flow dynamics testing were completed. A Cold Climate Adaptation Package (CCA) was developed to allow year-round turbine operation in ice covered rivers. The CCA package facilitates turbine extraction and accommodates ice forces, frazil ice, ad-freezing and cold temperatures that are not present at the European sites. The Permanent Magnet Generator (PMG) presents some unique challenges in meeting Canadian utility interconnection requirements. Specific attention to the frequency driver control and protection requirements resulted in a driver design with greater over-voltage capability for the PMG as well as other key attributes. Environmental studies in Europe included fish friendliness testing comprised of multiple in-river live passage tests for a wide variety of fish species. Latest test results indicate fish passage survivability close to 100%. Further fish studies are planned in Canada later this year. Successful deployment must meet societal requirements to gain community acceptance and public approval. Aesthetics considerations include low noise, disguised control buildings and vigilant turbine integration into the low profile existing structures. The resulting design was selected for deployment at existing historic National Park waterway structures. The integration of all of these design elements permits the successful deployment of the VLH turbine in Canada.
Wu, Hulin; Xue, Hongqi; Kumar, Arun
2012-06-01
Differential equations are extensively used for modeling dynamics of physical processes in many scientific fields such as engineering, physics, and biomedical sciences. Parameter estimation of differential equation models is a challenging problem because of high computational cost and high-dimensional parameter space. In this article, we propose a novel class of methods for estimating parameters in ordinary differential equation (ODE) models, which is motivated by HIV dynamics modeling. The new methods exploit the form of numerical discretization algorithms for an ODE solver to formulate estimating equations. First, a penalized-spline approach is employed to estimate the state variables and the estimated state variables are then plugged in a discretization formula of an ODE solver to obtain the ODE parameter estimates via a regression approach. We consider three different order of discretization methods, Euler's method, trapezoidal rule, and Runge-Kutta method. A higher-order numerical algorithm reduces numerical error in the approximation of the derivative, which produces a more accurate estimate, but its computational cost is higher. To balance the computational cost and estimation accuracy, we demonstrate, via simulation studies, that the trapezoidal discretization-based estimate is the best and is recommended for practical use. The asymptotic properties for the proposed numerical discretization-based estimators are established. Comparisons between the proposed methods and existing methods show a clear benefit of the proposed methods in regards to the trade-off between computational cost and estimation accuracy. We apply the proposed methods t an HIV study to further illustrate the usefulness of the proposed approaches. © 2012, The International Biometric Society.
Observations and global numerical modelling of the St. Patrick's Day 2015 geomagnetic storm event
NASA Astrophysics Data System (ADS)
Foerster, M.; Prokhorov, B. E.; Doornbos, E.; Astafieva, E.; Zakharenkova, I.
2017-12-01
With a sudden storm commencement (SSC) at 04:45 UT on St. Patrick's day 2015 started the most severe geomagnetic storm in solar cycle 24. It appeared as a two-stage geomagnetic storm with a minimum SYM-H value of -233 nT. In the response to the storm commencement in the first activation, a short-term positive effect in the ionospheric vertical electron content (VTEC) occurred at low- and mid-latitudes on the dayside. The second phase commencing around 12:30 UT lasted longer and caused significant and complex storm-time changes around the globe with hemispherical different ionospheric storm reactions in different longitudinal ranges. Swarm-C observations of the neutral mass density variation along the orbital path as well as Langmuir probe plasma and magnetometer measurements of all three Swarm satellites and global TEC records are used for physical interpretations and modelling of the positive/negative storm scenario. These observations pose a challenge for the global numerical modelling of thermosphere-ionosphere storm processes as the storm, which occurred around spring equinox, obviously signify the existence of other impact factors than seasonal dependence for hemispheric asymmetries to occur. Numerical simulation trials using the Potsdam version of the Upper Atmosphere Model (UAM-P) are presented to explain these peculiar M-I-T storm processes.
Estimation of cardiac conductivities in ventricular tissue by a variational approach
NASA Astrophysics Data System (ADS)
Yang, Huanhuan; Veneziani, Alessandro
2015-11-01
The bidomain model is the current standard model to simulate cardiac potential propagation. The numerical solution of this system of partial differential equations strongly depends on the model parameters and in particular on the cardiac conductivities. Unfortunately, it is quite problematic to measure these parameters in vivo and even more so in clinical practice, resulting in no common agreement in the literature. In this paper we consider a variational data assimilation approach to estimating those parameters. We consider the parameters as control variables to minimize the mismatch between the computed and the measured potentials under the constraint of the bidomain system. The existence of a minimizer of the misfit function is proved with the phenomenological Rogers-McCulloch ionic model, that completes the bidomain system. We significantly improve the numerical approaches in the literature by resorting to a derivative-based optimization method with settlement of some challenges due to discontinuity. The improvement in computational efficiency is confirmed by a 2D test as a direct comparison with approaches in the literature. The core of our numerical results is in 3D, on both idealized and real geometries, with the minimal ionic model. We demonstrate the reliability and the stability of the conductivity estimation approach in the presence of noise and with an imperfect knowledge of other model parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Xiang; Yang, Chao; State Key Laboratory of Computer Science, Chinese Academy of Sciences, Beijing 100190
2015-03-15
We present a numerical algorithm for simulating the spinodal decomposition described by the three dimensional Cahn–Hilliard–Cook (CHC) equation, which is a fourth-order stochastic partial differential equation with a noise term. The equation is discretized in space and time based on a fully implicit, cell-centered finite difference scheme, with an adaptive time-stepping strategy designed to accelerate the progress to equilibrium. At each time step, a parallel Newton–Krylov–Schwarz algorithm is used to solve the nonlinear system. We discuss various numerical and computational challenges associated with the method. The numerical scheme is validated by a comparison with an explicit scheme of high accuracymore » (and unreasonably high cost). We present steady state solutions of the CHC equation in two and three dimensions. The effect of the thermal fluctuation on the spinodal decomposition process is studied. We show that the existence of the thermal fluctuation accelerates the spinodal decomposition process and that the final steady morphology is sensitive to the stochastic noise. We also show the evolution of the energies and statistical moments. In terms of the parallel performance, it is found that the implicit domain decomposition approach scales well on supercomputers with a large number of processors.« less
When good times go bad: managing ‘legal high’ complications in the emergency department
Caffrey, Charles R; Lank, Patrick M
2018-01-01
Patients can use numerous drugs that exist outside of existing regulatory statutes in order to get “legal highs.” Legal psychoactive substances represent a challenge to the emergency medicine physician due to the sheer number of available agents, their multiple toxidromes and presentations, their escaping traditional methods of analysis, and the reluctance of patients to divulge their use of these agents. This paper endeavors to cover a wide variety of “legal highs,” or uncontrolled psychoactive substances that may have abuse potential and may result in serious toxicity. These agents include not only some novel psychoactive substances aka “designer drugs,” but also a wide variety of over-the-counter medications, herbal supplements, and even a household culinary spice. The care of patients in the emergency department who have used “legal high” substances is challenging. Patients may misunderstand the substance they have been exposed to, there are rarely any readily available laboratory confirmatory tests for these substances, and the exact substances being abused may change on a near-daily basis. This review will attempt to group legal agents into expected toxidromes and discuss associated common clinical manifestations and management. A focus on aggressive symptom-based supportive care as well as management of end-organ dysfunction is the mainstay of treatment for these patients in the emergency department. PMID:29302196
Hansen, Richard A; Droege, Marcus
2005-06-01
Numerous studies have focused on the impact of direct-to-consumer (DTC) prescription drug advertising on consumer behavior and health outcomes. These studies have used various approaches to assess exposure to prescription drug advertising and to measure the subsequent effects of such advertisements. The objectives of this article are to (1) discuss measurement challenges involved in DTC advertising research, (2) summarize measurement approaches commonly identified in the literature, and (3) discuss contamination, time to action, and endogeneity as specific problems in measurement design and application. We conducted a review of the professional literature to identify illustrative approaches to advertising measurement. Specifically, our review of the literature focused on measurement of DTC advertising exposure and effect. We used the hierarchy-of-effects model to guide our discussion of processing and communication effects. Other effects were characterized as target audience action, sales, market share, and profit. Overall, existing studies have used a variety of approaches to measure advertising exposure and effect, yet the ability of measures to produce a valid and reliable understanding of the effects of DTC advertising can be improved. Our review provides a framework for conceptualizing DTC measurement, and can be used to identify gaps in the literature not sufficiently addressed by existing measures. Researchers should continue to explore correlations between exposure and effect of DTC advertising, but are obliged to improve and validate measurement in this area.
The laterality effect: myth or truth?
Cohen Kadosh, Roi
2008-03-01
Tzelgov and colleagues [Tzelgov, J., Meyer, J., and Henik, A. (1992). Automatic and intentional processing of numerical information. Journal of Experimental Psychology: Learning, Memory and Cognition, 18, 166-179.], offered the existence of the laterality effect as a post-hoc explanation for their results. According to this effect, numbers are classified automatically as small/large versus a standard point under autonomous processing of numerical information. However, the genuinity of the laterality effect was never examined, or was confounded with the numerical distance effect. In the current study, I controlled the numerical distance effect and observed that the laterality effect does exist, and affects the processing of automatic numerical information. The current results suggest that the laterality effect should be taken into account when using paradigms that require automatic numerical processing such as Stroop-like or priming tasks.
Numbers and space: associations and dissociations.
Nathan, Merav Ben; Shaki, Samuel; Salti, Moti; Algom, Daniel
2009-06-01
A cornerstone of contemporary research in numerical cognition is the surprising link found between numbers and space. In particular, people react faster and more accurately to small numbers with a left-hand key and to large numbers with a right-hand key. Because this contingency is found in a variety of tasks, it has been taken to support the automatic activation of magnitude as well as the notion of a mental number line arranged from left to right. The present study challenges the presence of a link between left-right location, on the one hand, and small-large number, on the other hand. We show that a link exists between space and relative magnitude, a relationship that might or might not be unique to numbers.
OPTIMASS: a package for the minimization of kinematic mass functions with constraints
NASA Astrophysics Data System (ADS)
Cho, Won Sang; Gainer, James S.; Kim, Doojin; Lim, Sung Hak; Matchev, Konstantin T.; Moortgat, Filip; Pape, Luc; Park, Myeonghun
2016-01-01
Reconstructed mass variables, such as M 2, M 2 C , M T * , and M T2 W , play an essential role in searches for new physics at hadron colliders. The calculation of these variables generally involves constrained minimization in a large parameter space, which is numerically challenging. We provide a C++ code, O ptimass, which interfaces with the M inuit library to perform this constrained minimization using the Augmented Lagrangian Method. The code can be applied to arbitrarily general event topologies, thus allowing the user to significantly extend the existing set of kinematic variables. We describe this code, explain its physics motivation, and demonstrate its use in the analysis of the fully leptonic decay of pair-produced top quarks using M 2 variables.
Population-based imaging biobanks as source of big data.
Gatidis, Sergios; Heber, Sophia D; Storz, Corinna; Bamberg, Fabian
2017-06-01
Advances of computational sciences over the last decades have enabled the introduction of novel methodological approaches in biomedical research. Acquiring extensive and comprehensive data about a research subject and subsequently extracting significant information has opened new possibilities in gaining insight into biological and medical processes. This so-called big data approach has recently found entrance into medical imaging and numerous epidemiological studies have been implementing advanced imaging to identify imaging biomarkers that provide information about physiological processes, including normal development and aging but also on the development of pathological disease states. The purpose of this article is to present existing epidemiological imaging studies and to discuss opportunities, methodological and organizational aspects, and challenges that population imaging poses to the field of big data research.
The Disposal of Spacecraft and Launch Vehicle Stages in Low Earth Orbit
NASA Technical Reports Server (NTRS)
Johnson, Nicholas L.
2007-01-01
Spacecraft and launch vehicle stages abandoned in Earth orbit have historically been a primary source of debris from accidental explosions. In the future, such satellites will become the principal cause of orbital debris via inadvertent collisions. To curtail both the near-term and far-term risks posed by derelict spacecraft and launch vehicle stages to operational space systems, numerous national and international orbital debris mitigation guidelines specifically recommend actions which could prevent or limit such future debris generation. Although considerable progress has been made in implementing these recommendations, some changes to existing vehicle designs can be difficult. Moreover, the nature of some missions also can present technological and budgetary challenges to be compliant with widely accepted orbital debris mitigation measures.
Numerical computation of orbits and rigorous verification of existence of snapback repellers.
Peng, Chen-Chang
2007-03-01
In this paper we show how analysis from numerical computation of orbits can be applied to prove the existence of snapback repellers in discrete dynamical systems. That is, we present a computer-assisted method to prove the existence of a snapback repeller of a specific map. The existence of a snapback repeller of a dynamical system implies that it has chaotic behavior [F. R. Marotto, J. Math. Anal. Appl. 63, 199 (1978)]. The method is applied to the logistic map and the discrete predator-prey system.
Colon-Gonzalez, Maria C; El Rayess, Fadya; Guevara, Sara; Anandarajah, Gowri
2015-01-01
Central American countries, like many others, face a shortage of rural health physicians. Most medical schools in this region are located in urban areas and focus on tertiary care training rather than on community health or primary care, which are better suited for rural practice. However, many countries require young physicians to do community service in rural communities to address healthcare provider shortages. This study aimed to: (a) synthesize what is known about the current state of medical education preparing physicians for rural practice in this region, and (b) identify common needs, challenges and opportunities for improving medical education in this area. A comprehensive literature review was conducted between December 2013 and May 2014. The stepwise, reproducible search process included English and Spanish language resources from both data-based web search engines (PubMed, Web of Science/Web of Knowledge, ERIC and Google Scholar) and the grey literature. Search criteria included MeSH terms: 'medical education', 'rural health', 'primary care', 'community medicine', 'social service', in conjunction with 'Central America', 'Latin America', 'Mexico', 'Guatemala', 'Belize', 'El Salvador', 'Nicaragua', 'Honduras', 'Costa Rica' and 'Panama'. Articles were included in the review if they (1) were published after 1984; (2) focused on medical education for rural health, primary care, community health; and (3) involved the countries of interest. A narrative synthesis of the content of resources meeting inclusion criteria was done using qualitative research methods to identify common themes pertaining to the study goals. The search revealed 20 resources that met inclusion criteria. Only four of the 20 were research articles; therefore, information about this subject was primarily derived from expert opinion. Thematic analysis revealed the historical existence of several innovative programs that directly address rural medicine training needs, suggesting that expertise is present in this region. However, numerous challenges limit sustainability or expansion of successful programs. Common challenges include: (a) physicians' exposure to rural medicine primarily takes place during social service commitment time, rather than during formal medical training; (b) innovative educational programs are often not sustainable due to financial and leadership challenges; (c) the majority of physician manpower is in urban areas, resulting in few rural physician role models and teachers; and (d) there is insufficient collaboration to establish clinical and educational systems to meet rural health needs. Recurring suggestions for curricular changes include: (a) making primary care training a core component of medical school education; and (b) expanding medical school curricula in cross-cultural communication and social determinants of disease. Suggestions for health system changes include: (a) improving living and working conditions for rural physicians; and (b) establishing partnerships between educational, governmental and non-governmental organizations and rural community leadership, to promote rural health training and systems. Expertise in rural medicine and training exists in continental Central America. However, there are numerous challenges to improving medical education to meet the needs of rural communities. Overcoming these challenges will require creative solutions, new partnerships, and evaluation and dissemination of successful educational programs. There is a great need for further research on this topic.
Incorporating neurophysiological concepts in mathematical thermoregulation models
NASA Astrophysics Data System (ADS)
Kingma, Boris R. M.; Vosselman, M. J.; Frijns, A. J. H.; van Steenhoven, A. A.; van Marken Lichtenbelt, W. D.
2014-01-01
Skin blood flow (SBF) is a key player in human thermoregulation during mild thermal challenges. Various numerical models of SBF regulation exist. However, none explicitly incorporates the neurophysiology of thermal reception. This study tested a new SBF model that is in line with experimental data on thermal reception and the neurophysiological pathways involved in thermoregulatory SBF control. Additionally, a numerical thermoregulation model was used as a platform to test the function of the neurophysiological SBF model for skin temperature simulation. The prediction-error of the SBF-model was quantified by root-mean-squared-residual (RMSR) between simulations and experimental measurement data. Measurement data consisted of SBF (abdomen, forearm, hand), core and skin temperature recordings of young males during three transient thermal challenges (1 development and 2 validation). Additionally, ThermoSEM, a thermoregulation model, was used to simulate body temperatures using the new neurophysiological SBF-model. The RMSR between simulated and measured mean skin temperature was used to validate the model. The neurophysiological model predicted SBF with an accuracy of RMSR < 0.27. Tskin simulation results were within 0.37 °C of the measured mean skin temperature. This study shows that (1) thermal reception and neurophysiological pathways involved in thermoregulatory SBF control can be captured in a mathematical model, and (2) human thermoregulation models can be equipped with SBF control functions that are based on neurophysiology without loss of performance. The neurophysiological approach in modelling thermoregulation is favourable over engineering approaches because it is more in line with the underlying physiology.
Laser heating challenges of high yield MagLIF targets
NASA Astrophysics Data System (ADS)
Slutz, Stephen; Sefkow, Adam; Vesey, Roger
2014-10-01
The MagLIF (Magnetized Liner Inertial Fusion) concept is predicted by numerical simulation to produce fusion yields of about 100 kJ, when driven by 25 MA from the existing Z accelerator [S. A. Slutz et al. Phys. Plasmas 17, 056303 (2010)] and much higher yields with future accelerators delivering higher currents [Slutz and Vesey PRL 108, 025003 (2012)]. The fuel must be heated before compression to obtain significant fusion yields due to the relatively slow implosion velocities (~ 100 km/s) of magnetically driven liners. Lasers provide a convenient means to accomplish this pre-compressional heating of the fusion fuel, but there are challenges. The laser must penetrate a foil covering the laser entrance hole and deposit 20-30 kJ within the ~1 cm length of the liner in fuel at 6-12 mg/cc. Such high densities could result in beam scattering due to refraction and laser plasma interactions. Numerical simulations of the laser heating process are presented, which indicate that energies as high as 30 kJ could be deposited in the fuel by using two laser pulses of different wavelengths. Simulations of this process will be presented as well of results for a MagLIF design for a potential new machine delivering 50 MA of current. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
The Role of Computer Simulation in Nanoporous Metals—A Review
Xia, Re; Wu, Run Ni; Liu, Yi Lun; Sun, Xiao Yu
2015-01-01
Nanoporous metals (NPMs) have proven to be all-round candidates in versatile and diverse applications. In this decade, interest has grown in the fabrication, characterization and applications of these intriguing materials. Most existing reviews focus on the experimental and theoretical works rather than the numerical simulation. Actually, with numerous experiments and theory analysis, studies based on computer simulation, which may model complex microstructure in more realistic ways, play a key role in understanding and predicting the behaviors of NPMs. In this review, we present a comprehensive overview of the computer simulations of NPMs, which are prepared through chemical dealloying. Firstly, we summarize the various simulation approaches to preparation, processing, and the basic physical and chemical properties of NPMs. In this part, the emphasis is attached to works involving dealloying, coarsening and mechanical properties. Then, we conclude with the latest progress as well as the future challenges in simulation studies. We believe that highlighting the importance of simulations will help to better understand the properties of novel materials and help with new scientific research on these materials. PMID:28793491
Constrained orbital intercept-evasion
NASA Astrophysics Data System (ADS)
Zatezalo, Aleksandar; Stipanovic, Dusan M.; Mehra, Raman K.; Pham, Khanh
2014-06-01
An effective characterization of intercept-evasion confrontations in various space environments and a derivation of corresponding solutions considering a variety of real-world constraints are daunting theoretical and practical challenges. Current and future space-based platforms have to simultaneously operate as components of satellite formations and/or systems and at the same time, have a capability to evade potential collisions with other maneuver constrained space objects. In this article, we formulate and numerically approximate solutions of a Low Earth Orbit (LEO) intercept-maneuver problem in terms of game-theoretic capture-evasion guaranteed strategies. The space intercept-evasion approach is based on Liapunov methodology that has been successfully implemented in a number of air and ground based multi-player multi-goal game/control applications. The corresponding numerical algorithms are derived using computationally efficient and orbital propagator independent methods that are previously developed for Space Situational Awareness (SSA). This game theoretical but at the same time robust and practical approach is demonstrated on a realistic LEO scenario using existing Two Line Element (TLE) sets and Simplified General Perturbation-4 (SGP-4) propagator.
Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr
2010-03-24
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less
SERVER DEVELOPMENT FOR NSLS-II PHYSICS APPLICATIONS AND PERFORMANCE ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, G.; Kraimer, M.
2011-03-28
The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. The server software under development is available via an open source sourceforge project named epics-pvdata, which consists of modules pvData, pvAccess, pvIOC, and pvService. Examples of two services that already exist in the pvService module are itemFinder, and gather. Each service uses pvData to store in-memory transient data, pvService to transfer data over the network, and pvIOC as the service engine. The performance benchmarking for pvAccess and both gather service and item finder service are presented inmore » this paper. The performance comparison between pvAccess and Channel Access are presented also. For an ultra low emittance synchrotron radiation light source like NSLS II, the control system requirements, especially for beam control are tight. To control and manipulate the beam effectively, a use case study has been performed to satisfy the requirement and theoretical evaluation has been performed. The analysis shows that model based control is indispensable for beam commissioning and routine operation. However, there are many challenges such as how to re-use a design model for on-line model based control, and how to combine the numerical methods for modeling of a realistic lattice with the analytical techniques for analysis of its properties. To satisfy the requirements and challenges, adequate system architecture for the software framework for beam commissioning and operation is critical. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating and plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology, and client interface is undergoing. The design and implementation adopted a new EPICS implementation, namely epics-pvdata [9], which is under active development. The implementation of this project under Java is close to stable, and binding to other language such as C++ and/or Python is undergoing. In this paper, we focus on the performance benchmarking and comparison for pvAccess and Channel Access, the performance evaluation for 2 services, gather and item finder respectively.« less
NASA Astrophysics Data System (ADS)
Biggin, A. J.; Suttie, N.; Paterson, G. A.; Aubert, J.; Hurst, E.; Clarke, A.
2013-12-01
On timescales over which mantle convection may be affecting the geodynamo (10-100s of million years), magnetic reversal frequency is the best documented aspect of geomagnetic behaviour. Suitable, continuous recorders of this parameter become very sparse before a few hundreds of millions of years however presenting a major challenge to documenting and understanding geomagnetic variations on the timescale of even the most recent supercontinent cycle. It is hypothetically possible to measure the absolute geomagnetic palaeointensity from any geological material that has cooled from above the Curie Temperature of its constituent magnetic remanence carriers. Since igneous rocks are abundant in the geological record, estimates of dipole moment from these present a vital resource in documenting geomagnetic variations into deep time. In practice, a host of practical problems makes obtaining such measurements reliably from geological materials challenging. Nevertheless, the absolute palaeointensity database PINT, newly linked to the comprehensive Magnetics Information Consortium (MagIC) database, already contains 3,941 published dipole moment estimates from rocks older than 50,000 years ago and continues to grow rapidly. In order that even the existing record may be used to maximum effectiveness in characterising geomagnetic behaviour, two challenges must be met. 1. The variable reliability of individual measurements must be reasonably assessed 2. The impact of the inhomogeneous distribution of dipole moment estimates in space and time must be ascertained. Here, we will report efforts attempting to address these two challenges using novel approaches. A new set of quality criteria for palaeointensity data (QPI) has been developed and tested by application to studies recently incorporated into PINT. To address challenge 1, we propose that every published dipole moment estimate eventually be given a QPI score indicating the number of these criteria fulfilled. To begin to address challenge 2, we take an approach using the outputs of numerical dynamo simulations. This involves subsampling synthetic global time series of full-vector magnetic field data, converting these datasets into virtual (axial) dipole moments, and comparing these to the entire distribution to ascertain how well secular variation is averaged and characterised. Finally, the two approaches will be combined. Datasets of real dipole moment estimates, filtered by QPI, will be compared to the synthetic distributions in order to present more robust characterisations of geomagnetic behaviour in different time intervals than has previously been possible.
A case study of global health at the university: implications for research and action.
Pinto, Andrew D; Cole, Donald C; ter Kuile, Aleida; Forman, Lisa; Rouleau, Katherine; Philpott, Jane; Pakes, Barry; Jackson, Suzanne; Muntaner, Carles
2014-01-01
Global health is increasingly a major focus of institutions in high-income countries. However, little work has been done to date to study the inner workings of global health at the university level. Academics may have competing objectives, with few mechanisms to coordinate efforts and pool resources. To conduct a case study of global health at Canada's largest health sciences university and to examine how its internal organization influences research and action. We drew on existing inventories, annual reports, and websites to create an institutional map, identifying centers and departments using the terms 'global health' or 'international health' to describe their activities. We compiled a list of academics who self-identified as working in global or international health. We purposively sampled persons in leadership positions as key informants. One investigator carried out confidential, semi-structured interviews with 20 key informants. Interview notes were returned to participants for verification and then analyzed thematically by pairs of coders. Synthesis was conducted jointly. More than 100 academics were identified as working in global health, situated in numerous institutions, centers, and departments. Global health academics interviewed shared a common sense of what global health means and the values that underpin such work. Most academics interviewed expressed frustration at the existing fragmentation and the lack of strategic direction, financial support, and recognition from the university. This hampered collaborative work and projects to tackle global health problems. The University of Toronto is not exceptional in facing such challenges, and our findings align with existing literature that describes factors that inhibit collaboration in global health work at universities. Global health academics based at universities may work in institutional siloes and this limits both internal and external collaboration. A number of solutions to address these challenges are proposed.
NASA Astrophysics Data System (ADS)
Picot-Colbeaux, Géraldine; Devau, Nicolas; Thiéry, Dominique; Pettenati, Marie; Surdyk, Nicolas; Parmentier, Marc; Amraoui, Nadia; Crastes de Paulet, François; André, Laurent
2016-04-01
Chalk aquifer is the main water resource for domestic water supply in many parts in northern France. In same basin, groundwater is frequently affected by quality problems concerning nitrates. Often close to or above the drinking water standards, nitrate concentration in groundwater is mainly due to historical agriculture practices, combined with leakage and aquifer recharge through the vadose zone. The complexity of processes occurring into such an environment leads to take into account a lot of knowledge on agronomy, geochemistry and hydrogeology in order to understand, model and predict the spatiotemporal evolution of nitrate content and provide a decision support tool for the water producers and stakeholders. To succeed in this challenge, conceptual and numerical models representing accurately the Chalk aquifer specificity need to be developed. A multidisciplinary approach is developed to simulate storage and transport from the ground surface until groundwater. This involves a new agronomic module "NITRATE" (NItrogen TRansfer for Arable soil to groundwaTEr), a soil-crop model allowing to calculate nitrogen mass balance in arable soil, and the "PHREEQC" numerical code for geochemical calculations, both coupled with the 3D transient groundwater numerical code "MARTHE". Otherwise, new development achieved on MARTHE code allows the use of dual porosity and permeability calculations needed in the fissured Chalk aquifer context. This method concerning the integration of existing multi-disciplinary tools is a real challenge to reduce the number of parameters by selecting the relevant equations and simplifying the equations without altering the signal. The robustness and the validity of these numerical developments are tested step by step with several simulations constrained by climate forcing, land use and nitrogen inputs over several decades. In the first time, simulations are performed in a 1D vertical unsaturated soil column for representing experimental nitrates vertical soil profiles (0-30m depth experimental measurements in Somme region). In the second time, this approach is used to simulate with a 3D model a drinking water catchment area in order to compared nitrate contents time series calculated and measured in the domestic water pumping well since 1995 (field in northern France - Avre Basin region). This numerical tool will help the decision-making in all activities in relation with water uses.
Implicitly solving phase appearance and disappearance problems using two-fluid six-equation model
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-01-25
Phase appearance and disappearance issue presents serious numerical challenges in two-phase flow simulations using the two-fluid six-equation model. Numerical challenges arise from the singular equation system when one phase is absent, as well as from the discontinuity in the solution space when one phase appears or disappears. In this work, a high-resolution spatial discretization scheme on staggered grids and fully implicit methods were applied for the simulation of two-phase flow problems using the two-fluid six-equation model. A Jacobian-free Newton-Krylov (JFNK) method was used to solve the discretized nonlinear problem. An improved numerical treatment was proposed and proved to be effectivemore » to handle the numerical challenges. The treatment scheme is conceptually simple, easy to implement, and does not require explicit truncations on solutions, which is essential to conserve mass and energy. Various types of phase appearance and disappearance problems relevant to thermal-hydraulics analysis have been investigated, including a sedimentation problem, an oscillating manometer problem, a non-condensable gas injection problem, a single-phase flow with heat addition problem and a subcooled flow boiling problem. Successful simulations of these problems demonstrate the capability and robustness of the proposed numerical methods and numerical treatments. As a result, volume fraction of the absent phase can be calculated effectively as zero.« less
NASA Astrophysics Data System (ADS)
Lange, Jacob; O'Shaughnessy, Richard; Healy, James; Lousto, Carlos; Shoemaker, Deirdre; Lovelace, Geoffrey; Scheel, Mark; Ossokine, Serguei
2016-03-01
In this talk, we describe a procedure to reconstruct the parameters of sufficiently massive coalescing compact binaries via direct comparison with numerical relativity simulations. For sufficiently massive sources, existing numerical relativity simulations are long enough to cover the observationally accessible part of the signal. Due to the signal's brevity, the posterior parameter distribution it implies is broad, simple, and easily reconstructed from information gained by comparing to only the sparse sample of existing numerical relativity simulations. We describe how followup simulations can corroborate and improve our understanding of a detected source. Since our method can include all physics provided by full numerical relativity simulations of coalescing binaries, it provides a valuable complement to alternative techniques which employ approximations to reconstruct source parameters. Supported by NSF Grant PHY-1505629.
The BACnet Campus Challenge - Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masica, Ken; Tom, Steve
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Accurate thermoelastic tensor and acoustic velocities of NaCl
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcondes, Michel L., E-mail: michel@if.usp.br; Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455; Shukla, Gaurav, E-mail: shukla@physics.umn.edu
Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor bymore » using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.« less
Approach to design neural cryptography: a generalized architecture and a heuristic rule.
Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen
2013-06-01
Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.
NASA Astrophysics Data System (ADS)
Sanghyun, Ahn; Seungwoong, Ha; Kim, Soo Yong
2016-06-01
A vital challenge for many socioeconomic systems is determining the optimum use of limited information. Traffic systems, wherein the range of resources is limited, are a particularly good example of this challenge. Based on bounded information accessibility in terms of, for example, high costs or technical limitations, we develop a new optimization strategy to improve the efficiency of a traffic system with signals and intersections. Numerous studies, including the study by Chowdery and Schadschneider (whose method we denote by ChSch), have attempted to achieve the maximum vehicle speed or the minimum wait time for a given traffic condition. In this paper, we introduce a modified version of ChSch with an independently functioning, decentralized control system. With the new model, we determine the optimization strategy under bounded information accessibility, which proves the existence of an optimal point for phase transitions in the system. The paper also provides insight that can be applied by traffic engineers to create more efficient traffic systems by analyzing the area and symmetry of local sites. We support our results with a statistical analysis using empirical traffic data from Seoul, Korea.
Lessons Learned from Monitoring Drought in Data Sparse Regions in the United States
NASA Astrophysics Data System (ADS)
Edwards, L. M.; Redmond, K. T.
2011-12-01
Drought monitoring in the geographic domain represented by the Western Regional Climate Center (WRCC) in the United States can serve as an example of many of the challenges that face a global drought early warning system (GDEWS). The WRCC area includes numerous climate regions, such as: the Pacific coast of the continental U.S., the lowest elevation in North America, arid and alpine environments, temperate rainforest, Alaska, Hawaii and the Pacific territories of the U.S. in the tropics. This area is quite diverse in its climatological regimes, from rainforest to high desert to tundra, and covers a large area of land and water. Drought in the WRCC domain affects a wide range of constituents and interests, and the complex interplay between "human-caused" and natural drought cannot be understated. Data to support a GDEWS, as in the WRCC region, is often non-existent or unreliable in remote locations. Even in the continental U.S., data is not as dense as the topography and climate zones demand for accurate drought assessment. Challenges and efforts to address drought monitoring at the WRCC will be presented.
The BACnet Campus Challenge - Part 1
Masica, Ken; Tom, Steve
2015-12-01
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Use of Green's functions in the numerical solution of two-point boundary value problems
NASA Technical Reports Server (NTRS)
Gallaher, L. J.; Perlin, I. E.
1974-01-01
This study investigates the use of Green's functions in the numerical solution of the two-point boundary value problem. The first part deals with the role of the Green's function in solving both linear and nonlinear second order ordinary differential equations with boundary conditions and systems of such equations. The second part describes procedures for numerical construction of Green's functions and considers briefly the conditions for their existence. Finally, there is a description of some numerical experiments using nonlinear problems for which the known existence, uniqueness or convergence theorems do not apply. Examples here include some problems in finding rendezvous orbits of the restricted three body system.
The Spatial-Numerical Congruity Effect in Preschoolers
ERIC Educational Resources Information Center
Patro, Katarzyna; Haman, Maciej
2012-01-01
Number-to-space mapping and its directionality are compelling topics in the study of numerical cognition. Usually, literacy and math education are thought to shape a left-to-right number line. We challenged this claim by analyzing performance of preliterate precounting preschoolers in a spatial-numerical task. In our experiment, children exhibited…
Numerical simulation of synthesis gas incineration
NASA Astrophysics Data System (ADS)
Kazakov, A. V.; Khaustov, S. A.; Tabakaev, R. B.; Belousova, Y. A.
2016-04-01
The authors have analysed the expediency of the suggested low-grade fuels application method. Thermal processing of solid raw materials in the gaseous fuel, called synthesis gas, is investigated. The technical challenges concerning the applicability of the existing gas equipment developed and extensively tested exclusively for natural gas were considered. For this purpose computer simulation of three-dimensional syngas-incinerating flame dynamics was performed by means of the ANSYS Multiphysics engineering software. The subjects of studying were: a three-dimensional aerodynamic flame structure, heat-release and temperature fields, a set of combustion properties: a flare range and the concentration distribution of burnout reagents. The obtained results were presented in the form of a time-averaged pathlines with color indexing. The obtained results can be used for qualitative and quantitative evaluation of complex multicomponent gas incineration singularities.
Proton pump inhibitor-refractory gastroesophageal reflux disease: challenges and solutions
Mermelstein, Joseph; Chait Mermelstein, Alanna; Chait, Maxwell M
2018-01-01
A significant percentage of patients with gastroesophageal reflux disease (GERD) will not respond to proton pump inhibitor (PPI) therapy. The causes of PPI-refractory GERD are numerous and diverse, and include adherence, persistent acid, functional disorders, nonacid reflux, and PPI bioavailability. The evaluation should start with a symptom assessment and may progress to imaging, endoscopy, and monitoring of esophageal pH, impedance, and bilirubin. There are a variety of pharmacologic and procedural interventions that should be selected based on the underlying mechanism of PPI failure. Pharmacologic treatments can include antacids, prokinetics, alginates, bile acid binders, reflux inhibitors, and antidepressants. Procedural options include laparoscopic fundoplication and LINX as well as endoscopic procedures, such as transoral incisionless fundoplication and Stretta. Several alternative and complementary treatments of possible benefit also exist. PMID:29606884
NASA Astrophysics Data System (ADS)
Jay, Caroline; Lunn, Darren; Michailidou, Eleni
As new technologies emerge, and Web sites become increasingly sophisticated, ensuring they remain accessible to disabled and small-screen users is a major challenge. While guidelines and automated evaluation tools are useful for informing some aspects of Web site design, numerous studies have demonstrated that they provide no guarantee that the site is genuinely accessible. The only reliable way to evaluate the accessibility of a site is to study the intended users interacting with it. This chapter outlines the processes that can be used throughout the design life cycle to ensure Web accessibility, describing their strengths and weaknesses, and discussing the practical and ethical considerations that they entail. The chapter also considers an important emerging trend in user evaluations: combining data from studies of “standard” Web use with data describing existing accessibility issues, to drive accessibility solutions forward.
NASA Astrophysics Data System (ADS)
Douma, M.; Ligierko, G.; Angelov, I.
2008-10-01
The need for information has increased exponentially over the past decades. The current systems for constructing, exploring, classifying, organizing, and searching information face the growing challenge of enabling their users to operate efficiently and intuitively in knowledge-heavy environments. This paper presents SpicyNodes, an advanced user interface for difficult interaction contexts. It is based on an underlying structure known as a radial map, which allows users to manipulate and interact in a natural manner with entities called nodes. This technology overcomes certain limitations of existing solutions and solves the problem of browsing complex sets of linked information. SpicyNodes is also an organic system that projects users into a living space, stimulating exploratory behavior and fostering creative thought. Our interactive radial layout is used for educational purposes and has the potential for numerous other applications.
OPTIMASS: A package for the minimization of kinematic mass functions with constraints
Cho, Won Sang; Gainer, James S.; Kim, Doojin; ...
2016-01-07
Reconstructed mass variables, such as M 2, M 2C, M* T, and M T2 W, play an essential role in searches for new physics at hadron colliders. The calculation of these variables generally involves constrained minimization in a large parameter space, which is numerically challenging. We provide a C++ code, Optimass, which interfaces with the Minuit library to perform this constrained minimization using the Augmented Lagrangian Method. The code can be applied to arbitrarily general event topologies, thus allowing the user to significantly extend the existing set of kinematic variables. Here, we describe this code, explain its physics motivation, andmore » demonstrate its use in the analysis of the fully leptonic decay of pair-produced top quarks using M 2 variables.« less
Peptide and protein-based nanotubes for nanobiotechnology.
Petrov, Anna; Audette, Gerald F
2012-01-01
The development of biologically relevant nanosystems such as biomolecular probes and sensors requires systems that effectively interface specific biochemical environments with abiotic architectures. The most widely studied nanomaterial, carbon nanotubes, has proven challenging in their adaptation for biomedical applications despite their numerous advantageous physical and electrochemical properties. On the other hand, development of bionanosystems through adaptation of existing biological systems has several advantages including their adaptability through modern recombinant DNA strategies. Indeed, the use of peptides, proteins and protein assemblies as nanotubes, scaffolds, and nanowires has shown much promise as a bottom-up approach to the development of novel bionanosystems. We highlight several unique peptide and protein systems that generate protein nanotubes (PNTs) that are being explored for the development of biosensors, probes, bionanowires, and drug delivery systems. Copyright © 2012 Wiley Periodicals, Inc.
Critical Issues in Evaluating National-Level Health Data Warehouses in LMICs: Kenya Case Study.
Gesicho, Milka B; Babic, Ankica; Were, Martin C
2017-01-01
Low-Middle-Income-Countries (LMICs) are beginning to adopt national health data warehousing (NHDWs) for making strategic decisions and for improving health outcomes. Given the numerous challenges likely to be faced in establishment of NHDWs by LMICs, it is prudent that evaluations are done in relation to the data warehouses (DWs), in order to identify and mitigate critical issues that arise. When critic issues are not identified, DWs are prone to suboptimal implementation with compromised outcomes. Despite the fact that several publications exist on evaluating DWs, evaluations specific to health data warehouses are scanty, with almost none evaluating NHDWs more so in LMICs. This paper uses a systematic approach guided by an evaluation framework to identify critical issues to be considered in evaluating Kenya's NHDW.
Peng, Yang; Wu, Chao; Zheng, Yifu; Dong, Jun
2017-01-01
Welded joints are prone to fatigue cracking with the existence of welding defects and bending stress. Fracture mechanics is a useful approach in which the fatigue life of the welded joint can be predicted. The key challenge of such predictions using fracture mechanics is how to accurately calculate the stress intensity factor (SIF). An empirical formula for calculating the SIF of welded joints under bending stress was developed by Baik, Yamada and Ishikawa based on the hybrid method. However, when calculating the SIF of a semi-elliptical crack, this study found that the accuracy of the Baik-Yamada formula was poor when comparing the benchmark results, experimental data and numerical results. The reasons for the reduced accuracy of the Baik-Yamada formula were identified and discussed in this paper. Furthermore, a new correction factor was developed and added to the Baik-Yamada formula by using theoretical analysis and numerical regression. Finally, the predictions using the modified Baik-Yamada formula were compared with the benchmark results, experimental data and numerical results. It was found that the accuracy of the modified Baik-Yamada formula was greatly improved. Therefore, it is proposed that this modified formula is used to conveniently and accurately calculate the SIF of semi-elliptical cracks in welded joints under bending stress. PMID:28772527
Nonlinear resonances in the ABC-flow
NASA Astrophysics Data System (ADS)
Didov, A. A.; Uleysky, M. Yu.
2018-01-01
In this paper, we study resonances of the ABC-flow in the near integrable case ( C ≪1 ). This is an interesting example of a Hamiltonian system with 3/2 degrees of freedom in which simultaneous existence of two resonances of the same order is possible. Analytical conditions of the resonance existence are received. It is shown numerically that the largest n :1 (n = 1, 2, 3) resonances exist, and their energies are equal to theoretical energies in the near integrable case. We provide analytical and numerical evidences for existence of two branches of the two largest n :1 (n = 1, 2) resonances in the region of finite motion.
Refining the aggregate exposure pathway.
Tan, Yu-Mei; Leonard, Jeremy A; Edwards, Stephen; Teeguarden, Justin; Egeghy, Peter
2018-03-01
Advancements in measurement technologies and modeling capabilities continue to result in an abundance of exposure information, adding to that currently in existence. However, fragmentation within the exposure science community acts as an obstacle for realizing the vision set forth in the National Research Council's report on Exposure Science in the 21 st century to consider exposures from source to dose, on multiple levels of integration, and to multiple stressors. The concept of an Aggregate Exposure Pathway (AEP) was proposed as a framework for organizing and integrating diverse exposure information that exists across numerous repositories and among multiple scientific fields. A workshop held in May 2016 followed introduction of the AEP concept, allowing members of the exposure science community to provide extensive evaluation and feedback regarding the framework's structure, key components, and applications. The current work briefly introduces topics discussed at the workshop and attempts to address key challenges involved in refining this framework. The resulting evolution in the AEP framework's features allows for facilitating acquisition, integration, organization, and transparent application and communication of exposure knowledge in a manner that is independent of its ultimate use, thereby enabling reuse of such information in many applications.
A Single-Cell Approach to the Elusive Latent Human Cytomegalovirus Transcriptome.
Goodrum, Felicia; McWeeney, Shannon
2018-06-12
Herpesvirus latency has been difficult to understand molecularly due to low levels of viral genomes and gene expression. In the case of the betaherpesvirus human cytomegalovirus (HCMV), this is further complicated by the heterogeneity inherent to hematopoietic subpopulations harboring genomes and, as a consequence, the various patterns of infection that simultaneously exist in a host, ranging from latent to lytic. Single-cell RNA sequencing (scRNA-seq) provides tremendous potential in measuring the gene expression profiles of heterogeneous cell populations for a wide range of applications, including in studies of cancer, immunology, and infectious disease. A recent study by Shnayder et al. (mBio 9:e00013-18, 2018, https://doi.org/10.1128/mBio.00013-18) utilized scRNA-seq to define transcriptomal characteristics of HCMV latency. They conclude that latency-associated gene expression is similar to the late lytic viral program but at lower levels of expression. The study highlights the numerous challenges, from the definition of latency to the analysis of scRNA-seq, that exist in defining a latent transcriptome. Copyright © 2018 Goodrum and McWeeney.
Ethical Challenges in the Teaching of Multicultural Course Work
ERIC Educational Resources Information Center
Fier, Elizabeth Boyer; Ramsey, MaryLou
2005-01-01
The authors explore the ethical issues and challenges frequently encountered by counselor educators of multicultural course work. Existing ethics codes are examined, and the need for greater specificity with regard to teaching courses of multicultural content is addressed. Options for revising existing codes to better address the challenges of…
Applied Geophysics Opportunities in the Petroleum Industry
NASA Astrophysics Data System (ADS)
Olgaard, D. L.; Tikku, A.; Roberts, J. C.; Martinez, A.
2012-12-01
Meeting the increasing global demand for energy over the next several decades presents daunting challenges to engineers and scientists, including geoscientists of all disciplines. Many opportunities exist for geophysicists to find and produce oil and gas in a safe, environmentally responsible and affordable manner. Successful oil and gas exploration involves a 'Plates to Pores' approach that integrates multi-scale data from satellites, marine and land seismic and non-seismic field surveys, lab experiments, and even electron microscopy. The petroleum industry is at the forefront of using high performance computing to develop innovative methods to process and analyze large volumes of seismic data and perform realistic numerical modeling, such as finite element fluid flow and rock deformation simulations. Challenging and rewarding jobs in exploration, production and research exist for students with BS/BA, MS and PhD degrees. Geophysics students interested in careers in the petroleum industry should have a broad foundation in science, math and fundamental geosciences at the BS/BA level, as well as mastery of the scientific method, usually gained through thesis work at MS and PhD levels. Field geology or geophysics experience is also valuable. Other personal attributes typical for geoscientists to be successful in industry include a passion for solving complex geoscience problems, the flexibility to work on a variety of assignments throughout a career and skills such as teamwork, communication, integration and leadership. In this presentation we will give examples of research, exploration and production opportunities for geophysicists in petroleum companies and compare and contrast careers in academia vs. industry.
Zhang, Kaihua; Zhang, Lei; Yang, Ming-Hsuan
2014-10-01
It is a challenging task to develop effective and efficient appearance models for robust object tracking due to factors such as pose variation, illumination change, occlusion, and motion blur. Existing online tracking algorithms often update models with samples from observations in recent frames. Despite much success has been demonstrated, numerous issues remain to be addressed. First, while these adaptive appearance models are data-dependent, there does not exist sufficient amount of data for online algorithms to learn at the outset. Second, online tracking algorithms often encounter the drift problems. As a result of self-taught learning, misaligned samples are likely to be added and degrade the appearance models. In this paper, we propose a simple yet effective and efficient tracking algorithm with an appearance model based on features extracted from a multiscale image feature space with data-independent basis. The proposed appearance model employs non-adaptive random projections that preserve the structure of the image feature space of objects. A very sparse measurement matrix is constructed to efficiently extract the features for the appearance model. We compress sample images of the foreground target and the background using the same sparse measurement matrix. The tracking task is formulated as a binary classification via a naive Bayes classifier with online update in the compressed domain. A coarse-to-fine search strategy is adopted to further reduce the computational complexity in the detection procedure. The proposed compressive tracking algorithm runs in real-time and performs favorably against state-of-the-art methods on challenging sequences in terms of efficiency, accuracy and robustness.
Working and caring for a child with chronic illness: A review of current literature.
Kish, A M; Newcombe, P A; Haslam, D M
2018-05-01
Advances in medical knowledge have contributed to the increase in the number of children living with some form of long-term chronic illness or condition. As a consequence of these advancements, treatments that are more accessible and easier to administer, usually within a child's home, have been developed. However, this may mean that parents take on greater treatment responsibility and require extra time and energy to meet these tasks, additional to other responsibilities. This review paper aims to summarize and critique existing literature on working parents of children with a chronic condition, by focusing on patterns of parent work, the challenges experienced, and the flow-on consequences to well-being. Employing a narrative, meta-synthesis of the current literature, this review identified 3 key themes related to working parents of children with chronic illness. The paper first identifies that although employment is less common, these parents are not necessarily nonworking. Second, these parents experience numerous challenges including balancing work and family, time constraints, stress, and feelings of "doing it all." And third, the above challenges lead to additional impacts on parental quality of life. This review summarizes what is currently known about work patterns, challenges, and consequences in parents of children with chronic conditions. Employment is clearly impacted for these parents. Although workplace challenges have been extensively researched, other challenges (eg, personal and family) and impacts on their well-being have not. This review discusses the present standing of this research. It outlines the strengths and limitations of the current literature, makes recommendations for future research, and suggests theoretical and practical implications of the further findings. © 2018 John Wiley & Sons Ltd.
Challenges of Integrating NASA's Space Communications Networks
NASA Technical Reports Server (NTRS)
Reinert, Jessica; Barnes, Patrick
2013-01-01
The transition to new technology, innovative ideas, and resistance to change is something that every industry experiences. Recent examples of this shift are changing to using robots in the assembly line construction of automobiles or the increasing use of robotics for medical procedures. Most often this is done with cost-reduction in mind, though ease of use for the customer is also a driver. All industries experience the push to increase efficiency of their systems; National Aeronautics and Space Administration (NASA) and the commercial space industry are no different. NASA space communication services are provided by three separately designed, developed, maintained, and operated communications networks known as the Deep Space Network (DSN), Near Earth Network (NEN) and Space Network (SN). The Space Communications and Navigation (SCaN) Program is pursuing integration of these networks and has performed a variety of architecture trade studies to determine what integration options would be the most effective in achieving a unified user mission support organization, and increase the use of common operational equipment and processes. The integration of multiple, legacy organizations and existing systems has challenges ranging from technical to cultural. The existing networks are the progeny of the very first communication and tracking capabilities implemented by NASA and the Jet Propulsion Laboratory (JPL) more than 50 years ago and have been customized to the needs of their respective user mission base. The technical challenges to integrating the networks are many, though not impossible to overcome. The three distinct networks provide the same types of services, with customizable data rates, bandwidth, frequencies, and so forth. The differences across the networks have occurred in effort to satisfy their user missions' needs. Each new requirement has made the networks more unique and harder to integrate. The cultural challenges, however, have proven to be a significant obstacle for integration. Over the past few decades of use, user missions and network personnel alike have grown accustomed to the processes by which services are provided by the NASA communications and navigation networks. The culture established by each network has created several challenges that need to be overcome in order to effectively integrate the networks. As with any change, there has been resistance, an apprehension to explore automation of existing processes, and a working environment that attempts to indirectly influence change without mandating compliance. Overcoming technical and cultural challenges is essential to successfully integrating the networks and although the challenges are numerous, the integration of the networks promises a more efficient space communications network for NASA and its customers, as well as potential long-term cost savings to the agency. This paper, Challenges of Integrating NASA Legacy Communications Networks, will provide a brief overview of the current NASA space communications networks as well as the an overview of the process implemented while performing the SCaN Trade Studies and an introduction to the requirements driving integration of the SCaN Networks. This paper will describe in detail the challenges experienced, both technical and cultural, while working with NASA space communications network-specific personnel. The paper will also cover lessons learned during the performance of architecture trade studies and provide recommendations for ways to improve the process.
Challenges of Integrating NASAs Space Communication Networks
NASA Technical Reports Server (NTRS)
Reinert, Jessica M.; Barnes, Patrick
2013-01-01
The transition to new technology, innovative ideas, and resistance to change is something that every industry experiences. Recent examples of this shift are changing to using robots in the assembly line construction of automobiles or the increasing use of robotics for medical procedures. Most often this is done with cost-reduction in mind, though ease of use for the customer is also a driver. All industries experience the push to increase efficiency of their systems; National Aeronautics and Space Administration (NASA) and the commercial space industry are no different. NASA space communication services are provided by three separately designed, developed, maintained, and operated communications networks known as the Deep Space Network (DSN), Near Earth Network (NEN) and Space Network (SN). The Space Communications and Navigation (SCaN) Program is pursuing integration of these networks and has performed a variety of architecture trade studies to determine what integration options would be the most effective in achieving a unified user mission support organization, and increase the use of common operational equipment and processes. The integration of multiple, legacy organizations and existing systems has challenges ranging from technical to cultural. The existing networks are the progeny of the very first communication and tracking capabilities implemented by NASA and the Jet Propulsion Laboratory (JPL) more than 50 years ago and have been customized to the needs of their respective user mission base. The technical challenges to integrating the networks are many, though not impossible to overcome. The three distinct networks provide the same types of services, with customizable data rates, bandwidth, frequencies, and so forth. The differences across the networks have occurred in effort to satisfy their user missions' needs. Each new requirement has made the networks more unique and harder to integrate. The cultural challenges, however, have proven to be a significant obstacle for integration. Over the past few decades of use, user missions and network personnel alike have grown accustomed to the processes by which services are provided by the NASA communications and navigation networks. The culture established by each network has created several challenges that need to be overcome in order to effectively integrate the networks. As with any change, there has been resistance, an apprehension to explore automation of existing processes, and a working environment that attempts to indirectly influence change without mandating compliance. Overcoming technical and cultural challenges is essential to successfully integrating the networks and although the challenges are numerous, the integration of the networks promises a more efficient space communications network for NASA and its customers, as well as potential long-term cost savings to the agency. This paper, Challenges of Integrating NASA Legacy Communications Networks, will provide a brief overview of the current NASA space communications networks as well as the an overview of the process implemented while performing the SCaN Trade Studies and an introduction to the requirements driving integration of the SCaN Networks. This paper will describe in detail the challenges experienced, both technical and cultural, while working with NASA space communications network-specific personnel. The paper will also cover lessons learned during the performance of architecture trade studies and provide recommendations for ways to improve the process.
Development of Image Segmentation Methods for Intracranial Aneurysms
Qian, Yi; Morgan, Michael
2013-01-01
Though providing vital means for the visualization, diagnosis, and quantification of decision-making processes for the treatment of vascular pathologies, vascular segmentation remains a process that continues to be marred by numerous challenges. In this study, we validate eight aneurysms via the use of two existing segmentation methods; the Region Growing Threshold and Chan-Vese model. These methods were evaluated by comparison of the results obtained with a manual segmentation performed. Based upon this validation study, we propose a new Threshold-Based Level Set (TLS) method in order to overcome the existing problems. With divergent methods of segmentation, we discovered that the volumes of the aneurysm models reached a maximum difference of 24%. The local artery anatomical shapes of the aneurysms were likewise found to significantly influence the results of these simulations. In contrast, however, the volume differences calculated via use of the TLS method remained at a relatively low figure, at only around 5%, thereby revealing the existence of inherent limitations in the application of cerebrovascular segmentation. The proposed TLS method holds the potential for utilisation in automatic aneurysm segmentation without the setting of a seed point or intensity threshold. This technique will further enable the segmentation of anatomically complex cerebrovascular shapes, thereby allowing for more accurate and efficient simulations of medical imagery. PMID:23606905
Robust volcano plot: identification of differential metabolites in the presence of outliers.
Kumar, Nishith; Hoque, Md Aminul; Sugimoto, Masahiro
2018-04-11
The identification of differential metabolites in metabolomics is still a big challenge and plays a prominent role in metabolomics data analyses. Metabolomics datasets often contain outliers because of analytical, experimental, and biological ambiguity, but the currently available differential metabolite identification techniques are sensitive to outliers. We propose a kernel weight based outlier-robust volcano plot for identifying differential metabolites from noisy metabolomics datasets. Two numerical experiments are used to evaluate the performance of the proposed technique against nine existing techniques, including the t-test and the Kruskal-Wallis test. Artificially generated data with outliers reveal that the proposed method results in a lower misclassification error rate and a greater area under the receiver operating characteristic curve compared with existing methods. An experimentally measured breast cancer dataset to which outliers were artificially added reveals that our proposed method produces only two non-overlapping differential metabolites whereas the other nine methods produced between seven and 57 non-overlapping differential metabolites. Our data analyses show that the performance of the proposed differential metabolite identification technique is better than that of existing methods. Thus, the proposed method can contribute to analysis of metabolomics data with outliers. The R package and user manual of the proposed method are available at https://github.com/nishithkumarpaul/Rvolcano .
Identifying important nodes by adaptive LeaderRank
NASA Astrophysics Data System (ADS)
Xu, Shuang; Wang, Pei
2017-03-01
Spreading process is a common phenomenon in complex networks. Identifying important nodes in complex networks is of great significance in real-world applications. Based on the spreading process on networks, a lot of measures have been proposed to evaluate the importance of nodes. However, most of the existing measures are appropriate to static networks, which are fragile to topological perturbations. Many real-world complex networks are dynamic rather than static, meaning that the nodes and edges of such networks may change with time, which challenge numerous existing centrality measures. Based on a new weighted mechanism and the newly proposed H-index and LeaderRank (LR), this paper introduces a variant of the LR measure, called adaptive LeaderRank (ALR), which is a new member of the LR-family. Simulations on six real-world networks reveal that the new measure can well balance between prediction accuracy and robustness. More interestingly, the new measure can better adapt to the adjustment or local perturbations of network topologies, as compared with the existing measures. By discussing the detailed properties of the measures from the LR-family, we illustrate that the ALR has its competitive advantages over the other measures. The proposed algorithm enriches the measures to understand complex networks, and may have potential applications in social networks and biological systems.
Numerical Relativity, Black Hole Mergers, and Gravitational Waves: Part I
NASA Technical Reports Server (NTRS)
Centrella, Joan
2012-01-01
This series of 3 lectures will present recent developments in numerical relativity, and their applications to simulating black hole mergers and computing the resulting gravitational waveforms. In this first lecture, we introduce the basic ideas of numerical relativity, highlighting the challenges that arise in simulating gravitational wave sources on a computer.
Haser, Grace C.; Tuttle, R. Michael; Su, Henry K.; Alon, Eran E.; Bergman, Donald; Bernet, Victor; Brett, Elise; Cobin, Rhoda; Dewey, Eliza H.; Doherty, Gerard; Dos Reis, Laura L.; Harris, Jeffrey; Klopper, Joshua; Lee, Stephanie L.; Levine, Robert A.; Lepore, Stephen J.; Likhterov, Ilya; Lupo, Mark A.; Machac, Josef; Mechanick, Jeffrey I.; Mehra, Saral; Milas, Mira; Orloff, Lisa A.; Randolph, Gregory; Revenson, Tracey A.; Roberts, Katherine J.; Ross, Douglas S.; Rowe, Meghan E.; Smallridge, Robert C.; Terris, David; Tufano, Ralph P.; Urken, Mark L.
2017-01-01
Objective The dramatic increase in papillary thyroid carcinoma (PTC) is primarily a result of early diagnosis of small cancers. Active surveillance is a promising management strategy for papillary thyroid microcarcinomas (PTMCs). However, as this management strategy gains traction in the U.S., it is imperative that patients and clinicians be properly educated, patients be followed for life, and appropriate tools be identified to implement the strategy. Methods We review previous active surveillance studies and the parameters used to identify patients who are good candidates for active surveillance. We also review some of the challenges to implementing active surveillance protocols in the U.S. and discuss how these might be addressed. Results Trials of active surveillance support nonsurgical management as a viable and safe management strategy. However, numerous challenges exist, including the need for adherence to protocols, education of patients and physicians, and awareness of the impact of this strategy on patient psychology and quality of life. The Thyroid Cancer Care Collaborative (TCCC) is a portable record keeping system that can manage a mobile patient population undergoing active surveillance. Conclusion With proper patient selection, organization, and patient support, active surveillance has the potential to be a long-term management strategy for select patients with PTMC. In order to address the challenges and opportunities for this approach to be successfully implemented in the U.S., it will be necessary to consider psychological and quality of life, cultural differences, and the patient’s clinical status. PMID:26799628
ERIC Educational Resources Information Center
King, Cheryl A.; Kramer, Anne C.
2008-01-01
Intervention research with youths at elevated risk for suicidal behavior and suicide--a vulnerable and high risk population--presents investigators with numerous ethical challenges. This report specifically addresses those challenges involving the informed consent and assent process with parents/guardians and youths. The challenges are delineated…
NASA Astrophysics Data System (ADS)
Liang, Qingguo; Li, Jie; Li, Dewu; Ou, Erfeng
2013-01-01
The vibrations of existing service tunnels induced by blast-excavation of adjacent tunnels have attracted much attention from both academics and engineers during recent decades in China. The blasting vibration velocity (BVV) is the most widely used controlling index for in situ monitoring and safety assessment of existing lining structures. Although numerous in situ tests and simulations had been carried out to investigate blast-induced vibrations of existing tunnels due to excavation of new tunnels (mostly by bench excavation method), research on the overall dynamical response of existing service tunnels in terms of not only BVV but also stress/strain seemed limited for new tunnels excavated by the full-section blasting method. In this paper, the impacts of blast-induced vibrations from a new tunnel on an existing railway tunnel in Xinjiang, China were comprehensively investigated by using laboratory tests, in situ monitoring and numerical simulations. The measured data from laboratory tests and in situ monitoring were used to determine the parameters needed for numerical simulations, and were compared with the calculated results. Based on the results from in situ monitoring and numerical simulations, which were consistent with each other, the original blasting design and corresponding parameters were adjusted to reduce the maximum BVV, which proved to be effective and safe. The effect of both the static stress before blasting vibrations and the dynamic stress induced by blasting on the total stresses in the existing tunnel lining is also discussed. The methods and related results presented could be applied in projects with similar ground and distance between old and new tunnels if the new tunnel is to be excavated by the full-section blasting method.
An improved conjugate gradient scheme to the solution of least squares SVM.
Chu, Wei; Ong, Chong Jin; Keerthi, S Sathiya
2005-03-01
The least square support vector machines (LS-SVM) formulation corresponds to the solution of a linear system of equations. Several approaches to its numerical solutions have been proposed in the literature. In this letter, we propose an improved method to the numerical solution of LS-SVM and show that the problem can be solved using one reduced system of linear equations. Compared with the existing algorithm for LS-SVM, the approach used in this letter is about twice as efficient. Numerical results using the proposed method are provided for comparisons with other existing algorithms.
NASA Astrophysics Data System (ADS)
Wang, Dongling; Xiao, Aiguo; Li, Xueyang
2013-02-01
Based on W-transformation, some parametric symplectic partitioned Runge-Kutta (PRK) methods depending on a real parameter α are developed. For α=0, the corresponding methods become the usual PRK methods, including Radau IA-IA¯ and Lobatto IIIA-IIIB methods as examples. For any α≠0, the corresponding methods are symplectic and there exists a value α∗ such that energy is preserved in the numerical solution at each step. The existence of the parameter and the order of the numerical methods are discussed. Some numerical examples are presented to illustrate these results.
Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J
2017-01-01
A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
DOT National Transportation Integrated Search
2016-03-11
The purpose of this work is to identify instances where the existing Federal Motor Vehicle Safety Standards may pose challenges to the introduction of automated vehicles. It identifies standards requiring further review - both to ensure that existing...
Cui, Jiaxin; Georgiou, George K; Zhang, Yiyun; Li, Yixun; Shu, Hua; Zhou, Xinlin
2017-02-01
Rapid automatized naming (RAN) has been found to predict mathematics. However, the nature of their relationship remains unclear. Thus, the purpose of this study was twofold: (a) to examine how RAN (numeric and non-numeric) predicts a subdomain of mathematics (arithmetic fluency) and (b) to examine what processing skills may account for the RAN-arithmetic fluency relationship. A total of 160 third-year kindergarten Chinese children (83 boys and 77 girls, mean age=5.11years) were assessed on RAN (colors, objects, digits, and dice), nonverbal IQ, visual-verbal paired associate learning, phonological awareness, short-term memory, speed of processing, approximate number system acuity, and arithmetic fluency (addition and subtraction). The results indicated first that RAN was a significant correlate of arithmetic fluency and the correlations did not vary as a function of type of RAN or arithmetic fluency tasks. In addition, RAN continued to predict addition and subtraction fluency even after controlling for all other processing skills. Taken together, these findings challenge the existing theoretical accounts of the RAN-arithmetic fluency relationship and suggest that, similar to reading fluency, multiple processes underlie the RAN-arithmetic fluency relationship. Copyright © 2016 Elsevier Inc. All rights reserved.
Computational Relativistic Astrophysics Using the Flowfield-Dependent Variation Theory
NASA Technical Reports Server (NTRS)
Richardson, G. A.; Chung, T. J.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
Theoretical models, observations and measurements have preoccupied astrophysicists for many centuries. Only in recent years, has the theory of relativity as applied to astrophysical flows met the challenges of how the governing equations can be solved numerically with accuracy and efficiency. Even without the effects of relativity, the physics of magnetohydrodynamic flow instability, turbulence, radiation, and enhanced transport in accretion disks has not been completely resolved. Relativistic effects become pronounced in such cases as jet formation from black hole magnetized accretion disks and also in the study of Gamma-Ray bursts (GRB). Thus, our concern in this paper is to reexamine existing numerical simulation tools as to the accuracy and efficiency of computations and introduce a new approach known as the flowfield-dependent variation (FDV) method. The main feature of the FDV method consists of accommodating discontinuities of shock waves and high gradients of flow variables such as occur in turbulence and unstable motions. In this paper, the physics involved in the solution of relativistic hydrodynamics and solution strategies of the FDV theory are elaborated. The general relativistic astrophysical flow and shock solver (GRAFSS) is introduced, and some simple example problems for Computational Relativistic Astrophysics (CRA) are demonstrated.
Z2Pack: Numerical implementation of hybrid Wannier centers for identifying topological materials
NASA Astrophysics Data System (ADS)
Gresch, Dominik; Autès, Gabriel; Yazyev, Oleg V.; Troyer, Matthias; Vanderbilt, David; Bernevig, B. Andrei; Soluyanov, Alexey A.
2017-02-01
The intense theoretical and experimental interest in topological insulators and semimetals has established band structure topology as a fundamental material property. Consequently, identifying band topologies has become an important, but often challenging, problem, with no exhaustive solution at the present time. In this work we compile a series of techniques, some previously known, that allow for a solution to this problem for a large set of the possible band topologies. The method is based on tracking hybrid Wannier charge centers computed for relevant Bloch states, and it works at all levels of materials modeling: continuous k .p models, tight-binding models, and ab initio calculations. We apply the method to compute and identify Chern, Z2, and crystalline topological insulators, as well as topological semimetal phases, using real material examples. Moreover, we provide a numerical implementation of this technique (the Z2Pack software package) that is ideally suited for high-throughput screening of materials databases for compounds with nontrivial topologies. We expect that our work will allow researchers to (a) identify topological materials optimal for experimental probes, (b) classify existing compounds, and (c) reveal materials that host novel, not yet described, topological states.
Review of Thawing Time Prediction Models Depending on Process Conditions and Product Characteristics
Kluza, Franciszek; Spiess, Walter E. L.; Kozłowicz, Katarzyna
2016-01-01
Summary Determining thawing times of frozen foods is a challenging problem as the thermophysical properties of the product change during thawing. A number of calculation models and solutions have been developed. The proposed solutions range from relatively simple analytical equations based on a number of assumptions to a group of empirical approaches that sometimes require complex calculations. In this paper analytical, empirical and graphical models are presented and critically reviewed. The conditions of solution, limitations and possible applications of the models are discussed. The graphical and semi--graphical models are derived from numerical methods. Using the numerical methods is not always possible as running calculations takes time, whereas the specialized software and equipment are not always cheap. For these reasons, the application of analytical-empirical models is more useful for engineering. It is demonstrated that there is no simple, accurate and feasible analytical method for thawing time prediction. Consequently, simplified methods are needed for thawing time estimation of agricultural and food products. The review reveals the need for further improvement of the existing solutions or development of new ones that will enable accurate determination of thawing time within a wide range of practical conditions of heat transfer during processing. PMID:27904387
A new weak Galerkin finite element method for elliptic interface problems
Mu, Lin; Wang, Junping; Ye, Xiu; ...
2016-08-26
We introduce and analyze a new weak Galerkin (WG) finite element method in this paper for solving second order elliptic equations with discontinuous coefficients and interfaces. Comparing with the existing WG algorithm for solving the same type problems, the present WG method has a simpler variational formulation and fewer unknowns. Moreover, the new WG algorithm allows the use of finite element partitions consisting of general polytopal meshes and can be easily generalized to high orders. Optimal order error estimates in both H1 and L2 norms are established for the present WG finite element solutions. We conducted extensive numerical experiments inmore » order to examine the accuracy, flexibility, and robustness of the proposed WG interface approach. In solving regular elliptic interface problems, high order convergences are numerically confirmed by using piecewise polynomial basis functions of high degrees. Moreover, the WG method is shown to be able to accommodate very complicated interfaces, due to its flexibility in choosing finite element partitions. Finally, in dealing with challenging problems with low regularities, the piecewise linear WG method is capable of delivering a second order of accuracy in L∞ norm for both C1 and H2 continuous solutions.« less
A new weak Galerkin finite element method for elliptic interface problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mu, Lin; Wang, Junping; Ye, Xiu
We introduce and analyze a new weak Galerkin (WG) finite element method in this paper for solving second order elliptic equations with discontinuous coefficients and interfaces. Comparing with the existing WG algorithm for solving the same type problems, the present WG method has a simpler variational formulation and fewer unknowns. Moreover, the new WG algorithm allows the use of finite element partitions consisting of general polytopal meshes and can be easily generalized to high orders. Optimal order error estimates in both H1 and L2 norms are established for the present WG finite element solutions. We conducted extensive numerical experiments inmore » order to examine the accuracy, flexibility, and robustness of the proposed WG interface approach. In solving regular elliptic interface problems, high order convergences are numerically confirmed by using piecewise polynomial basis functions of high degrees. Moreover, the WG method is shown to be able to accommodate very complicated interfaces, due to its flexibility in choosing finite element partitions. Finally, in dealing with challenging problems with low regularities, the piecewise linear WG method is capable of delivering a second order of accuracy in L∞ norm for both C1 and H2 continuous solutions.« less
NASA Astrophysics Data System (ADS)
Mercier, Sylvain; Gratton, Serge; Tardieu, Nicolas; Vasseur, Xavier
2017-12-01
Many applications in structural mechanics require the numerical solution of sequences of linear systems typically issued from a finite element discretization of the governing equations on fine meshes. The method of Lagrange multipliers is often used to take into account mechanical constraints. The resulting matrices then exhibit a saddle point structure and the iterative solution of such preconditioned linear systems is considered as challenging. A popular strategy is then to combine preconditioning and deflation to yield an efficient method. We propose an alternative that is applicable to the general case and not only to matrices with a saddle point structure. In this approach, we consider to update an existing algebraic or application-based preconditioner, using specific available information exploiting the knowledge of an approximate invariant subspace or of matrix-vector products. The resulting preconditioner has the form of a limited memory quasi-Newton matrix and requires a small number of linearly independent vectors. Numerical experiments performed on three large-scale applications in elasticity highlight the relevance of the new approach. We show that the proposed method outperforms the deflation method when considering sequences of linear systems with varying matrices.
A Mechanistic Model of the Actin Cycle
Bindschadler, M.; Osborn, E. A.; Dewey, C. F.; McGrath, J. L.
2004-01-01
We have derived a broad, deterministic model of the steady-state actin cycle that includes its major regulatory mechanisms. Ours is the first model to solve the complete nucleotide profile within filaments, a feature that determines the dynamics and geometry of actin networks at the leading edges of motile cells, and one that has challenged investigators developing models to interpret steady-state experiments. We arrived at the nucleotide profile through analytic and numerical approaches that completely agree. Our model reproduces behaviors seen in numerous experiments with purified proteins, but allows a detailed inspection of the concentrations and fluxes that might exist in these experiments. These inspections provide new insight into the mechanisms that determine the rate of actin filament treadmilling. Specifically, we find that mechanisms for enhancing Pi release from the ADP·Pi intermediate on filaments, for increasing the off rate of ADP-bound subunits at pointed ends, and the multiple, simultaneous functions of profilin, make unique and essential contributions to increased treadmilling. In combination, these mechanisms have a theoretical capacity to increase treadmilling to levels limited only by the amount of available actin. This limitation arises because as the cycle becomes more dynamic, it tends toward the unpolymerized state. PMID:15111391
Linearized lattice Boltzmann method for micro- and nanoscale flow and heat transfer.
Shi, Yong; Yap, Ying Wan; Sader, John E
2015-07-01
Ability to characterize the heat transfer in flowing gases is important for a wide range of applications involving micro- and nanoscale devices. Gas flows away from the continuum limit can be captured using the Boltzmann equation, whose analytical solution poses a formidable challenge. An efficient and accurate numerical simulation of the Boltzmann equation is thus highly desirable. In this article, the linearized Boltzmann Bhatnagar-Gross-Krook equation is used to develop a hierarchy of thermal lattice Boltzmann (LB) models based on half-space Gaussian-Hermite (GH) quadrature ranging from low to high algebraic precision, using double distribution functions. Simplified versions of the LB models in the continuum limit are also derived, and are shown to be consistent with existing thermal LB models for noncontinuum heat transfer reported in the literature. Accuracy of the proposed LB hierarchy is assessed by simulating thermal Couette flows for a wide range of Knudsen numbers. Effects of the underlying quadrature schemes (half-space GH vs full-space GH) and continuum-limit simplifications on computational accuracy are also elaborated. The numerical findings in this article provide direct evidence of improved computational capability of the proposed LB models for modeling noncontinuum flows and heat transfer at small length scales.
NASA Astrophysics Data System (ADS)
Sakai, K.; Watabe, D.; Minamidani, T.; Zhang, G. S.
2012-10-01
According to Godunov theorem for numerical calculations of advection equations, there exist no higher-order schemes with constant positive difference coefficients in a family of polynomial schemes with an accuracy exceeding the first-order. We propose a third-order computational scheme for numerical fluxes to guarantee the non-negative difference coefficients of resulting finite difference equations for advection-diffusion equations in a semi-conservative form, in which there exist two kinds of numerical fluxes at a cell surface and these two fluxes are not always coincident in non-uniform velocity fields. The present scheme is optimized so as to minimize truncation errors for the numerical fluxes while fulfilling the positivity condition of the difference coefficients which are variable depending on the local Courant number and diffusion number. The feature of the present optimized scheme consists in keeping the third-order accuracy anywhere without any numerical flux limiter. We extend the present method into multi-dimensional equations. Numerical experiments for advection-diffusion equations showed nonoscillatory solutions.
ERIC Educational Resources Information Center
Opfer, John E.; Thompson, Clarissa A.; Furlong, Ellen E.
2010-01-01
Numeric magnitudes often bias adults' spatial performance. Partly because the direction of this bias (left-to-right versus right-to-left) is culture-specific, it has been assumed that the orientation of spatial-numeric associations is a late development, tied to reading practice or schooling. Challenging this assumption, we found that preschoolers…
LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.
2017-08-01
MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.
Practice brief. Securing wireless technology for healthcare.
Retterer, John; Casto, Brian W
2004-05-01
Wireless networking can be a very complex science, requiring an understanding of physics and the electromagnetic spectrum. While the radio theory behind the technology can be challenging, a basic understanding of wireless networking can be sufficient for small-scale deployment. Numerous security mechanisms are available to wireless technologies, making it practical, scalable, and affordable for healthcare organizations. The decision on the selected security model should take into account the needs for additional server hardware and administrative costs. Where wide area network connections exist between cooperative organizations, deployment of a distributed security model can be considered to reduce administrative overhead. The wireless approach chosen should be dynamic and concentrate on the organization's specific environmental needs. Aspects of organizational mission, operations, service level, and budget allotment as well as an organization's risk tolerance are all part of the balance in the decision to deploy wireless technology.
In Vivo Tumor Vasculature Targeting of CuS@MSN Based Theranostic Nanomedicine.
Chen, Feng; Hong, Hao; Goel, Shreya; Graves, Stephen A; Orbay, Hakan; Ehlerding, Emily B; Shi, Sixiang; Theuer, Charles P; Nickles, Robert J; Cai, Weibo
2015-01-01
Actively targeted theranostic nanomedicine may be the key for future personalized cancer management. Although numerous types of theranostic nanoparticles have been developed in the past decade for cancer treatment, challenges still exist in the engineering of biocompatible theranostic nanoparticles with highly specific in vivo tumor targeting capabilities. Here, we report the design, synthesis, surface engineering, and in vivo active vasculature targeting of a new category of theranostic nanoparticle for future cancer management. Water-soluble photothermally sensitive copper sulfide nanoparticles were encapsulated in biocompatible mesoporous silica shells, followed by multistep surface engineering to form the final theranostic nanoparticles. Systematic in vitro targeting, an in vivo long-term toxicity study, photothermal ablation evaluation, in vivo vasculature targeted imaging, biodistribution and histology studies were performed to fully explore the potential of as-developed new theranostic nanoparticles.
Pacing and Defibrillators in Complex Congenital Heart Disease
Chubb, Henry; O’Neill, Mark; Rosenthal, Eric
2016-01-01
Device therapy in the complex congenital heart disease (CHD) population is a challenging field. There is a myriad of devices available, but none designed specifically for the CHD patient group, and a scarcity of prospective studies to guide best practice. Baseline cardiac anatomy, prior surgical and interventional procedures, existing tachyarrhythmias and the requirement for future intervention all play a substantial role in decision making. For both pacing systems and implantable cardioverter defibrillators, numerous factors impact on the merits of system location (endovascular versus non-endovascular), lead positioning, device selection and device programming. For those with Fontan circulation and following the atrial switch procedure there are also very specific considerations regarding access and potential complications. This review discusses the published guidelines, device indications and the best available evidence for guidance of device implantation in the complex CHD population. PMID:27403295
NASA Astrophysics Data System (ADS)
Vongehr, Sascha; Tang, Shaochun
2016-06-01
Research on hollow nanoshells has, for years, claimed to involve free, pre-existing nanobubbles as soft templates. It is a challenge to demonstrate this due to the difficulty of in situ observation during solution-based reactions. We show that no available free-bubble theory can describe the mysterious behavior of the bubble number density n. A new mechanism of collision coalescence of bubble-particle systems is suggested to form hollow nanoshells. By approximating relative velocity as ˜R -z (R is bubble radius), numerical simulations can reproduce the counterintuitive observations in the regime 1 < z < 2. We discuss the mechanism based on successful synthesis of grain-monolayer thin, fractal-like incomplete, multi-metallic nanoshells with superior catalytic activity. The behaviors of n, R, and shell thickness h are closely reproduced by z = 1.6.
Luque, John S; Raychowdhury, Swati; Weaver, Mary
2012-01-01
The objective of this pilot study was to understand, from the Vaccines for Children (VFC) program provider's perspective, issues relating to vaccine access and compliance for Hispanic adolescents in a rural setting. Researchers conducted individual structured interviews with VFC providers and focus groups with Hispanic immigrant parents in rural southern Georgia. Overall, the VFC providers said that their Hispanic patients were very positive toward vaccines in general, but there were cost issues related to stocking the vaccine and reaching the Hispanic population. The focus group discussions revealed that most Hispanic parents were not aware of the existence of the human papilloma virus (HPV) vaccine, nor had they heard about the VFC program. Numerous vaccination barriers continue to impact HPV vaccine uptake in the Hispanic immigrant population in the US South.
A review on the mechanical design elements of ankle rehabilitation robot.
Khalid, Yusuf M; Gouwanda, Darwin; Parasuraman, Subramanian
2015-06-01
Ankle rehabilitation robots are developed to enhance ankle strength, flexibility and proprioception after injury and to promote motor learning and ankle plasticity in patients with drop foot. This article reviews the design elements that have been incorporated into the existing robots, for example, backdrivability, safety measures and type of actuation. It also discusses numerous challenges faced by engineers in designing this robot, including robot stability and its dynamic characteristics, universal evaluation criteria to assess end-user comfort, safety and training performance and the scientific basis on the optimal rehabilitation strategies to improve ankle condition. This article can serve as a reference to design robot with better stability and dynamic characteristics and good safety measures against internal and external events. It can also serve as a guideline for the engineers to report their designs and findings. © IMechE 2015.
Orbit Determination Covariance Analysis for the Europa Clipper Mission
NASA Technical Reports Server (NTRS)
Ionasescu, Rodica; Martin-Mur, Tomas; Valerino, Powtawche; Criddle, Kevin; Buffington, Brent; McElrath, Timothy
2014-01-01
A new Jovian satellite tour is proposed by NASA, which would include numerous flybys of the moon Europa, and would explore its potential habitability by characterizing the existence of any water within and beneath Europa's ice shell. This paper describes the results of a covariance study that was undertaken on a sample tour to assess the navigational challenges and capabilities of such a mission from an orbit determination (OD) point of view, and to help establish a delta V budget for the maneuvers needed to keep the spacecraft on the reference trajectory. Additional parametric variations from the baseline case were also investigated. The success of the Europa Clipper mission will depend on the science measurements that it will enable. Meeting the requirements of the instruments onboard the spacecraft is an integral part of this analysis.
Particulate photocatalysts for overall water splitting
NASA Astrophysics Data System (ADS)
Chen, Shanshan; Takata, Tsuyoshi; Domen, Kazunari
2017-10-01
The conversion of solar energy to chemical energy is a promising way of generating renewable energy. Hydrogen production by means of water splitting over semiconductor photocatalysts is a simple, cost-effective approach to large-scale solar hydrogen synthesis. Since the discovery of the Honda-Fujishima effect, considerable progress has been made in this field, and numerous photocatalytic materials and water-splitting systems have been developed. In this Review, we summarize existing water-splitting systems based on particulate photocatalysts, focusing on the main components: light-harvesting semiconductors and co-catalysts. The essential design principles of the materials employed for overall water-splitting systems based on one-step and two-step photoexcitation are also discussed, concentrating on three elementary processes: photoabsorption, charge transfer and surface catalytic reactions. Finally, we outline challenges and potential advances associated with solar water splitting by particulate photocatalysts for future commercial applications.
Simultaneous co-fermentation of mixed sugars: a promising strategy for producing cellulosic ethanol.
Kim, Soo Rin; Ha, Suk-Jin; Wei, Na; Oh, Eun Joong; Jin, Yong-Su
2012-05-01
The lack of microbial strains capable of fermenting all sugars prevalent in plant cell wall hydrolyzates to ethanol is a major challenge. Although naturally existing or engineered microorganisms can ferment mixed sugars (glucose, xylose and galactose) in these hydrolyzates sequentially, the preferential utilization of glucose to non-glucose sugars often results in lower overall yield and productivity of ethanol. Therefore, numerous metabolic engineering approaches have been attempted to construct optimal microorganisms capable of co-fermenting mixed sugars simultaneously. Here, we present recent findings and breakthroughs in engineering yeast for improved ethanol production from mixed sugars. In particular, this review discusses new sugar transporters, various strategies for simultaneous co-fermentation of mixed sugars, and potential applications of co-fermentation for producing fuels and chemicals. Copyright © 2012 Elsevier Ltd. All rights reserved.
Challenges to Reducing Discrimination and Health Inequity Through Existing Civil Rights Laws
Chandra, Amitabh; Frakes, Michael; Malani, Anup
2017-01-01
Fifty years after the passage of Civil Rights Act, minority healthcare remains separate and unequal. We combine insights from Civil Rights Law and research on racial-disparities to understand whether stronger enforcement of existing Civil Rights laws would improve minority healthcare today, or whether complementary approaches are also necessary. Despite earlier success, modern challenges to improving minority healthcare are different than those confronted during de jure segregation. We review these challenges and the potential effectiveness of existing Civil Rights legislation in overcoming them. We conclude that enforcement could be strengthened by executive orders that strengthen existing laws, but Congressional action would be required to allow private individuals to bring suits against discriminatory providers. We contrast the relative benefits of this approach to wider non-litigation-based solutions. We conclude that a combination of the two approaches would better address the challenge of improving minority healthcare in the 21st century. PMID:28583962
Efficient Low Dissipative High Order Schemes for Multiscale MHD Flows
NASA Technical Reports Server (NTRS)
Sjoegreen, Bjoern; Yee, Helen C.; Mansour, Nagi (Technical Monitor)
2002-01-01
Accurate numerical simulations of complex multiscale compressible viscous flows, especially high speed turbulence combustion and acoustics, demand high order schemes with adaptive numerical dissipation controls. Standard high resolution shock-capturing methods are too dissipative to capture the small scales and/or long-time wave propagations without extreme grid refinements and small time steps. An integrated approach for the control of numerical dissipation in high order schemes for the compressible Euler and Navier-Stokes equations has been developed and verified by the authors and collaborators. These schemes are suitable for the problems in question. Basically, the scheme consists of sixth-order or higher non-dissipative spatial difference operators as the base scheme. To control the amount of numerical dissipation, multiresolution wavelets are used as sensors to adaptively limit the amount and to aid the selection and/or blending of the appropriate types of numerical dissipation to be used. Magnetohydrodynamics (MHD) waves play a key role in drag reduction in highly maneuverable high speed combat aircraft, in space weather forecasting, and in the understanding of the dynamics of the evolution of our solar system and the main sequence stars. Although there exist a few well-studied second and third-order high-resolution shock-capturing schemes for the MHD in the literature, these schemes are too diffusive and not practical for turbulence/combustion MHD flows. On the other hand, extension of higher than third-order high-resolution schemes to the MHD system of equations is not straightforward. Unlike the hydrodynamic equations, the inviscid MHD system is non-strictly hyperbolic with non-convex fluxes. The wave structures and shock types are different from their hydrodynamic counterparts. Many of the non-traditional hydrodynamic shocks are not fully understood. Consequently, reliable and highly accurate numerical schemes for multiscale MHD equations pose a great challenge to algorithm development. In addition, controlling the numerical error of the divergence free condition of the magnetic fields for high order methods has been a stumbling block. Lower order methods are not practical for the astrophysical problems in question. We propose to extend our hydrodynamics schemes to the MHD equations with several desired properties over commonly used MHD schemes.
2008-09-30
Nonlinear Internal Tide Generation at the Luzon Strait: Integrating Laboratory Data with Numerics and...laboratory experimental techniques have greatly enhanced the ability to obtained detailed spatiotemporal data for internal waves in challenging regimes...a custom configured wave tank; and to integrate these results with data obtained from numerical simulations, theory and field studies. The principal
MHD stagnation-point flow over a nonlinearly shrinking sheet with suction effect
NASA Astrophysics Data System (ADS)
Awaludin, Izyan Syazana; Ahmad, Rokiah; Ishak, Anuar
2018-04-01
The stagnation point flow over a shrinking permeable sheet in the existence of magnetic field is numerically investigated in this paper. The system of partial differential equations are transformed to a nonlinear ordinary differential equation using similarity transformation and is solved numerically using the boundary value problem solver, bvp4c, in Matlab software. It is found that dual solutions exist for a certain range of the shrinking strength.
Overcoming Challenges of the Technological Age by Teaching Information Literacy Skills
ERIC Educational Resources Information Center
Burke, Melynda
2010-01-01
The technological age has forever altered every aspect of life and work. Technology has changed how people locate and view information. However, the transition from print to electronic formats has created numerous challenges for individuals to overcome. These challenges include coping with the massive amounts of information bombarding people and…
Powering the Future: A Wind Turbine Design Challenge
ERIC Educational Resources Information Center
Pries, Caitlin Hicks; Hughes, Julie
2011-01-01
Nothing brings out the best in eighth-grade physical science students quite like an engineering challenge. The wind turbine design challenge described in this article has proved to be a favorite among students with its focus on teamwork and creativity and its (almost) sneaky reinforcement of numerous physics concepts. For this activity, pairs of…
ERIC Educational Resources Information Center
Hare, Kathleen A.; Dubé, Anik; Marshall, Zack; Gahagan, Jacqueline; Harris, Gregory E.; Tucker, Maryanne; Dykeman, Margaret; MacDonald, Jo-Ann
2016-01-01
Policy scoping reviews are an effective method for generating evidence-informed policies. However, when applying guiding methodological frameworks to complex policy evidence, numerous, unexpected challenges can emerge. This paper details five challenges experienced and addressed by a policy trainee-led, multi-disciplinary research team, while…
Henriques, C; Garnett, K; Weatherhead, E K; Lickorish, F A; Forrow, D; Delgado, J
2015-04-15
Society gets numerous benefits from the water environment. It is crucial to ensure that water management practices deliver these benefits over the long-term in a sustainable and cost-effective way. Currently, hydromorphological alterations and nutrient enrichment pose the greatest challenges in European water bodies. The rapidly changing climatic and socio-economic boundary conditions pose further challenges to water management decisions and the achievement of policy goals. Scenarios are a strategic tool useful in conducting systematic investigations of future uncertainties pertaining to water management. In this study, the use of scenarios revealed water management challenges for England and Wales to 2050. A set of existing scenarios relevant to river basin management were elaborated through stakeholder workshops and interviews, relying on expert knowledge to identify drivers of change, their interdependencies, and influence on system dynamics. In a set of four plausible alternative futures, the causal chain from driving forces through pressures to states, impacts and responses (DPSIR framework) was explored. The findings suggest that scenarios driven by short-term economic growth and competitiveness undermine current environmental legislative requirements and exacerbate the negative impacts of climate change, producing a general deterioration of water quality and physical habitats, as well as reduced water availability with adverse implications for the environment, society and economy. Conversely, there are substantial environmental improvements under the scenarios characterised by long-term sustainability, though achieving currently desired environmental outcomes still poses challenges. The impacts vary across contrasting generic catchment types that exhibit distinct future water management challenges. The findings suggest the need to address hydromorphological alterations, nutrient enrichment and nitrates in drinking water, which are all likely to be exacerbated in the future. Future-proofing river basin management measures that deal with these challenges is crucial moving forward. The use of scenarios to future-proof strategy, policy and delivery mechanisms is discussed to inform next steps. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Roubinet, D.; Russian, A.; Dentz, M.; Gouze, P.
2017-12-01
Characterizing and modeling hydrodynamic reactive transport in fractured rock are critical challenges for various research fields and applications including environmental remediation, geological storage, and energy production. To this end, we consider a recently developed time domain random walk (TDRW) approach, which is adapted to reproduce anomalous transport behaviors and capture heterogeneous structural and physical properties. This method is also very well suited to optimize numerical simulations by memory-shared massive parallelization and provide numerical results at various scales. So far, the TDRW approach has been applied for modeling advective-diffusive transport with mass transfer between mobile and immobile regions and simple (theoretical) reactions in heterogeneous porous media represented as single continuum domains. We extend this approach to dual-continuum representations considering a highly permeable fracture network embedded into a poorly permeable rock matrix with heterogeneous geochemical reactions occurring in both geological structures. The resulting numerical model enables us to extend the range of the modeled heterogeneity scales with an accurate representation of solute transport processes and no assumption on the Fickianity of these processes. The proposed model is compared to existing particle-based methods that are usually used to model reactive transport in fractured rocks assuming a homogeneous surrounding matrix, and is used to evaluate the impact of the matrix heterogeneity on the apparent reaction rates for different 2D and 3D simple-to-complex fracture network configurations.
Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries
NASA Astrophysics Data System (ADS)
Reeves, H. W.; Fienen, M. N.; Feinstein, D.
2015-12-01
Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.
Carrell, David S; Schoen, Robert E; Leffler, Daniel A; Morris, Michele; Rose, Sherri; Baer, Andrew; Crockett, Seth D; Gourevitch, Rebecca A; Dean, Katie M; Mehrotra, Ateev
2017-09-01
Widespread application of clinical natural language processing (NLP) systems requires taking existing NLP systems and adapting them to diverse and heterogeneous settings. We describe the challenges faced and lessons learned in adapting an existing NLP system for measuring colonoscopy quality. Colonoscopy and pathology reports from 4 settings during 2013-2015, varying by geographic location, practice type, compensation structure, and electronic health record. Though successful, adaptation required considerably more time and effort than anticipated. Typical NLP challenges in assembling corpora, diverse report structures, and idiosyncratic linguistic content were greatly magnified. Strategies for addressing adaptation challenges include assessing site-specific diversity, setting realistic timelines, leveraging local electronic health record expertise, and undertaking extensive iterative development. More research is needed on how to make it easier to adapt NLP systems to new clinical settings. A key challenge in widespread application of NLP is adapting existing systems to new clinical settings. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Antarctic subglacial lake exploration: first results and future plans
Siegert, Martin J.; Priscu, John C.; Wadham, Jemma L.; Lyons, W. Berry
2016-01-01
After more than a decade of planning, three attempts were made in 2012–2013 to access, measure in situ properties and directly sample subglacial Antarctic lake environments. First, Russian scientists drilled into the top of Lake Vostok, allowing lake water to infiltrate, and freeze within, the lower part of the ice-core borehole, from which further coring would recover a frozen sample of surface lake water. Second, UK engineers tried unsuccessfully to deploy a clean-access hot-water drill, to sample the water column and sediments of subglacial Lake Ellsworth. Third, a US mission successfully drilled cleanly into subglacial Lake Whillans, a shallow hydraulically active lake at the coastal margin of West Antarctica, obtaining samples that would later be used to prove the existence of microbial life and active biogeochemical cycling beneath the ice sheet. This article summarizes the results of these programmes in terms of the scientific results obtained, the operational knowledge gained and the engineering challenges revealed, to collate what is known about Antarctic subglacial environments and how to explore them in future. While results from Lake Whillans testify to subglacial lakes as being viable biological habitats, the engineering challenges to explore deeper more isolated lakes where unique microorganisms and climate records may be found, as exemplified in the Lake Ellsworth and Vostok missions, are considerable. Through international cooperation, and by using equipment and knowledge of the existing subglacial lake exploration programmes, it is possible that such environments could be explored thoroughly, and at numerous sites, in the near future. PMID:26667917
Antarctic subglacial lake exploration: first results and future plans.
Siegert, Martin J; Priscu, John C; Alekhina, Irina A; Wadham, Jemma L; Lyons, W Berry
2016-01-28
After more than a decade of planning, three attempts were made in 2012-2013 to access, measure in situ properties and directly sample subglacial Antarctic lake environments. First, Russian scientists drilled into the top of Lake Vostok, allowing lake water to infiltrate, and freeze within, the lower part of the ice-core borehole, from which further coring would recover a frozen sample of surface lake water. Second, UK engineers tried unsuccessfully to deploy a clean-access hot-water drill, to sample the water column and sediments of subglacial Lake Ellsworth. Third, a US mission successfully drilled cleanly into subglacial Lake Whillans, a shallow hydraulically active lake at the coastal margin of West Antarctica, obtaining samples that would later be used to prove the existence of microbial life and active biogeochemical cycling beneath the ice sheet. This article summarizes the results of these programmes in terms of the scientific results obtained, the operational knowledge gained and the engineering challenges revealed, to collate what is known about Antarctic subglacial environments and how to explore them in future. While results from Lake Whillans testify to subglacial lakes as being viable biological habitats, the engineering challenges to explore deeper more isolated lakes where unique microorganisms and climate records may be found, as exemplified in the Lake Ellsworth and Vostok missions, are considerable. Through international cooperation, and by using equipment and knowledge of the existing subglacial lake exploration programmes, it is possible that such environments could be explored thoroughly, and at numerous sites, in the near future. © 2015 The Author(s).
Solberg Nes, Lise; Ehlers, Shawna L; Patten, Christi A; Gastineau, Dennis A
2013-03-01
Hematopoietic stem cell transplantation (HSCT) is an intensive cancer therapy entailing numerous physical, emotional, cognitive, and practical challenges. Patients' ability to adjust and cope with such challenges may depend on their ability to exert control over cognitive, emotional, and behavioral processes, that is, ability to self-regulate. Self-regulatory capacity is a limited resource that can be depleted or fatigued (i.e., "self-regulatory fatigue"), particularly in the context of stressful life events such as cancer diagnosis and treatment. This is one of the first studies to examine self-regulatory fatigue in a cancer population. The current study aimed to (1) extract items for a specific scale of self-regulatory capacity and (2) examine the impact of such capacity on adaptation in patients with hematologic malignancies preparing for HSCT. Factor analysis of four existing scales gauging psychological adjustment and well-being in 314 patients preparing for HSCT (63% male and 89% Caucasian) identified 23 items (α = 0.85) related to self-regulatory control or fatigue. This measure was then examined using existing clinical data obtained from 178 patients (57% male and 91% Caucasian) undergoing treatment for hematologic malignancies in relationship to quality of life, coping, and self-reported adherence to physicians' recommendations. Controlling for pain severity, physical fatigue, and depression, self-regulatory fatigue scores were incrementally associated with decreased quality of life, use of avoidance coping strategies, and decreased adherence to physicians' recommendations. These results emphasize the potential role of self-regulatory capacity in coping with and adjusting to hematologic cancers and future research is warranted.
NASA Astrophysics Data System (ADS)
Peysson, Y.; Bonoli, P. T.; Chen, J.; Garofalo, A.; Hillairet, J.; Li, M.; Qian, J.; Shiraiwa, S.; Decker, J.; Ding, B. J.; Ekedahl, A.; Goniche, M.; Zhai, X.
2017-10-01
The Lower Hybrid (LH) wave is widely used in existing tokamaks for tailoring current density profile or extending pulse duration to steady-state regimes. Its high efficiency makes it particularly attractive for a fusion reactor, leading to consider it for this purpose in ITER tokamak. Nevertheless, if basics of the LH wave in tokamak plasma are well known, quantitative modeling of experimental observations based on first principles remains a highly challenging exercise, despite considerable numerical efforts achieved so far. In this context, a rigorous methodology must be carried out in the simulations to identify the minimum number of physical mechanisms that must be considered to reproduce experimental shot to shot observations and also scalings (density, power spectrum). Based on recent simulations carried out for EAST, Alcator C-Mod and Tore Supra tokamaks, the state of the art in LH modeling is reviewed. The capability of fast electron bremsstrahlung, internal inductance li and LH driven current at zero loop voltage to constrain all together LH simulations is discussed, as well as the needs of further improvements (diagnostics, codes, LH model), for robust interpretative and predictive simulations.
Tubular filamentation for laser material processing
Xie, Chen; Jukna, Vytautas; Milián, Carles; Giust, Remo; Ouadghiri-Idrissi, Ismail; Itina, Tatiana; Dudley, John M.; Couairon, Arnaud; Courvoisier, Francois
2015-01-01
An open challenge in the important field of femtosecond laser material processing is the controlled internal structuring of dielectric materials. Although the availability of high energy high repetition rate femtosecond lasers has led to many advances in this field, writing structures within transparent dielectrics at intensities exceeding 1013 W/cm2 has remained difficult as it is associated with significant nonlinear spatial distortion. This letter reports the existence of a new propagation regime for femtosecond pulses at high power that overcomes this challenge, associated with the generation of a hollow uniform and intense light tube that remains propagation invariant even at intensities associated with dense plasma formation. This regime is seeded from higher order nondiffracting Bessel beams, which carry an optical vortex charge. Numerical simulations are quantitatively confirmed by experiments where a novel experimental approach allows direct imaging of the 3D fluence distribution within transparent solids. We also analyze the transitions to other propagation regimes in near and far fields. We demonstrate how the generation of plasma in this tubular geometry can lead to applications in ultrafast laser material processing in terms of single shot index writing, and discuss how it opens important perspectives for material compression and filamentation guiding in atmosphere. PMID:25753215
Aging in the Republic of Bulgaria.
Pitheckoff, Natalie
2017-10-01
Bulgaria, a southeastern European nation with 7.1 million inhabitants, is ranked 4th in the world for its rate of population aging. Bulgaria has one of the highest proportions of older adults in the world with approximately 20% aged 65 and older. Three main demographic factors have led to rapid population aging. These include emigration, high death rates, and low birth rates. This "perfect storm" of demographic factors has created numerous political, social, and economic challenges for Bulgaria. For example, informal support of older adults is declining as younger generations move abroad or to urban areas for greater employment opportunities. This has increased the need for formal long-term services and supports, which can be at odds with traditional values. Additionally, economic sustainability is a major concern for the nation as population aging and de-population continues. Few gerontological organizations, scholars, or secondary datasets exist in the country. To address these challenges, more research on aging is needed to encourage economic renewal, healthy aging policies, and long-term services and supports. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... ``Numerical Guidelines Applicable to Volatile Market Opens'' with a new paragraph, entitled ``Individual Stock... to eliminate the ability of the Exchange to deviate from the Numerical Guidelines contained in... existing paragraph (c)(2), which provides flexibility to the Exchange to use different Numerical Guidelines...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-06
... replacing existing paragraph (a)(2)(C)(iv) of Rule 3312, entitled ``Numerical Guidelines Applicable to... Exchange to deviate from the Numerical Guidelines contained in paragraph (a)(2)(C)(i) when deciding which... Numerical Guidelines or Reference Prices in various ``Unusual Circumstances.'' The Exchange proposes to...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... proposes replacing existing paragraph (c)(4) of Rule 128, entitled ``Numerical Guidelines Applicable to... the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1) (other than under...)(2), which provides flexibility to the Exchange to use different Numerical Guidelines or Reference...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
....'' Second, NASDAQ replacing existing paragraph (C)(4) of Rule 11890, entitled ``Numerical Guidelines... NASDAQ to deviate from the Numerical Guidelines contained in paragraph (C)(1) (other than under limited... provides flexibility to NASDAQ to use different Numerical Guidelines or Reference Prices in various...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-29
... proposes replacing existing paragraph (c)(4) of Rule 7.10, entitled ``Numerical Guidelines Applicable to... the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1) (other than under...)(2), which provides flexibility to the Exchange to use different Numerical Guidelines or Reference...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... Exchange replacing existing paragraph (c)(4) of Rule 11.19, entitled ``Numerical Guidelines Applicable to... the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1) (other than under... provides flexibility to the Exchange to use different Numerical Guidelines or Reference Prices in various...
Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program
NASA Technical Reports Server (NTRS)
Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.
2010-01-01
The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the Aeroelasticity Branch will examine other experimental efforts within the Subsonic Fixed Wing (SFW) program (such as testing of the NASA Common Research Model (CRM)) and other NASA programs and assess aeroelasticity issues and research topics.
Asteroid Crewed Segment Mission Lean Development
NASA Technical Reports Server (NTRS)
Gard, Joe; McDonald, Mark; Jermstad, Wayne
2014-01-01
The next generation of human spaceflight missions presents numerous challenges to designers that must be addressed to produce a feasible concept. The specific challenges of designing an exploration mission utilizing the Space Launch System and the Orion spacecraft to carry astronauts beyond earth orbit to explore an asteroid stored in a distant retrograde orbit around the moon will be addressed. Mission designers must carefully balance competing constraints including cost, schedule, risk, and numerous spacecraft performance metrics including launch mass, nominal landed mass, abort landed mass, mission duration, consumable limits and many others. The Asteroid Redirect Crewed Mission will be described along with results from the concurrent mission design trades that led to its formulation. While the trades presented are specific to this mission, the integrated process is applicable to any potential future mission. The following trades were critical in the mission formulation and will be described in detail: 1) crew size, 2) mission duration, 3) trajectory design, 4) docking vs grapple, 5) extravehicular activity tasks, 6) launch mass and integrated vehicle performance, 7) contingency performance, 8) crew consumables including food, clothing, oxygen, nitrogen and water, and 9) mission risk. The additional Orion functionality required to perform the Asteroid Redirect Crewed Mission and how it is incorporated while minimizing cost, schedule and mass impacts will be identified. Existing investments in the NASA technology portfolio were leveraged to provide the added functionality that will be beneficial to future exploration missions. Mission kits are utilized to augment Orion with the necessary functionality without introducing costly new requirements to the mature Orion spacecraft design effort. The Asteroid Redirect Crewed Mission provides an exciting early mission for the Orion and SLS while providing a stepping stone to even more ambitious missions in the future.
Xiao, Li; Luo, Ray
2017-12-07
We explored a multi-scale algorithm for the Poisson-Boltzmann continuum solvent model for more robust simulations of biomolecules. In this method, the continuum solvent/solute interface is explicitly simulated with a numerical fluid dynamics procedure, which is tightly coupled to the solute molecular dynamics simulation. There are multiple benefits to adopt such a strategy as presented below. At this stage of the development, only nonelectrostatic interactions, i.e., van der Waals and hydrophobic interactions, are included in the algorithm to assess the quality of the solvent-solute interface generated by the new method. Nevertheless, numerical challenges exist in accurately interpolating the highly nonlinear van der Waals term when solving the finite-difference fluid dynamics equations. We were able to bypass the challenge rigorously by merging the van der Waals potential and pressure together when solving the fluid dynamics equations and by considering its contribution in the free-boundary condition analytically. The multi-scale simulation method was first validated by reproducing the solute-solvent interface of a single atom with analytical solution. Next, we performed the relaxation simulation of a restrained symmetrical monomer and observed a symmetrical solvent interface at equilibrium with detailed surface features resembling those found on the solvent excluded surface. Four typical small molecular complexes were then tested, both volume and force balancing analyses showing that these simple complexes can reach equilibrium within the simulation time window. Finally, we studied the quality of the multi-scale solute-solvent interfaces for the four tested dimer complexes and found that they agree well with the boundaries as sampled in the explicit water simulations.
Nam, Vu Thanh; van Kuijk, Marijke; Anten, Niels P R
2016-01-01
Allometric regression models are widely used to estimate tropical forest biomass, but balancing model accuracy with efficiency of implementation remains a major challenge. In addition, while numerous models exist for aboveground mass, very few exist for roots. We developed allometric equations for aboveground biomass (AGB) and root biomass (RB) based on 300 (of 45 species) and 40 (of 25 species) sample trees respectively, in an evergreen forest in Vietnam. The biomass estimations from these local models were compared to regional and pan-tropical models. For AGB we also compared local models that distinguish functional types to an aggregated model, to assess the degree of specificity needed in local models. Besides diameter at breast height (DBH) and tree height (H), wood density (WD) was found to be an important parameter in AGB models. Existing pan-tropical models resulted in up to 27% higher estimates of AGB, and overestimated RB by nearly 150%, indicating the greater accuracy of local models at the plot level. Our functional group aggregated local model which combined data for all species, was as accurate in estimating AGB as functional type specific models, indicating that a local aggregated model is the best choice for predicting plot level AGB in tropical forests. Finally our study presents the first allometric biomass models for aboveground and root biomass in forests in Vietnam.
Nam, Vu Thanh; van Kuijk, Marijke; Anten, Niels P. R.
2016-01-01
Allometric regression models are widely used to estimate tropical forest biomass, but balancing model accuracy with efficiency of implementation remains a major challenge. In addition, while numerous models exist for aboveground mass, very few exist for roots. We developed allometric equations for aboveground biomass (AGB) and root biomass (RB) based on 300 (of 45 species) and 40 (of 25 species) sample trees respectively, in an evergreen forest in Vietnam. The biomass estimations from these local models were compared to regional and pan-tropical models. For AGB we also compared local models that distinguish functional types to an aggregated model, to assess the degree of specificity needed in local models. Besides diameter at breast height (DBH) and tree height (H), wood density (WD) was found to be an important parameter in AGB models. Existing pan-tropical models resulted in up to 27% higher estimates of AGB, and overestimated RB by nearly 150%, indicating the greater accuracy of local models at the plot level. Our functional group aggregated local model which combined data for all species, was as accurate in estimating AGB as functional type specific models, indicating that a local aggregated model is the best choice for predicting plot level AGB in tropical forests. Finally our study presents the first allometric biomass models for aboveground and root biomass in forests in Vietnam. PMID:27309718
Generalized Redistribute-to-the-Right Algorithm: Application to the Analysis of Censored Cost Data
CHEN, SHUAI; ZHAO, HONGWEI
2013-01-01
Medical cost estimation is a challenging task when censoring of data is present. Although researchers have proposed methods for estimating mean costs, these are often derived from theory and are not always easy to understand. We provide an alternative method, based on a replace-from-the-right algorithm, for estimating mean costs more efficiently. We show that our estimator is equivalent to an existing one that is based on the inverse probability weighting principle and semiparametric efficiency theory. We also propose an alternative method for estimating the survival function of costs, based on the redistribute-to-the-right algorithm, that was originally used for explaining the Kaplan–Meier estimator. We show that this second proposed estimator is equivalent to a simple weighted survival estimator of costs. Finally, we develop a more efficient survival estimator of costs, using the same redistribute-to-the-right principle. This estimator is naturally monotone, more efficient than some existing survival estimators, and has a quite small bias in many realistic settings. We conduct numerical studies to examine the finite sample property of the survival estimators for costs, and show that our new estimator has small mean squared errors when the sample size is not too large. We apply both existing and new estimators to a data example from a randomized cardiovascular clinical trial. PMID:24403869
ERIC Educational Resources Information Center
De Silva, Nilani Ljunggren
2013-01-01
The question of inclusive education is not straightforward. Despite all its good intentions, inclusive education, in practice faces numerous challenges today. This study analyses these challenges in the Swedish special education context. The author explores special educators' experiences, possibilities and challenges when applying inclusive…
Challenges to Women's Participation in Senior Administrative Positions in Iranian Higher Education
ERIC Educational Resources Information Center
Mohajeri, Bahieh; Mousavi, Farah
2017-01-01
In the last three decades, growth in the education of women in Iran has led to a significant increase in demand for women professionals and administrators in Iranian universities. However, the path to the top is not easy and numerous challenges must still be overcome. This study explored the challenges of women's participation in senior…
NASA Astrophysics Data System (ADS)
Mudunuru, M. K.; Shabouei, M.; Nakshatrala, K.
2015-12-01
Advection-diffusion-reaction (ADR) equations appear in various areas of life sciences, hydrogeological systems, and contaminant transport. Obtaining stable and accurate numerical solutions can be challenging as the underlying equations are coupled, nonlinear, and non-self-adjoint. Currently, there is neither a robust computational framework available nor a reliable commercial package known that can handle various complex situations. Herein, the objective of this poster presentation is to present a novel locally conservative non-negative finite element formulation that preserves the underlying physical and mathematical properties of a general linear transient anisotropic ADR equation. In continuous setting, governing equations for ADR systems possess various important properties. In general, all these properties are not inherited during finite difference, finite volume, and finite element discretizations. The objective of this poster presentation is two fold: First, we analyze whether the existing numerical formulations (such as SUPG and GLS) and commercial packages provide physically meaningful values for the concentration of the chemical species for various realistic benchmark problems. Furthermore, we also quantify the errors incurred in satisfying the local and global species balance for two popular chemical kinetics schemes: CDIMA (chlorine dioxide-iodine-malonic acid) and BZ (Belousov--Zhabotinsky). Based on these numerical simulations, we show that SUPG and GLS produce unphysical values for concentration of chemical species due to the violation of the non-negative constraint, contain spurious node-to-node oscillations, and have large errors in local and global species balance. Second, we proposed a novel finite element formulation to overcome the above difficulties. The proposed locally conservative non-negative computational framework based on low-order least-squares finite elements is able to preserve these underlying physical and mathematical properties. Several representative numerical examples are discussed to illustrate the importance of the proposed numerical formulations to accurately describe various aspects of mixing process in chaotic flows and to simulate transport in highly heterogeneous anisotropic media.
A case study of global health at the university: implications for research and action
Pinto, Andrew D.; Cole, Donald C.; ter Kuile, Aleida; Forman, Lisa; Rouleau, Katherine; Philpott, Jane; Pakes, Barry; Jackson, Suzanne; Muntaner, Carles
2014-01-01
Background Global health is increasingly a major focus of institutions in high-income countries. However, little work has been done to date to study the inner workings of global health at the university level. Academics may have competing objectives, with few mechanisms to coordinate efforts and pool resources. Objective To conduct a case study of global health at Canada's largest health sciences university and to examine how its internal organization influences research and action. Design We drew on existing inventories, annual reports, and websites to create an institutional map, identifying centers and departments using the terms ‘global health’ or ‘international health’ to describe their activities. We compiled a list of academics who self-identified as working in global or international health. We purposively sampled persons in leadership positions as key informants. One investigator carried out confidential, semi-structured interviews with 20 key informants. Interview notes were returned to participants for verification and then analyzed thematically by pairs of coders. Synthesis was conducted jointly. Results More than 100 academics were identified as working in global health, situated in numerous institutions, centers, and departments. Global health academics interviewed shared a common sense of what global health means and the values that underpin such work. Most academics interviewed expressed frustration at the existing fragmentation and the lack of strategic direction, financial support, and recognition from the university. This hampered collaborative work and projects to tackle global health problems. Conclusions The University of Toronto is not exceptional in facing such challenges, and our findings align with existing literature that describes factors that inhibit collaboration in global health work at universities. Global health academics based at universities may work in institutional siloes and this limits both internal and external collaboration. A number of solutions to address these challenges are proposed. PMID:25172428
Schafheutle, Ellen Ingrid; Hassell, Karen; Noyce, Peter R
2013-01-01
Revalidation is about assuring that health practitioners remain up to date and fit to practice, and demonstrating that they continue to meet the requirements of their professional regulator. To critically discuss issues that need to be considered when designing a system of revalidation for pharmacy professionals. Although providing international context, the article focuses in particular on Great Britain (GB), where both pharmacists (Phs) and pharmacy technicians (PTs) are regulated. Following a brief historical overview, the article draws on emerging evidence in context. Revalidation may involve discrete periodic assessment or a continuous process of assessment against clearly identified standards. The evolving scope of pharmacy practice involves increasingly clinical roles and also practitioners in nonpatient-facing roles. The potential risk to patients and the public may require consideration. Although revalidation, or systems for recertification/relicensure, exist in numerous jurisdictions, most center on the collection of continuing education credits; continuous professional development and reflective practice are increasingly found. Revalidation may involve assessment of other sources, such as appraisals or monitoring visits. Existing revalidation systems are coordinated centrally, but particularly in larger jurisdictions, like GB, where approximately 67,000 pharmacy professionals are regulated, some responsibility may need to be devolved. This would require engagement with employers and contracting organizations to ensure suitability and consistency. Existing systems, such as company appraisals, are unfit for the assessment of fitness to practice owing to a focus on organizational/business targets. Certain groups of pharmacy professionals may pose particular challenges, such as self-employed locums, pharmacy owners, those working in different sectors, or returning after a break. To ensure proportionality, it must be considered whether the same standards and/or sources of evidence should apply to all pharmacy professionals, either dependent on whether they are patient facing, their scope of practice, or whether Phs and PTs should be treated differently. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Zubair, Mohammad; Nielsen, Eric; Luitjens, Justin; Hammond, Dana
2016-01-01
In the field of computational fluid dynamics, the Navier-Stokes equations are often solved using an unstructuredgrid approach to accommodate geometric complexity. Implicit solution methodologies for such spatial discretizations generally require frequent solution of large tightly-coupled systems of block-sparse linear equations. The multicolor point-implicit solver used in the current work typically requires a significant fraction of the overall application run time. In this work, an efficient implementation of the solver for graphics processing units is proposed. Several factors present unique challenges to achieving an efficient implementation in this environment. These include the variable amount of parallelism available in different kernel calls, indirect memory access patterns, low arithmetic intensity, and the requirement to support variable block sizes. In this work, the solver is reformulated to use standard sparse and dense Basic Linear Algebra Subprograms (BLAS) functions. However, numerical experiments show that the performance of the BLAS functions available in existing CUDA libraries is suboptimal for matrices representative of those encountered in actual simulations. Instead, optimized versions of these functions are developed. Depending on block size, the new implementations show performance gains of up to 7x over the existing CUDA library functions.
Cutcliffe, John; Happell, Brenda
2009-04-01
Interpersonal relationships, although considered to be the cornerstone of therapeutic engagement, are replete with issues of power; yet, the concept of 'invisible power' within such formal mental health care relationships is seldom explored and/or critiqued in the literature. This paper involves an examination of power in the interpersonal relationship between the mental health nurse and the consumer. Issues of power are emphasized by drawing on examples from clinical experiences, each of which is then deconstructed as an analytical means to uncover the different layers of power. This examination highlights the existence of both obscure and seldomly acknowledged invisible manifestations of power that are inherent in psychiatry and interpersonal mental health nursing. It also identifies that there is an orthodoxy of formal mental health care that perhaps is best described as 'biopsychiatry' (or 'traditional psychiatry'). Within this are numerous serious speech acts and these provide the power for mental health practitioners to act in particular ways, to exercise control. The authors challenge this convention as the only viable discourse: a potentially viable alternative to the current of formal mental health care does exist and, most importantly, this alternative is less tied to the use of invisible power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millis, Andrew
Understanding the behavior of interacting electrons in molecules and solids so that one can predict new superconductors, catalysts, light harvesters, energy and battery materials and optimize existing ones is the ``quantum many-body problem’’. This is one of the scientific grand challenges of the 21 st century. A complete solution to the problem has been proven to be exponentially hard, meaning that straightforward numerical approaches fail. New insights and new methods are needed to provide accurate yet feasible approximate solutions. This CMSCN project brought together chemists and physicists to combine insights from the two disciplines to develop innovative new approaches. Outcomesmore » included the Density Matrix Embedding method, a new, computationally inexpensive and extremely accurate approach that may enable first principles treatment of superconducting and magnetic properties of strongly correlated materials, new techniques for existing methods including an Adaptively Truncated Hilbert Space approach that will vastly expand the capabilities of the dynamical mean field method, a self-energy embedding theory and a new memory-function based approach to the calculations of the behavior of driven systems. The methods developed under this project are now being applied to improve our understanding of superconductivity, to calculate novel topological properties of materials and to characterize and improve the properties of nanoscale devices.« less
Establishing NWP capabilities in African Small Island States (SIDs)
NASA Astrophysics Data System (ADS)
Rögnvaldsson, Ólafur
2017-04-01
Íslenskar orkurannsóknir (ÍSOR), in collaboration with Belgingur Ltd. and the United Nations Economic Commission for Africa (UNECA) signed a Letter of Agreement in 2015 regarding collaboration in the "Establishing Operational Capacity for Building, Deploying and Using Numerical Weather and Seasonal Prediction Systems in Small Island States in Africa (SIDs)" project. The specific objectives of the collaboration were the following: - Build capacity of National Meteorological and Hydrology Services (NMHS) staff on the use of the WRF atmospheric model for weather and seasonal forecasting, interpretation of model results, and the use of observations to verify and improve model simulations. - Establish a platform for integrating short to medium range weather forecasts, as well as seasonal forecasts, into already existing infrastructure at NMHS and Regional Climate Centres. - Improve understanding of existing model results and forecast verification, for improving decision-making on the time scale of days to weeks. To meet these challenges the operational Weather On Demand (WOD) forecasting system, developed by Belgingur, is being installed in a number of SIDs countries (Cabo Verde, Guinea-Bissau, and Seychelles), as well as being deployed for the Pan-Africa region, with forecasts being disseminated to collaborating NMHSs.
NASA Astrophysics Data System (ADS)
Xing, Pengju; Yoshioka, Keita; Adachi, Jose; El-Fayoumi, Amr; Bunger, Andrew P.
2017-07-01
The tip behavior of hydraulic fractures is characterized by a rich nesting of asymptotic solutions, comprising a formidable challenge for the development of efficient and accurate numerical simulators. We present experimental validation of several theoretically-predicted asymptotic behaviors, namely for hydraulic fracture growth under conditions of negligible fracture toughness, with growth progressing from early-time radial geometry to large-time blade-like (PKN) geometry. Our experimental results demonstrate: 1) existence of a asymptotic solution of the form w ∼ s3/2 (LEFM) in the near tip region, where w is the crack opening and s is the distance from the crack tip, 2) transition to an asymptotic solution of the form w ∼ s2/3 away from the near-tip region, with the transition length scale also consistent with theory, 3) transition to an asymptotic solution of the form w ∼ s1/3 after the fracture attains blade-like (PKN) geometry, and 4) existence of a region near the tip of a blade-like (PKN) hydraulic fracture in which plane strain conditions persist, with the thickness of this region of the same order as the crack height.
NASA Astrophysics Data System (ADS)
Chen, Buxin; Zhang, Zheng; Sidky, Emil Y.; Xia, Dan; Pan, Xiaochuan
2017-11-01
Optimization-based algorithms for image reconstruction in multispectral (or photon-counting) computed tomography (MCT) remains a topic of active research. The challenge of optimization-based image reconstruction in MCT stems from the inherently non-linear data model that can lead to a non-convex optimization program for which no mathematically exact solver seems to exist for achieving globally optimal solutions. In this work, based upon a non-linear data model, we design a non-convex optimization program, derive its first-order-optimality conditions, and propose an algorithm to solve the program for image reconstruction in MCT. In addition to consideration of image reconstruction for the standard scan configuration, the emphasis is on investigating the algorithm’s potential for enabling non-standard scan configurations with no or minimum hardware modification to existing CT systems, which has potential practical implications for lowered hardware cost, enhanced scanning flexibility, and reduced imaging dose/time in MCT. Numerical studies are carried out for verification of the algorithm and its implementation, and for a preliminary demonstration and characterization of the algorithm in reconstructing images and in enabling non-standard configurations with varying scanning angular range and/or x-ray illumination coverage in MCT.
1, 2, 3, 4: infusing quantitative literacy into introductory biology.
Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.
Statistical Methods Applied to Gamma-ray Spectroscopy Algorithms in Nuclear Security Missions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fagan, Deborah K.; Robinson, Sean M.; Runkle, Robert C.
2012-10-01
In a wide range of nuclear security missions, gamma-ray spectroscopy is a critical research and development priority. One particularly relevant challenge is the interdiction of special nuclear material for which gamma-ray spectroscopy supports the goals of detecting and identifying gamma-ray sources. This manuscript examines the existing set of spectroscopy methods, attempts to categorize them by the statistical methods on which they rely, and identifies methods that have yet to be considered. Our examination shows that current methods effectively estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty—ones that are significantly moremore » complex. We thus explore the premise that significantly improving algorithm performance requires greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods have the potential to reduce decision uncertainty by more rigorously and comprehensively incorporating all sources of uncertainty. We expect that application of such methods will demonstrate progress in meeting the needs of nuclear security missions by improving on the existing numerical infrastructure for which these analyses have not been conducted.« less
An Efficient Numerical Approach for Nonlinear Fokker-Planck equations
NASA Astrophysics Data System (ADS)
Otten, Dustin; Vedula, Prakash
2009-03-01
Fokker-Planck equations which are nonlinear with respect to their probability densities that occur in many nonequilibrium systems relevant to mean field interaction models, plasmas, classical fermions and bosons can be challenging to solve numerically. To address some underlying challenges in obtaining numerical solutions, we propose a quadrature based moment method for efficient and accurate determination of transient (and stationary) solutions of nonlinear Fokker-Planck equations. In this approach the distribution function is represented as a collection of Dirac delta functions with corresponding quadrature weights and locations, that are in turn determined from constraints based on evolution of generalized moments. Properties of the distribution function can be obtained by solution of transport equations for quadrature weights and locations. We will apply this computational approach to study a wide range of problems, including the Desai-Zwanzig Model (for nonlinear muscular contraction) and multivariate nonlinear Fokker-Planck equations describing classical fermions and bosons, and will also demonstrate good agreement with results obtained from Monte Carlo and other standard numerical methods.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... an improved understanding of methodological challenges associated with integrating existing tools and... methodological challenges associated with integrating existing tools (e.g., climate models, downscaling... sensitivity to methodological choices such as different approaches for downscaling global climate change...
A Three-Dimensional Linearized Unsteady Euler Analysis for Turbomachinery Blade Rows
NASA Technical Reports Server (NTRS)
Montgomery, Matthew D.; Verdon, Joseph M.
1996-01-01
A three-dimensional, linearized, Euler analysis is being developed to provide an efficient unsteady aerodynamic analysis that can be used to predict the aeroelastic and aeroacoustic response characteristics of axial-flow turbomachinery blading. The field equations and boundary conditions needed to describe nonlinear and linearized inviscid unsteady flows through a blade row operating within a cylindrical annular duct are presented. In addition, a numerical model for linearized inviscid unsteady flow, which is based upon an existing nonlinear, implicit, wave-split, finite volume analysis, is described. These aerodynamic and numerical models have been implemented into an unsteady flow code, called LINFLUX. A preliminary version of the LINFLUX code is applied herein to selected, benchmark three-dimensional, subsonic, unsteady flows, to illustrate its current capabilities and to uncover existing problems and deficiencies. The numerical results indicate that good progress has been made toward developing a reliable and useful three-dimensional prediction capability. However, some problems, associated with the implementation of an unsteady displacement field and numerical errors near solid boundaries, still exist. Also, accurate far-field conditions must be incorporated into the FINFLUX analysis, so that this analysis can be applied to unsteady flows driven be external aerodynamic excitations.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... Securities.'' Second, the Exchange is replacing existing paragraph (C)(4) of Rule 11890, entitled ``Numerical... the ability of the Exchange to deviate from the Numerical Guidelines contained in paragraph (C)(1... flexibility to the Exchange to use different Numerical Guidelines or Reference Prices in various ``Unusual...
ERIC Educational Resources Information Center
Merkley, Rebecca; Shimi, Andria; Scerif, Gaia
2016-01-01
It is not yet understood how children acquire the meaning of numerical symbols and most existing research has focused on the role of approximate non-symbolic representations of number in this process (see Piazza, ("Trends in Cognitive" 14(12):542-551, 2010). However, numerical symbols necessitate an understanding of both order and…
The Effects of Numerical Imbalance and Gender on Tokens: An Examination of Kanter's Theory.
ERIC Educational Resources Information Center
Fairhurst, Gail Theus; Snavely, Bretta Kay
A study examined the effects of gender on social interaction under numerically imbalanced conditions. Specifically, the study tested R. M. Kanter's assumption that all tokens (individuals who enter a work environment where their gender is numerically scarce) respond in a similar manner to token conditions, although evidence exists that males and…
Dung, Van Than; Tjahjowidodo, Tegoeh
2017-01-01
B-spline functions are widely used in many industrial applications such as computer graphic representations, computer aided design, computer aided manufacturing, computer numerical control, etc. Recently, there exist some demands, e.g. in reverse engineering (RE) area, to employ B-spline curves for non-trivial cases that include curves with discontinuous points, cusps or turning points from the sampled data. The most challenging task in these cases is in the identification of the number of knots and their respective locations in non-uniform space in the most efficient computational cost. This paper presents a new strategy for fitting any forms of curve by B-spline functions via local algorithm. A new two-step method for fast knot calculation is proposed. In the first step, the data is split using a bisecting method with predetermined allowable error to obtain coarse knots. Secondly, the knots are optimized, for both locations and continuity levels, by employing a non-linear least squares technique. The B-spline function is, therefore, obtained by solving the ordinary least squares problem. The performance of the proposed method is validated by using various numerical experimental data, with and without simulated noise, which were generated by a B-spline function and deterministic parametric functions. This paper also discusses the benchmarking of the proposed method to the existing methods in literature. The proposed method is shown to be able to reconstruct B-spline functions from sampled data within acceptable tolerance. It is also shown that, the proposed method can be applied for fitting any types of curves ranging from smooth ones to discontinuous ones. In addition, the method does not require excessive computational cost, which allows it to be used in automatic reverse engineering applications.
NASA Astrophysics Data System (ADS)
Cui, Z.; Welty, C.; Maxwell, R. M.
2011-12-01
Lagrangian, particle-tracking models are commonly used to simulate solute advection and dispersion in aquifers. They are computationally efficient and suffer from much less numerical dispersion than grid-based techniques, especially in heterogeneous and advectively-dominated systems. Although particle-tracking models are capable of simulating geochemical reactions, these reactions are often simplified to first-order decay and/or linear, first-order kinetics. Nitrogen transport and transformation in aquifers involves both biodegradation and higher-order geochemical reactions. In order to take advantage of the particle-tracking approach, we have enhanced an existing particle-tracking code SLIM-FAST, to simulate nitrogen transport and transformation in aquifers. The approach we are taking is a hybrid one: the reactive multispecies transport process is operator split into two steps: (1) the physical movement of the particles including the attachment/detachment to solid surfaces, which is modeled by a Lagrangian random-walk algorithm; and (2) multispecies reactions including biodegradation are modeled by coupling multiple Monod equations with other geochemical reactions. The coupled reaction system is solved by an ordinary differential equation solver. In order to solve the coupled system of equations, after step 1, the particles are converted to grid-based concentrations based on the mass and position of the particles, and after step 2 the newly calculated concentration values are mapped back to particles. The enhanced particle-tracking code is capable of simulating subsurface nitrogen transport and transformation in a three-dimensional domain with variably saturated conditions. Potential application of the enhanced code is to simulate subsurface nitrogen loading to the Chesapeake Bay and its tributaries. Implementation details, verification results of the enhanced code with one-dimensional analytical solutions and other existing numerical models will be presented in addition to a discussion of implementation challenges.
A nested numerical tidal model of the southern New England bight
NASA Technical Reports Server (NTRS)
Gordon, R. B.; Spaulding, M. L.
1979-01-01
Efforts were focused on the development and application of a three-dimensional numerical model for predicting pollutant and sediment transport in estuarine and coastal environments. To successfully apply the pollutant and sediment transport model to Rhode Island coastal waters, it was determined that the flow field in this region had to be better described through the use of existing numerical circulation models. A nested, barotropic numerical tidal model was applied to the southern New England Bight (Long Island, Block Island, Rhode Island Sounds, Buzzards Bay, and the shelf south of Block Island). Forward time and centered spatial differences were employed with the bottom friction term evaluated at both time levels. Using existing tide records on the New England shelf, adequate information was available to specify the tide height boundary condition further out on the shelf. Preliminary results are within the accuracy of the National Ocean Survey tide table data.
NASA Technical Reports Server (NTRS)
Abdul, Hameed
2016-01-01
This summer I assisted the RPT Program Office in developing a design plan to update their existing website to current NASA web standards. The finished website is intended for the general public, specifically potential customers interested in learning about NASA's chemical rocket test facility capabilities and test assignment process. The goal of the website is to give the public insight about the purpose and function of the RPT Program. Working on this project gave me the opportunity to learn skills necessary for effective project management. The RPT Program Office manages numerous facilities so they are required to travel often to other sites for meetings throughout the year. Maneuvering around the travel schedule of the office and the workload priority of the IT Department proved to be quite the challenge. I overcame the travel schedule of the office by frequently communicating and checking in with my mentor via email and telephone.
Mechanisms of inhibition within the telencephalon: "where the wild things are".
Fishell, Gord; Rudy, Bernardo
2011-01-01
In this review, we first provide a historical perspective of inhibitory signaling from the discovery of inhibition through to our present understanding of the diversity and mechanisms by which GABAergic interneuron populations function in different parts of the telencephalon. This is followed by a summary of the mechanisms of inhibition in the CNS. With this as a starting point, we provide an overview describing the variations in the subtypes and origins of inhibitory interneurons within the pallial and subpallial divisions of the telencephalon, with a focus on the hippocampus, somatosensory, paleo/piriform cortex, striatum, and various amygdala nuclei. Strikingly, we observe that marked variations exist in the origin and numerical balance between GABAergic interneurons and the principal cell populations in distinct regions of the telencephalon. Finally we speculate regarding the attractiveness and challenges of establishing a unifying nomenclature to describe inhibitory neuron diversity throughout the telencephalon.
Harnessing fluid-structure interactions to design self-regulating acoustic metamaterials
NASA Astrophysics Data System (ADS)
Casadei, Filippo; Bertoldi, Katia
2014-01-01
The design of phononic crystals and acoustic metamaterials with tunable and adaptive wave properties remains one of the outstanding challenges for the development of next generation acoustic devices. We report on the numerical and experimental demonstration of a locally resonant acoustic metamaterial with dispersion characteristics, which autonomously adapt in response to changes of an incident aerodynamic flow. The metamaterial consists of a slender beam featuring a periodic array or airfoil-shaped masses supported by a linear and torsional springs. The resonance characteristics of the airfoils lead to strong attenuation at frequencies defined by the properties of the airfoils and the speed on the incident fluid. The proposed concept expands the ability of existing acoustic bandgap materials to autonomously adapt their dispersion properties through fluid-structure interactions, and has the potential to dramatically impact a variety of applications, such as robotics, civil infrastructures, and defense systems.
The ‘spiteful’ origins of human cooperation
Marlowe, Frank W.; Berbesque, J. Colette; Barrett, Clark; Bolyanatz, Alexander; Gurven, Michael; Tracer, David
2011-01-01
We analyse generosity, second-party (‘spiteful’) punishment (2PP), and third-party (‘altruistic’) punishment (3PP) in a cross-cultural experimental economics project. We show that smaller societies are less generous in the Dictator Game but no less prone to 2PP in the Ultimatum Game. We might assume people everywhere would be more willing to punish someone who hurt them directly (2PP) than someone who hurt an anonymous third person (3PP). While this is true of small societies, people in large societies are actually more likely to engage in 3PP than 2PP. Strong reciprocity, including generous offers and 3PP, exists mostly in large, complex societies that face numerous challenging collective action problems. We argue that ‘spiteful’ 2PP, motivated by the basic emotion of anger, is more universal than 3PP and sufficient to explain the origins of human cooperation. PMID:21159680
Leveraging natural killer cells for cancer immunotherapy.
Grossenbacher, Steven K; Aguilar, Ethan G; Murphy, William J
2017-05-01
Natural killer (NK) cells are potent antitumor effector cells of the innate immune system. Based on their ability to eradicate tumors in vitro and in animal models, significant enthusiasm surrounds the prospect of leveraging human NK cells as vehicles for cancer immunotherapy. While interest in manipulating the effector functions of NK cells has existed for over 30 years, there is renewed optimism for this approach today. Although T cells receive much of the clinical and preclinical attention when it comes to cancer immunotherapy, new strategies are utilizing adoptive NK-cell immunotherapy and monoclonal antibodies and engineered molecules which have been developed to specifically activate NK cells against tumors. Despite the numerous challenges associated with the preclinical and clinical development of NK cell-based therapies for cancer, NK cells possess many unique immunological properties and hold the potential to provide an effective means for cancer immunotherapy.
Season of birth bias in eating disorders--fact or fiction?
Winje, Eirin; Willoughby, Kate; Lask, Bryan
2008-09-01
A season of birth (SoB) bias is said to be present if the SoB pattern for a particular group varies from the pattern within the normal population. Significant biases have been found for several disorders including eating disorders (EDs). This article critically reviews the existing literature on SoB in ED in order to inform future hypothesis-based research. A literature search identified 12 papers investigating SoB in ED. Despite methodological differences, the studies consistently show a SoB bias for anorexia nervosa (AN) in the spring months, in both the northern and southern Hemispheres. This is especially strong for early-onset and restrictive subtype of AN. These findings suggest that SoB is a risk factor for AN. However, none of the studies have been methodologically satisfactory. Future research needs to overcome numerous methodological challenges and to explore specific hypotheses to explain this bias. (c) 2008 by Wiley Periodicals, Inc.
Robust CO2 Injection: Application of Bayesian-Information-Gap Decision Theory
NASA Astrophysics Data System (ADS)
Grasinger, M.; O'Malley, D.; Vesselinov, V. V.; Karra, S.
2015-12-01
Carbon capture and sequestration has the potential to reduce greenhouse gasemissions. However, care must be taken when choosing a site for CO2 seques-tration to ensure that the CO2 remains sequestered for many years, and thatthe environment is not harmed in any way. Making a rational decision be-tween potential sites for sequestration is not without its challenges because, asin the case of many environmental and subsurface problems, there is a lot ofuncertainty that exists. A method for making decisions under various typesand severities of uncertainty, Bayesian-Information-Gap Decision Theory (BIGDT), is presented. BIG DT was coupled with a numerical model for CO2 wellinjection and the resulting framework was then applied to a problem of selectingbetween two potential sites for CO2 sequestration. The results of the analysisare presented, followed by a discussion of the decision process.
Anticipating issues related to increasing preimplantation genetic diagnosis use: a research agenda.
Klitzman, Robert; Appelbaum, Paul S; Chung, Wendy; Sauer, Mark
2008-01-01
Increasing use of preimplantation genetic diagnosis (PGD) poses numerous clinical, social, psychological, ethical, legal and policy dilemmas, many of which have received little attention. Patients and providers are now considering and using PGD for a widening array of genetic disorders, and patients may increasingly seek 'designer babies.' In the USA, although governmental oversight policies have been discussed, few specific guidelines exist. Hence, increasingly, patients and providers will face challenging ethical and policy questions of when and for whom to use PGD, and how it should be financed. These issues should be better clarified and addressed through collection of data concerning the current use of PGD in the USA, including factors involved in decision making about PGD use, as well as the education of the various communities that are, and should be, involved in its implementation. Improved understanding of these issues will ultimately enhance the development and implementation of future clinical guidelines and policies.
Fight, Flight or Freeze: Common Responses for Follower Coping with Toxic Leadership.
Webster, Vicki; Brough, Paula; Daly, Kathleen
2016-10-01
Sustained destructive leadership behaviours are associated with negative outcomes that produce serious workplace problems, yet there is scant research into how followers effectively cope with toxic leader behaviours. Despite numerous attempts to develop typologies of coping behaviours, there remains much to learn, especially in relation to this specific workplace stressor. This mixed method research investigates the coping strategies reported by 76 followers to cope with the psychological, emotional and physical consequences of their leader's adverse behaviour. Coping instances were categorized using two existing theoretical coping frameworks, and the ability of these frameworks to explain responses to real-world experiences with toxic leadership are discussed. Common coping strategies reported included assertively challenging the leader, seeking social support, ruminating, taking leave and leaving the organization. Organizational interventions to increase effectiveness of follower coping with the impact of toxic leadership are also discussed. Copyright © 2014 John Wiley & Sons, Ltd. Copyright © 2014 John Wiley & Sons, Ltd.
Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.
Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel
2017-06-01
Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.
Harnessing fluid-structure interactions to design self-regulating acoustic metamaterials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casadei, Filippo; Bertoldi, Katia; Kavli Institute for Bionano Science, Harvard University, Cambridge, Massachusetts 02138
The design of phononic crystals and acoustic metamaterials with tunable and adaptive wave properties remains one of the outstanding challenges for the development of next generation acoustic devices. We report on the numerical and experimental demonstration of a locally resonant acoustic metamaterial with dispersion characteristics, which autonomously adapt in response to changes of an incident aerodynamic flow. The metamaterial consists of a slender beam featuring a periodic array or airfoil-shaped masses supported by a linear and torsional springs. The resonance characteristics of the airfoils lead to strong attenuation at frequencies defined by the properties of the airfoils and the speedmore » on the incident fluid. The proposed concept expands the ability of existing acoustic bandgap materials to autonomously adapt their dispersion properties through fluid-structure interactions, and has the potential to dramatically impact a variety of applications, such as robotics, civil infrastructures, and defense systems.« less
Fracture propagation and stability of ice shelves governed by ice shelf heterogeneity
NASA Astrophysics Data System (ADS)
Borstad, Chris; McGrath, Daniel; Pope, Allen
2017-05-01
Tabular iceberg calving and ice shelf retreat occurs after full-thickness fractures, known as rifts, propagate across an ice shelf. A quickly evolving rift signals a threat to the stability of Larsen C, the Antarctic Peninsula's largest ice shelf. Here we reveal the influence of ice shelf heterogeneity on the growth of this rift, with implications that challenge existing notions of ice shelf stability. Most of the rift extension has occurred in bursts after overcoming the resistance of suture zones that bind together neighboring glacier inflows. We model the stresses in the ice shelf to determine potential rift trajectories. Calving perturbations to ice flow will likely reach the grounding line. The stability of Larsen C may hinge on a single suture zone that stabilizes numerous upstream rifts. Elevated fracture toughness of suture zones may be the most important property that allows ice shelves to modulate Antarctica's contribution to sea level rise.
Identification of neutral tumor evolution across cancer types
Barnes, Chris P; Graham, Trevor A; Sottoriva, Andrea
2016-01-01
Despite extraordinary efforts to profile cancer genomes, interpreting the vast amount of genomic data in the light of cancer evolution remains challenging. Here we demonstrate that neutral tumor evolution results in a power-law distribution of the mutant allele frequencies reported by next-generation sequencing of tumor bulk samples. We find that the neutral power-law fits with high precision 323 of 904 cancers from 14 types, selected from different cohorts. In malignancies identified as neutral, all clonal selection occurred prior to the onset of cancer growth and not in later-arising subclones, resulting in numerous passenger mutations that are responsible for intra-tumor heterogeneity. Reanalyzing cancer sequencing data within the neutral framework allowed the measurement, in each patient, of both the in vivo mutation rate and the order and timing of mutations. This result provides a new way to interpret existing cancer genomic data and to discriminate between functional and non-functional intra-tumor heterogeneity. PMID:26780609
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyle, Jennifer E.; Zhang, Xing; Weitz, Karl K.
Understanding how biological molecules are generated, metabolized and eliminated in living systems is important for interpreting processes such as immune response and disease pathology. While genomic and proteomic studies have provided vast amounts of information over the last several decades, interest in lipidomics has also grown due to improved analytical technologies revealing altered lipid metabolism in type 2 diabetes, cancer, and lipid storage disease. Liquid chromatography and mass spectrometry (LC-MS) measurements are currently the dominant approach for characterizing the lipidome by providing detailed information on the spatial and temporal composition of lipids. However, interpreting lipids’ biological roles is challenging duemore » to the existence of numerous structural and stereoisomers (i.e. distinct acyl chain and double-bond positions), which are unresolvable using present LC-MS approaches. Here we show that combining structurally-based ion mobility spectrometry (IMS) with LC-MS measurements distinguishes lipid isomers and allows insight into biological and disease processes.« less
Zheng, Xueyun; Deng, Liulin; Baker, Erin S; Ibrahim, Yehia M; Petyuk, Vladislav A; Smith, Richard D
2017-07-11
While α-linked amino acids in the l-form are exclusively utilized in mammalian protein building, β-linked and d-form amino acids also have important biological roles. Unfortunately, the structural elucidation and separation of these different amino acid types in peptides has been analytically challenging to date due to the numerous isomers present, limiting our knowledge about their existence and biological roles. Here, we utilized an ultrahigh resolution ion mobility spectrometry platform coupled with mass spectrometry (IMS-MS) to separate amyloid β (Aβ) peptides containing l-aspartic acid, d-aspartic acid, l-isoaspartic acid, and d-isoaspartic acid residues which span α- and β-linked amino acids in both d- and l-forms. The results illustrate how IMS-MS could be used to better understand age-related diseases or protein folding disorders resulting from amino acid modifications.
Reversible patterning of spherical shells through constrained buckling
NASA Astrophysics Data System (ADS)
Marthelot, J.; Brun, P.-T.; Jiménez, F. López; Reis, P. M.
2017-07-01
Recent advances in active soft structures envision the large deformations resulting from mechanical instabilities as routes for functional shape morphing. Numerous such examples exist for filamentary and plate systems. However, examples with double-curved shells are rarer, with progress hampered by challenges in fabrication and the complexities involved in analyzing their underlying geometrical nonlinearities. We show that on-demand patterning of hemispherical shells can be achieved through constrained buckling. Their postbuckling response is stabilized by an inner rigid mandrel. Through a combination of experiments, simulations, and scaling analyses, our investigation focuses on the nucleation and evolution of the buckling patterns into a reticulated network of sharp ridges. The geometry of the system, namely, the shell radius and the gap between the shell and the mandrel, is found to be the primary ingredient to set the surface morphology. This prominence of geometry suggests a robust, scalable, and tunable mechanism for reversible shape morphing of elastic shells.
Nanoparticle-induced unusual melting and solidification behaviours of metals
Ma, Chao; Chen, Lianyi; Cao, Chezheng; Li, Xiaochun
2017-01-01
Effective control of melting and solidification behaviours of materials is significant for numerous applications. It has been a long-standing challenge to increase the melted zone (MZ) depth while shrinking the heat-affected zone (HAZ) size during local melting and solidification of materials. In this paper, nanoparticle-induced unusual melting and solidification behaviours of metals are reported that effectively solve this long-time dilemma. By introduction of Al2O3 nanoparticles, the MZ depth of Ni is increased by 68%, while the corresponding HAZ size is decreased by 67% in laser melting at a pulse energy of 0.18 mJ. The addition of SiC nanoparticles shows similar results. The discovery of the unusual melting and solidification of materials that contain nanoparticles will not only have impacts on existing melting and solidification manufacturing processes, such as laser welding and additive manufacturing, but also on other applications such as pharmaceutical processing and energy storage. PMID:28098147
Efficient estimation of the maximum metabolic productivity of batch systems
St. John, Peter C.; Crowley, Michael F.; Bomble, Yannick J.
2017-01-31
Production of chemicals from engineered organisms in a batch culture involves an inherent trade-off between productivity, yield, and titer. Existing strategies for strain design typically focus on designing mutations that achieve the highest yield possible while maintaining growth viability. While these methods are computationally tractable, an optimum productivity could be achieved by a dynamic strategy in which the intracellular division of resources is permitted to change with time. New methods for the design and implementation of dynamic microbial processes, both computational and experimental, have therefore been explored to maximize productivity. However, solving for the optimal metabolic behavior under the assumptionmore » that all fluxes in the cell are free to vary is a challenging numerical task. Here, previous studies have therefore typically focused on simpler strategies that are more feasible to implement in practice, such as the time-dependent control of a single flux or control variable.« less
Assessing testamentary and decision-making capacity: Approaches and models.
Purser, Kelly; Rosenfeld, Tuly
2015-09-01
The need for better and more accurate assessments of testamentary and decision-making capacity grows as Australian society ages and incidences of mentally disabling conditions increase. Capacity is a legal determination, but one on which medical opinion is increasingly being sought. The difficulties inherent within capacity assessments are exacerbated by the ad hoc approaches adopted by legal and medical professionals based on individual knowledge and skill, as well as the numerous assessment paradigms that exist. This can negatively affect the quality of assessments, and results in confusion as to the best way to assess capacity. This article begins by assessing the nature of capacity. The most common general assessment models used in Australia are then discussed, as are the practical challenges associated with capacity assessment. The article concludes by suggesting a way forward to satisfactorily assess legal capacity given the significant ramifications of getting it wrong.
Chemical alternatives assessment: the case of flame retardants.
Howard, Gregory J
2014-12-01
Decisions on chemical substitution are made rapidly and by many stakeholders; these decisions may have a direct impact on consumer exposures, and, when a hazard exists, to consumer risks. Flame retardants (FRs) represent particular challenges, including very high production volumes, designed-in persistence, and often direct consumer exposure. Newer FR products, as with other industrial chemicals, typically lack data on hazard and exposure, and in many cases even basic information on structure and use in products is unknown. Chemical alternatives assessment (CAA) provides a hazard-focused approach to distinguishing between possible substitutions; variations on this process are used by several government and numerous corporate entities. By grouping chemicals according to functional use, some information on exposure potential can be inferred, allowing for decisions based on those hazard properties that are most distinguishing. This approach can help prevent the "regrettable substitution" of one chemical with another of equal, or even higher, risk. Copyright © 2014 Elsevier Ltd. All rights reserved.
Strategies for the coupling of global and local crystal growth models
NASA Astrophysics Data System (ADS)
Derby, Jeffrey J.; Lun, Lisa; Yeckel, Andrew
2007-05-01
The modular coupling of existing numerical codes to model crystal growth processes will provide for maximum effectiveness, capability, and flexibility. However, significant challenges are posed to make these coupled models mathematically self-consistent and algorithmically robust. This paper presents sample results from a coupling of the CrysVUn code, used here to compute furnace-scale heat transfer, and Cats2D, used to calculate melt fluid dynamics and phase-change phenomena, to form a global model for a Bridgman crystal growth system. However, the strategy used to implement the CrysVUn-Cats2D coupling is unreliable and inefficient. The implementation of under-relaxation within a block Gauss-Seidel iteration is shown to be ineffective for improving the coupling performance in a model one-dimensional problem representative of a melt crystal growth model. Ideas to overcome current convergence limitations using approximations to a full Newton iteration method are discussed.
Probing CP violation in $$h\\rightarrow\\gamma\\gamma$$ with converted photons
Bishara, Fady; Grossman, Yuval; Harnik, Roni; ...
2014-04-11
We study Higgs diphoton decays, in which both photons undergo nuclear conversion to electron- positron pairs. The kinematic distribution of the two electron-positron pairs may be used to probe the CP violating (CPV) coupling of the Higgs to photons, that may be produced by new physics. Detecting CPV in this manner requires interference between the spin-polarized helicity amplitudes for both conversions. We derive leading order, analytic forms for these amplitudes. In turn, we obtain compact, leading-order expressions for the full process rate. While performing experiments involving photon conversions may be challenging, we use the results of our analysis to constructmore » experimental cuts on certain observables that may enhance sensitivity to CPV. We show that there exist regions of phase space on which sensitivity to CPV is of order unity. As a result, the statistical sensitivity of these cuts are verified numerically, using dedicated Monte-Carlo simulations.« less
The 'spiteful' origins of human cooperation.
Marlowe, Frank W; Berbesque, J Colette; Barrett, Clark; Bolyanatz, Alexander; Gurven, Michael; Tracer, David
2011-07-22
We analyse generosity, second-party ('spiteful') punishment (2PP), and third-party ('altruistic') punishment (3PP) in a cross-cultural experimental economics project. We show that smaller societies are less generous in the Dictator Game but no less prone to 2PP in the Ultimatum Game. We might assume people everywhere would be more willing to punish someone who hurt them directly (2PP) than someone who hurt an anonymous third person (3PP). While this is true of small societies, people in large societies are actually more likely to engage in 3PP than 2PP. Strong reciprocity, including generous offers and 3PP, exists mostly in large, complex societies that face numerous challenging collective action problems. We argue that 'spiteful' 2PP, motivated by the basic emotion of anger, is more universal than 3PP and sufficient to explain the origins of human cooperation.
Reconstructing White Walls: Multi-View Multi-Shot 3d Reconstruction of Textureless Surfaces
NASA Astrophysics Data System (ADS)
Ley, Andreas; Hänsch, Ronny; Hellwich, Olaf
2016-06-01
The reconstruction of the 3D geometry of a scene based on image sequences has been a very active field of research for decades. Nevertheless, there are still existing challenges in particular for homogeneous parts of objects. This paper proposes a solution to enhance the 3D reconstruction of weakly-textured surfaces by using standard cameras as well as a standard multi-view stereo pipeline. The underlying idea of the proposed method is based on improving the signal-to-noise ratio in weakly-textured regions while adaptively amplifying the local contrast to make better use of the limited numerical range in 8-bit images. Based on this premise, multiple shots per viewpoint are used to suppress statistically uncorrelated noise and enhance low-contrast texture. By only changing the image acquisition and adding a preprocessing step, a tremendous increase of up to 300% in completeness of the 3D reconstruction is achieved.
Managing Emergency Situations in the Smart City: The Smart Signal.
Asensio, Ángel; Blanco, Teresa; Blasco, Rubén; Marco, Álvaro; Casas, Roberto
2015-06-18
In a city there are numerous items, many of them unnoticed but essential; this is the case of the signals. Signals are considered objects with reduced technological interest, but in this paper we prove that making them smart and integrating in the IoT (Internet of Things) could be a relevant contribution to the Smart City. This paper presents the concept of Smart Signal, as a device conscious of its context, with communication skills, able to offer the best message to the user, and as a ubiquitous element that contributes with information to the city. We present the design considerations and a real implementation and validation of the system in one of the most challenging environments that may exist in a city: a tunnel. The main advantages of the Smart Signal are the improvement of the actual functionality of the signal providing new interaction capabilities with users and a new sensory mechanism of the Smart City.
Making Semantic Information Work Effectively for Degraded Environments
2013-06-01
Control Research & Technology Symposium (ICCRTS) held 19-21 June, 2013 in Alexandria, VA. 14. ABSTRACT The challenges of effectively managing semantic...technologies over disadvantaged or degraded environments are numerous and complex. One of the greatest challenges is the size of raw data. Large...approach mitigates this challenge by performing data reduction through the adoption of format recognition technologies, semantic data extractions, and the
ERIC Educational Resources Information Center
Marcinkowski, Thomas J.
2009-01-01
Over the past four decades, numerous professionals in the field of environmental education (EE) have attempted to take stock of conditions within and outside of EE. In turn, many used the results of their analyses to describe challenges to and opportunities for EE. Many of these challenges and opportunities continue to ring true today, although…
Haddad, Tarek; Baginska, Ewelina; Kümmerer, Klaus
2015-04-01
Pharmaceuticals may undergo transformation into new products during almost all possible processes along their life-cycle. This could either take place in the natural water environment and/or during water treatment processes. Numerous studies that address the issue of such transformation products (TPs) have been published, describing selected aspects of TPs in the environment and their formation within effluent and water treatment processes. In order to exemplify the number and quality of information published on TPs, we selected 21 active pharmaceutical ingredients from the groups of antibiotics and antineoplastics, and assessed the knowledge about their TPs that has been published until the end of May 2012. The goal of this work was to demonstrate, that the quality of data on pharmaceutical TPs greatly differs in terms of the availability of chemical structures for each TP, rather than to provide an exhaustive database of available TPs. The aim was to point out the challenge going along with so many TPs formed under different treatment and environmental conditions. An extensive review in the form of a table showing the existing data on 158 TPs for 15 compounds, out of 21 investigated, was presented. Numerous TPs are the result of different treatments and environmental processes. However, also numerous different TPs may be formed within only one type of treatment, applied under sometimes even very similar treatment conditions and treatments times. In general, the growing number of elucidated TPs is rationalized by ineffective removal treatments. Our results demonstrate a severe risk of drowning in much unrelated and non-assessable data, both from a scientific and from a technical treatment-related point of view. Therefore, limiting the input of pharmaceuticals into effluents as well as improving their (bio) degradability and elimination behavior, instead of only relying on advanced effluent treatments, is urgently needed. Solutions that focus on this "beginning of the pipe" approach should minimize the adverse effects of parent compounds by reducing and formation of TPs and their entrance into the natural environment. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2003-03-01
Small communities have long faced challenges in obtaining or retaining the commercial air service they desire. These challenges are increasing as many U.S. airlines try to stem unprecedented financial losses through numerous cost-cutting measures, in...
Advanced Space Flight and Environmental Concerns
NASA Technical Reports Server (NTRS)
Whitaker, A.
2001-01-01
The aerospace industry has conquered numerous environmental challenges during the last decade. The aerospace industry of today has evolved due in part to the environmental challenges, becoming stronger, more robust, learning to push the limits of technology, materials and manufacturing, and performing cutting edge engineering.
Multifunctional Mesoscale Observing Networks.
NASA Astrophysics Data System (ADS)
Dabberdt, Walter F.; Schlatter, Thomas W.; Carr, Frederick H.; Friday, Elbert W. Joe; Jorgensen, David; Koch, Steven; Pirone, Maria; Ralph, F. Martin; Sun, Juanzhen; Welsh, Patrick; Wilson, James W.; Zou, Xiaolei
2005-07-01
More than 120 scientists, engineers, administrators, and users met on 8 10 December 2003 in a workshop format to discuss the needs for enhanced three-dimensional mesoscale observing networks. Improved networks are seen as being critical to advancing numerical and empirical modeling for a variety of mesoscale applications, including severe weather warnings and forecasts, hydrology, air-quality forecasting, chemical emergency response, transportation safety, energy management, and others. The participants shared a clear and common vision for the observing requirements: existing two-dimensional mesoscale measurement networks do not provide observations of the type, frequency, and density that are required to optimize mesoscale prediction and nowcasts. To be viable, mesoscale observing networks must serve multiple applications, and the public, private, and academic sectors must all actively participate in their design and implementation, as well as in the creation and delivery of value-added products. The mesoscale measurement challenge can best be met by an integrated approach that considers all elements of an end-to-end solution—identifying end users and their needs, designing an optimal mix of observations, defining the balance between static and dynamic (targeted or adaptive) sampling strategies, establishing long-term test beds, and developing effective implementation strategies. Detailed recommendations are provided pertaining to nowcasting, numerical prediction and data assimilation, test beds, and implementation strategies.
Numerical investigation of the dynamical environment of 65803 Didymos
NASA Astrophysics Data System (ADS)
Dell'Elce, L.; Baresi, N.; Naidu, S. P.; Benner, L. A. M.; Scheeres, D. J.
2017-03-01
The Asteroid Impact & Deflection Assessment (AIDA) mission is planning to visit the Didymos binary system in 2022 in order to perform the first demonstration ever of the kinetic impact technique. Binary asteroids are an ideal target for this since the deflection of the secondary body can be accurately measured by a satellite orbiting in the system. However, these binaries offer an extremely rich dynamical environment whose accurate investigation through analytical approaches is challenging at best and requires a significant number of restrictive assumptions. For this reason, a numerical investigation of the dynamical environment in the vicinity of the Didymos system is offered in this paper. After computing various families of periodic orbits, their robustness is assessed in a high-fidelity environment consisting of the perturbed restricted full three-body problem. The results of this study suggest that several nominally stable trajectories, including the triangular libration points, should not be considered as safe as a state vector perturbation may cause the spacecraft to drift from the nominal orbit and possibly impact one of the primary bodies within a few days. Nonetheless, there exist two safe solutions, namely terminator and interior retrograde orbits. The first one is adequate for observation purposes of the entire system and for communications. The second one is more suitable to perform close investigations of the primary body.
Oscillator Seeding of a High Gain Harmonic Generation FEL in a Radiator-First Configuration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gandhi, P.; Wurtele, J.; Penn, G.
2012-05-20
A longitudinally coherent X-ray pulse from a high repetition rate free electron laser (FEL) is desired for a wide variety of experimental applications. However, generating such a pulse with a repetition rate greater than 1 MHz is a significant challenge. The desired high repetition rate sources, primarily high harmonic generation with intense lasers in gases or plasmas, do not exist now, and, for the multi-MHz bunch trains that superconducting accelerators can potentially produce, are likely not feasible with current technology. In this paper, we propose to place an oscillator downstream of a radiator. The oscillator generates radiation that is usedmore » as a seed for a high gain harmonic generation (HGHG) FEL which is upstream of the oscillator. For the first few pulses the oscillator builds up power and, until power is built up, the radiator has no HGHG seed. As power in the oscillator saturates, the HGHG is seeded and power is produced. The dynamics and stability of this radiator-first scheme is explored analytically and numerically. A single-pass map is derived using a semi-analytic model for FEL gain and saturation. Iteration of the map is shown to be in good agreement with simulations. A numerical example is presented for a soft X-ray FEL.« less
NASA Astrophysics Data System (ADS)
Sirikham, Adisorn; Zhao, Yifan; Mehnen, Jörn
2017-11-01
Thermography is a promising method for detecting subsurface defects, but accurate measurement of defect depth is still a big challenge because thermographic signals are typically corrupted by imaging noise and affected by 3D heat conduction. Existing methods based on numerical models are susceptible to signal noise and methods based on analytical models require rigorous assumptions that usually cannot be satisfied in practical applications. This paper presents a new method to improve the measurement accuracy of subsurface defect depth through determining the thermal wave reflection coefficient directly from observed data that is usually assumed to be pre-known. This target is achieved through introducing a new heat transfer model that includes multiple physical parameters to better describe the observed thermal behaviour in pulsed thermographic inspection. Numerical simulations are used to evaluate the performance of the proposed method against four selected state-of-the-art methods. Results show that the accuracy of depth measurement has been improved up to 10% when noise level is high and thermal wave reflection coefficients is low. The feasibility of the proposed method in real data is also validated through a case study on characterising flat-bottom holes in carbon fibre reinforced polymer (CFRP) laminates which has a wide application in various sectors of industry.
NASA Astrophysics Data System (ADS)
De Filippis, G.; Noël, J. P.; Kerschen, G.; Soria, L.; Stephan, C.
2017-09-01
The introduction of the frequency-domain nonlinear subspace identification (FNSI) method in 2013 constitutes one in a series of recent attempts toward developing a realistic, first-generation framework applicable to complex structures. If this method showed promising capabilities when applied to academic structures, it is still confronted with a number of limitations which needs to be addressed. In particular, the removal of nonphysical poles in the identified nonlinear models is a distinct challenge. In the present paper, it is proposed as a first contribution to operate directly on the identified state-space matrices to carry out spurious pole removal. A modal-space decomposition of the state and output matrices is examined to discriminate genuine from numerical poles, prior to estimating the extended input and feedthrough matrices. The final state-space model thus contains physical information only and naturally leads to nonlinear coefficients free of spurious variations. Besides spurious variations due to nonphysical poles, vibration modes lying outside the frequency band of interest may also produce drifts of the nonlinear coefficients. The second contribution of the paper is to include residual terms, accounting for the existence of these modes. The proposed improved FNSI methodology is validated numerically and experimentally using a full-scale structure, the Morane-Saulnier Paris aircraft.
A robust, efficient equidistribution 2D grid generation method
NASA Astrophysics Data System (ADS)
Chacon, Luis; Delzanno, Gian Luca; Finn, John; Chung, Jeojin; Lapenta, Giovanni
2007-11-01
We present a new cell-area equidistribution method for two- dimensional grid adaptation [1]. The method is able to satisfy the equidistribution constraint to arbitrary precision while optimizing desired grid properties (such as isotropy and smoothness). The method is based on the minimization of the grid smoothness integral, constrained to producing a given positive-definite cell volume distribution. The procedure gives rise to a single, non-linear scalar equation with no free-parameters. We solve this equation numerically with the Newton-Krylov technique. The ellipticity property of the linearized scalar equation allows multigrid preconditioning techniques to be effectively used. We demonstrate a solution exists and is unique. Therefore, once the solution is found, the adapted grid cannot be folded due to the positivity of the constraint on the cell volumes. We present several challenging tests to show that our new method produces optimal grids in which the constraint is satisfied numerically to arbitrary precision. We also compare the new method to the deformation method [2] and show that our new method produces better quality grids. [1] G.L. Delzanno, L. Chac'on, J.M. Finn, Y. Chung, G. Lapenta, A new, robust equidistribution method for two-dimensional grid generation, in preparation. [2] G. Liao and D. Anderson, A new approach to grid generation, Appl. Anal. 44, 285--297 (1992).
NASA Astrophysics Data System (ADS)
Dehghan, Ali Naghi; Goshtasbi, Kamran; Ahangari, Kaveh; Jin, Yan; Bahmani, Aram
2017-02-01
A variety of 3D numerical models were developed based on hydraulic fracture experiments to simulate the propagation of hydraulic fracture at its intersection with natural (pre-existing) fracture. Since the interaction between hydraulic and pre-existing fractures is a key condition that causes complex fracture patterns, the extended finite element method was employed in ABAQUS software to simulate the problem. The propagation of hydraulic fracture in a fractured medium was modeled in two horizontal differential stresses (Δ σ) of 5e6 and 10e6 Pa considering different strike and dip angles of pre-existing fracture. The rate of energy release was calculated in the directions of hydraulic and pre-existing fractures (G_{{frac}} /G_{{rock}}) at their intersection point to determine the fracture behavior. Opening and crossing were two dominant fracture behaviors during the hydraulic and pre-existing fracture interaction at low and high differential stress conditions, respectively. The results of numerical studies were compared with those of experimental models, showing a good agreement between the two to validate the accuracy of the models. Besides the horizontal differential stress, strike and dip angles of the natural (pre-existing) fracture, the key finding of this research was the significant effect of the energy release rate on the propagation behavior of the hydraulic fracture. This effect was more prominent under the influence of strike and dip angles, as well as differential stress. The obtained results can be used to predict and interpret the generation of complex hydraulic fracture patterns in field conditions.
NSLS-II HIGH LEVEL APPLICATION INFRASTRUCTURE AND CLIENT API DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, G.; Yang; L.
2011-03-28
The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate themore » beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the developers of APIs and how to use them to form a physics application to the users. For example, how the channels are related to magnet and what the current real-time setting of a magnet is in physics unit are the internals of APIs. Measuring chromaticities are the users of APIs. All the users of APIs are working with magnet and instrument names in a physics unit. The low level communications in current or voltage unit are minimized. In this paper, we discussed our recent progress of our infrastructure development, and client API.« less
Severe storms forecast systems
NASA Technical Reports Server (NTRS)
Kaplan, M.; Zack, J.
1980-01-01
Two research tasks are described: (1) the improvement and enhancement of an existing mesoscale numerical simulation system, and (2) numerical diagnostic studies associated with an individual case of severe storm development (April 10, 1979 in the Red River Valley of Texas and Oklahoma).
A numerical study of mixing in supersonic combustors with hypermixing injectors
NASA Technical Reports Server (NTRS)
Lee, J.
1993-01-01
A numerical study was conducted to evaluate the performance of wall mounted fuel-injectors designed for potential Supersonic Combustion Ramjet (SCRAM-jet) engine applications. The focus of this investigation was to numerically simulate existing combustor designs for the purpose of validating the numerical technique and the physical models developed. Three different injector designs of varying complexity were studied to fully understand the computational implications involved in accurate predictions. A dual transverse injection system and two streamwise injector designs were studied. The streamwise injectors were designed with swept ramps to enhance fuel-air mixing and combustion characteristics at supersonic speeds without the large flow blockage and drag contribution of the transverse injection system. For this study, the Mass-Average Navier-Stokes equations and the chemical species continuity equations were solved. The computations were performed using a finite-volume implicit numerical technique and multiple block structured grid system. The interfaces of the multiple block structured grid systems were numerically resolved using the flux-conservative technique. Detailed comparisons between the computations and existing experimental data are presented. These comparisons show that numerical predictions are in agreement with the experimental data. These comparisons also show that a number of turbulence model improvements are needed for accurate combustor flowfield predictions.
A numerical study of mixing in supersonic combustors with hypermixing injectors
NASA Technical Reports Server (NTRS)
Lee, J.
1992-01-01
A numerical study was conducted to evaluate the performance of wall mounted fuel-injectors designed for potential Supersonic Combustion Ramjet (SCRAM-jet) engine applications. The focus of this investigation was to numerically simulate existing combustor designs for the purpose of validating the numerical technique and the physical models developed. Three different injector designs of varying complexity were studied to fully understand the computational implications involved in accurate predictions. A dual transverse injection system and two streamwise injector designs were studied. The streamwise injectors were designed with swept ramps to enhance fuel-air mixing and combustion characteristics at supersonic speeds without the large flow blockage and drag contribution of the transverse injection system. For this study, the Mass-Averaged Navier-Stokes equations and the chemical species continuity equations were solved. The computations were performed using a finite-volume implicit numerical technique and multiple block structured grid system. The interfaces of the multiple block structured grid systems were numerically resolved using the flux-conservative technique. Detailed comparisons between the computations and existing experimental data are presented. These comparisons show that numerical predictions are in agreement with the experimental data. These comparisons also show that a number of turbulence model improvements are needed for accurate combustor flowfield predictions.
Low Reynolds number two-equation modeling of turbulent flows
NASA Technical Reports Server (NTRS)
Michelassi, V.; Shih, T.-H.
1991-01-01
A k-epsilon model that accounts for viscous and wall effects is presented. The proposed formulation does not contain the local wall distance thereby making very simple the application to complex geometries. The formulation is based on an existing k-epsilon model that proved to fit very well with the results of direct numerical simulation. The new form is compared with nine different two-equation models and with direct numerical simulation for a fully developed channel flow at Re = 3300. The simple flow configuration allows a comparison free from numerical inaccuracies. The computed results prove that few of the considered forms exhibit a satisfactory agreement with the channel flow data. The model shows an improvement with respect to the existing formulations.
ERIC Educational Resources Information Center
Agus, Mirian; Peró-Cebollero, Maribel; Penna, Maria Pietronilla; Guàrdia-Olmos, Joan
2015-01-01
This study aims to investigate about the existence of a graphical facilitation effect on probabilistic reasoning. Measures of undergraduates' performances on problems presented in both verbal-numerical and graphical-pictorial formats have been related to visuo-spatial and numerical prerequisites, to statistical anxiety, to attitudes towards…
Effect of Atmospheric Absorption Bands on the Optimal Design of Multijunction Solar Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, William E.; Friedman, Daniel J.; Geisz, John F.
Designing terrestrial multijunction (MJ) cells with 5+ junctions is challenging, in part because the presence of atmospheric absorption bands creates a design space with numerous local maxima. Here we introduce a new taxonomical structure which facilitates both numerical convergence and the visualization of the resulting designs.
A structural classification for inland northwest forest vegetation.
Kevin L. O' Hara; Penelope A. Latham; Paul Hessburg; Bradley G. Smith
1996-01-01
Existing approaches to vegetation classification range from those bassed on potential vegetation to others based on existing vegetation composition, or existing structural or physiognomic characteristics. Examples of these classifications are numerous, and in some cases, date back hundreds of years (Mueller-Dumbois and Ellenberg 1974). Small-scale or stand level...
CHALLENGES FOR THE FUTURE IN ENVIRONMENTAL MUTAGENESIS
CHALLENGES FOR THE FUTURE IN ENVIRONMENTAL MUTAGENESIS
Michael D. Waters
US Environmental Protection Agency, MD-51A, Research Triangle Park, NC 27711 USA
Our rapidly growing understanding of the structure of the human genome is forming the basis for numerous new...
Progress Towards a Time-Dependent Theory of Solar Meridional Flows
NASA Astrophysics Data System (ADS)
Shirley, James H.
2017-08-01
Large-scale meridional motions of solar materials play an important role in flux transport dynamo models. Meridional flows transport surface magnetic flux to polar regions of the Sun, where it may later be subducted and conveyed back towards the equatorial region by a deep return flow in the convection zone. The transported flux may thereafter lead to the generation of new toroidal fields, thereby completing the dynamo cycle. More than two decades of observations have revealed that meridional flow speeds vary substantially with time. Further, a complex morphological variability of meridional flow cells is now recognized, with multiple cell structures detected both in latitude and in depth. ‘Countercells’ with reversed flow directions have been detected at various times. Flow speeds are apparently influenced by the proximity of flows to active regions. This complexity represents a considerable challenge to dynamo modeling efforts. Flows morphology and speed changes may be arbitrarily prescribed in models, but physical realism of model outputs may be questionable, and elusive: The models are ‘trying to hit a moving target.’ Considerations such as these led Belucz et al. (2013; Ap. J. 806:169) to call for “time-dependent theories that can tell us theoretically how this circulation may change its amplitude and form in each hemisphere.” Such a theory now exists for planetary atmospheres (Shirley, 2017; Plan. Sp. Sci. 141, 1-16). Proof of concept for the non-tidal orbit-spin coupling hypothesis of Shirley (2017) was obtained through numerical modeling of the atmospheric circulation of Mars (Mischna & Shirley, 2017; Plan. Sp. Sci. 141, 45-72). Much-improved correspondence of numerical modeling outcomes with observations was demonstrated. In this presentation we will briefly review the physical hypothesis and some prior evidence of its possible role in solar dynamo excitation. We show a strong correlation between observed meridional flow speeds of magnetic features in Cycle 23 with the putative dynamical forcing function. We will also briefly discuss the potential for incorporating orbit-spin coupling accelerations within existing numerical solar dynamo models.
The ecology, distribution, conservation and management of large old trees.
Lindenmayer, David B; Laurance, William F
2017-08-01
Large old trees are some of the most iconic biota on earth and are integral parts of many terrestrial ecosystems including those in tropical, temperate and boreal forests, deserts, savannas, agro-ecological areas, and urban environments. In this review, we provide new insights into the ecology, function, evolution and management of large old trees through broad cross-disciplinary perspectives from literatures in plant physiology, growth and development, evolution, habitat value for fauna and flora, and conservation management. Our review reveals that the diameter, height and longevity of large old trees varies greatly on an inter-specific basis, thereby creating serious challenges in defining large old trees and demanding an ecosystem- and species-specific definition that will only rarely be readily transferable to other species or ecosystems. Such variation is also manifested by marked inter-specific differences in the key attributes of large old trees (beyond diameter and height) such as the extent of buttressing, canopy architecture, the extent of bark micro-environments and the prevalence of cavities. We found that large old trees play an extraordinary range of critical ecological roles including in hydrological regimes, nutrient cycles and numerous ecosystem processes. Large old trees strongly influence the spatial and temporal distribution and abundance of individuals of the same species and populations of numerous other plant and animal species. We suggest many key characteristics of large old trees such as extreme height, prolonged lifespans, and the presence of cavities - which confer competitive and evolutionary advantages in undisturbed environments - can render such trees highly susceptible to a range of human influences. Large old trees are vulnerable to threats ranging from droughts, fire, pests and pathogens, to logging, land clearing, landscape fragmentation and climate change. Tackling such diverse threats is challenging because they often interact and manifest in different ways in different ecosystems, demanding targeted species- or ecosystem-specific responses. We argue that novel management actions will often be required to protect existing large old trees and ensure the recruitment of new cohorts of such trees. For example, fine-scale tree-level conservation such as buffering individual stems will be required in many environments such as in agricultural areas and urban environments. Landscape-level approaches like protecting places where large old trees are most likely to occur will be needed. However, this brings challenges associated with likely changes in tree distributions associated with climate change, because long-lived trees may presently exist in places unsuitable for the development of new cohorts of the same species. Appropriate future environmental domains for a species could exist in new locations where it has never previously occurred. The future distribution and persistence of large old trees may require controversial responses including assisted migration via seed or seedling establishment in new locales. However, the effectiveness of such approaches may be limited where key ecological features of large old trees (such as cavity presence) depend on other species such as termites, fungi and bacteria. Unless other species with similar ecological roles are present to fulfil these functions, these taxa might need to be moved concurrently with the target tree species. © 2016 Cambridge Philosophical Society.
An interlaboratory transfer of a multi-analyte assay between continents.
Georgiou, Alexandra; Dong, Kelly; Hughes, Stephen; Barfield, Matthew
2015-01-01
Alex has worked at GlaxoSmithKline for the past 15 years and currently works within the bioanalytical and toxicokinetic group in the United Kingdom. Alex's role in previous years has been the in-house support of preclinical and clinical bioanalysis, from method development through to sample analysis activities as well as acting as PI for GLP bioanalysis and toxicokinetics. For the past two years, Alex has applied this analytical and regulatory experience to focus on the outsourcing of preclinical bioanalysis, toxicokinetics and clinical bioanalysis, working closely with multiple bioanalytical and in-life CRO partners worldwide. Alex works to support DMPK and Safety Assessment outsourcing activities for GSK across multiple therapeutic areas, from the first GLP study through to late stage clinical PK studies. Transfer and cross-validation of an existing analytical assay between a laboratory providing current analytical support, and a laboratory needed for new or additional support, can present the bioanalyst with numerous challenges. These challenges can be technical or logistical in nature and may prove to be significant when transferring an assay between laboratories in different continents. Part of GlaxoSmithKline's strategy to improve confidence in providing quality data, is to cross-validate between laboratories. If the cross-validation fails predefined acceptance criteria, then a subsequent investigation would follow. This may also prove to be challenging. The importance of thorough planning and good communication throughout assay transfer, cross-validation and any subsequent investigations is illustrated in this case study.
The Forced Soft Spring Equation
ERIC Educational Resources Information Center
Fay, T. H.
2006-01-01
Through numerical investigations, this paper studies examples of the forced Duffing type spring equation with [epsilon] negative. By performing trial-and-error numerical experiments, the existence is demonstrated of stability boundaries in the phase plane indicating initial conditions yielding bounded solutions. Subharmonic boundaries are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David
In the January 2002 edition of SIAM News, Nick Trefethen announced the '$100, 100-Digit Challenge'. In this note he presented ten easy-to-state but hard-to-solve problems of numerical analysis, and challenged readers to find each answer to ten-digit accuracy. Trefethen closed with the enticing comment: 'Hint: They're hard! If anyone gets 50 digits in total, I will be impressed.' This challenge obviously struck a chord in hundreds of numerical mathematicians worldwide, as 94 teams from 25 nations later submitted entries. Many of these submissions exceeded the target of 50 correct digits; in fact, 20 teams achieved a perfect score of 100more » correct digits. Trefethen had offered $100 for the best submission. Given the overwhelming response, a generous donor (William Browning, founder of Applied Mathematics, Inc.) provided additional funds to provide a $100 award to each of the 20 winning teams. Soon after the results were out, four participants, each from a winning team, got together and agreed to write a book about the problems and their solutions. The team is truly international: Bornemann is from Germany, Laurie is from South Africa, Wagon is from the USA, and Waldvogel is from Switzerland. This book provides some mathematical background for each problem, and then shows in detail how each of them can be solved. In fact, multiple solution techniques are mentioned in each case. The book describes how to extend these solutions to much larger problems and much higher numeric precision (hundreds or thousands of digit accuracy). The authors also show how to compute error bounds for the results, so that one can say with confidence that one's results are accurate to the level stated. Numerous numerical software tools are demonstrated in the process, including the commercial products Mathematica, Maple and Matlab. Computer programs that perform many of the algorithms mentioned in the book are provided, both in an appendix to the book and on a website. In the process, the authors take the reader on a wide-ranging tour of modern numerical mathematics, with enough background material so that even readers with little or no training in numerical analysis can follow. Here is a list of just a few of the topics visited: numerical quadrature (i.e., numerical integration), series summation, sequence extrapolation, contour integration, Fourier integrals, high-precision arithmetic, interval arithmetic, symbolic computing, numerical linear algebra, perturbation theory, Euler-Maclaurin summation, global minimization, eigenvalue methods, evolutionary algorithms, matrix preconditioning, random walks, special functions, elliptic functions, Monte-Carlo methods, and numerical differentiation.« less
2007-12-01
tremendous opportunity to support telesurgical care using mobile systems, where communication assets are challenging . BODY RESEARCH PLAN This... challenged with constraints of remoteness, limited resources, and limited technical expertise. This telesurgery research, funded by TATRC and reported...utilized iChat (V.2.1.3) for the Apple Macintosh. Technology Summary The challenges of implementing the technology were numerous. Beyond the fact that
ERIC Educational Resources Information Center
Chung, Kyong-Mee; Jung, Woohyun; Yang, Jae-won; Ben-Itzchak, Esther; Zachor, Ditza A.; Furniss, Frederick; Heyes, Katie; Matson, Johnny L.; Kozlowski, Alison M.; Barker, Alyse A.
2012-01-01
Challenging behaviors are deemed extremely common within the autism spectrum disorders (ASD) population. Numerous factors and their effects upon the presence and severity of challenging behaviors within this population have been investigated. However, there has been limited research to investigate the effects of cultural differences on challenging…
Aerodynamic shape optimization using control theory
NASA Technical Reports Server (NTRS)
Reuther, James
1996-01-01
Aerodynamic shape design has long persisted as a difficult scientific challenge due its highly nonlinear flow physics and daunting geometric complexity. However, with the emergence of Computational Fluid Dynamics (CFD) it has become possible to make accurate predictions of flows which are not dominated by viscous effects. It is thus worthwhile to explore the extension of CFD methods for flow analysis to the treatment of aerodynamic shape design. Two new aerodynamic shape design methods are developed which combine existing CFD technology, optimal control theory, and numerical optimization techniques. Flow analysis methods for the potential flow equation and the Euler equations form the basis of the two respective design methods. In each case, optimal control theory is used to derive the adjoint differential equations, the solution of which provides the necessary gradient information to a numerical optimization method much more efficiently then by conventional finite differencing. Each technique uses a quasi-Newton numerical optimization algorithm to drive an aerodynamic objective function toward a minimum. An analytic grid perturbation method is developed to modify body fitted meshes to accommodate shape changes during the design process. Both Hicks-Henne perturbation functions and B-spline control points are explored as suitable design variables. The new methods prove to be computationally efficient and robust, and can be used for practical airfoil design including geometric and aerodynamic constraints. Objective functions are chosen to allow both inverse design to a target pressure distribution and wave drag minimization. Several design cases are presented for each method illustrating its practicality and efficiency. These include non-lifting and lifting airfoils operating at both subsonic and transonic conditions.
A numerical study of the 3-periodic wave solutions to KdV-type equations
NASA Astrophysics Data System (ADS)
Zhang, Yingnan; Hu, Xingbiao; Sun, Jianqing
2018-02-01
In this paper, by using the direct method of calculating periodic wave solutions proposed by Akira Nakamura, we present a numerical process to calculate the 3-periodic wave solutions to several KdV-type equations: the Korteweg-de Vries equation, the Sawada-Koterra equation, the Boussinesq equation, the Ito equation, the Hietarinta equation and the (2 + 1)-dimensional Kadomtsev-Petviashvili equation. Some detailed numerical examples are given to show the existence of the three-periodic wave solutions numerically.
Improved numerical methods for turbulent viscous recirculating flows
NASA Technical Reports Server (NTRS)
Vandoormaal, J. P.; Turan, A.; Raithby, G. D.
1986-01-01
The objective of the present study is to improve both the accuracy and computational efficiency of existing numerical techniques used to predict viscous recirculating flows in combustors. A review of the status of the study is presented along with some illustrative results. The effort to improve the numerical techniques consists of the following technical tasks: (1) selection of numerical techniques to be evaluated; (2) two dimensional evaluation of selected techniques; and (3) three dimensional evaluation of technique(s) recommended in Task 2.
Analysis, results and conclusion of magnetotelluric data acquired in northern Switzerland
NASA Astrophysics Data System (ADS)
Shah, Neeraj; Samrock, Friedemann; Grayver, Alexander; Saar, Martin O.
2017-04-01
In early 2016, a magnetotelluric (MT) survey of the Aargau region of northern Switzerland was performed in order to understand the potential of the magnetotelluric method to characterise the electrical resistivity properties of the subsurface in Switzerland, and more widely, in areas with high amounts of cultural electromagnetic (EM) noise. Subsurface electrical resisitivity properties are interesting as they can help identify underground aquifers or geothermal resources and possibly provide insight into the large-scale movement of fluid. The north of Switzerland is a challenging and representative environment, with significant EM infrastructure, including powerlines and numerous other sources of noise related to human activity and use of industrial equipment. Here, we present the results of the survey together with detailed analysis of the issues encountered and challenges faced when doing this survey. In particular, we concentrate on data quality issues in the raw time series, the impact of using a remote reference over single site processing and the distribution of transfer functions. The final set of transfer functions for the survey, which includes twelve successful sites, is shown to suffer from noise issues in certain frequency ranges. A 1-D inversion of SSQ averaged transfer functions and comparison to existing borehole data demonstrates that geologic data is captured in the MT data recorded in northern Switzerland. Further, 2-D forward modelling supports the idea that good geologic information exists in the data despite the noise issues, which for now, impede a robust multi-dimensional inversion. Finally, suggestions for future work and methods to improve the quality of data when surveying in high EM noise environments are offered with a view to being able to reliably perform MT surveys closer to urban environments.
Building energy governance in Shanghai
NASA Astrophysics Data System (ADS)
Kung, YiHsiu Michelle
With Asia's surging economies and urbanization, the region is adding to its built environment at an unprecedented rate, especially those population centers in China and India. With numerous existing buildings, plus a new building boom, construction in these major Asian cities has caused momentous sustainability challenges. This dissertation focuses on China's leading city, Shanghai, to explore and assess its existing commercial building energy policies and practices. Research estimates that Shanghai's commercial buildings might become a key challenge with regard to energy use and CO2 emissions as compared to other major Asian cities. Relevant building energy policy instruments at national and local levels for commercial buildings are reviewed. In addition, two benchmarks are established to further assess building energy policies in Shanghai. The first benchmark is based on the synthesis of relevant criteria and policy instruments as recommended by professional organizations, while the second practical benchmark is drawn from an analysis of three global cities: New York, London and Tokyo. Moreover, two large-scale commercial building sites - Shanghai IKEA and Plaza 66 - are selected for investigation and assessment of their efforts on building energy saving measures. Detailed building energy savings, CO2 reductions, and management cost reductions based on data availability and calculations are presented with the co-benefits approach. The research additionally analyzes different interventions and factors that facilitate or constrain the implementation process of building energy saving measures in each case. Furthermore, a multi-scale analytical framework is employed to investigate relevant stakeholders that shape Shanghai's commercial building energy governance. Research findings and policy recommendations are offered at the close of this dissertation. Findings and policy recommendations are intended to facilitate commercial building energy governance in Shanghai and other rapidly growing second-tier or third-tier cities in China, and to further contribute to the general body of knowledge on Asia's urban building sustainability.
The Challenges in Measuring Local Immunization Coverage: A Statewide Case Study
Rowhani-Rahbar, Ali; Duchin, Jeffrey; DeHart, M. Patricia; Opel, Douglas
2016-01-01
There are many forms of existing immunization surveillance in the United States and Washington state, but all are limited in their ability to provide timely identification of clusters of unimmunized individuals and assess the risk of vaccine-preventable diseases. This article aims to: (1) describe challenges to measuring immunization coverage at a local level in the United States using Washington State as a case study; and (2) propose improvements to existing surveillance systems that address the challenges identified. PMID:27244807
ERIC Educational Resources Information Center
Fay, Temple H.
2010-01-01
Through numerical investigations, we study examples of the forced quadratic spring equation [image omitted]. By performing trial-and-error numerical experiments, we demonstrate the existence of stability boundaries in the phase plane indicating initial conditions yielding bounded solutions, investigate the resonance boundary in the [omega]…
New challenges for Life Sciences flight project management
NASA Technical Reports Server (NTRS)
Huntoon, C. L.
1999-01-01
Scientists have conducted studies involving human spaceflight crews for over three decades. These studies have progressed from simple observations before and after each flight to sophisticated experiments during flights of several weeks up to several months. The findings from these experiments are available in the scientific literature. Management of these flight experiments has grown into a system fashioned from the Apollo Program style, focusing on budgeting, scheduling and allocation of human and material resources. While these areas remain important to the future, the International Space Station (ISS) requires that the Life Sciences spaceflight experiments expand the existing project management methodology. The use of telescience with state-the-art information technology and the multi-national crews and investigators challenges the former management processes. Actually conducting experiments on board the ISS will be an enormous undertaking and International Agreements and Working Groups will be essential in giving guidance to the flight project management Teams forged in this matrix environment must be competent to make decisions and qualified to work with the array of engineers, scientists, and the spaceflight crews. In order to undertake this complex task, data systems not previously used for these purposes must be adapted so that the investigators and the project management personnel can all share in important information as soon as it is available. The utilization of telescience and distributed experiment operations will allow the investigator to remain involved in their experiment as well as to understand the numerous issues faced by other elements of the program The complexity in formation and management of project teams will be a new kind of challenge for international science programs. Meeting that challenge is essential to assure success of the International Space Station as a laboratory in space.
New challenges for Life Sciences flight project management.
Huntoon, C L
1999-01-01
Scientists have conducted studies involving human spaceflight crews for over three decades. These studies have progressed from simple observations before and after each flight to sophisticated experiments during flights of several weeks up to several months. The findings from these experiments are available in the scientific literature. Management of these flight experiments has grown into a system fashioned from the Apollo Program style, focusing on budgeting, scheduling and allocation of human and material resources. While these areas remain important to the future, the International Space Station (ISS) requires that the Life Sciences spaceflight experiments expand the existing project management methodology. The use of telescience with state-the-art information technology and the multi-national crews and investigators challenges the former management processes. Actually conducting experiments on board the ISS will be an enormous undertaking and International Agreements and Working Groups will be essential in giving guidance to the flight project management Teams forged in this matrix environment must be competent to make decisions and qualified to work with the array of engineers, scientists, and the spaceflight crews. In order to undertake this complex task, data systems not previously used for these purposes must be adapted so that the investigators and the project management personnel can all share in important information as soon as it is available. The utilization of telescience and distributed experiment operations will allow the investigator to remain involved in their experiment as well as to understand the numerous issues faced by other elements of the program The complexity in formation and management of project teams will be a new kind of challenge for international science programs. Meeting that challenge is essential to assure success of the International Space Station as a laboratory in space.
New challenges for life sciences flight project management
NASA Astrophysics Data System (ADS)
Huntoon, Carolyn L.
1999-09-01
Scientists have conducted studies involving human spaceflight crews for over three decades. These studies have progressed from simple observations before and after each flight to sophisticated experiments during flights of several weeks up to several months. The findings from these experiments are available in the scientific literature. Management of these flight experiments has grown into a system fashioned from the Apollo Program style, focusing on budgeting, scheduling and allocation of human and material resources. While these areas remain important to the future, the International Space Station (ISS) requires that the Life Sciences spaceflight experiments expand the existing project management methodology. The use of telescience with state-of-the-art information technology and the multi-national crews and investigators challenges the former management processes. Actually conducting experiments on board the ISS will be an enormous undertaking and International Agreements and Working Groups will be essential in giving guidance to the flight project management Teams forged in this matrix environment must be competent to make decisions and qualified to work with the array of engineers, scientists, and the spaceflight crews. In order to undertake this complex task, data systems not previously used for these purposes must be adapted so that the investigators and the project management personnel can all share in important information as soon as it is available. The utilization of telescience and distributed experiment operations will allow the investigator to remain involved in their experiment as well as to understand the numerous issues faced by other elements of the program. The complexity in formation and management of project teams will be a new kind of challenge for international science programs. Meeting that challenge is essential to assure success of the International Space Station as a laboratory in space.
Multidisciplinary Optimization of a Transport Aircraft Wing using Particle Swarm Optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw; Venter, Gerhard
2002-01-01
The purpose of this paper is to demonstrate the application of particle swarm optimization to a realistic multidisciplinary optimization test problem. The paper's new contributions to multidisciplinary optimization is the application of a new algorithm for dealing with the unique challenges associated with multidisciplinary optimization problems, and recommendations as to the utility of the algorithm in future multidisciplinary optimization applications. The selected example is a bi-level optimization problem that demonstrates severe numerical noise and has a combination of continuous and truly discrete design variables. The use of traditional gradient-based optimization algorithms is thus not practical. The numerical results presented indicate that the particle swarm optimization algorithm is able to reliably find the optimum design for the problem presented here. The algorithm is capable of dealing with the unique challenges posed by multidisciplinary optimization as well as the numerical noise and truly discrete variables present in the current example problem.
Numerical simulations of strongly correlated electron and spin systems
NASA Astrophysics Data System (ADS)
Changlani, Hitesh Jaiprakash
Developing analytical and numerical tools for strongly correlated systems is a central challenge for the condensed matter physics community. In the absence of exact solutions and controlled analytical approximations, numerical techniques have often contributed to our understanding of these systems. Exact Diagonalization (ED) requires the storage of at least two vectors the size of the Hilbert space under consideration (which grows exponentially with system size) which makes it affordable only for small systems. The Density Matrix Renormalization Group (DMRG) uses an intelligent Hilbert space truncation procedure to significantly reduce this cost, but in its present formulation is limited to quasi-1D systems. Quantum Monte Carlo (QMC) maps the Schrodinger equation to the diffusion equation (in imaginary time) and only samples the eigenvector over time, thereby avoiding the memory limitation. However, the stochasticity involved in the method gives rise to the "sign problem" characteristic of fermion and frustrated spin systems. The first part of this thesis is an effort to make progress in the development of a numerical technique which overcomes the above mentioned problems. We consider novel variational wavefunctions, christened "Correlator Product States" (CPS), that have a general functional form which hopes to capture essential correlations in the ground states of spin and fermion systems in any dimension. We also consider a recent proposal to modify projector (Green's Function) Quantum Monte Carlo to ameliorate the sign problem for realistic and model Hamiltonians (such as the Hubbard model). This exploration led to our own set of improvements, primarily a semistochastic formulation of projector Quantum Monte Carlo. Despite their limitations, existing numerical techniques can yield physical insights into a wide variety of problems. The second part of this thesis considers one such numerical technique - DMRG - and adapts it to study the Heisenberg antiferromagnet on a generic tree graph. Our attention turns to a systematic numerical and semi-analytical study of the effect of local even/odd sublattice imbalance on the low energy spectrum of antiferromagnets on regular Cayley trees. Finally, motivated by previous experiments and theories of randomly diluted antiferromagnets (where an even/odd sublattice imbalance naturally occurs), we present our study of the Heisenberg antiferromagnet on the Cayley tree at the percolation threshold. Our work shows how to detect "emergent" low energy degrees of freedom and compute the effective interactions between them by using data from DMRG calculations.
ERIC Educational Resources Information Center
Sesno, Alice Healy
A teacher's professional integrity faces numerous challenges in the classroom. To help educators safeguard against potentially career-ending incidents, numerous "survival rules" are provided in this text. It argues that teachers must safeguard themselves with self-protecting knowledge and, in some instances, must reprogram themselves…
Increasing Work Opportunities for Low-Income Workers through TANF and Economic Development Programs.
ERIC Educational Resources Information Center
Friedman, Pamela
2002-01-01
The numerous layoffs of low-income workers that occurred when the nation's economy slowed in 2001 have created numerous challenges for local Temporary Assistance for Needy Families (TANF) programs. By increasing collaboration between community economic development and workforce development efforts to serve low-income residents, states and…
Technology, Policy, and School Change: The Role of Intermediary Organizations
ERIC Educational Resources Information Center
Forthe, Darrell
2012-01-01
As educators work to advance 21st century teaching and learning in schools, numerous reforms are needed but none greater than the necessity to integrate technology. Technology integration presents complex challenges because numerous changes must take place. The National Education Technology Plan 2010 (NETP) provides a road map for these necessary…
Computing Spacetimes: From Cosmology to Black Holes
NASA Technical Reports Server (NTRS)
Centrella, Joan
2007-01-01
Numerical relativity, the solution of the Einstein equations on a computer, is one of the most challenging and exciting areas of physics. Richard Matzner has played a key role in this subject from its birth, roughly 3 decades ago, to the present. This talk will present some of the highlights of Richard's work in numerical relativity.
Nutrient pollution remains one of the most prevalent causes of water quality impairment in the United States. The U.S. Environmental Protection Agency’s (EPA) approach to addressing the challenge of managing nutrient pollution has included supporting development of numeric nutri...
Traffic Flow Density Distribution Based on FEM
NASA Astrophysics Data System (ADS)
Ma, Jing; Cui, Jianming
In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.
ERIC Educational Resources Information Center
Wright, Sharon L.
2013-01-01
Businesses and governmental agencies are increasingly reliant on virtual teams composed of team members in different location. However, such virtual teams face all the interpersonal challenges inherent in working in a group, plus additional challenges that are a consequence from communicating through electronic methods. Numerous technological…
Promoting Social and Emotional Growth of Students with Disabilities
ERIC Educational Resources Information Center
Darrow, Alice-Ann
2014-01-01
Students with disabilities are often faced with numerous challenges as they progress through their school years. In addition to disability-related challenges, they may encounter additional difficulties such as bullying in school and lack of social acceptance by their peers. It is important that students with disabilities develop competence in…
Identifying the potential health hazards to the central nervous system of a new family of materials presents many challenges. Whole-animal toxicity testing has been the tradition, but in vitro methods have been steadily gaining popularity. There are numerous challenges in testing...
Graduate Students' Mental Health: Departmental Contexts as a Source of Differential Risk
ERIC Educational Resources Information Center
La Touche, Rachel A.
2017-01-01
Research in higher education acknowledges academic performance, progress and general health as adversely impacted by mental health challenges. These challenges are consistent with numerous life changes that accompany the student experience, including changes related to work, finances, social interactions and living conditions. Current scholarship…
A Working Model of a New American University
ERIC Educational Resources Information Center
Crow, Michael M.; Loui, Kimberly
2006-01-01
American universities confront unique challenges as they move into the twenty-first century. These include rapid population growth, demographic and economic changes on both global and regional levels, and the numerous local challenges that face today's communities. Modern universities must engage, and in turn be engaged by, their communities in…
Autism and Reading: Teaching a Sudanese Refugee Boy
ERIC Educational Resources Information Center
Walker-Dalhouse, Doris; Dalhouse, A. Derick
2015-01-01
Refugee families in the United States face numerous challenges in becoming acculturated. School-age children of refugees face the additional challenges of acquiring academic language and meeting school expectations for behavior and social interactions while attempting to navigate the school curriculum. This case study examines the school and home…
IPv6 Tactical Network Management
2009-09-01
is transitioning to IPv6 networks. While the benefits provided by IPv6 are numerous, its challenges lie in managing a network on the scale...operability, and usability in a tactical network is under way. New challenges are also presented by the need to integrate into the IPv6 segment new...Accessing this information also presents challenges . Feasibility studies are conducted to show that, for these devices, the IPv6 domain is at least
NASA Astrophysics Data System (ADS)
Ganapathy, Vinay; Ramachandran, Ramesh
2017-10-01
The response of a quadrupolar nucleus (nuclear spin with I > 1/2) to an oscillating radio-frequency pulse/field is delicately dependent on the ratio of the quadrupolar coupling constant to the amplitude of the pulse in addition to its duration and oscillating frequency. Consequently, analytic description of the excitation process in the density operator formalism has remained less transparent within existing theoretical frameworks. As an alternative, the utility of the "concept of effective Floquet Hamiltonians" is explored in the present study to explicate the nuances of the excitation process in multilevel systems. Employing spin I = 3/2 as a case study, a unified theoretical framework for describing the excitation of multiple-quantum transitions in static isotropic and anisotropic solids is proposed within the framework of perturbation theory. The challenges resulting from the anisotropic nature of the quadrupolar interactions are addressed within the effective Hamiltonian framework. The possible role of the various interaction frames on the convergence of the perturbation corrections is discussed along with a proposal for a "hybrid method" for describing the excitation process in anisotropic solids. Employing suitable model systems, the validity of the proposed hybrid method is substantiated through a rigorous comparison between simulations emerging from exact numerical and analytic methods.
Oude Weernink, C E; Sweegers, L; Relou, L; van der Zijpp, T J; van Hoof, J
2018-02-06
Modern healthcare, including nursing home care, goes together with the use of technologies to support treatment, the provision of care and daily activities. The challenges concerning the implementation of such technologies are numerous. One of these emerging technologies are location technologies (RTLS or Real-Time Location Systems). that can be utilized in the nursing home for monitoring the use and location of assets. This paper describes a participatory design study of RTLS based on context mapping, conducted in two nursing home organizations. Rather than investigating the technological possibilities, this study investigates the needs and wishes from the perspective of the care professional. The study identified semantic themes that relate to the practicalities of lost and misplaced items in the nursing home, as well as latent themes that cover the wishes regarding technology in the nursing homes. The organizational culture and building typology may play a role in losing items. The participants in this study indicated that RTLS can provide a solution to some of the challenges that they encounter in the workplace. However, the implementation of new technologies should be done with care and should be integrated into existing ICT systems in order to minimize additional training and posing a burden on the workload.
Oude Weernink, C.E.; Sweegers, L.; Relou, L.; van der Zijpp, T.J.; van Hoof, J.
2018-01-01
INTRODUCTION: Modern healthcare, including nursing home care, goes together with the use of technologies to support treatment, the provision of care and daily activities. The challenges concerning the implementation of such technologies are numerous. One of these emerging technologies are location technologies (RTLS or Real-Time Location Systems). that can be utilized in the nursing home for monitoring the use and location of assets. METHODOLOGY: This paper describes a participatory design study of RTLS based on context mapping, conducted in two nursing home organizations. Rather than investigating the technological possibilities, this study investigates the needs and wishes from the perspective of the care professional. RESULTS: The study identified semantic themes that relate to the practicalities of lost and misplaced items in the nursing home, as well as latent themes that cover the wishes regarding technology in the nursing homes. The organizational culture and building typology may play a role in losing items. CONCLUSION: The participants in this study indicated that RTLS can provide a solution to some of the challenges that they encounter in the workplace. However, the implementation of new technologies should be done with care and should be integrated into existing ICT systems in order to minimize additional training and posing a burden on the workload. PMID:29527110
Rees, Tim; Hardy, Lew; Güllich, Arne; Abernethy, Bruce; Côté, Jean; Woodman, Tim; Montgomery, Hugh; Laing, Stewart; Warr, Chelsea
2016-08-01
The literature base regarding the development of sporting talent is extensive, and includes empirical articles, reviews, position papers, academic books, governing body documents, popular books, unpublished theses and anecdotal evidence, and contains numerous models of talent development. With such a varied body of work, the task for researchers, practitioners and policy makers of generating a clear understanding of what is known and what is thought to be true regarding the development of sporting talent is particularly challenging. Drawing on a wide array of expertise, we address this challenge by avoiding adherence to any specific model or area and by providing a reasoned review across three key overarching topics: (a) the performer; (b) the environment; and (c) practice and training. Within each topic sub-section, we review and calibrate evidence by performance level of the samples. We then conclude each sub-section with a brief summary, a rating of the quality of evidence, a recommendation for practice and suggestions for future research. These serve to highlight both our current level of understanding and our level of confidence in providing practice recommendations, but also point to a need for future studies that could offer evidence regarding the complex interactions that almost certainly exist across domains.
Heuer, Sabine; Hallowell, Brooke
2015-01-01
Numerous authors report that people with aphasia have greater difficulty allocating attention than people without neurological disorders. Studying how attention deficits contribute to language deficits is important. However, existing methods for indexing attention allocation in people with aphasia pose serious methodological challenges. Eye-tracking methods have great potential to address such challenges. We developed and assessed the validity of a new dual-task method incorporating eye tracking to assess attention allocation. Twenty-six adults with aphasia and 33 control participants completed auditory sentence comprehension and visual search tasks. To test whether the new method validly indexes well-documented patterns in attention allocation, demands were manipulated by varying task complexity in single- and dual-task conditions. Differences in attention allocation were indexed via eye-tracking measures. For all participants significant increases in attention allocation demands were observed from single- to dual-task conditions and from simple to complex stimuli. Individuals with aphasia had greater difficulty allocating attention with greater task demands. Relationships between eye-tracking indices of comprehension during single and dual tasks and standardized testing were examined. Results support the validity of the novel eye-tracking method for assessing attention allocation in people with and without aphasia. Clinical and research implications are discussed. PMID:25913549
NASA Astrophysics Data System (ADS)
Broadhurst, T.; Mattson, E.
2017-12-01
Enhanced geothermal systems (EGS) are gaining in popularity as a technology that can be used to increase areas for geothermal resource procurement. One of the most important factors in the success of an EGS system is the success of the subsurface reservoir that is used for fluid flow and heat mining through advection. There are numerous challenges in stimulating a successful reservoir, including maintaining flow rates, minimizing leak off, preventing short-circuiting, and reducing the risk of microseismicity associated with subsurface activity. Understanding past examples of stimulation can be invaluable in addressing these challenges. This study provides an overview of stimulation methods that have been employed in EGS systems from 1974-2017. We include all geothermal reservoirs and demonstration projects that have experienced hydrofracturing, chemical stimulation, and induced thermal stress for a comprehensive list. We also examine different metrics and measures of success in geothermal reservoir stimulation to draw conclusions and provide recommendations for future projects. Multiple project characteristics are reported including geologic setting, stress conditions, reservoir temperature, injection specifics, resulting microseismicity, and overall project goals. Insight into optimal and unproductive stimulation methods is crucial to conserving mental capital, utilizing project funding, and ensuring EGS technology advances as efficiently as possible.
SEMIPARAMETRIC EFFICIENT ESTIMATION FOR SHARED-FRAILTY MODELS WITH DOUBLY-CENSORED CLUSTERED DATA
Wang, Jane-Ling
2018-01-01
In this paper, we investigate frailty models for clustered survival data that are subject to both left- and right-censoring, termed “doubly-censored data”. This model extends current survival literature by broadening the application of frailty models from right-censoring to a more complicated situation with additional left censoring. Our approach is motivated by a recent Hepatitis B study where the sample consists of families. We adopt a likelihood approach that aims at the nonparametric maximum likelihood estimators (NPMLE). A new algorithm is proposed, which not only works well for clustered data but also improve over existing algorithm for independent and doubly-censored data, a special case when the frailty variable is a constant equal to one. This special case is well known to be a computational challenge due to the left censoring feature of the data. The new algorithm not only resolves this challenge but also accommodate the additional frailty variable effectively. Asymptotic properties of the NPMLE are established along with semi-parametric efficiency of the NPMLE for the finite-dimensional parameters. The consistency of Bootstrap estimators for the standard errors of the NPMLE is also discussed. We conducted some simulations to illustrate the numerical performance and robustness of the proposed algorithm, which is also applied to the Hepatitis B data. PMID:29527068
Van Dyk, Jacob; Meghzifene, Ahmed
2017-04-01
The past few years have seen a significant growth of interest in the global radiation therapy (RT) crisis. Various organizations have quantified the need and are providing aid in support of addressing the shortfalls existing in many low-to-middle income countries. With the tremendous demand for new facilities, equipment, and personnel, it is very important to recognize the quality and safety challenges and to address them directly. An examination of publications on quality and safety in RT indicates a consistency in a number of the recommendations; however, these authoritative reports were generally based on input from high-resourced contexts. Here, we review these recommendations with a special emphasis on issues that are significant in low-to-middle income countries. Although multidimensional, training, and staffing are top priorities, any support provided to lower-resourced settings must address the numerous facets associated with quality and safety indicators. Strong partnerships between high income and other countries will enhance the development of safe and resource-appropriate strategies for advancing the radiation treatment process. The real challenge is the engagement of a strong spirit of cooperation, collaboration, and communication among the multiple organizations in support of reducing the cancer divide and improving the provision of safe and effective RT. Copyright © 2017 Elsevier Inc. All rights reserved.
Estimation of Local Orientations in Fibrous Structures With Applications to the Purkinje System
Plank, Gernot; Trayanova, Natalia A.; Vidal, René
2011-01-01
The extraction of the cardiac Purkinje system (PS) from intensity images is a critical step toward the development of realistic structural models of the heart. Such models are important for uncovering the mechanisms of cardiac disease and improving its treatment and prevention. Unfortunately, the manual extraction of the PS is a challenging and error-prone task due to the presence of image noise and numerous fiber junctions. To deal with these challenges, we propose a framework that estimates local fiber orientations with high accuracy and reconstructs the fibers via tracking. Our key contribution is the development of a descriptor for estimating the orientation distribution function (ODF), a spherical function encoding the local geometry of the fibers at a point of interest. The fiber/branch orientations are identified as the modes of the ODFs via spherical clustering and guide the extraction of the fiber centerlines. Experiments on synthetic data evaluate the sensitivity of our approach to image noise, width of the fiber, and choice of the mode detection strategy, and show its superior performance compared to those of the existing descriptors. Experiments on the free-running PS in an MR image also demonstrate the accuracy of our method in reconstructing such sparse fibrous structures. PMID:21335301
An efficient and accurate 3D displacements tracking strategy for digital volume correlation
NASA Astrophysics Data System (ADS)
Pan, Bing; Wang, Bo; Wu, Dafang; Lubineau, Gilles
2014-07-01
Owing to its inherent computational complexity, practical implementation of digital volume correlation (DVC) for internal displacement and strain mapping faces important challenges in improving its computational efficiency. In this work, an efficient and accurate 3D displacement tracking strategy is proposed for fast DVC calculation. The efficiency advantage is achieved by using three improvements. First, to eliminate the need of updating Hessian matrix in each iteration, an efficient 3D inverse compositional Gauss-Newton (3D IC-GN) algorithm is introduced to replace existing forward additive algorithms for accurate sub-voxel displacement registration. Second, to ensure the 3D IC-GN algorithm that converges accurately and rapidly and avoid time-consuming integer-voxel displacement searching, a generalized reliability-guided displacement tracking strategy is designed to transfer accurate and complete initial guess of deformation for each calculation point from its computed neighbors. Third, to avoid the repeated computation of sub-voxel intensity interpolation coefficients, an interpolation coefficient lookup table is established for tricubic interpolation. The computational complexity of the proposed fast DVC and the existing typical DVC algorithms are first analyzed quantitatively according to necessary arithmetic operations. Then, numerical tests are performed to verify the performance of the fast DVC algorithm in terms of measurement accuracy and computational efficiency. The experimental results indicate that, compared with the existing DVC algorithm, the presented fast DVC algorithm produces similar precision and slightly higher accuracy at a substantially reduced computational cost.
Exceptional collections in surface-like categories
NASA Astrophysics Data System (ADS)
Kuznetsov, A. G.
2017-09-01
We provide a categorical framework for recent results of Markus Perling's on the combinatorics of exceptional collections on numerically rational surfaces. Using it we simplify and generalize some of Perling's results as well as Vial's criterion for the existence of a numerical exceptional collection. Bibliography: 18 titles.
NASA Astrophysics Data System (ADS)
Hashim; Khan, Masood; Alshomrani, Ali Saleh
2017-12-01
This article considers a realistic approach to examine the magnetohydrodynamics (MHD) flow of Carreau fluid induced by the shrinking sheet subject to the stagnation-point. This study also explores the impacts of non-linear thermal radiation on the heat transfer process. The governing equations of physical model are expressed as a system of partial differential equations and are transformed into non-linear ordinary differential equations by introducing local similarity variables. The economized equations of the problem are numerically integrated using the Runge-Kutta Fehlberg integration scheme. In this study, we explore the condition of existence, non-existence, uniqueness and dual nature for obtaining numerical solutions. It is found that the solutions may possess multiple natures, upper and lower branch, for a specific range of shrinking parameter. Results indicate that due to an increment in the magnetic parameter, range of shrinking parameter where a dual solution exists, increases. Further, strong magnetic field enhances the thickness of the momentum boundary layer in case of the second solution while for first solution it reduces. We further note that the fluid suction diminishes the fluid velocity and therefore the thickness of the hydrodynamic boundary layer decreases as well. A critical analysis with existing works is performed which shows that outcome are benchmarks with these works.
A Polynomial Time, Numerically Stable Integer Relation Algorithm
NASA Technical Reports Server (NTRS)
Ferguson, Helaman R. P.; Bailey, Daivd H.; Kutler, Paul (Technical Monitor)
1998-01-01
Let x = (x1, x2...,xn be a vector of real numbers. X is said to possess an integer relation if there exist integers a(sub i) not all zero such that a1x1 + a2x2 + ... a(sub n)Xn = 0. Beginning in 1977 several algorithms (with proofs) have been discovered to recover the a(sub i) given x. The most efficient of these existing integer relation algorithms (in terms of run time and the precision required of the input) has the drawback of being very unstable numerically. It often requires a numeric precision level in the thousands of digits to reliably recover relations in modest-sized test problems. We present here a new algorithm for finding integer relations, which we have named the "PSLQ" algorithm. It is proved in this paper that the PSLQ algorithm terminates with a relation in a number of iterations that is bounded by a polynomial in it. Because this algorithm employs a numerically stable matrix reduction procedure, it is free from the numerical difficulties, that plague other integer relation algorithms. Furthermore, its stability admits an efficient implementation with lower run times oil average than other algorithms currently in Use. Finally, this stability can be used to prove that relation bounds obtained from computer runs using this algorithm are numerically accurate.
Tartarus: A relativistic Green's function quantum average atom code
Gill, Nathanael Matthew; Starrett, Charles Edward
2017-06-28
A relativistic Green’s Function quantum average atom model is implemented in the Tartarus code for the calculation of equation of state data in dense plasmas. We first present the relativistic extension of the quantum Green’s Function average atom model described by Starrett [1]. The Green’s Function approach addresses the numerical challenges arising from resonances in the continuum density of states without the need for resonance tracking algorithms or adaptive meshes, though there are still numerical challenges inherent to this algorithm. We discuss how these challenges are addressed in the Tartarus algorithm. The outputs of the calculation are shown in comparisonmore » to PIMC/DFT-MD simulations of the Principal Shock Hugoniot in Silicon. Finally, we also present the calculation of the Hugoniot for Silver coming from both the relativistic and nonrelativistic modes of the Tartarus code.« less
Tartarus: A relativistic Green's function quantum average atom code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gill, Nathanael Matthew; Starrett, Charles Edward
A relativistic Green’s Function quantum average atom model is implemented in the Tartarus code for the calculation of equation of state data in dense plasmas. We first present the relativistic extension of the quantum Green’s Function average atom model described by Starrett [1]. The Green’s Function approach addresses the numerical challenges arising from resonances in the continuum density of states without the need for resonance tracking algorithms or adaptive meshes, though there are still numerical challenges inherent to this algorithm. We discuss how these challenges are addressed in the Tartarus algorithm. The outputs of the calculation are shown in comparisonmore » to PIMC/DFT-MD simulations of the Principal Shock Hugoniot in Silicon. Finally, we also present the calculation of the Hugoniot for Silver coming from both the relativistic and nonrelativistic modes of the Tartarus code.« less
MFIX simulation of NETL/PSRI challenge problem of circulating fluidized bed
Li, Tingwen; Dietiker, Jean-François; Shahnam, Mehrdad
2012-12-01
In this paper, numerical simulations of NETL/PSRI challenge problem of circulating fluidized bed (CFB) using the open-source code Multiphase Flow with Interphase eXchange (MFIX) are reported. Two rounds of simulation results are reported including the first-round blind test and the second-round modeling refinement. Three-dimensional high fidelity simulations are conducted to model a 12-inch diameter pilot-scale CFB riser. Detailed comparisons between numerical results and experimental data are made with respect to axial pressure gradient profile, radial profiles of solids velocity and solids mass flux along different radial directions at various elevations for operating conditions covering different fluidization regimes. Overall, the numericalmore » results show that CFD can predict the complex gas–solids flow behavior in the CFB riser reasonably well. In addition, lessons learnt from modeling this challenge problem are presented.« less
Review: Modelling chemical kinetics and convective heating in giant planet entries
NASA Astrophysics Data System (ADS)
Reynier, Philippe; D'Ammando, Giuliano; Bruno, Domenico
2018-01-01
A review of the existing chemical kinetics models for H2 / He mixtures and related transport and thermodynamic properties is presented as a pre-requisite towards the development of innovative models based on the state-to-state approach. A survey of the available results obtained during the mission preparation and post-flight analyses of the Galileo mission has been undertaken and a computational matrix has been derived. Different chemical kinetics schemes for hydrogen/helium mixtures have been applied to numerical simulations of the selected points along the entry trajectory. First, a reacting scheme, based on literature data, has been set up for computing the flow-field around the probe at high altitude and comparisons with existing numerical predictions are performed. Then, a macroscopic model derived from a state-to-state model has been constructed and incorporated into a CFD code. Comparisons with existing numerical results from the literature have been performed as well as cross-check comparisons between the predictions provided by the different models in order to evaluate the potential of innovative chemical kinetics models based on the state-to-state approach.
Clues to the Foundations of Numerical Cognitive Impairments: Evidence From Genetic Disorders
Simon, Tony J.
2011-01-01
Several neurodevelopmental disorders of known genetic etiology generate phenotypes that share the characteristic of numerical and mathematical cognitive impairments. This article reviews some of the main findings that suggest a possible key role that spatial and temporal information processing impairments may play in the atypical development of numerical cognitive competence. The question of what neural substrate might underlie these impairments is also addressed, as are the challenges for interpreting neural structure/cognitive function mapping in atypically developing populations. PMID:21761998
Successful Architectural Knowledge Sharing: Beware of Emotions
NASA Astrophysics Data System (ADS)
Poort, Eltjo R.; Pramono, Agung; Perdeck, Michiel; Clerc, Viktor; van Vliet, Hans
This chapter presents the analysis and key findings of a survey on architectural knowledge sharing. The responses of 97 architects working in the Dutch IT Industry were analyzed by correlating practices and challenges with project size and success. Impact mechanisms between project size, project success, and architectural knowledge sharing practices and challenges were deduced based on reasoning, experience and literature. We find that architects run into numerous and diverse challenges sharing architectural knowledge, but that the only challenges that have a significant impact are the emotional challenges related to interpersonal relationships. Thus, architects should be careful when dealing with emotions in knowledge sharing.
Code of Federal Regulations, 2012 CFR
2012-07-01
... emissions? Compliance with the numerical emission limitations established in this subpart is based on the... Alaska not accessible by the Federal Aid Highway System (FAHS) you do not have to meet the numerical CO...
ERIC Educational Resources Information Center
Ansari, Daniel
2010-01-01
The present paper provides a critical overview of how adult neuropsychological models have been applied to the study of the atypical development of numerical cognition. Specifically, the following three assumptions are challenged: 1. Profiles of strength and weaknesses do not change over developmental time. 2. Similar neuronal structures are…
Review: Milking robot utilization, a successful precision livestock farming evolution.
John, A J; Clark, C E F; Freeman, M J; Kerrisk, K L; Garcia, S C; Halachmi, I
2016-09-01
Automatic milking systems (AMS), one of the earliest precision livestock farming developments, have revolutionized dairy farming around the world. While robots control the milking process, there have also been numerous changes to how the whole farm system is managed. Milking is no longer performed in defined sessions; rather, the cow can now choose when to be milked in AMS, allowing milking to be distributed throughout a 24 h period. Despite this ability, there has been little attention given to milking robot utilization across 24 h. In order to formulate relevant research questions and improve farm AMS management there is a need to determine the current knowledge gaps regarding the distribution of robot utilization. Feed, animal and management factors and their interplay on levels of milking robot utilization across 24 h for both indoor and pasture-based systems are here reviewed. The impact of the timing, type and quantity of feed offered and their interaction with the distance of feed from the parlour; herd social dynamics, climate and various other management factors on robot utilization through 24 h are provided. This novel review draws together both the opportunities and challenges that exist for farm management to use these factors to improved system efficiency and those that exist for further research.
Beukelman, Timothy; Anink, Janneke; Berntson, Lillemor; Duffy, Ciaran; Ellis, Justine A; Glerup, Mia; Guzman, Jaime; Horneff, Gerd; Kearsley-Fleet, Lianne; Klein, Ariane; Klotsche, Jens; Magnusson, Bo; Minden, Kirsten; Munro, Jane E; Niewerth, Martina; Nordal, Ellen; Ruperto, Nicolino; Santos, Maria Jose; Schanberg, Laura E; Thomson, Wendy; van Suijlekom-Smit, Lisette; Wulffraat, Nico; Hyrich, Kimme
2017-04-19
To characterize the existing national and multi-national registries and cohort studies in juvenile idiopathic arthritis (JIA) and identify differences as well as areas of potential future collaboration. We surveyed investigators from North America, Europe, and Australia about existing JIA cohort studies and registries. We excluded cross-sectional studies. We captured information about study design, duration, location, inclusion criteria, data elements and collection methods. We received survey results from 18 studies, including 11 national and 7 multi-national studies representing 37 countries in total. Study designs included inception cohorts, prevalent disease cohorts, and new treatment cohorts (several of which contribute to pharmacosurveillance activities). Despite numerous differences, the data elements collected across the studies was quite similar, with most studies collecting at least 5 of the 6 American College of Rheumatology core set variables and the data needed to calculate the 3-variable clinical juvenile disease activity score. Most studies were collecting medication initiation and discontinuation dates and were attempting to capture serious adverse events. There is a wide-range of large, ongoing JIA registries and cohort studies around the world. Our survey results indicate significant potential for future collaborative work using data from different studies and both combined and comparative analyses.
1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology
Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965
Numerical simulations of thermal conductivity in dissipative two-dimensional Yukawa systems.
Khrustalyov, Yu V; Vaulina, O S
2012-04-01
Numerical data on the heat transfer constants in two-dimensional Yukawa systems were obtained. Numerical study of the thermal conductivity and diffusivity was carried out for the equilibrium systems with parameters close to conditions of laboratory experiments with dusty plasma. For calculations of heat transfer constants the Green-Kubo formulas were used. The influence of dissipation (friction) on the heat transfer processes in nonideal systems was investigated. The approximation of the coefficient of thermal conductivity is proposed. Comparison of the obtained results to the existing experimental and numerical data is discussed.
On the mechanics of cerebral aneurysms: experimental research and numerical simulation
NASA Astrophysics Data System (ADS)
Parshin, D. V.; Kuianova, I. O.; Yunoshev, A. S.; Ovsyannikov, K. S.; Dubovoy, A. V.
2017-10-01
This research extends existing experimental data for CA tissues [1, 2] and presents the preliminary results of numerical calculations. Experiments were performed to measure aneurysm wall stiffness and the data obtained was analyzed. To reconstruct the geometry of the CAs, DICOM images of real patients with aneurysms and ITK Snap [3] were used. In addition, numerical calculations were performed in ANSYS (commercial software, License of Lavrentyev Institute of Hydrodynamics). The results of these numerical calculations show a high level of agreement with experimental data from previous literature.
The Wallops Flight Facility Rapid Response Range Operations Initiative
NASA Technical Reports Server (NTRS)
Underwood, Bruce E.; Kremer, Steven E.
2004-01-01
While the dominant focus on short response missions has appropriately centered on the launch vehicle and spacecraft, often overlooked or afterthought phases of these missions have been launch site operations and the activities of launch range organizations. Throughout the history of organized spaceflight, launch ranges have been the bane of flight programs as the source of expense, schedule delays, and seemingly endless requirements. Launch Ranges provide three basic functions: (1) provide an appropriate geographical location to meet orbital other mission trajectory requirements, (2) provide project services such as processing facilities, launch complexes, tracking and data services, and expendable products, and (3) assure safety and property protection to participating personnel and third-parties. The challenge with which launch site authorities continuously struggle, is the inherent conflict arising from projects whose singular concern is execution of their mission, and the range s need to support numerous simultaneous customers. So, while tasks carried out by a launch range committed to a single mission pale in comparison to efforts of a launch vehicle or spacecraft provider and could normally be carried out in a matter of weeks, major launch sites have dozens of active projects separate sponsoring organizations. Accommodating the numerous tasks associated with each mission, when hardware failures, weather, maintenance requirements, and other factors constantly conspire against the range resource schedulers, make the launch range as significant an impediment to responsive missions as launch vehicles and their cargo. The obvious solution to the launch site challenge was implemented years ago when the Department of Defense simply established dedicated infrastructure and personnel to dedicated missions, namely an Inter Continental Ballistic Missile. This however proves to be prohibitively expensive for all but the most urgent of applications. So the challenge becomes how can a launch site provide acceptably responsive mission services to a particular customer without dedicating extensive resources and while continuing to serve other projects? NASA's Wallops Flight Facility (WFF) is pursuing solutions to exactly this challenge. NASA, in partnership with the Virginia Commercial Space Flight Authority, has initiated the Rapid Response Range Operations Initiative (R3Ops). R3Ops is a multi-phased effort to incrementally establish and demonstrate increasingly responsive launch operations, with an ultimate goal of providing ELV-class services in a maximum of 7-10 days from initial notification routinely, and shorter schedules possible with committed resources. This target will be pursued within the reality of simultaneous concurrent programs, and ideally, largely independent of specialized flight system configurations. WFF has recently completed Phase 1 of R3Ops, an in-depth collection (through extensive expert interviews) and software modeling of individual steps by various range disciplines. This modeling is now being used to identify existing inefficiencies in current procedures, to identify bottlenecks, and show interdependencies. Existing practices are being tracked to provide a baseline to benchmark against as new procedures are implemented. This paper will describe in detail the philosophies behind WFF's R3Ops, the data collected and modeled in Phase 1, and strategies for meeting responsive launch requirements in a multi-user range environment planned for subsequent phases of this initiative.
Integrated workflows for spiking neuronal network simulations
Antolík, Ján; Davison, Andrew P.
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902
Integrated workflows for spiking neuronal network simulations.
Antolík, Ján; Davison, Andrew P
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.
Dynamics of climate-based malaria transmission model with age-structured human population
NASA Astrophysics Data System (ADS)
Addawe, Joel; Pajimola, Aprimelle Kris
2016-10-01
In this paper, we proposed to study the dynamics of malaria transmission with periodic birth rate of the vector and an age-structure for the human population. The human population is divided into two compartments: pre-school (0-5 years) and the rest of the human population. We showed the existence of a disease-free equilibrium point. Using published epidemiological parameters, we use numerical simulations to show potential effect of climate change in the dynamics of age-structured malaria transmission. Numerical simulations suggest that there exists an asymptotically attractive solution that is positive and periodic.
Advanced Computational Techniques for Hypersonic Propulsion
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.
1996-01-01
CFD has played a major role in the resurgence of hypersonic flight, on the premise that numerical methods will allow us to perform simulations at conditions for which no ground test capability exists. Validation of CFD methods is being established using the experimental data base available, which is below Mach 8. It is important, however, to realize the limitations involved in the extrapolation process as well as the deficiencies that exist in numerical methods at the present time. Current features of CFD codes are examined for application to propulsion system components. The shortcomings in simulation and modeling are identified and discussed.
Can a Catholic College Exist Today?: Challenges to Religious Identity in the Midst of Pluralism
ERIC Educational Resources Information Center
Cesareo, Francesco C.
2007-01-01
One of the most significant and important challenges facing any president of a Catholic college or university is maintaining and enhancing the religious identity and mission of the institution in the midst of the pluralism that exists on every Catholic campus in the United States. Catholic colleges and universities are at an important crossroad, a…
"How Can I Help?": Practicing Familial Support through Simulation
ERIC Educational Resources Information Center
Coughlin, April B.; Dotger, Benjamin H.
2016-01-01
Teachers face numerous challenges in daily practice, including situations that involve the health, safety, and well-being of students and families. When harassment and physical abuse impact K-12 students, these situations pose unexpected challenges to novice teachers working to support their students. In this article, the authors report on a study…
ERIC Educational Resources Information Center
Ho, Li-Ching; Alviar-Martin, Theresa; Leviste, Enrique Niño P.
2014-01-01
Background/Context: Research indicates that across democratic societies, teachers face numerous intellectual and emotional challenges when handling controversial topics in the classroom. Less attention, however, has been paid to how teachers' willingness to teach controversial topics intersects with political and other societal factors in…
Injury Prevention in Physical Education: Scenarios and Solutions
ERIC Educational Resources Information Center
Merrie, Michael D.; Shewmake, Cole; Calleja, Paul
2016-01-01
The purpose of this article is to provide physical educators with practical strategies that can assist in preventing injuries in the classroom. The dynamic nature of physical education and the numerous tasks physical educators must complete daily can be challenging. Embedded in these challenges is the constant risk of student injury. Fortunately,…
A Quantitative Quality Control Model for Parallel and Distributed Crowdsourcing Tasks
ERIC Educational Resources Information Center
Zhu, Shaojian
2014-01-01
Crowdsourcing is an emerging research area that has experienced rapid growth in the past few years. Although crowdsourcing has demonstrated its potential in numerous domains, several key challenges continue to hinder its application. One of the major challenges is quality control. How can crowdsourcing requesters effectively control the quality…
Rethinking and Restructuring an Assessment System via Effective Deployment of Technology
ERIC Educational Resources Information Center
Okonkwo, Charity
2010-01-01
Every instructional process involves a strategic assessment system for a complete teaching-learning circle. Assessment system which is seriously challenged calls for a change in the approach. The National Open University of Nigeria (NOUN) assessment system at present is challenged. The large number of students and numerous courses offered by NOUN…
ERIC Educational Resources Information Center
Loke, Swee-Kin; Al-Sallami, Hesham S.; Wright, Daniel F. B.; McDonald, Jenny; Jadhav, Sheetal; Duffull, Stephen B.
2012-01-01
Complex systems are typically difficult for students to understand and computer simulations offer a promising way forward. However, integrating such simulations into conventional classes presents numerous challenges. Framed within an educational design research, we studied the use of an in-house built simulation of the coagulation network in four…
M-Learning Challenges in Teaching Crosscutting Themes in the Education of Young People and Adults
ERIC Educational Resources Information Center
Ota, Marcos Andrei; de Araujo, Carlos Fernando, Jr.
2016-01-01
The challenges faced in using new technologies in the classroom are numerous, but contributions generated with their resolution can proportionately provide original and efficient teaching practices more in tune to students' eager learning needs. This article presents some strategies developed to help teachers in transversal themes classes using…
"Gaining Power through Education": Experiences of Honduran Students from High Poverty Backgrounds
ERIC Educational Resources Information Center
Mather, Peter C.; Zempter, Christy; Ngumbi, Elizabeth; Nakama, Yuki; Manley, David; Cox, Haley
2017-01-01
This is a study of students from high-poverty backgrounds attending universities in Honduras. Based on a series of individual and focus group interviews, the researchers found students from high-poverty backgrounds face numerous practical challenges in persisting in higher education. Despite these challenges, participants succeeded due to a…
Numerical proof for chemostat chaos of Shilnikov's type.
Deng, Bo; Han, Maoan; Hsu, Sze-Bi
2017-03-01
A classical chemostat model is considered that models the cycling of one essential abiotic element or nutrient through a food chain of three trophic levels. The long-time behavior of the model was known to exhibit complex dynamics more than 20 years ago. It is still an open problem to prove the existence of chaos analytically. In this paper, we aim to solve the problem numerically. In our approach, we introduce an artificial singular parameter to the model and construct singular homoclinic orbits of the saddle-focus type which is known for chaos generation. From the configuration of the nullclines of the equations that generates the singular homoclinic orbits, a shooting algorithm is devised to find such Shilnikov saddle-focus homoclinic orbits numerically which in turn imply the existence of chaotic dynamics for the original chemostat model.
Hardware simulation of fuel cell/gas turbine hybrids
NASA Astrophysics Data System (ADS)
Smith, Thomas Paul
Hybrid solid oxide fuel cell/gas turbine (SOFC/GT) systems offer high efficiency power generation, but face numerous integration and operability challenges. This dissertation addresses the application of hardware-in-the-loop simulation (HILS) to explore the performance of a solid oxide fuel cell stack and gas turbine when combined into a hybrid system. Specifically, this project entailed developing and demonstrating a methodology for coupling a numerical SOFC subsystem model with a gas turbine that has been modified with supplemental process flow and control paths to mimic a hybrid system. This HILS approach was implemented with the U.S. Department of Energy Hybrid Performance Project (HyPer) located at the National Energy Technology Laboratory. By utilizing HILS the facility provides a cost effective and capable platform for characterizing the response of hybrid systems to dynamic variations in operating conditions. HILS of a hybrid system was accomplished by first interfacing a numerical model with operating gas turbine hardware. The real-time SOFC stack model responds to operating turbine flow conditions in order to predict the level of thermal effluent from the SOFC stack. This simulated level of heating then dynamically sets the turbine's "firing" rate to reflect the stack output heat rate. Second, a high-speed computer system with data acquisition capabilities was integrated with the existing controls and sensors of the turbine facility. In the future, this will allow for the utilization of high-fidelity fuel cell models that infer cell performance parameters while still computing the simulation in real-time. Once the integration of the numeric and the hardware simulation components was completed, HILS experiments were conducted to evaluate hybrid system performance. The testing identified non-intuitive transient responses arising from the large thermal capacitance of the stack that are inherent to hybrid systems. Furthermore, the tests demonstrated the capabilities of HILS as a research tool for investigating the dynamic behavior of SOFC/GT hybrid power generation systems.
NASA Astrophysics Data System (ADS)
Silva, Goncalo; Semiao, Viriato
2017-07-01
The first nonequilibrium effect experienced by gaseous flows in contact with solid surfaces is the slip-flow regime. While the classical hydrodynamic description holds valid in bulk, at boundaries the fluid-wall interactions must consider slip. In comparison to the standard no-slip Dirichlet condition, the case of slip formulates as a Robin-type condition for the fluid tangential velocity. This makes its numerical modeling a challenging task, particularly in complex geometries. In this work, this issue is handled with the lattice Boltzmann method (LBM), motivated by the similarities between the closure relations of the reflection-type boundary schemes equipping the LBM equation and the slip velocity condition established by slip-flow theory. Based on this analogy, we derive, as central result, the structure of the LBM boundary closure relation that is consistent with the second-order slip velocity condition, applicable to planar walls. Subsequently, three tasks are performed. First, we clarify the limitations of existing slip velocity LBM schemes, based on discrete analogs of kinetic theory fluid-wall interaction models. Second, we present improved slip velocity LBM boundary schemes, constructed directly at discrete level, by extending the multireflection framework to the slip-flow regime. Here, two classes of slip velocity LBM boundary schemes are considered: (i) linear slip schemes, which are local but retain some calibration requirements and/or operation limitations, (ii) parabolic slip schemes, which use a two-point implementation but guarantee the consistent prescription of the intended slip velocity condition, at arbitrary plane wall discretizations, further dispensing any numerical calibration procedure. Third and final, we verify the improvements of our proposed slip velocity LBM boundary schemes against existing ones. The numerical tests evaluate the ability of the slip schemes to exactly accommodate the steady Poiseuille channel flow solution, over distinct wall slippage conditions, namely, no-slip, first-order slip, and second-order slip. The modeling of channel walls is discussed at both lattice-aligned and non-mesh-aligned configurations: the first case illustrates the numerical slip due to the incorrect modeling of slippage coefficients, whereas the second case adds the effect of spurious boundary layers created by the deficient accommodation of bulk solution. Finally, the slip-flow solutions predicted by LBM schemes are further evaluated for the Knudsen's paradox problem. As conclusion, this work establishes the parabolic accuracy of slip velocity schemes as the necessary condition for the consistent LBM modeling of the slip-flow regime.
Silva, Goncalo; Semiao, Viriato
2017-07-01
The first nonequilibrium effect experienced by gaseous flows in contact with solid surfaces is the slip-flow regime. While the classical hydrodynamic description holds valid in bulk, at boundaries the fluid-wall interactions must consider slip. In comparison to the standard no-slip Dirichlet condition, the case of slip formulates as a Robin-type condition for the fluid tangential velocity. This makes its numerical modeling a challenging task, particularly in complex geometries. In this work, this issue is handled with the lattice Boltzmann method (LBM), motivated by the similarities between the closure relations of the reflection-type boundary schemes equipping the LBM equation and the slip velocity condition established by slip-flow theory. Based on this analogy, we derive, as central result, the structure of the LBM boundary closure relation that is consistent with the second-order slip velocity condition, applicable to planar walls. Subsequently, three tasks are performed. First, we clarify the limitations of existing slip velocity LBM schemes, based on discrete analogs of kinetic theory fluid-wall interaction models. Second, we present improved slip velocity LBM boundary schemes, constructed directly at discrete level, by extending the multireflection framework to the slip-flow regime. Here, two classes of slip velocity LBM boundary schemes are considered: (i) linear slip schemes, which are local but retain some calibration requirements and/or operation limitations, (ii) parabolic slip schemes, which use a two-point implementation but guarantee the consistent prescription of the intended slip velocity condition, at arbitrary plane wall discretizations, further dispensing any numerical calibration procedure. Third and final, we verify the improvements of our proposed slip velocity LBM boundary schemes against existing ones. The numerical tests evaluate the ability of the slip schemes to exactly accommodate the steady Poiseuille channel flow solution, over distinct wall slippage conditions, namely, no-slip, first-order slip, and second-order slip. The modeling of channel walls is discussed at both lattice-aligned and non-mesh-aligned configurations: the first case illustrates the numerical slip due to the incorrect modeling of slippage coefficients, whereas the second case adds the effect of spurious boundary layers created by the deficient accommodation of bulk solution. Finally, the slip-flow solutions predicted by LBM schemes are further evaluated for the Knudsen's paradox problem. As conclusion, this work establishes the parabolic accuracy of slip velocity schemes as the necessary condition for the consistent LBM modeling of the slip-flow regime.
Use of multivariable asymptotic expansions in a satellite theory
NASA Technical Reports Server (NTRS)
Dallas, S. S.
1973-01-01
Initial conditions and perturbative force of satellite are restricted to yield motion of equatorial satellite about oblate body. In this manner, exact analytic solution exists and can be used as standard of comparison in numerical accuracy comparisons. Detailed numerical accuracy studies of uniformly valid asymptotic expansions were made.
The Challenge Course Experience Questionnaire: A Facilitator's Assessment Tool
ERIC Educational Resources Information Center
Schary, David P.; Waldron, Alexis L.
2017-01-01
Challenge course programs influence a variety of psychological, social, and educational outcomes. Yet, many challenges exist when measuring challenge course outcomes like logistical constraints and a lack of specific assessment tools. This study piloted and tested an assessment tool designed for facilitators to measure participant outcomes in…
A Review of Numerical Simulation and Analytical Modeling for Medical Devices Safety in MRI
Kabil, J.; Belguerras, L.; Trattnig, S.; Pasquier, C.; Missoffe, A.
2016-01-01
Summary Objectives To review past and present challenges and ongoing trends in numerical simulation for MRI (Magnetic Resonance Imaging) safety evaluation of medical devices. Methods A wide literature review on numerical and analytical simulation on simple or complex medical devices in MRI electromagnetic fields shows the evolutions through time and a growing concern for MRI safety over the years. Major issues and achievements are described, as well as current trends and perspectives in this research field. Results Numerical simulation of medical devices is constantly evolving, supported by calculation methods now well-established. Implants with simple geometry can often be simulated in a computational human model, but one issue remaining today is the experimental validation of these human models. A great concern is to assess RF heating on implants too complex to be traditionally simulated, like pacemaker leads. Thus, ongoing researches focus on alternative hybrids methods, both numerical and experimental, with for example a transfer function method. For the static field and gradient fields, analytical models can be used for dimensioning simple implants shapes, but limited for complex geometries that cannot be studied with simplifying assumptions. Conclusions Numerical simulation is an essential tool for MRI safety testing of medical devices. The main issues remain the accuracy of simulations compared to real life and the studies of complex devices; but as the research field is constantly evolving, some promising ideas are now under investigation to take up the challenges. PMID:27830244
Existence of periodic solutions in a model of respiratory syncytial virus RSV
NASA Astrophysics Data System (ADS)
Arenas, Abraham J.; González, Gilberto; Jódar, Lucas
2008-08-01
In this paper we study the existence of a positive periodic solutions for nested models of respiratory syncytial virus RSV, by using a continuation theorem based on coincidence degree theory. Conditions for the existence of periodic solutions in the model are given. Numerical simulations related to the transmission of respiratory syncytial virus in Madrid and Rio Janeiro are included.
Trapped Modes in a Three-Layer Fluid
NASA Astrophysics Data System (ADS)
Saha, Sunanda; Bora, Swaroop Nandan
2018-03-01
In this work, trapped mode frequencies are computed for a submerged horizontal circular cylinder with the hydrodynamic set-up involving an infinite depth three-layer incompressible fluid with layer-wise different densities. The impermeable cylinder is fully immersed in either the bottom layer or the upper layer. The effect of surface tension at the surface of separation is neglected. In this set-up, there exist three wave numbers: the lowest one on the free surface and the other two on the internal interfaces. For each wave number, there exist two modes for which trapped waves exist. The existence of these trapped modes is shown by numerical evidence. We investigate the variation of these trapped modes subject to change in the depth of the middle layer as well as the submergence depth. We show numerically that two-layer and single-layer results cannot be recovered in the double and single limiting cases of the density ratios tending to unity. The existence of trapped modes shows that in general, a radiation condition for the waves at infinity is insufficient for the uniqueness of the solution of the scattering problem.
Opfer, John E; Thompson, Clarissa A; Furlong, Ellen E
2010-09-01
Numeric magnitudes often bias adults' spatial performance. Partly because the direction of this bias (left-to-right versus right-to-left) is culture-specific, it has been assumed that the orientation of spatial-numeric associations is a late development, tied to reading practice or schooling. Challenging this assumption, we found that preschoolers expected numbers to be ordered from left-to-right when they searched for objects in numbered containers, when they counted, and (to a lesser extent) when they added and subtracted. Further, preschoolers who lacked these biases demonstrated more immature, logarithmic representations of numeric value than preschoolers who exhibited the directional bias, suggesting that spatial-numeric associations aid magnitude representations for symbols denoting increasingly large numbers.
Designing Adaptive Low Dissipative High Order Schemes
NASA Technical Reports Server (NTRS)
Yee, H. C.; Sjoegreen, B.; Parks, John W. (Technical Monitor)
2002-01-01
Proper control of the numerical dissipation/filter to accurately resolve all relevant multiscales of complex flow problems while still maintaining nonlinear stability and efficiency for long-time numerical integrations poses a great challenge to the design of numerical methods. The required type and amount of numerical dissipation/filter are not only physical problem dependent, but also vary from one flow region to another. This is particularly true for unsteady high-speed shock/shear/boundary-layer/turbulence/acoustics interactions and/or combustion problems since the dynamics of the nonlinear effect of these flows are not well-understood. Even with extensive grid refinement, it is of paramount importance to have proper control on the type and amount of numerical dissipation/filter in regions where it is needed.
Scientific developments of liquid crystal-based optical memory: a review
NASA Astrophysics Data System (ADS)
Prakash, Jai; Chandran, Achu; Biradar, Ashok M.
2017-01-01
The memory behavior in liquid crystals (LCs), although rarely observed, has made very significant headway over the past three decades since their discovery in nematic type LCs. It has gone from a mere scientific curiosity to application in variety of commodities. The memory element formed by numerous LCs have been protected by patents, and some commercialized, and used as compensation to non-volatile memory devices, and as memory in personal computers and digital cameras. They also have the low cost, large area, high speed, and high density memory needed for advanced computers and digital electronics. Short and long duration memory behavior for industrial applications have been obtained from several LC materials, and an LC memory with interesting features and applications has been demonstrated using numerous LCs. However, considerable challenges still exist in searching for highly efficient, stable, and long-lifespan materials and methods so that the development of useful memory devices is possible. This review focuses on the scientific and technological approach of fascinating applications of LC-based memory. We address the introduction, development status, novel design and engineering principles, and parameters of LC memory. We also address how the amalgamation of LCs could bring significant change/improvement in memory effects in the emerging field of nanotechnology, and the application of LC memory as the active component for futuristic and interesting memory devices.
Fully-relativistic full-potential multiple scattering theory: A pathology-free scheme
NASA Astrophysics Data System (ADS)
Liu, Xianglin; Wang, Yang; Eisenbach, Markus; Stocks, G. Malcolm
2018-03-01
The Green function plays an essential role in the Korringa-Kohn-Rostoker(KKR) multiple scattering method. In practice, it is constructed from the regular and irregular solutions of the local Kohn-Sham equation and robust methods exist for spherical potentials. However, when applied to a non-spherical potential, numerical errors from the irregular solutions give rise to pathological behaviors of the charge density at small radius. Here we present a full-potential implementation of the fully-relativistic KKR method to perform ab initio self-consistent calculation by directly solving the Dirac differential equations using the generalized variable phase (sine and cosine matrices) formalism Liu et al. (2016). The pathology around the origin is completely eliminated by carrying out the energy integration of the single-site Green function along the real axis. By using an efficient pole-searching technique to identify the zeros of the well-behaved Jost matrices, we demonstrated that this scheme is numerically stable and computationally efficient, with speed comparable to the conventional contour energy integration method, while free of the pathology problem of the charge density. As an application, this method is utilized to investigate the crystal structures of polonium and their bulk properties, which is challenging for a conventional real-energy scheme. The noble metals are also calculated, both as a test of our method and to study the relativistic effects.
Neural Network Machine Learning and Dimension Reduction for Data Visualization
NASA Technical Reports Server (NTRS)
Liles, Charles A.
2014-01-01
Neural network machine learning in computer science is a continuously developing field of study. Although neural network models have been developed which can accurately predict a numeric value or nominal classification, a general purpose method for constructing neural network architecture has yet to be developed. Computer scientists are often forced to rely on a trial-and-error process of developing and improving accurate neural network models. In many cases, models are constructed from a large number of input parameters. Understanding which input parameters have the greatest impact on the prediction of the model is often difficult to surmise, especially when the number of input variables is very high. This challenge is often labeled the "curse of dimensionality" in scientific fields. However, techniques exist for reducing the dimensionality of problems to just two dimensions. Once a problem's dimensions have been mapped to two dimensions, it can be easily plotted and understood by humans. The ability to visualize a multi-dimensional dataset can provide a means of identifying which input variables have the highest effect on determining a nominal or numeric output. Identifying these variables can provide a better means of training neural network models; models can be more easily and quickly trained using only input variables which appear to affect the outcome variable. The purpose of this project is to explore varying means of training neural networks and to utilize dimensional reduction for visualizing and understanding complex datasets.
Fully-relativistic full-potential multiple scattering theory: A pathology-free scheme
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Xianglin; Wang, Yang; Eisenbach, Markus
The Green function plays an essential role in the Korringa–Kohn–Rostoker(KKR) multiple scattering method. In practice, it is constructed from the regular and irregular solutions of the local Kohn–Sham equation and robust methods exist for spherical potentials. However, when applied to a non-spherical potential, numerical errors from the irregular solutions give rise to pathological behaviors of the charge density at small radius. Here we present a full-potential implementation of the fully-relativistic KKR method to perform ab initio self-consistent calculation by directly solving the Dirac differential equations using the generalized variable phase (sine and cosine matrices) formalism Liu et al. (2016). Themore » pathology around the origin is completely eliminated by carrying out the energy integration of the single-site Green function along the real axis. Here, by using an efficient pole-searching technique to identify the zeros of the well-behaved Jost matrices, we demonstrated that this scheme is numerically stable and computationally efficient, with speed comparable to the conventional contour energy integration method, while free of the pathology problem of the charge density. As an application, this method is utilized to investigate the crystal structures of polonium and their bulk properties, which is challenging for a conventional real-energy scheme. The noble metals are also calculated, both as a test of our method and to study the relativistic effects.« less
Scientific developments of liquid crystal-based optical memory: a review.
Prakash, Jai; Chandran, Achu; Biradar, Ashok M
2017-01-01
The memory behavior in liquid crystals (LCs), although rarely observed, has made very significant headway over the past three decades since their discovery in nematic type LCs. It has gone from a mere scientific curiosity to application in variety of commodities. The memory element formed by numerous LCs have been protected by patents, and some commercialized, and used as compensation to non-volatile memory devices, and as memory in personal computers and digital cameras. They also have the low cost, large area, high speed, and high density memory needed for advanced computers and digital electronics. Short and long duration memory behavior for industrial applications have been obtained from several LC materials, and an LC memory with interesting features and applications has been demonstrated using numerous LCs. However, considerable challenges still exist in searching for highly efficient, stable, and long-lifespan materials and methods so that the development of useful memory devices is possible. This review focuses on the scientific and technological approach of fascinating applications of LC-based memory. We address the introduction, development status, novel design and engineering principles, and parameters of LC memory. We also address how the amalgamation of LCs could bring significant change/improvement in memory effects in the emerging field of nanotechnology, and the application of LC memory as the active component for futuristic and interesting memory devices.
Fully-relativistic full-potential multiple scattering theory: A pathology-free scheme
Liu, Xianglin; Wang, Yang; Eisenbach, Markus; ...
2017-10-28
The Green function plays an essential role in the Korringa–Kohn–Rostoker(KKR) multiple scattering method. In practice, it is constructed from the regular and irregular solutions of the local Kohn–Sham equation and robust methods exist for spherical potentials. However, when applied to a non-spherical potential, numerical errors from the irregular solutions give rise to pathological behaviors of the charge density at small radius. Here we present a full-potential implementation of the fully-relativistic KKR method to perform ab initio self-consistent calculation by directly solving the Dirac differential equations using the generalized variable phase (sine and cosine matrices) formalism Liu et al. (2016). Themore » pathology around the origin is completely eliminated by carrying out the energy integration of the single-site Green function along the real axis. Here, by using an efficient pole-searching technique to identify the zeros of the well-behaved Jost matrices, we demonstrated that this scheme is numerically stable and computationally efficient, with speed comparable to the conventional contour energy integration method, while free of the pathology problem of the charge density. As an application, this method is utilized to investigate the crystal structures of polonium and their bulk properties, which is challenging for a conventional real-energy scheme. The noble metals are also calculated, both as a test of our method and to study the relativistic effects.« less
Cheng, Xiaorong; Ge, Hui; Andoni, Deljfina; Ding, Xianfeng; Fan, Zhao
2015-01-01
A recent hierarchical model of numerical processing, initiated by Fischer and Brugger (2011) and Fischer (2012), suggested that situated factors, such as different body postures and body movements, can influence the magnitude representation and bias numerical processing. Indeed, Loetscher et al. (2008) found that participants’ behavior in a random number generation task was biased by head rotations. More small numbers were reported after leftward than rightward head turns, i.e., a motion-numerical compatibility effect. Here, by carrying out two experiments, we explored whether similar motion-numerical compatibility effects exist for movements of other important body components, e.g., arms, and for composite body movements as well, which are basis for complex human activities in many ecologically meaningful situations. In Experiment 1, a motion-numerical compatibility effect was observed for lateral rotations of two body components, i.e., the head and arms. Relatively large numbers were reported after making rightward compared to leftward movements for both lateral head and arm turns. The motion-numerical compatibility effect was observed again in Experiment 2 when participants were asked to perform composite body movements of congruent movement directions, e.g., simultaneous head left turns and arm left turns. However, it disappeared when the movement directions were incongruent, e.g., simultaneous head left turns and arm right turns. Taken together, our results extended Loetscher et al.’s (2008) finding by demonstrating that their effect is effector-general and exists for arm movements. Moreover, our study reveals for the first time that the impact of spatial information on numerical processing induced by each of the two sensorimotor-based situated factors, e.g., a lateral head turn and a lateral arm turn, can cancel each other out. PMID:26594188
Developing and Using an Applet to Enrich Students' Concept Image of Rational Polynomials
ERIC Educational Resources Information Center
Mason, John
2015-01-01
This article draws on extensive experience working with secondary and tertiary teachers and educators using an applet to display rational polynomials (up to degree 7 in numerator and denominator), as support for the challenge to deduce as much as possible about the graph from the graphs of the numerator and the denominator. Pedagogical and design…
Grand challenges in understanding the interplay of climate and land changes
Liu, Shuguang; Bond-Lamberty, Ben; Boysen, Lena R.; Ford, James D.; Fox, Andrew; Gallo, Kevin; Hatfield, Jerry L.; Henebry, Geoffrey M.; Huntington, Thomas G.; Liu, Zhihua; Loveland, Thomas R.; Norby, Richard J.; Sohl, Terry L.; Steiner, Allison L.; Yuan, Wenping; Zhang, Zhao; Zhao, Shuqing
2017-01-01
Half of Earth’s land surface has been altered by human activities, creating various consequences on the climate and weather systems at local to global scales, which in turn affect a myriad of land surface processes and the adaptation behaviors. This study reviews the status and major knowledge gaps in the interactions of land and atmospheric changes and present 11 grand challenge areas for the scientific research and adaptation community in the coming decade. These land-cover and land-use change (LCLUC)-related areas include 1) impacts on weather and climate, 2) carbon and other biogeochemical cycles, 3) biospheric emissions, 4) the water cycle, 5) agriculture, 6) urbanization, 7) acclimation of biogeochemical processes to climate change, 8) plant migration, 9) land-use projections, 10) model and data uncertainties, and, finally, 11) adaptation strategies. Numerous studies have demonstrated the effects of LCLUC on local to global climate and weather systems, but these putative effects vary greatly in magnitude and even sign across space, time, and scale and thus remain highly uncertain. At the same time, many challenges exist toward improved understanding of the consequences of atmospheric and climate change on land process dynamics and services. Future effort must improve the understanding of the scale-dependent, multifaceted perturbations and feedbacks between land and climate changes in both reality and models. To this end, one critical cross-disciplinary need is to systematically quantify and better understand measurement and model uncertainties. Finally, LCLUC mitigation and adaptation assessments must be strengthened to identify implementation barriers, evaluate and prioritize opportunities, and examine how decision-making processes work in specific contexts.
Vannoy, Steven D; Mauer, Barbara; Kern, John; Girn, Kamaljeet; Ingoglia, Charles; Campbell, Jeannie; Galbreath, Laura; Unützer, Jürgen
2011-07-01
Integration of general medical and mental health services is a growing priority for safety-net providers. The authors describe a project that established a one-year learning collaborative focused on integration of services between community health centers (CHCs) and community mental health centers (CMHCs). Specific targets were treatment for general medical and psychiatric symptoms related to depression, bipolar disorder, alcohol use disorders, and metabolic syndrome. This observational study used mixed methods. Quantitative measures included 15 patient-level health indicators, practice self-assessment of resources and support for chronic disease self-management, and participant satisfaction. Sixteen CHC-CMHC pairs were selected for the learning collaborative series. One pair dropped out because of personnel turnover. All teams increased capacity on one or more patient health indicators. CHCs scored higher than CMHCs on support for chronic disease self-management. Participation in the learning collaborative increased self-assessment scores for CHCs and CMHCs. Participant satisfaction was high. Observations by faculty indicate that quality improvement challenges included tracking patient-level outcomes, workforce issues, and cross-agency communication. Even though numerous systemic barriers were encountered, the findings support existing literature indicating that the learning collaborative is a viable quality improvement approach for enhancing integration of general medical and mental health services between CHCs and CMHCs. Real-world implementation of evidence-based guidelines presents challenges often absent in research. Technical resources and support, a stable workforce with adequate training, and adequate opportunities for collaborator communications are particular challenges for integrating behavioral and general medical services across CHCs and CMHCs.
NASA Astrophysics Data System (ADS)
Jougnot, D.; Roubinet, D.; Linde, N.; Irving, J.
2016-12-01
Quantifying fluid flow in fractured media is a critical challenge in a wide variety of research fields and applications. To this end, geophysics offers a variety of tools that can provide important information on subsurface physical properties in a noninvasive manner. Most geophysical techniques infer fluid flow by data or model differencing in time or space (i.e., they are not directly sensitive to flow occurring at the time of the measurements). An exception is the self-potential (SP) method. When water flows in the subsurface, an excess of charge in the pore water that counterbalances electric charges at the mineral-pore water interface gives rise to a streaming current and an associated streaming potential. The latter can be measured with the SP technique, meaning that the method is directly sensitive to fluid flow. Whereas numerous field experiments suggest that the SP method may allow for the detection of hydraulically active fractures, suitable tools for numerically modeling streaming potentials in fractured media do not exist. Here, we present a highly efficient two-dimensional discrete-dual-porosity approach for solving the fluid-flow and associated self-potential problems in fractured domains. Our approach is specifically designed for complex fracture networks that cannot be investigated using standard numerical methods due to computational limitations. We then simulate SP signals associated with pumping conditions for a number of examples to show that (i) accounting for matrix fluid flow is essential for accurate SP modeling and (ii) the sensitivity of SP to hydraulically active fractures is intimately linked with fracture-matrix fluid interactions. This implies that fractures associated with strong SP amplitudes are likely to be hydraulically conductive, attracting fluid flow from the surrounding matrix.
Tidal Debris from High-Velocity Collisions as Fake Dark Galaxies: A Numerical Model of VIRGOHI 21
NASA Astrophysics Data System (ADS)
Duc, Pierre-Alain; Bournaud, Frederic
2008-02-01
High-speed collisions, although current in clusters of galaxies, have long been neglected, as they are believed to cause little damages to galaxies except when they are repeated, a process called "harassment." In fact, they are able to produce faint but extended gaseous tails. Such low-mass, starless, tidal debris may become detached and appear as free-floating clouds in the very deep H I surveys that are currently being carried out. We show in this paper that these debris possess the same apparent properties as the so-called dark galaxies, objects originally detected in H I, with no optical counterpart, and presumably dark matter-dominated. We present a numerical model of the prototype of such dark galaxies—VIRGOHI 21—that is able to reproduce its main characteristics: the one-sided tail linking it to the spiral galaxy NGC 4254, the absence of stars, and above all the reversal of the velocity gradient along the tail originally attributed to rotation motions caused by a massive dark matter halo, which we find to be consistent with simple streaming motions plus projection effects. According to our numerical simulations, this tidal debris was expelled 750 Myr ago during a flyby at 1100 km s-1 of NGC 4254 by a massive companion that should now lie at a projected distance of about 400 kpc. A candidate for the intruder is discussed. The existence of galaxies that have never been able to form stars had already been challenged on the basis of theoretical and observational grounds. Tidal collisions, in particular those occurring at high speed, provide a much more simple explanation for the origin of such putative dark galaxies.
Multi-scale modeling of tsunami flows and tsunami-induced forces
NASA Astrophysics Data System (ADS)
Qin, X.; Motley, M. R.; LeVeque, R. J.; Gonzalez, F. I.
2016-12-01
The modeling of tsunami flows and tsunami-induced forces in coastal communities with the incorporation of the constructed environment is challenging for many numerical modelers because of the scale and complexity of the physical problem. A two-dimensional (2D) depth-averaged model can be efficient for modeling of waves offshore but may not be accurate enough to predict the complex flow with transient variance in vertical direction around constructed environments on land. On the other hand, using a more complex three-dimensional model is much more computational expensive and can become impractical due to the size of the problem and the meshing requirements near the built environment. In this study, a 2D depth-integrated model and a 3D Reynolds Averaged Navier-Stokes (RANS) model are built to model a 1:50 model-scale, idealized community, representative of Seaside, OR, USA, for which existing experimental data is available for comparison. Numerical results from the two numerical models are compared with each other as well as experimental measurement. Both models predict the flow parameters (water level, velocity, and momentum flux in the vicinity of the buildings) accurately, in general, except for time period near the initial impact, where the depth-averaged models can fail to capture the complexities in the flow. Forces predicted using direct integration of predicted pressure on structural surfaces from the 3D model and using momentum flux from the 2D model with constructed environment are compared, which indicates that force prediction from the 2D model is not always reliable in such a complicated case. Force predictions from integration of the pressure are also compared with forces predicted from bare earth momentum flux calculations to reveal the importance of incorporating the constructed environment in force prediction models.
ERIC Educational Resources Information Center
Cramer, Sharon F.
2012-01-01
As members of enrollment management units look ahead to the next few years, they anticipate many institution-wide challenges: (1) implementation of a new student information system; (2) major upgrade of an existing system; and (3) re-configuring an existing system to reflect changes in academic policies or to accommodate new federal or state…
2009-06-12
disease to name but a few. With every challenge, however, there exist just as many opportunities to enhance strategic partnership, provide...and disease to name but a few. With every challenge, however, there exist just as many opportunities to enhance strategic partnership, provide...Policy Coordination Committee PCO Project Contracting Office PCRU Post-Conflict Reconstruction Unit PME Professional Military Education PMESII
Asymptotic-preserving Lagrangian approach for modeling anisotropic transport in magnetized plasmas
NASA Astrophysics Data System (ADS)
Chacon, Luis; Del-Castillo-Negrete, Diego
2012-03-01
Modeling electron transport in magnetized plasmas is extremely challenging due to the extreme anisotropy between parallel (to the magnetic field) and perpendicular directions (the transport-coefficient ratio χ/χ˜10^10 in fusion plasmas). Recently, a novel Lagrangian Green's function method has been proposedfootnotetextD. del-Castillo-Negrete, L. Chac'on, PRL, 106, 195004 (2011); D. del-Castillo-Negrete, L. Chac'on, Phys. Plasmas, submitted (2011) to solve the local and non-local purely parallel transport equation in general 3D magnetic fields. The approach avoids numerical pollution, is inherently positivity-preserving, and is scalable algorithmically (i.e., work per degree-of-freedom is grid-independent). In this poster, we discuss the extension of the Lagrangian Green's function approach to include perpendicular transport terms and sources. We present an asymptotic-preserving numerical formulation, which ensures a consistent numerical discretization temporally and spatially for arbitrary χ/χ ratios. We will demonstrate the potential of the approach with various challenging configurations, including the case of transport across a magnetic island in cylindrical geometry.
Numerical simulation of cavitating flows in shipbuilding
NASA Astrophysics Data System (ADS)
Bagaev, D.; Yegorov, S.; Lobachev, M.; Rudnichenko, A.; Taranov, A.
2018-05-01
The paper presents validation of numerical simulations of cavitating flows around different marine objects carried out at the Krylov State Research Centre (KSRC). Preliminary validation was done with reference to international test objects. The main part of the paper contains results of solving practical problems of ship propulsion design. The validation of numerical simulations by comparison with experimental data shows a good accuracy of the supercomputer technologies existing at Krylov State Research Centre for both hydrodynamic and cavitation characteristics prediction.
Predator-prey models with component Allee effect for predator reproduction.
Terry, Alan J
2015-12-01
We present four predator-prey models with component Allee effect for predator reproduction. Using numerical simulation results for our models, we describe how the customary definitions of component and demographic Allee effects, which work well for single species models, can be extended to predators in predator-prey models by assuming that the prey population is held fixed. We also find that when the prey population is not held fixed, then these customary definitions may lead to conceptual problems. After this discussion of definitions, we explore our four models, analytically and numerically. Each of our models has a fixed point that represents predator extinction, which is always locally stable. We prove that the predator will always die out either if the initial predator population is sufficiently small or if the initial prey population is sufficiently small. Through numerical simulations, we explore co-existence fixed points. In addition, we demonstrate, by simulation, the existence of a stable limit cycle in one of our models. Finally, we derive analytical conditions for a co-existence trapping region in three of our models, and show that the fourth model cannot possess a particular kind of co-existence trapping region. We punctuate our results with comments on their real-world implications; in particular, we mention the possibility of prey resurgence from mortality events, and the possibility of failure in a biological pest control program.
ERIC Educational Resources Information Center
Morrow, Lesley Mandel, Ed.; Woo, Deborah Gee, Ed.
As a result of the America Reads Challenge Act of 1997, numerous tutoring programs have been established to help ensure that every child reads independently by the end of third grade. This book describes exemplary America Reads programs across the country as well as other effective early literacy interventions, including Reading Recovery, the…
ERIC Educational Resources Information Center
Taskin, V.; Bernholt, S.; Parchmann, I.
2015-01-01
Chemical representations play an important role in helping learners to understand chemical contents. Thus, dealing with chemical representations is a necessity for learning chemistry, but at the same time, it presents a great challenge to learners. Due to this great challenge, it is not surprising that numerous national and international studies…
The Unique Challenges of Conserving Large Old Trees.
Lindenmayer, David B; Laurance, William F
2016-06-01
Large old trees play numerous critical ecological roles. They are susceptible to a plethora of interacting threats, in part because the attributes that confer a competitive advantage in intact ecosystems make them maladapted to rapidly changing, human-modified environments. Conserving large old trees will require surmounting a number of unresolved challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Judith, Kate; Bull, David
2016-01-01
The implementation of open educational resources (OER) at the course level in higher education poses numerous challenges to education practitioners--ranging from discoverability challenges to the lack of knowledge on how to best localize and utilize OER as courseware. Drawing on case studies of OER initiatives globally, the article discusses…
Weathering the Storms: Acknowledging Challenges to Learning in Times of Stress
ERIC Educational Resources Information Center
Hubschman, Betty; Lutz, Marilyn; King, Christine; Wang, Jia; Kopp, David
2006-01-01
Students and faculty have had numerous disruptions this academic year with Hurricanes Katrina, Rita, and Wilma developing into major stressors. During this innovative session, we will examine some of the challenges and strategies used by faculty to work with students to maintain empathy and academic rigor in times of stress and disruption, and…
Fire metrology: Current and future directions in physics-based measurements
Robert L. Kremens; Alistair M.S. Smith; Matthew B. Dickinson
2010-01-01
The robust evaluation of fire impacts on the biota, soil, and atmosphere requires measurement and analysis methods that can characterize combustion processes across a range of temporal and spatial scales. Numerous challenges are apparent in the literature. These challenges have led to novel research to quantify the 1) structure and heterogeneity of the pre-fire...
Suphanchaimat, Rapeepong; Kantamaturapoj, Kanang; Putthasri, Weerasak; Prakongsai, Phusit
2015-09-17
In recent years, cross-border migration has gained significant attention in high-level policy dialogues in numerous countries. While there exists some literature describing the health status of migrants, and exploring migrants' perceptions of service utilisation in receiving countries, there is still little evidence that examines the issue of health services for migrants through the lens of providers. This study therefore aims to systematically review the latest literature, which investigated perceptions and attitudes of healthcare providers in managing care for migrants, as well as examining the challenges and barriers faced in their practices. A systematic review was performed by gathering evidence from three main online databases: Medline, Embase and Scopus, plus a purposive search from the World Health Organization's website and grey literature sources. The articles, published in English since 2000, were reviewed according to the following topics: (1) how healthcare providers interacted with individual migrant patients, (2) how workplace factors shaped services for migrants, and (3) how the external environment, specifically laws and professional norms influenced their practices. Key message of the articles were analysed by thematic analysis. Thirty seven articles were recruited for the final review. Key findings of the selected articles were synthesised and presented in the data extraction form. Quality of retrieved articles varied substantially. Almost all the selected articles had congruent findings regarding language andcultural challenges, and a lack of knowledge of a host country's health system amongst migrant patients. Most respondents expressed concerns over in-house constraints resulting from heavy workloads and the inadequacy of human resources. Professional norms strongly influenced the behaviours and attitudes of healthcare providers despite conflicting with laws that limited right to health services access for illegal migrants. The perceptions, attitudes and practices of practitioners in the provision of healthcare services for migrants were mainly influenced by: (1) diverse cultural beliefs and language differences, (2) limited institutional capacity, in terms of time and/or resource constraints, (3) the contradiction between professional ethics and laws that limited migrants' right to health care. Nevertheless, healthcare providers addressedsuch problems by partially ignoring the immigrants'precarious legal status, and using numerous tactics, including seeking help from civil society groups, to support their clinical practice. It was evident that healthcare providers faced several challenges in managing care for migrants, which included not only language and cultural barriers, but also resource constraints within their workplaces, and disharmony between the law and their professional norms. Further studies, which explore health care management for migrants in countries with different health insurance models, are recommended.
ERIC Educational Resources Information Center
Stavrou-Costea, Eleni
2005-01-01
Purpose: The study aims to examine the human resource management challenges in Southern EU and their effect on organizational performance. Design/methodology/approach: First, key challenges were identified in the existing literature. Then, these challenges were matched with those reported most often in the CRANET questionnaire. These challenges…
Satellite Tasking via a Tablet Computer
2015-09-01
connectivity have helped to overcome the challenges of information delivery , but there remains the challenge of real-time information. This thesis...have helped to overcome the challenges of information delivery , but there remains the challenge of real-time information. This thesis examines the...76 3. Integration with Existing Programs for Access and Dissemination of Imagery
A Robust Actin Filaments Image Analysis Framework
Alioscha-Perez, Mitchel; Benadiba, Carine; Goossens, Katty; Kasas, Sandor; Dietler, Giovanni; Willaert, Ronnie; Sahli, Hichem
2016-01-01
The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput) automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale). Based on this observation, we propose a three-steps actin filaments extraction methodology: (i) first the input image is decomposed into a ‘cartoon’ part corresponding to the filament structures in the image, and a noise/texture part, (ii) on the ‘cartoon’ image, we apply a multi-scale line detector coupled with a (iii) quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in osteoblasts grown in two different conditions: static (control) and fluid shear stress. The proposed methodology exhibited higher sensitivity values and similar accuracy compared to state-of-the-art methods. PMID:27551746
Design sensitivity analysis with Applicon IFAD using the adjoint variable method
NASA Technical Reports Server (NTRS)
Frederick, Marjorie C.; Choi, Kyung K.
1984-01-01
A numerical method is presented to implement structural design sensitivity analysis using the versatility and convenience of existing finite element structural analysis program and the theoretical foundation in structural design sensitivity analysis. Conventional design variables, such as thickness and cross-sectional areas, are considered. Structural performance functionals considered include compliance, displacement, and stress. It is shown that calculations can be carried out outside existing finite element codes, using postprocessing data only. That is, design sensitivity analysis software does not have to be imbedded in an existing finite element code. The finite element structural analysis program used in the implementation presented is IFAD. Feasibility of the method is shown through analysis of several problems, including built-up structures. Accurate design sensitivity results are obtained without the uncertainty of numerical accuracy associated with selection of a finite difference perturbation.
Numerical MHD study for plasmoid instability in uniform resistivity
NASA Astrophysics Data System (ADS)
Shimizu, Tohru; Kondoh, Koji; Zenitani, Seiji
2017-11-01
The plasmoid instability (PI) caused in uniform resistivity is numerically studied with a MHD numerical code of HLLD scheme. It is shown that the PI observed in numerical studies may often include numerical (non-physical) tearing instability caused by the numerical dissipations. By increasing the numerical resolutions, the numerical tearing instability gradually disappears and the physical tearing instability remains. Hence, the convergence of the numerical results is observed. Note that the reconnection rate observed in the numerical tearing instability can be higher than that of the physical tearing instability. On the other hand, regardless of the numerical and physical tearing instabilities, the tearing instability can be classified into symmetric and asymmetric tearing instability. The symmetric tearing instability tends to occur when the thinning of current sheet is stopped by the physical or numerical dissipations, often resulting in the drastic changes in plasmoid chain's structure and its activity. In this paper, by eliminating the numerical tearing instability, we could not specify the critical Lundquist number Sc beyond which PI is fully developed. It suggests that Sc does not exist, at least around S = 105.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-22
... sequence of modifications to a contract or order, a method for determining the order of application for... application for modifications. (a) Circumstances may exist in which the numeric order of the modifications to... date and the same signature date, procuring contracting office modifications will be applied in numeric...
Chaos in the fractional order logistic delay system: Circuit realization and synchronization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baskonus, Haci Mehmet; Hammouch, Zakia; Mekkaoui, Toufik
2016-06-08
In this paper, we present a numerical study and a circuit design to prove existence of chaos in the fractional order Logistic delay system. In addition, we investigate an active control synchronization scheme in this system. Numerical and cicruit simulations show the effectiveness and feasibility of this method.
One Language, Two Number-Word Systems and Many Problems: Numerical Cognition in the Czech Language
ERIC Educational Resources Information Center
Pixner, S.; Zuber, J.; Hermanova, V.; Kaufmann, L.; Nuerk, H.-C.; Moeller, K.
2011-01-01
Comparing numerical performance between different languages does not only mean comparing different number-word systems, but also implies a comparison of differences regarding culture or educational systems. The Czech language provides the remarkable opportunity to disentangle this confound as there exist two different number-word systems within…
Bubble coalescence in a Newtonian fluid
NASA Astrophysics Data System (ADS)
Garg, Vishrut; Basaran, Osman
2017-11-01
Bubble coalescence plays a central role in the hydrodynamics of gas-liquid systems such as bubble column reactors, spargers, and foams. Two bubbles approaching each other at velocity V coalesce when the thin film between them ruptures, which is often the rate-limiting step. Experimental studies of this system are difficult, and recent works provide conflicting results on the effect of V on coalescence times. We simulate the head-on approach of two bubbles of equal radii R in an incompressible Newtonian fluid (density ρ, viscosity μ, and surface tension σ) by solving numerically the free boundary problem comprised of the Navier Stokes and continuity equations. Simulations are made challenging by the existence of highly disparate lengthscales, i.e. film thickness and drop radii, which are resolved by using the method of elliptic mesh generation. For a given liquid, the bubbles are shown to coalesce for all velocities below a critical value. The effects of Ohnesorge number Oh = μ /√{ ρσR } on coalescence time and critical velocity are also investigated.
Automatic alignment for three-dimensional tomographic reconstruction
NASA Astrophysics Data System (ADS)
van Leeuwen, Tristan; Maretzke, Simon; Joost Batenburg, K.
2018-02-01
In tomographic reconstruction, the goal is to reconstruct an unknown object from a collection of line integrals. Given a complete sampling of such line integrals for various angles and directions, explicit inverse formulas exist to reconstruct the object. Given noisy and incomplete measurements, the inverse problem is typically solved through a regularized least-squares approach. A challenge for both approaches is that in practice the exact directions and offsets of the x-rays are only known approximately due to, e.g. calibration errors. Such errors lead to artifacts in the reconstructed image. In the case of sufficient sampling and geometrically simple misalignment, the measurements can be corrected by exploiting so-called consistency conditions. In other cases, such conditions may not apply and we have to solve an additional inverse problem to retrieve the angles and shifts. In this paper we propose a general algorithmic framework for retrieving these parameters in conjunction with an algebraic reconstruction technique. The proposed approach is illustrated by numerical examples for both simulated data and an electron tomography dataset.
Exploring a Multiphysics Resolution Approach for Additive Manufacturing
NASA Astrophysics Data System (ADS)
Estupinan Donoso, Alvaro Antonio; Peters, Bernhard
2018-06-01
Metal additive manufacturing (AM) is a fast-evolving technology aiming to efficiently produce complex parts while saving resources. Worldwide, active research is being performed to solve the existing challenges of this growing technique. Constant computational advances have enabled multiscale and multiphysics numerical tools that complement the traditional physical experimentation. In this contribution, an advanced discrete-continuous concept is proposed to address the physical phenomena involved during laser powder bed fusion. The concept treats powder as discrete by the extended discrete element method, which predicts the thermodynamic state and phase change for each particle. The fluid surrounding is solved with multiphase computational fluid dynamics techniques to determine momentum, heat, gas and liquid transfer. Thus, results track the positions and thermochemical history of individual particles in conjunction with the prevailing fluid phases' temperature and composition. It is believed that this methodology can be employed to complement experimental research by analysis of the comprehensive results, which can be extracted from it to enable AM processes optimization for parts qualification.
Materials Informatics: The Materials ``Gene'' and Big Data
NASA Astrophysics Data System (ADS)
Rajan, Krishna
2015-07-01
Materials informatics provides the foundations for a new paradigm of materials discovery. It shifts our emphasis from one of solely searching among large volumes of data that may be generated by experiment or computation to one of targeted materials discovery via high-throughput identification of the key factors (i.e., “genes”) and via showing how these factors can be quantitatively integrated by statistical learning methods into design rules (i.e., “gene sequencing”) governing targeted materials functionality. However, a critical challenge in discovering these materials genes is the difficulty in unraveling the complexity of the data associated with numerous factors including noise, uncertainty, and the complex diversity of data that one needs to consider (i.e., Big Data). In this article, we explore one aspect of materials informatics, namely how one can efficiently explore for new knowledge in regimes of structure-property space, especially when no reasonable selection pathways based on theory or clear trends in observations exist among an almost infinite set of possibilities.
Minerva-Red: Small Planets Orbiting Small Stars
NASA Astrophysics Data System (ADS)
Blake, Cullen
2018-06-01
Recent results from Kepler and ground-based exoplanet surveys suggest that low-mass stars are host to numerous small planets. Since low-mass stars are intrinsically faint at optical wavelengths, obtaining the Doppler precision necessary to detect these companions remains a challenge for existing instruments. I will describe MINERVA-Red, a project to use a robotic, near-infrared optimized 0.7-meter telescope and a specialized Doppler spectrometer to carry out an intensive, multi-year campaign designed to reveal the planetary systems orbiting some of the closest stars to the Sun. The MINERVA-Red cross-dispersed echelle spectrograph is optimized for the “deep red”, between 800 nm and 900 nm, where the stars that will be targeted are relatively bright. The instrument is very compact and designed for the ultimate in Doppler precision – it uses a single-mode fiber input. I will describe the spectrometer and the status of the MINERVA-Red project, which is expected to begin routine operations at Whipple Observatory on Mt Hopkins, Arizona, in 2018.
A Review of Computational Methods for Finding Non-Coding RNA Genes
Abbas, Qaisar; Raza, Syed Mansoor; Biyabani, Azizuddin Ahmed; Jaffar, Muhammad Arfan
2016-01-01
Finding non-coding RNA (ncRNA) genes has emerged over the past few years as a cutting-edge trend in bioinformatics. There are numerous computational intelligence (CI) challenges in the annotation and interpretation of ncRNAs because it requires a domain-related expert knowledge in CI techniques. Moreover, there are many classes predicted yet not experimentally verified by researchers. Recently, researchers have applied many CI methods to predict the classes of ncRNAs. However, the diverse CI approaches lack a definitive classification framework to take advantage of past studies. A few review papers have attempted to summarize CI approaches, but focused on the particular methodological viewpoints. Accordingly, in this article, we summarize in greater detail than previously available, the CI techniques for finding ncRNAs genes. We differentiate from the existing bodies of research and discuss concisely the technical merits of various techniques. Lastly, we review the limitations of ncRNA gene-finding CI methods with a point-of-view towards the development of new computational tools. PMID:27918472
A Robust Shape Reconstruction Method for Facial Feature Point Detection.
Tan, Shuqiu; Chen, Dongyi; Guo, Chenggang; Huang, Zhiqi
2017-01-01
Facial feature point detection has been receiving great research advances in recent years. Numerous methods have been developed and applied in practical face analysis systems. However, it is still a quite challenging task because of the large variability in expression and gestures and the existence of occlusions in real-world photo shoot. In this paper, we present a robust sparse reconstruction method for the face alignment problems. Instead of a direct regression between the feature space and the shape space, the concept of shape increment reconstruction is introduced. Moreover, a set of coupled overcomplete dictionaries termed the shape increment dictionary and the local appearance dictionary are learned in a regressive manner to select robust features and fit shape increments. Additionally, to make the learned model more generalized, we select the best matched parameter set through extensive validation tests. Experimental results on three public datasets demonstrate that the proposed method achieves a better robustness over the state-of-the-art methods.
Steinbrück, Lars; McHardy, Alice Carolyn
2012-01-01
Distinguishing mutations that determine an organism's phenotype from (near-) neutral ‘hitchhikers’ is a fundamental challenge in genome research, and is relevant for numerous medical and biotechnological applications. For human influenza viruses, recognizing changes in the antigenic phenotype and a strains' capability to evade pre-existing host immunity is important for the production of efficient vaccines. We have developed a method for inferring ‘antigenic trees’ for the major viral surface protein hemagglutinin. In the antigenic tree, antigenic weights are assigned to all tree branches, which allows us to resolve the antigenic impact of the associated amino acid changes. Our technique predicted antigenic distances with comparable accuracy to antigenic cartography. Additionally, it identified both known and novel sites, and amino acid changes with antigenic impact in the evolution of influenza A (H3N2) viruses from 1968 to 2003. The technique can also be applied for inference of ‘phenotype trees’ and genotype–phenotype relationships from other types of pairwise phenotype distances. PMID:22532796
Surface Estimation, Variable Selection, and the Nonparametric Oracle Property.
Storlie, Curtis B; Bondell, Howard D; Reich, Brian J; Zhang, Hao Helen
2011-04-01
Variable selection for multivariate nonparametric regression is an important, yet challenging, problem due, in part, to the infinite dimensionality of the function space. An ideal selection procedure should be automatic, stable, easy to use, and have desirable asymptotic properties. In particular, we define a selection procedure to be nonparametric oracle (np-oracle) if it consistently selects the correct subset of predictors and at the same time estimates the smooth surface at the optimal nonparametric rate, as the sample size goes to infinity. In this paper, we propose a model selection procedure for nonparametric models, and explore the conditions under which the new method enjoys the aforementioned properties. Developed in the framework of smoothing spline ANOVA, our estimator is obtained via solving a regularization problem with a novel adaptive penalty on the sum of functional component norms. Theoretical properties of the new estimator are established. Additionally, numerous simulated and real examples further demonstrate that the new approach substantially outperforms other existing methods in the finite sample setting.
Surface Estimation, Variable Selection, and the Nonparametric Oracle Property
Storlie, Curtis B.; Bondell, Howard D.; Reich, Brian J.; Zhang, Hao Helen
2010-01-01
Variable selection for multivariate nonparametric regression is an important, yet challenging, problem due, in part, to the infinite dimensionality of the function space. An ideal selection procedure should be automatic, stable, easy to use, and have desirable asymptotic properties. In particular, we define a selection procedure to be nonparametric oracle (np-oracle) if it consistently selects the correct subset of predictors and at the same time estimates the smooth surface at the optimal nonparametric rate, as the sample size goes to infinity. In this paper, we propose a model selection procedure for nonparametric models, and explore the conditions under which the new method enjoys the aforementioned properties. Developed in the framework of smoothing spline ANOVA, our estimator is obtained via solving a regularization problem with a novel adaptive penalty on the sum of functional component norms. Theoretical properties of the new estimator are established. Additionally, numerous simulated and real examples further demonstrate that the new approach substantially outperforms other existing methods in the finite sample setting. PMID:21603586
Genetic and metabolic engineering for microbial production of poly-γ-glutamic acid.
Cao, Mingfeng; Feng, Jun; Sirisansaneeyakul, Sarote; Song, Cunjiang; Chisti, Yusuf
2018-05-28
Poly-γ-glutamic acid (γ-PGA) is a natural biopolymer of glutamic acid. The repeating units of γ-PGA may be derived exclusively from d-glutamic acid, or l-glutamic acid, or both. The monomer units are linked by amide bonds between the α-amino group and the γ-carboxylic acid group. γ-PGA is biodegradable, edible and water-soluble. It has numerous existing and emerging applications in processing of foods, medicines and cosmetics. This review focuses on microbial production of γ-PGA via genetically and metabolically engineered recombinant bacteria. Strategies for improving production of γ-PGA include modification of its biosynthesis pathway, enhancing the production of its precursor (glutamic acid), and preventing loss of the precursor to competing byproducts. These and other strategies are discussed. Heterologous synthesis of γ-PGA in industrial bacterial hosts that do not naturally produce γ-PGA is discussed. Emerging trends and the challenges affecting the production of γ-PGA are reviewed. Copyright © 2018. Published by Elsevier Inc.
LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS
Einstein, Daniel R.; Dyedov, Vladimir
2010-01-01
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546
Square ice in graphene nanocapillaries.
Algara-Siller, G; Lehtinen, O; Wang, F C; Nair, R R; Kaiser, U; Wu, H A; Geim, A K; Grigorieva, I V
2015-03-26
Bulk water exists in many forms, including liquid, vapour and numerous crystalline and amorphous phases of ice, with hexagonal ice being responsible for the fascinating variety of snowflakes. Much less noticeable but equally ubiquitous is water adsorbed at interfaces and confined in microscopic pores. Such low-dimensional water determines aspects of various phenomena in materials science, geology, biology, tribology and nanotechnology. Theory suggests many possible phases for adsorbed and confined water, but it has proved challenging to assess its crystal structure experimentally. Here we report high-resolution electron microscopy imaging of water locked between two graphene sheets, an archetypal example of hydrophobic confinement. The observations show that the nanoconfined water at room temperature forms 'square ice'--a phase having symmetry qualitatively different from the conventional tetrahedral geometry of hydrogen bonding between water molecules. Square ice has a high packing density with a lattice constant of 2.83 Å and can assemble in bilayer and trilayer crystallites. Molecular dynamics simulations indicate that square ice should be present inside hydrophobic nanochannels independently of their exact atomic nature.
[Microarray CGH: principle and use for constitutional disorders].
Sanlaville, D; Lapierre, J M; Coquin, A; Turleau, C; Vermeesch, J; Colleaux, L; Borck, G; Vekemans, M; Aurias, A; Romana, S P
2005-10-01
Chips technology has allowed to miniaturize process making possible to realize in one step and using the same device a lot of chemical reactions. The application of this technology to molecular cytogenetics resulted in the development of comparative genomic hybridization (CGH) on microarrays technique. Using this technique it is possible to detect very small genetic imbalances anywhere in the genome. Its usefulness has been well documented in cancer and more recently in constitutional disorders. In particular it has been used to detect interstitial and subtelomeric submicroscopic imbalances, to characterize their size at the molecular level or to define the breakpoints of translocation. The challenge today is to transfer this technology in laboratory medicine. Nevertheless this technology remains expensive and the existence of numerous sequence polymorphisms makes its interpretation difficult. Finally its is unlikely that it will make karyotyping obsolete as it does not allow to detect balanced rearrangements which after meiotic segregation might result in genome imbalance in the progeny.
Wall roughness induces asymptotic ultimate turbulence
NASA Astrophysics Data System (ADS)
Zhu, Xiaojue; Verschoof, Ruben A.; Bakhuis, Dennis; Huisman, Sander G.; Verzicco, Roberto; Sun, Chao; Lohse, Detlef
2018-04-01
Turbulence governs the transport of heat, mass and momentum on multiple scales. In real-world applications, wall-bounded turbulence typically involves surfaces that are rough; however, characterizing and understanding the effects of wall roughness on turbulence remains a challenge. Here, by combining extensive experiments and numerical simulations, we examine the paradigmatic Taylor-Couette system, which describes the closed flow between two independently rotating coaxial cylinders. We show how wall roughness greatly enhances the overall transport properties and the corresponding scaling exponents associated with wall-bounded turbulence. We reveal that if only one of the walls is rough, the bulk velocity is slaved to the rough side, due to the much stronger coupling to that wall by the detaching flow structures. If both walls are rough, the viscosity dependence is eliminated, giving rise to asymptotic ultimate turbulence—the upper limit of transport—the existence of which was predicted more than 50 years ago. In this limit, the scaling laws can be extrapolated to arbitrarily large Reynolds numbers.
ScanImage: flexible software for operating laser scanning microscopes.
Pologruto, Thomas A; Sabatini, Bernardo L; Svoboda, Karel
2003-05-17
Laser scanning microscopy is a powerful tool for analyzing the structure and function of biological specimens. Although numerous commercial laser scanning microscopes exist, some of the more interesting and challenging applications demand custom design. A major impediment to custom design is the difficulty of building custom data acquisition hardware and writing the complex software required to run the laser scanning microscope. We describe a simple, software-based approach to operating a laser scanning microscope without the need for custom data acquisition hardware. Data acquisition and control of laser scanning are achieved through standard data acquisition boards. The entire burden of signal integration and image processing is placed on the CPU of the computer. We quantitate the effectiveness of our data acquisition and signal conditioning algorithm under a variety of conditions. We implement our approach in an open source software package (ScanImage) and describe its functionality. We present ScanImage, software to run a flexible laser scanning microscope that allows easy custom design.
NASA Astrophysics Data System (ADS)
Luo, Chun-Ling; Zhuo, Ling-Qing
2017-01-01
Imaging through atmospheric turbulence is a topic with a long history and grand challenges still exist in the remote sensing and astro observation fields. In this letter, we try to propose a simple scheme to improve the resolution of imaging through turbulence based on the computational ghost imaging (CGI) and computational ghost diffraction (CGD) setup via the laser beam shaping techniques. A unified theory of CGI and CGD through turbulence with the multi-Gaussian shaped incoherent source is developed, and numerical examples are given to see clearly the effects of the system parameters to CGI and CGD. Our results show that the atmospheric effect to the CGI and CGD system is closely related to the propagation distance between the source and the object. In addition, by properly increasing the beam order of the multi-Gaussian source, we can improve the resolution of CGI and CGD through turbulence relative to the commonly used Gaussian source. Therefore our results may find applications in remote sensing and astro observation.
NASA Technical Reports Server (NTRS)
Shen, Hayley H.
1991-01-01
Liquid fuel combustion process is greatly affected by the rate of droplet evaporation. The heat and mass exchanges between gas and liquid couple the dynamics of both phases in all aspects: mass, momentum, and energy. Correct prediction of the evaporation rate is therefore a key issue in engineering design of liquid combustion devices. Current analytical tools for characterizing the behavior of these devices are based on results from a single isolated droplet. Numerous experimental studies have challenged the applicability of these results in a dense spray. To account for the droplets' interaction in a dense spray, a number of theories have been developed in the past decade. Herein, two tasks are examined. One was to study how to implement the existing theoretical results, and the other was to explore the possibility of experimental verifications. The current theoretical results of group evaporation are given for a monodispersed cluster subject to adiabatic conditions. The time evolution of the fluid mechanic and thermodynamic behavior in this cluster is derived. The results given are not in the form of a subscale model for CFD codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Teresa A.; Lapsa, Melissa Voss
Oak Ridge National Laboratory (ORNL) is both the largest science and energy laboratory of the US Department of Energy (DOE) and one of the oldest national laboratories still operating at its original site. These characteristics provide the Sustainable Campus Initiative (SCI) both a unique opportunity and a unique challenge to integrate sustainability into facilities and activities. As outlined in this report, SCI is leveraging the outcomes of ORNL’s DOE-sponsored research and development programs to maximize the efficient use of energy and natural resources across ORNL. Wherever possible, ORNL is integrating technical innovations into new and existing facilities, systems, and processesmore » with a widespread approach to achieving Executive Order 13514. ORNL continues to pursue and deploy innovative solutions and initiatives to advance regional, national, and worldwide sustainability and continues to transform its culture and engage employees in supporting sustainability at work, at home, and in the community. Table 1 summarizes ORNL's FY 2013 performance and planned actions to attain future goals. ORNL has achieved numerous successes during FY 2013, which are described in detail throughout this document.« less
Boys’ and Young Men’s Perspectives on Violence in Northern Tanzania
Likindikoki, Samuel; Kaaya, Sylvia
2013-01-01
The challenge of violence for youth in low-income countries includes a range of experiences from witnessing to experiencing to participating in violence. Although boys and young men are often the perpetrators of such violence, they may also be its victims. Yet little evidence exists from the voiced experiences of boys themselves on perceptions and interpretations of the violence around them. Given the numerous negative health implications of violence for boys, for the girls and other boys with whom they interact, and for the health of their future partners and families, we conducted an in-depth study in rural and urban Tanzania with adolescent boys on the masculinity norms shaping their transitions through puberty that might be contributing to high-risk behaviours, including engagement in violence. The findings identified underlying societal gendered norms influencing the enactment of violence, and recommendations from the boys on how to diminish the violence around them. Additional research is needed with boys on the social norms and structural factors influencing their engagement in violence. PMID:23586440
MINERVA-Red: A Census of Planets Orbiting the Nearest Low-mass Stars to the Sun
NASA Astrophysics Data System (ADS)
Blake, Cullen; Johnson, John; Plavchan, Peter; Sliski, David; Wittenmyer, Robert A.; Eastman, Jason D.; Barnes, Stuart
2015-01-01
Recent results from Kepler and ground-based exoplanet surveys suggest that low-mass stars host numerous small planets. Since low-mass stars are intrinsically faint at optical wavelengths, obtaining the Doppler precision necessary to detect these companions remains a challenge for existing instruments. We describe MINERVA-Red, a project to use a dedicated, robotic, near-infrared optimized 0.7 meter telescope and a specialized Doppler spectrometer to carry out an intensive, multi-year campaign designed to reveal the planetary systems orbiting some of the closest stars to the Sun. The MINERVA-Red cross-dispersed echelle spectrograph is optimized for the 'deep red', between 800 nm and 900 nm, where these stars are relatively bright. The instrument is very compact and designed for the ultimate in Doppler precision by using single-mode fiber input. We describe the spectrometer and the status of the MINERVA-Red project, which is expected to begin routine operations at Whipple Observatory on Mt Hopkins, Arizona, in 2015.
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
Treating hearing disorders with cell and gene therapy
NASA Astrophysics Data System (ADS)
Gillespie, Lisa N.; Richardson, Rachael T.; Nayagam, Bryony A.; Wise, Andrew K.
2014-12-01
Hearing loss is an increasing problem for a substantial number of people and, with an aging population, the incidence and severity of hearing loss will become more significant over time. There are very few therapies currently available to treat hearing loss, and so the development of new therapeutic strategies for hearing impaired individuals is of paramount importance to address this unmet clinical need. Most forms of hearing loss are progressive in nature and therefore an opportunity exists to develop novel therapeutic approaches to slow or halt hearing loss progression, or even repair or replace lost hearing function. Numerous emerging technologies have potential as therapeutic options. This paper details the potential of cell- and gene-based therapies to provide therapeutic agents to protect sensory and neural cells from various insults known to cause hearing loss; explores the potential of replacing lost sensory and nerve cells using gene and stem cell therapy; and describes the considerations for clinical translation and the challenges that need to be overcome.
Managing Emergency Situations in the Smart City: The Smart Signal
Asensio, Ángel; Blanco, Teresa; Blasco, Rubén; Marco, Álvaro; Casas, Roberto
2015-01-01
In a city there are numerous items, many of them unnoticed but essential; this is the case of the signals. Signals are considered objects with reduced technological interest, but in this paper we prove that making them smart and integrating in the IoT (Internet of Things) could be a relevant contribution to the Smart City. This paper presents the concept of Smart Signal, as a device conscious of its context, with communication skills, able to offer the best message to the user, and as a ubiquitous element that contributes with information to the city. We present the design considerations and a real implementation and validation of the system in one of the most challenging environments that may exist in a city: a tunnel. The main advantages of the Smart Signal are the improvement of the actual functionality of the signal providing new interaction capabilities with users and a new sensory mechanism of the Smart City. PMID:26094626
Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf
2010-07-01
Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties. Copyright 2010 Elsevier Inc. All rights reserved.
Stone, David; Bress, William
2007-01-01
Toxigenic cyanobacteria, commonly known as blue green algae, are an emerging public health issue. The toxins produced by cyanobacteria have been detected across the United States in marine, freshwater and estuarine systems and associated with adverse health outcomes. The intent of this paper is to focus on how to address risk in a recreational freshwater scenario when toxigenic cyanobacteria are present. Several challenges exist for monitoring, assessing and posting water bodies and advising the public when toxigenic cyanobacteria are present. These include addressing different recreational activities that are associated with varying levels of risk, the dynamic temporal and spatial aspects of blooms, data gaps in toxicological information and the lack of training and resources for adequate surveillance. Without uniform federal guidance, numerous states have taken public health action for cyanobacteria with different criteria. Vermont and Oregon independently developed a tiered decision-making framework to reduce risk to recreational users when toxigenic cyanobacteria are present. This framework is based on a combination of qualitative and quantitative information.
Dynamic Analysis of a Reaction-Diffusion Rumor Propagation Model
NASA Astrophysics Data System (ADS)
Zhao, Hongyong; Zhu, Linhe
2016-06-01
The rapid development of the Internet, especially the emergence of the social networks, leads rumor propagation into a new media era. Rumor propagation in social networks has brought new challenges to network security and social stability. This paper, based on partial differential equations (PDEs), proposes a new SIS rumor propagation model by considering the effect of the communication between the different rumor infected users on rumor propagation. The stabilities of a nonrumor equilibrium point and a rumor-spreading equilibrium point are discussed by linearization technique and the upper and lower solutions method, and the existence of a traveling wave solution is established by the cross-iteration scheme accompanied by the technique of upper and lower solutions and Schauder’s fixed point theorem. Furthermore, we add the time delay to rumor propagation and deduce the conditions of Hopf bifurcation and stability switches for the rumor-spreading equilibrium point by taking the time delay as the bifurcation parameter. Finally, numerical simulations are performed to illustrate the theoretical results.
Requirements for plant coexistence through pollination niche partitioning
Benadi, Gita
2015-01-01
Plant–pollinator interactions are often thought to have been a decisive factor in the diversification of flowering plants, but to be of little or no importance for the maintenance of existing plant diversity. In a recent opinion paper, Pauw (2013 Trends Ecol. Evol. 28, 30–37. (doi:10.1016/j.tree.2012.07.019)) challenged this view by proposing a mechanism of diversity maintenance based on pollination niche partitioning. In this article, I investigate under which conditions the mechanism suggested by Pauw can promote plant coexistence, using a mathematical model of plant and pollinator population dynamics. Numerical simulations show that this mechanism is most effective when the costs of searching for flowers are low, pollinator populations are strongly limited by resources other than pollen and nectar, and plant–pollinator interactions are sufficiently specialized. I review the empirical literature on these three requirements, discuss additional factors that may be important for diversity maintenance through pollination niche partitioning, and provide recommendations on how to detect this coexistence mechanism in natural plant communities. PMID:26108627
Stochastic models for inferring genetic regulation from microarray gene expression data.
Tian, Tianhai
2010-03-01
Microarray expression profiles are inherently noisy and many different sources of variation exist in microarray experiments. It is still a significant challenge to develop stochastic models to realize noise in microarray expression profiles, which has profound influence on the reverse engineering of genetic regulation. Using the target genes of the tumour suppressor gene p53 as the test problem, we developed stochastic differential equation models and established the relationship between the noise strength of stochastic models and parameters of an error model for describing the distribution of the microarray measurements. Numerical results indicate that the simulated variance from stochastic models with a stochastic degradation process can be represented by a monomial in terms of the hybridization intensity and the order of the monomial depends on the type of stochastic process. The developed stochastic models with multiple stochastic processes generated simulations whose variance is consistent with the prediction of the error model. This work also established a general method to develop stochastic models from experimental information. 2009 Elsevier Ireland Ltd. All rights reserved.
Reputation and Reward: Two Sides of the Same Bitcoin
Delgado-Segura, Sergi; Tanas, Cristian; Herrera-Joancomartí, Jordi
2016-01-01
In Mobile Crowd Sensing (MCS), the power of the crowd, jointly with the sensing capabilities of the smartphones they wear, provides a new paradigm for data sensing. Scenarios involving user behavior or those that rely on user mobility are examples where standard sensor networks may not be suitable, and MCS provides an interesting solution. However, including human participation in sensing tasks presents numerous and unique research challenges. In this paper, we analyze three of the most important: user participation, data sensing quality and user anonymity. We tackle the three as a whole, since all of them are strongly correlated. As a result, we present PaySense, a general framework that incentivizes user participation and provides a mechanism to validate the quality of collected data based on the users’ reputation. All such features are performed in a privacy-preserving way by using the Bitcoin cryptocurrency. Rather than a theoretical one, our framework has been implemented, and it is ready to be deployed and complement any existing MCS system. PMID:27240373
Reputation and Reward: Two Sides of the Same Bitcoin.
Delgado-Segura, Sergi; Tanas, Cristian; Herrera-Joancomartí, Jordi
2016-05-27
In Mobile Crowd Sensing (MCS), the power of the crowd, jointly with the sensing capabilities of the smartphones they wear, provides a new paradigm for data sensing. Scenarios involving user behavior or those that rely on user mobility are examples where standard sensor networks may not be suitable, and MCS provides an interesting solution. However, including human participation in sensing tasks presents numerous and unique research challenges. In this paper, we analyze three of the most important: user participation, data sensing quality and user anonymity. We tackle the three as a whole, since all of them are strongly correlated. As a result, we present PaySense, a general framework that incentivizes user participation and provides a mechanism to validate the quality of collected data based on the users' reputation. All such features are performed in a privacy-preserving way by using the Bitcoin cryptocurrency. Rather than a theoretical one, our framework has been implemented, and it is ready to be deployed and complement any existing MCS system.
Square ice in graphene nanocapillaries
NASA Astrophysics Data System (ADS)
Algara-Siller, G.; Lehtinen, O.; Wang, F. C.; Nair, R. R.; Kaiser, U.; Wu, H. A.; Geim, A. K.; Grigorieva, I. V.
2015-03-01
Bulk water exists in many forms, including liquid, vapour and numerous crystalline and amorphous phases of ice, with hexagonal ice being responsible for the fascinating variety of snowflakes. Much less noticeable but equally ubiquitous is water adsorbed at interfaces and confined in microscopic pores. Such low-dimensional water determines aspects of various phenomena in materials science, geology, biology, tribology and nanotechnology. Theory suggests many possible phases for adsorbed and confined water, but it has proved challenging to assess its crystal structure experimentally. Here we report high-resolution electron microscopy imaging of water locked between two graphene sheets, an archetypal example of hydrophobic confinement. The observations show that the nanoconfined water at room temperature forms `square ice'--a phase having symmetry qualitatively different from the conventional tetrahedral geometry of hydrogen bonding between water molecules. Square ice has a high packing density with a lattice constant of 2.83 Å and can assemble in bilayer and trilayer crystallites. Molecular dynamics simulations indicate that square ice should be present inside hydrophobic nanochannels independently of their exact atomic nature.
Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes
NASA Astrophysics Data System (ADS)
Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico
2017-12-01
Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.
The SIR model of Zika virus disease outbreak in Brazil at year 2015
NASA Astrophysics Data System (ADS)
Aik, Lim Eng; Kiang, Lam Chee; Hong, Tan Wei; Abu, Mohd Syafarudy
2017-05-01
This research study demonstrates a numerical model intended for comprehension the spread of the year 2015 Zika virus disease utilizing the standard SIR framework. In modeling virulent disease dynamics, it is important to explore whether the illness spread could accomplish a pandemic level or it could be eradicated. Information from the year 2015 Zika virus disease event is utilized and Brazil where the event began is considered in this research study. A three dimensional nonlinear differential equation is formulated and solved numerically utilizing the Euler's method in MS excel. It is appeared from the research study that, with health intercessions of public, the viable regenerative number can be decreased making it feasible for the event to cease to exist. It is additionally indicated numerically that the pandemic can just cease to exist when there are no new infected people in the populace.
Resonances and bound states in the continuum on periodic arrays of slightly noncircular cylinders
NASA Astrophysics Data System (ADS)
Hu, Zhen; Lu, Ya Yan
2018-02-01
Optical bound states in the continuum (BICs), especially those on periodic structures, have interesting properties and potentially important applications. Existing theoretical and numerical studies for optical BICs are mostly for idealized structures with simple and perfect geometric features, such as circular holes, rectangular cylinders and spheres. Since small distortions are always present in actual fabricated structures, we perform a high accuracy numerical study for BICs and resonances on a simple periodic structure with small distortions, i.e., periodic arrays of slightly noncircular cylinders. Our numerical results confirm that symmetries are important not only for the so-called symmetry-protected BICs, but also for the majority of propagating BICs which do not have a symmetry mismatch with the outgoing radiation waves. Typically, the BICs continue to exist if the small distortions keep the relevant symmetries, and they become resonant modes with finite quality factors if the small distortions break a required symmetry.
The Construction of 3-d Neutral Density for Arbitrary Data Sets
NASA Astrophysics Data System (ADS)
Riha, S.; McDougall, T. J.; Barker, P. M.
2014-12-01
The Neutral Density variable allows inference of water pathways from thermodynamic properties in the global ocean, and is therefore an essential component of global ocean circulation analysis. The widely used algorithm for the computation of Neutral Density yields accurate results for data sets which are close to the observed climatological ocean. Long-term numerical climate simulations, however, often generate a significant drift from present-day climate, which renders the existing algorithm inaccurate. To remedy this problem, new algorithms which operate on arbitrary data have been developed, which may potentially be used to compute Neutral Density during runtime of a numerical model.We review existing approaches for the construction of Neutral Density in arbitrary data sets, detail their algorithmic structure, and present an analysis of the computational cost for implementations on a single-CPU computer. We discuss possible strategies for the implementation in state-of-the-art numerical models, with a focus on distributed computing environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Peng; Fan, Zheng, E-mail: ZFAN@ntu.edu.sg; Zhou, Yu
2016-07-15
Nonlinear guided waves have been investigated widely in simple geometries, such as plates, pipe and shells, where analytical solutions have been developed. This paper extends the application of nonlinear guided waves to waveguides with arbitrary cross sections. The criteria for the existence of nonlinear guided waves were summarized based on the finite deformation theory and nonlinear material properties. Numerical models were developed for the analysis of nonlinear guided waves in complex geometries, including nonlinear Semi-Analytical Finite Element (SAFE) method to identify internal resonant modes in complex waveguides, and Finite Element (FE) models to simulate the nonlinear wave propagation at resonantmore » frequencies. Two examples, an aluminum plate and a steel rectangular bar, were studied using the proposed numerical model, demonstrating the existence of nonlinear guided waves in such structures and the energy transfer from primary to secondary modes.« less
The Implicit Function Theorem and Non-Existence of Limit of Functions of Several Variables
ERIC Educational Resources Information Center
dos Santos, A. L. C.; da Silva, P. N.
2008-01-01
We use the Implicit Function Theorem to establish a result of non-existence of limit to a certain class of functions of several variables. We consider functions given by quotients such that both the numerator and denominator functions are null at the limit point. We show that the non-existence of the limit of such function is related with the…
NASA Technical Reports Server (NTRS)
Park, Sang C.; Brinckerhoff, Pamela; Franck, Randy; Schweickart, Rusty; Thomson, Shaun; Burt, Bill; Ousley, Wes
2016-01-01
The James Webb Space Telescope (JWST) Optical Telescope Element (OTE) assembly is the largest optically stable infrared-optimized telescope currently being manufactured and assembled, and scheduled for launch in 2018. The JWST OTE, including the primary mirrors, secondary mirror, and the Aft Optics Subsystems (AOS) are designed to be passively cooled and operate at near 45 degrees Kelvin. Due to the size of its large sunshield in relation to existing test facilities, JWST cannot be optically or thermally tested as a complete observatory-level system at flight temperatures. As a result, the telescope portion along with its instrument complement will be tested as a single unit very late in the program, and on the program schedule critical path. To mitigate schedule risks, a set of 'pathfinder' cryogenic tests will be performed to reduce program risks by demonstrating the optical testing capabilities of the facility, characterizing telescope thermal performance, and allowing project personnel to learn valuable testing lessons off-line. This paper describes the 'pathfinder' cryogenic test program, focusing on the recently completed second test in the series called the Optical Ground Support Equipment 2 (OGSE2) test. The JWST OGSE2 was successfully completed within the allocated project schedule while faced with numerous conflicting thermal requirements during cool-down to the final cryogenic operational temperatures, and during warm-up after the cryo-stable optical tests. The challenges include developing a pre-test cool-down and warm-up profiles without a reliable method to predict the thermal behaviors in a rarified helium environment, and managing the test article hardware safety driven by the project Limits and Constraints (L&C's). Furthermore, OGSE2 test included the time critical Aft Optics Subsystem (AOS), a part of the flight Optical Telescope Element that would need to be placed back in the overall telescope assembly integrations. The OGSE2 test requirements included the strict adherence of the project contamination controls due to the presence of the contamination sensitive flight optical elements. The test operations required close coordination of numerous personnel while they being exposed and trained for the 'final' combined OTE and instrument cryo-test in 2017. This paper will also encompass the OGSE2 thermal data look-back review.
Revisiting Isotherm Analyses Using R: Comparison of Linear, Non-linear, and Bayesian Techniques
Extensive adsorption isotherm data exist for an array of chemicals of concern on a variety of engineered and natural sorbents. Several isotherm models exist that can accurately describe these data from which the resultant fitting parameters may subsequently be used in numerical ...
NASA Astrophysics Data System (ADS)
Gotz, M.; Karsch, L.; Pawelke, J.
2017-11-01
In order to describe the volume recombination in a pulsed radiation field of high dose-per-pulse this study presents a numerical solution of a 1D transport model of the liberated charges in a plane-parallel ionization chamber. In addition, measurements were performed on an Advanced Markus ionization chamber in a pulsed electron beam to obtain suitable data to test the calculation. The experiment used radiation pulses of 4 μs duration and variable dose-per-pulse values up to about 1 Gy, as well as pulses of variable duration up to 308 μs at constant dose-per-pulse values between 85 mGy and 400 mGy. Those experimental data were compared to the developed numerical model and existing descriptions of volume recombination. At low collection voltages the observed dose-per-pulse dependence of volume recombination can be approximated by the existing theory using effective parameters. However, at high collection voltages large discrepancies are observed. The developed numerical model shows much better agreement with the observations and is able to replicate the observed behavior over the entire range of dose-per-pulse values and collection voltages. Using the developed numerical model, the differences between observation and existing theory are shown to be the result of a large fraction of the charge being collected as free electrons and the resultant distortion of the electric field inside the chamber. Furthermore, the numerical solution is able to calculate recombination losses for arbitrary pulse durations in good agreement with the experimental data, an aspect not covered by current theory. Overall, the presented numerical solution of the charge transport model should provide a more flexible tool to describe volume recombination for high dose-per-pulse values as well as for arbitrary pulse durations and repetition rates.
Gotz, M; Karsch, L; Pawelke, J
2017-11-01
In order to describe the volume recombination in a pulsed radiation field of high dose-per-pulse this study presents a numerical solution of a 1D transport model of the liberated charges in a plane-parallel ionization chamber. In addition, measurements were performed on an Advanced Markus ionization chamber in a pulsed electron beam to obtain suitable data to test the calculation. The experiment used radiation pulses of 4 μs duration and variable dose-per-pulse values up to about 1 Gy, as well as pulses of variable duration up to 308 [Formula: see text] at constant dose-per-pulse values between 85 mGy and 400 mGy. Those experimental data were compared to the developed numerical model and existing descriptions of volume recombination. At low collection voltages the observed dose-per-pulse dependence of volume recombination can be approximated by the existing theory using effective parameters. However, at high collection voltages large discrepancies are observed. The developed numerical model shows much better agreement with the observations and is able to replicate the observed behavior over the entire range of dose-per-pulse values and collection voltages. Using the developed numerical model, the differences between observation and existing theory are shown to be the result of a large fraction of the charge being collected as free electrons and the resultant distortion of the electric field inside the chamber. Furthermore, the numerical solution is able to calculate recombination losses for arbitrary pulse durations in good agreement with the experimental data, an aspect not covered by current theory. Overall, the presented numerical solution of the charge transport model should provide a more flexible tool to describe volume recombination for high dose-per-pulse values as well as for arbitrary pulse durations and repetition rates.
Meta-synthesis of qualitative research: the challenges and opportunities.
Mohammed, Mohammed A; Moles, Rebekah J; Chen, Timothy F
2016-06-01
Synthesis of qualitative studies is an emerging area that has been gaining more interest as an important source of evidence for improving health care policy and practice. In the last decade there have been numerous attempts to develop methods of aggregating and synthesizing qualitative data. Although numerous empirical qualitative studies have been published about different aspects of health care research, to date, the aggregation and syntheses of these data has not been commonly reported, particularly in pharmacy practice related research. This paper describes different methods of conducting meta-synthesis and provides an overview of selected common methods. The paper also emphasizes the challenges and opportunities associated with conducting meta-synthesis and highlights the importance of meta-synthesis in informing practice, policy and research.
Challenging the paradigm of singularity excision in gravitational collapse.
Baiotti, Luca; Rezzolla, Luciano
2006-10-06
A paradigm deeply rooted in modern numerical relativity calculations prescribes the removal of those regions of the computational domain where a physical singularity may develop. We here challenge this paradigm by performing three-dimensional simulations of the collapse of uniformly rotating stars to black holes without excision. We show that this choice, combined with suitable gauge conditions and the use of minute numerical dissipation, improves dramatically the long-term stability of the evolutions. In turn, this allows for the calculation of the waveforms well beyond what was previously possible, providing information on the black-hole ringing and setting a new mark on the present knowledge of the gravitational-wave emission from the stellar collapse to a rotating black hole.
2007-02-01
and give advice, whether of the scientific or personal kind. She was sensitive to the stresses, challenges , and joys of graduate school, and always...lives and form a family! The addition of Morgen Peet mid-way during my studies was a gift and a challenge . While some would say it is easier to get...limited by numerous logistical and ethical challenges . Marine mammals are protected in the United States by the Endangered Species Act and the Marine
ERIC Educational Resources Information Center
Cepeda, Francisco Javier Delgado
2017-01-01
This work presents a proposed model in blended learning for a numerical methods course evolved from traditional teaching into a research lab in scientific visualization. The blended learning approach sets a differentiated and flexible scheme based on a mobile setup and face to face sessions centered on a net of research challenges. Model is…
On the Reconstruction of Palaeo-Ice Sheets: Recent Advances and Future Challenges
NASA Technical Reports Server (NTRS)
Stokes, Chris R.; Tarasov, Lev; Blomdin, Robin; Cronin, Thomas M.; Fisher, Timothy G.; Gyllencreutz, Richard; Hattestrand, Clas; Heyman, Jacob; Hindmarsh, Richard C. A.; Hughes, Anna L. C.;
2015-01-01
Reconstructing the growth and decay of palaeo-ice sheets is critical to understanding mechanisms of global climate change and associated sea-level fluctuations in the past, present and future. The significance of palaeo-ice sheets is further underlined by the broad range of disciplines concerned with reconstructing their behaviour, many of which have undergone a rapid expansion since the 1980s. In particular, there has been a major increase in the size and qualitative diversity of empirical data used to reconstruct and date ice sheets, and major improvements in our ability to simulate their dynamics in numerical ice sheet models. These developments have made it increasingly necessary to forge interdisciplinary links between sub-disciplines and to link numerical modelling with observations and dating of proxy records. The aim of this paper is to evaluate recent developments in the methods used to reconstruct ice sheets and outline some key challenges that remain, with an emphasis on how future work might integrate terrestrial and marine evidence together with numerical modelling. Our focus is on pan-ice sheet reconstructions of the last deglaciation, but regional case studies are used to illustrate methodological achievements, challenges and opportunities. Whilst various disciplines have made important progress in our understanding of ice-sheet dynamics, it is clear that data-model integration remains under-used, and that uncertainties remain poorly quantified in both empirically-based and numerical ice-sheet reconstructions. The representation of past climate will continue to be the largest source of uncertainty for numerical modelling. As such, palaeo-observations are critical to constrain and validate modelling. State-of-the-art numerical models will continue to improve both in model resolution and in the breadth of inclusion of relevant processes, thereby enabling more accurate and more direct comparison with the increasing range of palaeo-observations. Thus, the capability is developing to use all relevant palaeo-records to more strongly constrain deglacial (and to a lesser extent pre-LGM) ice sheet evolution. In working towards that goal, the accurate representation of uncertainties is required for both constraint data and model outputs. Close cooperation between modelling and data-gathering communities is essential to ensure this capability is realised and continues to progress.
On the reconstruction of palaeo-ice sheets: Recent advances and future challenges
Stokes, Chris R.; Tarasov, Lev; Blomdin, Robin; Cronin, Thomas M.; Fisher, Timothy G.; Gyllencreutz, Richard; Hattestrand, Clas; Heyman, Jakob; Hindmarsh, Richard C. A.; Hughes, Anna L. C.; Jakobsson, Martin; Kirchner, Nina; Livingstone, Stephen J.; Margold, Martin; Murton, Julian B.; Noormets, Riko; Peltier, W. Richard; Peteet, Dorothy M.; Piper, David J. W.; Preusser, Frank; Renssen, Hans; Roberts, David H.; Roche, Didier M.; Saint-Ange, Francky; Stroeven, Arjen P.; Teller, James T.
2015-01-01
Reconstructing the growth and decay of palaeo-ice sheets is critical to understanding mechanisms of global climate change and associated sea-level fluctuations in the past, present and future. The significance of palaeo-ice sheets is further underlined by the broad range of disciplines concerned with reconstructing their behaviour, many of which have undergone a rapid expansion since the 1980s. In particular, there has been a major increase in the size and qualitative diversity of empirical data used to reconstruct and date ice sheets, and major improvements in our ability to simulate their dynamics in numerical ice sheet models. These developments have made it increasingly necessary to forge interdisciplinary links between sub-disciplines and to link numerical modelling with observations and dating of proxy records. The aim of this paper is to evaluate recent developments in the methods used to reconstruct ice sheets and outline some key challenges that remain, with an emphasis on how future work might integrate terrestrial and marine evidence together with numerical modelling. Our focus is on pan-ice sheet reconstructions of the last deglaciation, but regional case studies are used to illustrate methodological achievements, challenges and opportunities. Whilst various disciplines have made important progress in our understanding of ice-sheet dynamics, it is clear that data-model integration remains under-used, and that uncertainties remain poorly quantified in both empirically-based and numerical ice-sheet reconstructions. The representation of past climate will continue to be the largest source of uncertainty for numerical modelling. As such, palaeo-observations are critical to constrain and validate modelling. State-of-the-art numerical models will continue to improve both in model resolution and in the breadth of inclusion of relevant processes, thereby enabling more accurate and more direct comparison with the increasing range of palaeo-observations. Thus, the capability is developing to use all relevant palaeo-records to more strongly constrain deglacial (and to a lesser extent pre-LGM) ice sheet evolution. In working towards that goal, the accurate representation of uncertainties is required for both constraint data and model outputs. Close cooperation between modelling and data-gathering communities is essential to ensure this capability is realised and continues to progress.
5-(2-aminopropyl)indole: a new player in the drama of 'legal highs' alerts the community.
Katselou, Maria; Papoutsis, Ioannis; Nikolaou, Panagiota; Spiliopoulou, Chara; Athanaselis, Sotiris
2015-01-01
5-(2-aminopropyl)indole (5-IT) is a new psychoactive substance, a 'legal high', that recently invaded the drug arena in Europe and has already led to numerous intoxications and fatalities. Knowledge upon its pharmacology and toxicity is non-existent or restricted; the only available information involves very few published scientific articles, official reports from the European Monitoring Centre for Drugs and Drug Addiction and drug abusers' experiences expressed in online drug forums. A review of the existing knowledge on 5-IT is reported, concerning its chemistry and synthesis, its pharmacological and toxicological aspects, as well as information concerning the fatal and toxic consequences of its use. The existing methodologies for the determination of 5-IT in biological and seized samples as well as its legal status are also presented. All the relative data were gathered through a detailed search of PubMed and the Internet. No original studies have investigated and/or confirmed its pharmacological properties, acute and chronic toxicity, physiological and behavioural effects or the dependence potential of the drug. Thus, it is difficult to specify the physical effects of 5-IT in humans. This drug is a phenomenon with global significance for public health as its use can lead to intoxication and fatalities. Significant information on 5-IT is provided for pharmacologists, toxicologists, forensic pathologists and regulatory authorities. 5-IT is a current public health challenge. Better international collaboration, effective legislation and continuous community alertness are needed to tackle this current growing phenomenon. © 2014 Australasian Professional Society on Alcohol and other Drugs.
Challenges and Demands on Automated Software Revision
NASA Technical Reports Server (NTRS)
Bonakdarpour, Borzoo; Kulkarni, Sandeep S.
2008-01-01
In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.
Stress estimation from borehole scans for prediction of excavation overbreak in brittle rock
NASA Astrophysics Data System (ADS)
LeRiche, Andrew Campbell
In the field of geomechanics, one of the most important considerations during design is the state of stress that exists at the location of a project. Despite this, stress is often the most poorly understood site characteristic, given the current challenges in accurately measuring it. This stems from the fact that stress can't be directly measured, but must be inferred by disturbing the rockmass and recording its response. Although some methods do exist for the prediction of in situ stress, this only provides a point estimate and is often plagued with uncertain results and practical limitations in the field. This research proposes a methodology of continuously predicting stress along a borehole through the back analysis of borehole breakout and how this same approach could be employed to predict excavation overbreak. KGHM's Victoria Project in Sudbury, Canada, was the location of data collection, which firstly involved site characterization through common geotechnical core logging procedures and laboratory scale intact core testing. Testing comprised Brazilian tensile strength and unconfined compressive strength testing, which involved the characterization of crack accumulation in both cases. From two pilot holes, acoustic televiewer surveys were completed to characterize the occurrence and geometry of breakout. This was done to predict the orientation of major principal stresses in the horizontal axis, with the results being further validated by the geometry of stress-induced core disking. From the lab material properties and breakout geometries, a continuum based, back analysis of breakout was done through the creation of a generic database of stress dependent numerical models. When compared with the in situ breakout profiles, this created an estimate of stress as a function of depth along each hole. The consideration of the presence of borehole fluid on the estimate of stress was also made. This provided the upper-bound estimate of stress from this methodology. Given the generic nature of the numerical models, potential shaft overbreak was also assessed using this technique and from the previously described estimate of stress.
Seismic behavior of an Italian Renaissance Sanctuary: Damage assessment by numerical modelling
NASA Astrophysics Data System (ADS)
Clementi, Francesco; Nespeca, Andrea; Lenci, Stefano
2016-12-01
The paper deals with modelling and analysis of architectural heritage through the discussion of an illustrative case study: the Medieval Sanctuary of Sant'Agostino (Offida, Italy). Using the finite element technique, a 3D numerical model of the sanctuary is built, and then used to identify the main sources of the damages. The work shows that advanced numerical analyses could offer significant information for the understanding of the causes of existing damage and, more generally, on the seismic vulnerability.
Environmental cleanup: The challenge at the Hanford Site, Washington, USA
NASA Astrophysics Data System (ADS)
Gray, Robert H.; Becker, C. Dale
1993-07-01
Numerous challenges face those involved with developing a coordinated and consistent approach to cleaning up the US Department of Energy’s (DOE) Hanford Site in southeastern Washington. These challenges are much greater than those encountered when the site was selected and the world’s first nuclear complex was developed almost 50 years ago. This article reviews Hanford’s history, operations, waste storage/disposal activities, environmental monitoring, and today’s approach to characterize and clean up Hanford under a Federal Facility Agreement and Consent Order, signed by DOE, the Environmental Protection Agency, and the Washington Sate Department of Ecology. Although cleanup of defense-related waste at Hanford holds many positive benefits, negative features include high costs to the US taxpayer, numerous uncertainties concerning the technologies to be employed and the risks involved, and the high probability that special interest groups and activists at large will never be completely satisfied. Issues concerning future use of the site, whether to protect and preserve its natural features or open it to public exploitation, remain to be resolved.
Numerical simulations of merging black holes for gravitational-wave astronomy
NASA Astrophysics Data System (ADS)
Lovelace, Geoffrey
2014-03-01
Gravitational waves from merging binary black holes (BBHs) are among the most promising sources for current and future gravitational-wave detectors. Accurate models of these waves are necessary to maximize the number of detections and our knowledge of the waves' sources; near the time of merger, the waves can only be computed using numerical-relativity simulations. For optimal application to gravitational-wave astronomy, BBH simulations must achieve sufficient accuracy and length, and all relevant regions of the BBH parameter space must be covered. While great progress toward these goals has been made in the almost nine years since BBH simulations became possible, considerable challenges remain. In this talk, I will discuss current efforts to meet these challenges, and I will present recent BBH simulations produced using the Spectral Einstein Code, including a catalog of publicly available gravitational waveforms [black-holes.org/waveforms]. I will also discuss simulations of merging black holes with high mass ratios and with spins nearly as fast as possible, the most challenging regions of the BBH parameter space.
Parallel Index and Query for Large Scale Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Ruebel, Oliver
2011-07-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing ofmore » a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.« less
Modeling the Effects of Turbulence in Rotating Detonation Engines
NASA Astrophysics Data System (ADS)
Towery, Colin; Smith, Katherine; Hamlington, Peter; van Schoor, Marthinus; TESLa Team; Midé Team
2014-03-01
Propulsion systems based on detonation waves, such as rotating and pulsed detonation engines, have the potential to substantially improve the efficiency and power density of gas turbine engines. Numerous technical challenges remain to be solved in such systems, however, including obtaining more efficient injection and mixing of air and fuels, more reliable detonation initiation, and better understanding of the flow in the ejection nozzle. These challenges can be addressed using numerical simulations. Such simulations are enormously challenging, however, since accurate descriptions of highly unsteady turbulent flow fields are required in the presence of combustion, shock waves, fluid-structure interactions, and other complex physical processes. In this study, we performed high-fidelity three dimensional simulations of a rotating detonation engine and examined turbulent flow effects on the operation, performance, and efficiency of the engine. Along with experimental data, these simulations were used to test the accuracy of commonly-used Reynolds averaged and subgrid-scale turbulence models when applied to detonation engines. The authors gratefully acknowledge the support of the Defense Advanced Research Projects Agency (DARPA).
Challenging High-Ability Students
ERIC Educational Resources Information Center
Scager, Karin; Akkerman, Sanne F.; Pilot, Albert; Wubbels, Theo
2014-01-01
The existing literature on indicators of an optimal learning environment for high-ability students frequently discusses the concept of challenge. It is, however, not clear what, precisely, constitutes appropriate challenge for these students. In this study, the authors examined an undergraduate honours course, Advanced Cell Biology, which has…
Reflections on Graduate Student PBL Experiences
ERIC Educational Resources Information Center
McDonald, Betty
2008-01-01
The study designed to contribute to existing research on Problem-Based Learning (PBL) chose a focus group comprising 16 MSc. Petroleum Engineering students (six females). Using PBL as the method of instruction, students examined a real-life petroleum engineering problem that highlighted numerous areas of their existing curriculum. They worked in…
This paper employs analytical and numerical general equilibrium models to examine the significance of pre-existing factor taxes for the costs of pollution reduction under a wide range of environmental policy instruments. Pre-existing taxes imply significantly ...
USDA-ARS?s Scientific Manuscript database
Although there exist numerous research studies in the literature on greenhouse gas emission and groundwater pollution potentials of soils amended with plant-based biochar made from traditional dry pyrolysis (hereafter referred as pyrochar), a very few such studies exist for hydrochar made from hydro...
Unresolved Issues and New Challenges in Teaching English to Young Learners: The Case of South Korea
ERIC Educational Resources Information Center
Garton, Sue
2014-01-01
The introduction of languages, especially English, into the primary curriculum around the world has been one of the major language-in-education policy developments in recent years. In countries where English has been compulsory for a number of years, the question arises as to what extent the numerous and well-documented challenges faced by the…
ERIC Educational Resources Information Center
Riddick, Francine Piscitelli
2009-01-01
Large school districts face a number of challenges due to their sheer size. One of these challenges involves staffing the role of the principal. With Baby Boomers reaching retirement age, large school districts, especially those experiencing growth, have to fill numerous leadership positions. In order to fill these positions efficiently and…
Preparing the Teacher to Meet the Challenges of a Changing World
ERIC Educational Resources Information Center
Okogbaa, Veronica
2017-01-01
In this 21st Century there has been changes in almost all the aspects of human endeavour. This has created numerous challenges which need to be tackled by the current educational systems of all nations. If robust plans are not put in place to educate the upcoming generation to effectively function and develop their societies, continuity of the…
ERIC Educational Resources Information Center
Dove, Laura R.; Bryant, Natalie P.
2016-01-01
The purpose of this article is to outline the unique challenges faced by international students enrolled in business law or legal environment of business courses. It is also imperative to recognize the numerous opportunities that instructors can create in business law classrooms that will enhance the experience of all students given the…
Ambulatory training in neurology education.
Lukas, Rimas V; Blood, Angela D; Brorson, James R; Albert, Dara V F
2017-01-15
Much of the care provided by practicing neurologists takes place in outpatient clinics. However, neurology trainees often have limited exposure to this setting. Adequate incorporation of outpatient care in neurology training is vital; however it is often hampered by numerous challenges. We detail a number of these challenges and suggest potential means for improvement. Copyright © 2016 Elsevier B.V. All rights reserved.
The Impact of Bullying on School Performance in Six Selected Schools in South Carolina
ERIC Educational Resources Information Center
Cooper, Stephanie A.
2011-01-01
The nation's K-12 schools are faced with numerous critical challenges, such as elevating academic achievement, and meeting No Child Left Behind state standards (Kowalski et al., 2008). But bullying in schools is becoming one of the most challenging issues that school personnel are encountering. In a Stanford University, study it was revealed that…
Numerical Assessment of Rockbursting.
1987-05-27
static equilibrium, nonlinear elasticity, strain-softening • material , unstable propagation of pre-existing cracks , and finally - surface...structure of LINOS, which is common to most of the large finite element codes, the library of element and material subroutines can be easily expanded... material model subroutines , are tested by comparing finite element results with analytical or numerical results derived for hypo-elastic and
NASA Astrophysics Data System (ADS)
Holt, Jason; Icarus Allen, J.; Anderson, Thomas R.; Brewin, Robert; Butenschön, Momme; Harle, James; Huse, Geir; Lehodey, Patrick; Lindemann, Christian; Memery, Laurent; Salihoglu, Baris; Senina, Inna; Yool, Andrew
2014-12-01
It has long been recognised that there are strong interactions and feedbacks between climate, upper ocean biogeochemistry and marine food webs, and also that food web structure and phytoplankton community distribution are important determinants of variability in carbon production and export from the euphotic zone. Numerical models provide a vital tool to explore these interactions, given their capability to investigate multiple connected components of the system and the sensitivity to multiple drivers, including potential future conditions. A major driver for ecosystem model development is the demand for quantitative tools to support ecosystem-based management initiatives. The purpose of this paper is to review approaches to the modelling of marine ecosystems with a focus on the North Atlantic Ocean and its adjacent shelf seas, and to highlight the challenges they face and suggest ways forward. We consider the state of the art in simulating oceans and shelf sea physics, planktonic and higher trophic level ecosystems, and look towards building an integrative approach with these existing tools. We note how the different approaches have evolved historically and that many of the previous obstacles to harmonisation may no longer be present. We illustrate this with examples from the on-going and planned modelling effort in the Integrative Modelling Work Package of the EURO-BASIN programme.
O'Flaherty, Jacqueline A; Laws, Thomas A
2014-11-01
Face-to-face communication with students remains the gold standard in teaching; the effectiveness of this approach to learning is commonly and regularly assessed by students' evaluation of teaching and peer reviews of teaching. Critics note that increases in on-line education are driven more by economic forces than consistent evidence to show their long-term effectiveness or acceptance by students. Numerous studies report that students in higher education found their external studies comparatively more challenging than face-to-face delivery. Identifying how educators might best provide sufficient and effective personal support for students studying in the external mode continues to challenge educators. Opportunities do exist for blending on-line course work with synchronous interactions between students and their teachers but evaluations of these innovations rarely appear in the literature. In this study, a web-based virtual classroom simulated the synchronous face-to-face discussions that occur between Bachelor of Nursing students and tutors. First year students enrolled externally in a biological science course interacted in a virtual classroom for 13 weeks completing an 'evaluation of experience' survey following their final assessment. A comparison was made between 'on-campus' and 'external to campus' students to determine the relationship between i) overall satisfaction with the course and ii) final grades, as well as their experience of the virtual class. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
Collecting behavioural data using the world wide web: considerations for researchers
Rhodes, S; Bowie, D; Hergenrather, K
2003-01-01
Objective: To identify and describe advantages, challenges, and ethical considerations of web based behavioural data collection. Methods: This discussion is based on the authors' experiences in survey development and study design, respondent recruitment, and internet research, and on the experiences of others as found in the literature. Results: The advantages of using the world wide web to collect behavioural data include rapid access to numerous potential respondents and previously hidden populations, respondent openness and full participation, opportunities for student research, and reduced research costs. Challenges identified include issues related to sampling and sample representativeness, competition for the attention of respondents, and potential limitations resulting from the much cited "digital divide", literacy, and disability. Ethical considerations include anonymity and privacy, providing and substantiating informed consent, and potential risks of malfeasance. Conclusions: Computer mediated communications, including electronic mail, the world wide web, and interactive programs will play an ever increasing part in the future of behavioural science research. Justifiable concerns regarding the use of the world wide web in research exist, but as access to, and use of, the internet becomes more widely and representatively distributed globally, the world wide web will become more applicable. In fact, the world wide web may be the only research tool able to reach some previously hidden population subgroups. Furthermore, many of the criticisms of online data collection are common to other survey research methodologies. PMID:12490652
Jones, Deborah J.
2013-01-01
Treatment outcome research with children and adolescents has progressed to such an extent that numerous handbooks have been devoted to reviewing and summarizing the evidence base. Ensuring that consumers of these advancements in state-of-the-field interventions have the opportunity to access, engage in, and benefit from this evidence-base, however, has been wrought with challenge. As such, much discussion exists about innovative strategies for overcoming the gap between research and practice; yet, no other potential solution that has received more attention in both the popular and academic press than technology. The promise of technology is not surprising given the fast-paced evolution in development and, in turn, a seemingly endless range of possibilities for novel service delivery platforms. Yet, this is precisely the most formidable challenge threatening to upset the very promise of this potential solution: The rate of emerging technologies is far outpacing the field’s capacity to demonstrate the conceptual or empirical benefits of such an approach. Accordingly, this paper aims to provide a series of recommendations that better situate empirical enquiry at the core of a collaborative development, testing, and deployment process that must define this line of work if the promise of mental health technologies is going to be a reality for front-line clinicians and the clients they serve. PMID:24400723
Can beaches survive climate change?
Vitousek, Sean; Barnard, Patrick L.; Limber, Patrick W.
2017-01-01
Anthropogenic climate change is driving sea level rise, leading to numerous impacts on the coastal zone, such as increased coastal flooding, beach erosion, cliff failure, saltwater intrusion in aquifers, and groundwater inundation. Many beaches around the world are currently experiencing chronic erosion as a result of gradual, present-day rates of sea level rise (about 3 mm/year) and human-driven restrictions in sand supply (e.g., harbor dredging and river damming). Accelerated sea level rise threatens to worsen coastal erosion and challenge the very existence of natural beaches throughout the world. Understanding and predicting the rates of sea level rise and coastal erosion depends on integrating data on natural systems with computer simulations. Although many computer modeling approaches are available to simulate shoreline change, few are capable of making reliable long-term predictions needed for full adaption or to enhance resilience. Recent advancements have allowed convincing decadal to centennial-scale predictions of shoreline evolution. For example, along 500 km of the Southern California coast, a new model featuring data assimilation predicts that up to 67% of beaches may completely erode by 2100 without large-scale human interventions. In spite of recent advancements, coastal evolution models must continue to improve in their theoretical framework, quantification of accuracy and uncertainty, computational efficiency, predictive capability, and integration with observed data, in order to meet the scientific and engineering challenges produced by a changing climate.
Collecting behavioural data using the world wide web: considerations for researchers.
Rhodes, S D; Bowie, D A; Hergenrather, K C
2003-01-01
To identify and describe advantages, challenges, and ethical considerations of web based behavioural data collection. This discussion is based on the authors' experiences in survey development and study design, respondent recruitment, and internet research, and on the experiences of others as found in the literature. The advantages of using the world wide web to collect behavioural data include rapid access to numerous potential respondents and previously hidden populations, respondent openness and full participation, opportunities for student research, and reduced research costs. Challenges identified include issues related to sampling and sample representativeness, competition for the attention of respondents, and potential limitations resulting from the much cited "digital divide", literacy, and disability. Ethical considerations include anonymity and privacy, providing and substantiating informed consent, and potential risks of malfeasance. Computer mediated communications, including electronic mail, the world wide web, and interactive programs will play an ever increasing part in the future of behavioural science research. Justifiable concerns regarding the use of the world wide web in research exist, but as access to, and use of, the internet becomes more widely and representatively distributed globally, the world wide web will become more applicable. In fact, the world wide web may be the only research tool able to reach some previously hidden population subgroups. Furthermore, many of the criticisms of online data collection are common to other survey research methodologies.
Peyrard, N; Dieckmann, U; Franc, A
2008-05-01
Models of infectious diseases are characterized by a phase transition between extinction and persistence. A challenge in contemporary epidemiology is to understand how the geometry of a host's interaction network influences disease dynamics close to the critical point of such a transition. Here we address this challenge with the help of moment closures. Traditional moment closures, however, do not provide satisfactory predictions close to such critical points. We therefore introduce a new method for incorporating longer-range correlations into existing closures. Our method is technically simple, remains computationally tractable and significantly improves the approximation's performance. Our extended closures thus provide an innovative tool for quantifying the influence of interaction networks on spatially or socially structured disease dynamics. In particular, we examine the effects of a network's clustering coefficient, as well as of new geometrical measures, such as a network's square clustering coefficients. We compare the relative performance of different closures from the literature, with or without our long-range extension. In this way, we demonstrate that the normalized version of the Bethe approximation-extended to incorporate long-range correlations according to our method-is an especially good candidate for studying influences of network structure. Our numerical results highlight the importance of the clustering coefficient and the square clustering coefficient for predicting disease dynamics at low and intermediate values of transmission rate, and demonstrate the significance of path redundancy for disease persistence.
2014-03-01
searching existing data sources, gathering and maintaining the data needed , and completing and reviewing the collection of information. Send...facility, personnel, and technology needs exist. Consolidation leads to more focused, institutionalized quality control and service improvement...and the numerous variables that exist between 9-1-1 center budgets. Further research is needed to accurately quantify these pre- and post
Educating Transformational Leaders in Mexico at Universidad De Monterrey
ERIC Educational Resources Information Center
Cantón, Alicia
2016-01-01
Mexico faces numerous social, economic, and political challenges. Higher education institutions provide opportunity for change by educating socially responsible leaders to become civically engaged citizens.
Gloyd, Stephen; Wagenaar, Bradley H; Woelk, Godfrey B; Kalibala, Samuel
2016-01-01
HIV programme data from routine health information systems (RHIS) and personal health information (PHI) provide ample opportunities for secondary data analysis. However, these data pose unique opportunities and challenges for use in health system monitoring, along with process and impact evaluations. Analyses focused on retrospective case reviews of four of the HIV-related studies published in this JIAS supplement. We identify specific opportunities and challenges with respect to the secondary analysis of RHIS and PHI data. Challenges working with both HIV-related RHIS and PHI included missing, inconsistent and implausible data; rapidly changing indicators; systematic differences in the utilization of services; and patient linkages over time and different data sources. Specific challenges among RHIS data included numerous registries and indicators, inconsistent data entry, gaps in data transmission, duplicate registry of information, numerator-denominator incompatibility and infrequent use of data for decision-making. Challenges specific to PHI included the time burden for busy providers, the culture of lax charting, overflowing archives for paper charts and infrequent chart review. Many of the challenges that undermine effective use of RHIS and PHI data for analyses are related to the processes and context of collecting the data, excessive data requirements, lack of knowledge of the purpose of data and the limited use of data among those generating the data. Recommendations include simplifying data sources, analysis and reporting; conducting systematic data quality audits; enhancing the use of data for decision-making; promoting routine chart review linked with simple patient tracking systems; and encouraging open access to RHIS and PHI data for increased use.
2011-11-30
OH: South- Western Cengage Learning. Mankiw , N. G. (2006). Principles of economics (4th ed.). Mason, OH: Thompson South- Western. Private...When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic times, it is now more...decision uncertainties. When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic
Boundary enhanced effects on the existence of quadratic solitons
NASA Astrophysics Data System (ADS)
Chen, Manna; Zhang, Ting; Li, Wenjie; Lu, Daquan; Guo, Qi; Hu, Wei
2018-05-01
We investigate, both analytically and numerically, the boundary enhanced effects exerted on the quadratic solitons consisting of fundamental waves and oscillatory second harmonics in the presence of boundary conditions. The nonlocal analogy predicts that the soliton for fundamental wave is supported by the balance between equivalent nonlinear confinement and diffraction (or dispersion). Under Snyder and Mitchell's strongly nonlocal approximation, we obtain the analytical soliton solutions both with and without the boundary conditions to show the impact of boundary conditions. We can distinguish explicitly the nonlinear confinement between the second harmonic mutual interaction and the enhanced effects caused by remote boundaries. Those boundary enhanced effects on the existence of solitons can be positive or negative, which depend on both sample size and nonlocal parameter. The piecewise existence regime of solitons can be explained analytically. The analytical soliton solutions are verified by the numerical ones and the discrepancy between them is also discussed.
Integration of existing systematic reviews into new reviews: identification of guidance needs
2014-01-01
Background An exponential increase in the number of systematic reviews published, and constrained resources for new reviews, means that there is an urgent need for guidance on explicitly and transparently integrating existing reviews into new systematic reviews. The objectives of this paper are: 1) to identify areas where existing guidance may be adopted or adapted, and 2) to suggest areas for future guidance development. Methods We searched documents and websites from healthcare focused systematic review organizations to identify and, where available, to summarize relevant guidance on the use of existing systematic reviews. We conducted informational interviews with members of Evidence-based Practice Centers (EPCs) to gather experiences in integrating existing systematic reviews, including common issues and challenges, as well as potential solutions. Results There was consensus among systematic review organizations and the EPCs about some aspects of incorporating existing systematic reviews into new reviews. Current guidance may be used in assessing the relevance of prior reviews and in scanning references of prior reviews to identify studies for a new review. However, areas of challenge remain. Areas in need of guidance include how to synthesize, grade the strength of, and present bodies of evidence composed of primary studies and existing systematic reviews. For instance, empiric evidence is needed regarding how to quality check data abstraction and when and how to use study-level risk of bias assessments from prior reviews. Conclusions There remain areas of uncertainty for how to integrate existing systematic reviews into new reviews. Methods research and consensus processes among systematic review organizations are needed to develop guidance to address these challenges. PMID:24956937
NASA Astrophysics Data System (ADS)
Suryanto, Agus; Darti, Isnani
2017-12-01
In this paper we discuss a fractional order predator-prey model with ratio-dependent functional response. The dynamical properties of this model is analyzed. Here we determine all equilibrium points of this model including their existence conditions and their stability properties. It is found that the model has two type of equilibria, namely the predator-free point and the co-existence point. If there is no co-existence equilibrium, i.e. when the coefficient of conversion from the functional response into the growth rate of predator is less than the death rate of predator, then the predator-free point is asymptotically stable. On the other hand, if the co-existence point exists then this equilibrium is conditionally stable. We also construct a nonstandard Grnwald-Letnikov (NSGL) numerical scheme for the propose model. This scheme is a combination of the Grnwald-Letnikov approximation and the nonstandard finite difference scheme. This scheme is implemented in MATLAB and used to perform some simulations. It is shown that our numerical solutions are consistent with the dynamical properties of our fractional predator-prey model.
Unsteady Flow Simulation: A Numerical Challenge
2003-03-01
drive to convergence the numerical unsteady term. The time marching procedure is based on the approximate implicit Newton method for systems of non...computed through analytical derivatives of S. The linear system stemming from equation (3) is solved at each integration step by the same iterative method...significant reduction of memory usage, thanks to the reduced dimensions of the linear system matrix during the implicit marching of the solution. The
ERIC Educational Resources Information Center
Wanjagi, James K.
2013-01-01
Increasingly, organizations are conducting more Enterprise Resource Planning (ERP) projects in order to promote organizational efficiencies. Meanwhile, minimal research has been conducted on the leadership challenges faced by project managers during the ERP project implementations and how these challenges are managed. The existing project…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.
A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.
The numerical methods for the development of the mixture region in the vapor explosion simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y.; Ohashi, H.; Akiyama, M.
An attempt to numerically simulate the process of the vapor explosion with a general multi-component and multi-dimension code is being challenged. Because of the rapid change of the flow field and extremely nonuniform distribution of the components in the system of the vapor explosion, the numerical divergence and diffusion are subject to occur easily. A dispersed component model and a multiregion scheme, by which these difficulties can be effectively overcome, were proposed. The simulations have been performed for the processes of the premixing and the fragmentation propagation in the vapor explosion.
Dietary biomarkers: advances, limitations and future directions
2012-01-01
The subjective nature of self-reported dietary intake assessment methods presents numerous challenges to obtaining accurate dietary intake and nutritional status. This limitation can be overcome by the use of dietary biomarkers, which are able to objectively assess dietary consumption (or exposure) without the bias of self-reported dietary intake errors. The need for dietary biomarkers was addressed by the Institute of Medicine, who recognized the lack of nutritional biomarkers as a knowledge gap requiring future research. The purpose of this article is to review existing literature on currently available dietary biomarkers, including novel biomarkers of specific foods and dietary components, and assess the validity, reliability and sensitivity of the markers. This review revealed several biomarkers in need of additional validation research; research is also needed to produce sensitive, specific, cost-effective and noninvasive dietary biomarkers. The emerging field of metabolomics may help to advance the development of food/nutrient biomarkers, yet advances in food metabolome databases are needed. The availability of biomarkers that estimate intake of specific foods and dietary components could greatly enhance nutritional research targeting compliance to national recommendations as well as direct associations with disease outcomes. More research is necessary to refine existing biomarkers by accounting for confounding factors, to establish new indicators of specific food intake, and to develop techniques that are cost-effective, noninvasive, rapid and accurate measures of nutritional status. PMID:23237668
Williams, Mark R; McKeown, Andrew; Dexter, Franklin; Miner, James R; Sessler, Daniel I; Vargo, John; Turk, Dennis C; Dworkin, Robert H
2016-01-01
Successful procedural sedation represents a spectrum of patient- and clinician-related goals. The absence of a gold-standard measure of the efficacy of procedural sedation has led to a variety of outcomes being used in clinical trials, with the consequent lack of consistency among measures, making comparisons among trials and meta-analyses challenging. We evaluated which existing measures have undergone psychometric analysis in a procedural sedation setting and whether the validity of any of these measures support their use across the range of procedures for which sedation is indicated. Numerous measures were found to have been used in clinical research on procedural sedation across a wide range of procedures. However, reliability and validity have been evaluated for only a limited number of sedation scales, observer-rated pain/discomfort scales, and satisfaction measures in only a few categories of procedures. Typically, studies only examined 1 or 2 aspects of scale validity. The results are likely unique to the specific clinical settings they were tested in. Certain scales, for example, those requiring motor stimulation, are unsuitable to evaluate sedation for procedures where movement is prohibited (e.g., magnetic resonance imaging scans). Further work is required to evaluate existing measures for procedures for which they were not developed. Depending on the outcomes of these efforts, it might ultimately be necessary to consider measures of sedation efficacy to be procedure specific.
An adult ureterocele complicated by a large stone: A case report.
Atta, Omar N; Alhawari, Hussein H; Murshidi, Muayyad M; Tarawneh, Emad; Murshidi, Mujalli M
2018-01-01
Ureterocele is a cystic dilatation of the lower part of the ureter. It is a congenital anomaly that is associated with other anomalies such as a duplicated system, and other diseases. It poses a great challenge owing to its numerous types and clinical presentations. Its incidence is 1 in every 4000 individuals. One of its presentations in the adult population is the presence of a stone, usually a solitary stone, inside the ureterocele. We are reporting a case of an adult ureterocele complicated by a large calculus; managed endoscopically with transurethral deroofing of the ureterocele followed by cystolitholapaxy. A literature review was also conducted. The pathogenesis of ureteroceles is not well understood, however many proposed mechanisms exist with the incomplete dissolution of chwalla membrane being the most accepted one. The type of ureterocele and age at presentation will help guide the appropriate investigation and management, nevertheless certain goals of treatment should apply to all cases. Adult ureterocele is usually clinically silent but it may co-exist with other conditions such as a ureteral calculus and in these conditions it can be managed endoscopically. Ureteroceles complicated by stones can be effectively managed with endoscopic resection or incision of the ureterocele coupled with stone removal, however long term follow up is required to monitor for hydronephrosis and iatrogenic vesicoureteric reflux. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Risk Classification with an Adaptive Naive Bayes Kernel Machine Model.
Minnier, Jessica; Yuan, Ming; Liu, Jun S; Cai, Tianxi
2015-04-22
Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models.
Palesh, Oxana; Peppone, Luke; Innominato, Pasquale F; Janelsins, Michelle; Jeong, Monica; Sprod, Lisa; Savard, Josee; Rotatori, Max; Kesler, Shelli; Telli, Melinda; Mustian, Karen
2012-01-01
Sleep problems are highly prevalent in cancer patients undergoing chemotherapy. This article reviews existing evidence on etiology, associated symptoms, and management of sleep problems associated with chemotherapy treatment during cancer. It also discusses limitations and methodological issues of current research. The existing literature suggests that subjectively and objectively measured sleep problems are the highest during the chemotherapy phase of cancer treatments. A possibly involved mechanism reviewed here includes the rise in the circulating proinflammatory cytokines and the associated disruption in circadian rhythm in the development and maintenance of sleep dysregulation in cancer patients during chemotherapy. Various approaches to the management of sleep problems during chemotherapy are discussed with behavioral intervention showing promise. Exercise, including yoga, also appear to be effective and safe at least for subclinical levels of sleep problems in cancer patients. Numerous challenges are associated with conducting research on sleep in cancer patients during chemotherapy treatments and they are discussed in this review. Dedicated intervention trials, methodologically sound and sufficiently powered, are needed to test current and novel treatments of sleep problems in cancer patients receiving chemotherapy. Optimal management of sleep problems in patients with cancer receiving treatment may improve not only the well-being of patients, but also their prognosis given the emerging experimental and clinical evidence suggesting that sleep disruption might adversely impact treatment and recovery from cancer. PMID:23486503