Multiwavelength Challenges in the Fermi Era
NASA Technical Reports Server (NTRS)
Thompson, D. J.
2010-01-01
The gamma-ray surveys of the sky by AGILE and the Fermi Gamma-ray Space Telescope offer both opportunities and challenges for multiwavelength and multi-messenger studies. Gamma-ray bursts, pulsars, binary sources, flaring Active Galactic Nuclei, and Galactic transient sources are all phenomena that can best be studied with a wide variety of instruments simultaneously or contemporaneously. From the gamma-ray side, a principal challenge is the latency from the time of an astrophysical event to the recognition of this event in the data. Obtaining quick and complete multiwavelength coverage of gamma-ray sources of interest can be difficult both in terms of logistics and in terms of generating scientific interest.
NASA's EOSDIS Near Term Challenges
NASA Technical Reports Server (NTRS)
Behnke, Jeanne
2018-01-01
Given the long-term requirements, and the rapid pace of information technology and changing expectations of the user community, the ESDIS Project has had to evolve EOSDIS continually over the past three decades. However, many challenges remain. One near-term challenge is the enormous quantity of new data that will need to be managed by the EOSDIS. With the upcoming launch of the latest NASA missions coupled with existing operational missions and field campaigns, EOSDIS can expect to handle as much as 50 petabytes of data per year. In perspective, this is twice the size of the current existing archive, which took over 21 years to collect. Another continuing challenge is the disparate requirements of a diverse science community. Maintaining rigorous long-term data preservation, supporting ease of discovery and access, incorporating user feedback, enabling reanalysis/ reprocessing, and agile integration of new data sources, continue to be the Project's objectives.
The Long Term Agroecosystem Research Network - Shared research strategy
USDA-ARS?s Scientific Manuscript database
Agriculture faces tremendous challenges in meeting multiple societal goals, including a safe and plentiful food supply; climate change adaptation and mitigation; supplying sources of bioenergy; improving water, air, and soil quality; and maintaining biodiversity. The Long Term Agroecosystem Research...
THE DNAPL REMEDIATION CHALLENGE: IS THERE A CASE FOR SOURCE DEPLETION?
Releases of Dense Non-Aqueous Phase Liquids (DNAPLs) at a large number of public and private sector sites in the United States pose significant challenges in site remediation and long-term site management. Extensive contamination of groundwater occurs as a result of significant ...
Making the right long-term prescription for medical equipment financing.
Conbeer, George P
2007-06-01
For hospital financial executives charged with assessing new technologies, obtaining access to sufficient information to support an in-depth analysis can be a daunting challenge. The information should come not only from direct sources, such as the equipment manufacturer, but also from indirect sources, such as leasing companies. A thorough knowledge of financing methods--including tax-exempt bonds, bank debt, standard leasing, tax-exempt leasing, and equipment rental terms-is critical.
Ragweed (Ambrosia) pollen source inventory for Austria.
Karrer, G; Skjøth, C A; Šikoparija, B; Smith, M; Berger, U; Essl, F
2015-08-01
This study improves the spatial coverage of top-down Ambrosia pollen source inventories for Europe by expanding the methodology to Austria, a country that is challenging in terms of topography and the distribution of ragweed plants. The inventory combines annual ragweed pollen counts from 19 pollen-monitoring stations in Austria (2004-2013), 657 geographical observations of Ambrosia plants, a Digital Elevation Model (DEM), local knowledge of ragweed ecology and CORINE land cover information from the source area. The highest mean annual ragweed pollen concentrations were generally recorded in the East of Austria where the highest densities of possible growth habitats for Ambrosia were situated. Approximately 99% of all observations of Ambrosia populations were below 745m. The European infection level varies from 0.1% at Freistadt in Northern Austria to 12.8% at Rosalia in Eastern Austria. More top-down Ambrosia pollen source inventories are required for other parts of Europe. A method for constructing top-down pollen source inventories for invasive ragweed plants in Austria, a country that is challenging in terms of topography and ragweed distribution. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Piscitelli, F.; Mauri, G.; Messi, F.; Anastasopoulos, M.; Arnold, T.; Glavic, A.; Höglund, C.; Ilves, T.; Lopez Higuera, I.; Pazmandi, P.; Raspino, D.; Robinson, L.; Schmidt, S.; Svensson, P.; Varga, D.; Hall-Wilton, R.
2018-05-01
The Multi-Blade is a Boron-10-based gaseous thermal neutron detector developed to face the challenge arising in neutron reflectometry at neutron sources. Neutron reflectometers are challenging instruments in terms of instantaneous counting rate and spatial resolution. This detector has been designed according to the requirements given by the reflectometers at the European Spallation Source (ESS) in Sweden. The Multi-Blade has been installed and tested on the CRISP reflectometer at the ISIS neutron and muon source in U.K.. The results on the detailed detector characterization are discussed in this manuscript.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genn Saji
2006-07-01
The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less
NASA Astrophysics Data System (ADS)
Chamakuri, Nagaiah; Engwer, Christian; Kunisch, Karl
2014-09-01
Optimal control for cardiac electrophysiology based on the bidomain equations in conjunction with the Fenton-Karma ionic model is considered. This generic ventricular model approximates well the restitution properties and spiral wave behavior of more complex ionic models of cardiac action potentials. However, it is challenging due to the appearance of state-dependent discontinuities in the source terms. A computational framework for the numerical realization of optimal control problems is presented. Essential ingredients are a shape calculus based treatment of the sensitivities of the discontinuous source terms and a marching cubes algorithm to track iso-surface of excitation wavefronts. Numerical results exhibit successful defibrillation by applying an optimally controlled extracellular stimulus.
Xu, Yan; Wang, Yining; Sun, Jian-Tao; Zhang, Jianwen; Tsujii, Junichi; Chang, Eric
2013-01-01
To build large collections of medical terms from semi-structured information sources (e.g. tables, lists, etc.) and encyclopedia sites on the web. The terms are classified into the three semantic categories, Medical Problems, Medications, and Medical Tests, which were used in i2b2 challenge tasks. We developed two systems, one for Chinese and another for English terms. The two systems share the same methodology and use the same software with minimum language dependent parts. We produced large collections of terms by exploiting billions of semi-structured information sources and encyclopedia sites on the Web. The standard performance metric of recall (R) is extended to three different types of Recall to take the surface variability of terms into consideration. They are Surface Recall (), Object Recall (), and Surface Head recall (). We use two test sets for Chinese. For English, we use a collection of terms in the 2010 i2b2 text. Two collections of terms, one for English and the other for Chinese, have been created. The terms in these collections are classified as either of Medical Problems, Medications, or Medical Tests in the i2b2 challenge tasks. The English collection contains 49,249 (Problems), 89,591 (Medications) and 25,107 (Tests) terms, while the Chinese one contains 66,780 (Problems), 101,025 (Medications), and 15,032 (Tests) terms. The proposed method of constructing a large collection of medical terms is both efficient and effective, and, most of all, independent of language. The collections will be made publicly available. PMID:23874426
Xu, Yan; Wang, Yining; Sun, Jian-Tao; Zhang, Jianwen; Tsujii, Junichi; Chang, Eric
2013-01-01
To build large collections of medical terms from semi-structured information sources (e.g. tables, lists, etc.) and encyclopedia sites on the web. The terms are classified into the three semantic categories, Medical Problems, Medications, and Medical Tests, which were used in i2b2 challenge tasks. We developed two systems, one for Chinese and another for English terms. The two systems share the same methodology and use the same software with minimum language dependent parts. We produced large collections of terms by exploiting billions of semi-structured information sources and encyclopedia sites on the Web. The standard performance metric of recall (R) is extended to three different types of Recall to take the surface variability of terms into consideration. They are Surface Recall (R(S)), Object Recall (R(O)), and Surface Head recall (R(H)). We use two test sets for Chinese. For English, we use a collection of terms in the 2010 i2b2 text. Two collections of terms, one for English and the other for Chinese, have been created. The terms in these collections are classified as either of Medical Problems, Medications, or Medical Tests in the i2b2 challenge tasks. The English collection contains 49,249 (Problems), 89,591 (Medications) and 25,107 (Tests) terms, while the Chinese one contains 66,780 (Problems), 101,025 (Medications), and 15,032 (Tests) terms. The proposed method of constructing a large collection of medical terms is both efficient and effective, and, most of all, independent of language. The collections will be made publicly available.
NASA Astrophysics Data System (ADS)
Xia, Xilin; Liang, Qiuhua; Ming, Xiaodong; Hou, Jingming
2017-05-01
Numerical models solving the full 2-D shallow water equations (SWEs) have been increasingly used to simulate overland flows and better understand the transient flow dynamics of flash floods in a catchment. However, there still exist key challenges that have not yet been resolved for the development of fully dynamic overland flow models, related to (1) the difficulty of maintaining numerical stability and accuracy in the limit of disappearing water depth and (2) inaccurate estimation of velocities and discharges on slopes as a result of strong nonlinearity of friction terms. This paper aims to tackle these key research challenges and present a new numerical scheme for accurately and efficiently modeling large-scale transient overland flows over complex terrains. The proposed scheme features a novel surface reconstruction method (SRM) to correctly compute slope source terms and maintain numerical stability at small water depth, and a new implicit discretization method to handle the highly nonlinear friction terms. The resulting shallow water overland flow model is first validated against analytical and experimental test cases and then applied to simulate a hypothetic rainfall event in the 42 km2 Haltwhistle Burn, UK.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-10-01
The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less
Liquid-metal-ion source development for space propulsion at ARC.
Tajmar, M; Scharlemann, C; Genovese, A; Buldrini, N; Steiger, W; Vasiljevich, I
2009-04-01
The Austrian Research Centers have a long history of developing indium Liquid-Metal-Ion Source (LMIS) for space applications including spacecraft charging compensators, SIMS and propulsion. Specifically the application as a thruster requires long-term operation as well as high-current operation which is very challenging. Recently, we demonstrated the operation of a cluster of single LMIS at an average current of 100muA each for more than 4800h and developed models for tip erosion and droplet deposition suggesting that such a LMIS can operate up to 20,000h or more. In order to drastically increase the current, a porous multi-tip source that allows operation up to several mA was developed. Our paper will highlight the problem areas and challenges from our LMIS development focusing on space propulsion applications.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
...--(i) To assist refugees in obtaining the skills which are necessary for economic self-sufficiency... the Ethiopian Community Development Council, Inc. (ECDC), located in Arlington, VA. Current economic... collaboration to meet these challenges. Provision of technical assistance is essential to support the long- term...
Artificial Intelligence and School Library Media Centers.
ERIC Educational Resources Information Center
Young, Robert J.
1990-01-01
Discusses developments in artificial intelligence in terms of their impact on school library media centers and the role of media specialists. Possible uses of expert systems, hypertext, and CD-ROM technologies in school media centers are examined and the challenges presented by these technologies are discussed. Fourteen sources of additional…
Long term field evaluation reveals HLB resistance in Citrus relatives
USDA-ARS?s Scientific Manuscript database
Citrus huanglongbing (HLB) is a destructive disease with no known cure. To identify sources of HLB resistance in the subfamily Aurantioideae to which citrus belongs, we conducted a six-year field trial under natural disease challenge conditions in an HLB endemic region. The study included 65 Citrus ...
Strategically Reviewing the Research Literature in Qualitative Research
ERIC Educational Resources Information Center
Chenail, Ronald J.; Cooper, Robin; Desir, Charlene
2010-01-01
Reviewing literature in qualitative research can be challenging in terms of why, when, where, and how we should access third-party sources in our work, especially for novice qualitative researchers. As a pragmatic solution, we suggest qualitative researchers utilize research literature in four functional ways: (a) define the phenomenon in…
NASA Astrophysics Data System (ADS)
Jordan, Phil; Melland, Alice; Shore, Mairead; Mellander, Per-Erik; Shortle, Ger; Ryan, David; Crockford, Lucy; Macintosh, Katrina; Campbell, Julie; Arnscheidt, Joerg; Cassidy, Rachel
2014-05-01
A complete appraisal of material fluxes in flowing waters is really only possibly with high time resolution data synchronous with measurements of discharge. Defined by Kirchner et al. (2004; Hydrological Processes, 18/7) as the high-frequency wave of the future and with regard to disentangling signal noise from process pattern, this challenge has been met in terms of nutrient flux monitoring by automated bankside analysis. In Ireland over a ten-year period, time-series nutrient data collected on a sub-hourly basis in rivers have been used to distinguish fluxes from different catchment sources and pathways and to provide more certain temporal pictures of flux for the comparative definition of catchment nutrient dynamics. In catchments where nutrient fluxes are particularly high and exhibit a mix of extreme diffuse and point source influences, high time resolution data analysis indicates that there are no satisfactory statistical proxies for seasonal or annual flux predictions that use coarse datasets. Or at least exposes the limits of statistical approaches to catchment scale and hydrological response. This has profound implications for catchment monitoring programmes that rely on modelled relationships. However, using high resolution monitoring for long term assessments of catchment mitigation measures comes with further challenges. Sustaining continuous wet chemistry analysis at river stations is resource intensive in terms of capital, maintenance and quality assurance. Furthermore, big data capture requires investment in data management systems and analysis. These two institutional challenges are magnified when considering the extended time period required to identify the influences of land-based nutrient control measures on water based systems. Separating the 'climate signal' from the 'source signal' in river nutrient flux data is a major analysis challenge; more so when tackled with anything but higher resolution data. Nevertheless, there is scope to lower costs in bankside analysis through technology development, and the scientific advantages of these data are clear and exciting. When integrating its use with policy appraisal, it must be made clear that the advances in river process understanding from high resolution monitoring data capture come as a package with the ability to make more informed decisions through an investment in better information.
Applied research opportunities in developed campgrounds
Carl P. Wiedemann
2002-01-01
Developed area camping is an important recreational activity in terms of both participation and as a source of revenue for public agencies. A major challenge for administrators in the public sector is how to increase revenues on limited budgets without sacrificing customer satisfaction. Applied research could make a valuable contribution to decision making, but not...
SoS Navigator 2.0: A Context-Based Approach to System-of-Systems Challenges
2008-06-01
in a Postindustrial Age. MIT Press, 1984. [ Kolb 1984] Kolb , David A. Experiential Learning : Experience as the Source of Learning and Develop- ment...terms of experiential learning , and the work of Rosen [Rosen 1991] in terms of the relational approach to understanding anticipa- tive systems. Our...Supporting Techniques and Tools 17 3.2 The Learning /Transformation Cycle 19 3.3 Summary of SoS Navigator Processes and Techniques 20 4 Case Summaries 22
Geocoronal hydrogen studies using Fabry Perot interferometers, part 2: Long-term observations
NASA Astrophysics Data System (ADS)
Nossal, S. M.; Mierkiewicz, E. J.; Roesler, F. L.; Reynolds, R. J.; Haffner, L. M.
2006-09-01
Long-term data sets are required to investigate sources of natural variability in the upper atmosphere. Understanding the influence of sources of natural variability such as the solar cycle is needed to characterize the thermosphere + exosphere, to understand coupling processes between atmospheric regions, and to isolate signatures of natural variability from those due to human-caused change. Multi-year comparisons of thermospheric + exospheric Balmer α emissions require cross-calibrated and well-understood instrumentation, a stable calibration source, reproducible observing conditions, separation of the terrestrial from the Galactic emission line, and consistent data analysis accounting for differences in viewing geometry. We discuss how we address these criteria in the acquisition and analysis of a mid-latitude geocoronal Balmer α column emission data set now spanning two solar cycles and taken mainly from Wisconsin and Kitt Peak, Arizona. We also discuss results and outstanding challenges for increasing the accuracy and use of these observations.
Numerical Simulations of Reacting Flows Using Asynchrony-Tolerant Schemes for Exascale Computing
NASA Astrophysics Data System (ADS)
Cleary, Emmet; Konduri, Aditya; Chen, Jacqueline
2017-11-01
Communication and data synchronization between processing elements (PEs) are likely to pose a major challenge in scalability of solvers at the exascale. Recently developed asynchrony-tolerant (AT) finite difference schemes address this issue by relaxing communication and synchronization between PEs at a mathematical level while preserving accuracy, resulting in improved scalability. The performance of these schemes has been validated for simple linear and nonlinear homogeneous PDEs. However, many problems of practical interest are governed by highly nonlinear PDEs with source terms, whose solution may be sensitive to perturbations caused by communication asynchrony. The current work applies the AT schemes to combustion problems with chemical source terms, yielding a stiff system of PDEs with nonlinear source terms highly sensitive to temperature. Examples shown will use single-step and multi-step CH4 mechanisms for 1D premixed and nonpremixed flames. Error analysis will be discussed both in physical and spectral space. Results show that additional errors introduced by the AT schemes are negligible and the schemes preserve their accuracy. We acknowledge funding from the DOE Computational Science Graduate Fellowship administered by the Krell Institute.
USDA-ARS?s Scientific Manuscript database
Relating field observations of plant phenological events to remotely sensed depictions of land surface phenology remains a challenge to the vertical integration of data from disparate sources. This research conducted at the Jornada Basin Long-Term Ecological Research site in southern New Mexico cap...
USDA-ARS?s Scientific Manuscript database
Relating field observations of plan phonological events to remotely sensed depictions of land surface phenology remains a challenge to the vertical integration of data from disparate sources. This research conducted at the Jornada Basin Long-Term Ecological Research site in southern New Mexico capit...
Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.
Ak, Ronay; Fink, Olga; Zio, Enrico
2016-08-01
The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.
O'Connell, Beverly; Guse, Lorna; Greenslade, Loreley
2018-01-30
Bachelor of Nursing students (BN) placed in long-term care encounter residents who exhibit challenging behaviors. Students are often inadequately prepared to manage these behaviors, and this is a source of distress for students. This study explored whether enhancing and restructuring theoretical and clinical courses resulted in student nurses feeling better prepared to manage residents' challenging behaviors and improve their levels of distress. This study was conducted in two phases with 116 BN students (first phase) and 99 students (second phase) where the course on older adults was restructured. The findings of this study indicated that students who felt less prepared experienced greater distress by residents' behaviors than those who felt better prepared. Scheduling a theoretical course on the care of older adults prior to the clinical course placement, as well as offering an online learning module focused on responsive behaviors, significantly increased students' feelings of preparedness to manage residents' complex behaviors.
NASA Astrophysics Data System (ADS)
Sturmberg, Björn C. P.; Dossou, Kokou B.; Lawrence, Felix J.; Poulton, Christopher G.; McPhedran, Ross C.; Martijn de Sterke, C.; Botten, Lindsay C.
2016-05-01
We describe EMUstack, an open-source implementation of the Scattering Matrix Method (SMM) for solving field problems in layered media. The fields inside nanostructured layers are described in terms of Bloch modes that are found using the Finite Element Method (FEM). Direct access to these modes allows the physical intuition of thin film optics to be extended to complex structures. The combination of the SMM and the FEM makes EMUstack ideally suited for studying lossy, high-index contrast structures, which challenge conventional SMMs.
Marshall, Harry H; Griffiths, David J; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G F; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L; Thompson, Faye J; Vitikainen, Emma I K; Cant, Michael A
2018-01-01
Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects.
Bayesian source term estimation of atmospheric releases in urban areas using LES approach.
Xue, Fei; Kikumoto, Hideki; Li, Xiaofeng; Ooka, Ryozo
2018-05-05
The estimation of source information from limited measurements of a sensor network is a challenging inverse problem, which can be viewed as an assimilation process of the observed concentration data and the predicted concentration data. When dealing with releases in built-up areas, the predicted data are generally obtained by the Reynolds-averaged Navier-Stokes (RANS) equations, which yields building-resolving results; however, RANS-based models are outperformed by large-eddy simulation (LES) in the predictions of both airflow and dispersion. Therefore, it is important to explore the possibility of improving the estimation of the source parameters by using the LES approach. In this paper, a novel source term estimation method is proposed based on LES approach using Bayesian inference. The source-receptor relationship is obtained by solving the adjoint equations constructed using the time-averaged flow field simulated by the LES approach based on the gradient diffusion hypothesis. A wind tunnel experiment with a constant point source downwind of a single building model is used to evaluate the performance of the proposed method, which is compared with that of the existing method using a RANS model. The results show that the proposed method reduces the errors of source location and releasing strength by 77% and 28%, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.
Digital Imaging and the Cognitive Revolution: A Media Challenge.
ERIC Educational Resources Information Center
Sartorius, Ute
This paper discusses the role of digital technology within the cognitive revolution of the perception of images. It analyzes the traditional values placed on images as a source of cognition. These values are discussed in terms of the ethical and social issues raised by the use of digital image manipulation in so far as the digital era is falsely…
Mitigating climate change through small-scale forestry in the USA: opportunities and challenges
Susan Charnley; David Diaz; Hannah Gosnell
2010-01-01
Forest management for carbon sequestration is a low-cost, low-technology, relatively easy way to help mitigate global climate change that can be adopted now while additional long-term solutions are developed. Carbon-oriented management of forests also offers forest owners an opportunity to obtain a new source of income, and commonly has environmental co-benefits. The...
Grand challenges in mass storage: A systems integrators perspective
NASA Technical Reports Server (NTRS)
Lee, Richard R.; Mintz, Daniel G.
1993-01-01
Within today's much ballyhooed supercomputing environment, with its CFLOPS of CPU power, and Gigabit networks, there exists a major roadblock to computing success; that of Mass Storage. The solution to this mass storage problem is considered to be one of the 'Grand Challenges' facing the computer industry today, as well as long into the future. It has become obvious to us, as well as many others in the industry, that there is no clear single solution in sight. The Systems Integrator today is faced with a myriad of quandaries in approaching this challenge. He must first be innovative in approach, second choose hardware solutions that are volumetric efficient; high in signal bandwidth; available from multiple sources; competitively priced, and have forward growth extendibility. In addition he must also comply with a variety of mandated, and often conflicting software standards (GOSIP, POSIX, IEEE, MSRM 4.0, and others), and finally he must deliver a systems solution with the 'most bang for the buck' in terms of cost vs. performance factors. These quandaries challenge the Systems Integrator to 'push the envelope' in terms of his or her ingenuity and innovation on an almost daily basis. This dynamic is explored further, and an attempt to acquaint the audience with rational approaches to this 'Grand Challenge' is made.
Trends and drivers of marine debris on the Atlantic coast of the United States 1997-2007
Ribic, C.A.; Sheavly, S.B.; Rugg, D.J.; Erdmann, Eric S.
2010-01-01
For the first time, we documented regional differences in amounts and long-term trends of marine debris along the US Atlantic coast. The Southeast Atlantic had low land-based and general-source debris loads as well as no increases despite a 19% increase in coastal population. The Northeast (8% population increase) also had low land-based and general-source debris loads and no increases. The Mid-Atlantic (10% population increase) fared the worst, with heavy land-based and general-source debris loads that increased over time. Ocean-based debris did not change in the Northeast where the fishery is relatively stable; it declined over the Mid-Atlantic and Southeast and was correlated with declining regional fisheries. Drivers, including human population, land use status, fishing activity, and oceanic current systems, had complex relationships with debris loads at local and regional scales. Management challenges remain undeniably large but solid information from long-term programs is one key to addressing this pressing pollution issue. ?? 2010.
Trends and drivers of marine debris on the Atlantic coast of the United States 1997-2007.
Ribic, Christine A; Sheavly, Seba B; Rugg, David J; Erdmann, Eric S
2010-08-01
For the first time, we documented regional differences in amounts and long-term trends of marine debris along the US Atlantic coast. The Southeast Atlantic had low land-based and general-source debris loads as well as no increases despite a 19% increase in coastal population. The Northeast (8% population increase) also had low land-based and general-source debris loads and no increases. The Mid-Atlantic (10% population increase) fared the worst, with heavy land-based and general-source debris loads that increased over time. Ocean-based debris did not change in the Northeast where the fishery is relatively stable; it declined over the Mid-Atlantic and Southeast and was correlated with declining regional fisheries. Drivers, including human population, land use status, fishing activity, and oceanic current systems, had complex relationships with debris loads at local and regional scales. Management challenges remain undeniably large but solid information from long-term programs is one key to addressing this pressing pollution issue. Published by Elsevier Ltd.
The DNAPL challenge: Is there a case for partial source removal?
NASA Astrophysics Data System (ADS)
Kavanaugh, M. C.; Rao, P. S. C.
2003-04-01
Despite significant advances in the science and technology of DNAPL source zone characterization, and DNAPL removal technologies over the past two decades, source remediation has not become a standard objective at most DNAPL sites. Few documented cases of DNAPL source removal have been published, and achievement of the usual cleanup metric in these source zones, namely, meeting Maximum Contaminant Levels ("MCLs") is rare. At most DNAPL sites, removal of sufficient amounts of DNAPL from the source zones to achieve MCLs is considered technically impracticable, taking cost into consideration. Leaving substantial quantities of DNAPL in source zones and instituting appropriate technologies to eliminate continued migration of groundwater plumes emanating from these source zones requires long-term reliability of barrier technologies (hydraulic or physical), and the permanence institutional controls. This strategy runs the risk of technical or institutional failures and possible liabilities associated with natural resource damage claims. To address this challenge, the U.S. Environmental Protection Agency ("EPA") established a panel of experts ("Panel") on DNAPL issues to provide their opinions on the overarching question of whether DNAPL source remediation is feasible. This Panel, co-chaired by the authors of this paper, has now prepared a report summarizing the opinions of the Panel on the key question of whether DNAPL source removal is achievable. This paper will present the findings of the Panel, addressing such issues as the current status of DNAPL source characterization and remediation technologies, alternative metrics of success for DNAPL source remediation, the potential benefits of partial DNAPL source depletion, and research needs to address data gaps that hinder the more widespread implementation of source removal strategies.
Hydropower Modeling Challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoll, Brady; Andrade, Juan; Cohen, Stuart
Hydropower facilities are important assets for the electric power sector and represent a key source of flexibility for electric grids with large amounts of variable generation. As variable renewable generation sources expand, understanding the capabilities and limitations of the flexibility from hydropower resources is important for grid planning. Appropriately modeling these resources, however, is difficult because of the wide variety of constraints these plants face that other generators do not. These constraints can be broadly categorized as environmental, operational, and regulatory. This report highlights several key issues involving incorporating these constraints when modeling hydropower operations in terms of production costmore » and capacity expansion. Many of these challenges involve a lack of data to adequately represent the constraints or issues of model complexity and run time. We present several potential methods for improving the accuracy of hydropower representation in these models to allow for a better understanding of hydropower's capabilities.« less
Chatterji, Madhabi
2016-12-01
This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.
Managing multicentre clinical trials with open source.
Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan
2014-03-01
Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.
Schlue, Danijela; Mate, Sebastian; Haier, Jörg; Kadioglu, Dennis; Prokosch, Hans-Ulrich; Breil, Bernhard
2017-01-01
Heterogeneous tumor documentation and its challenges of interpretation of medical terms lead to problems in analyses of data from clinical and epidemiological cancer registries. The objective of this project was to design, implement and improve a national content delivery portal for oncological terms. Data elements of existing handbooks and documentation sources were analyzed, combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation, a prototypical tumor wiki was designed and implemented. Requirements engineering techniques were applied to optimize this platform. It is targeted to user groups such as documentation officers, physicians and patients. The linkage to other information sources like PubMed and MeSH was realized.
Nuclear Explosion Monitoring Advances and Challenges
NASA Astrophysics Data System (ADS)
Baker, G. E.
2015-12-01
We address the state-of-the-art in areas important to monitoring, current challenges, specific efforts that illustrate approaches addressing shortcomings in capabilities, and additional approaches that might be helpful. The exponential increase in the number of events that must be screened as magnitude thresholds decrease presents one of the greatest challenges. Ongoing efforts to exploit repeat seismic events using waveform correlation, subspace methods, and empirical matched field processing holds as much "game-changing" promise as anything being done, and further efforts to develop and apply such methods efficiently are critical. Greater accuracy of travel time, signal loss, and full waveform predictions are still needed to better locate and discriminate seismic events. Important developments include methods to model velocities using multiple types of data; to model attenuation with better separation of source, path, and site effects; and to model focusing and defocusing of surface waves. Current efforts to model higher frequency full waveforms are likely to improve source characterization while more effective estimation of attenuation from ambient noise holds promise for filling in gaps. Censoring in attenuation modeling is a critical problem to address. Quantifying uncertainty of discriminants is key to their operational use. Efforts to do so for moment tensor (MT) inversion are particularly important, and fundamental progress on the statistics of MT distributions is the most important advance needed in the near term in this area. Source physics is seeing great progress through theoretical, experimental, and simulation studies. The biggest need is to accurately predict the effects of source conditions on seismic generation. Uniqueness is the challenge here. Progress will depend on studies that probe what distinguishes mechanisms, rather than whether one of many possible mechanisms is consistent with some set of observations.
NASA Astrophysics Data System (ADS)
Hewitson, B.; Jack, C. D.; Gutowski, W. J., Jr.
2014-12-01
Possibly the leading complication for users of climate information for policy and adaptation is the confusing mix of contrasting data sets that offer widely differing (and often times fundamentally contradictory) indications of the magnitude and direction of past and future regional climate change. In this light, the most pressing scientific-societal challenge is the need to find new ways to understand the sources of conflicting messages from multi-model, multi-method and multi-scale disparities, to develop and implement new analytical methodologies to address this difficulty and so to advance the interpretation and communication of robust climate information to decision makers. Compounding this challenge is the growth of climate services which, in view of the confusing mix of climate change messages, raises serious concerns as to the ethics of communication and dissemination of regional climate change data.The Working Group on Regional Climate (WGRC) of the World Climate Research Program (WCRP) oversees the CORDEX downscaling program which offers a systematic approach to compare the CMIP5 GCMs alongside RCMs and Empirical-statistical (ESD) downscaling within a common experimental design, and which facilitates the evaluation and assessment of the relative information content and sources of error. Using results from the CORDEX RCM and ESD evaluation experiment, and set against the regional messages from the CMIP5 GCMs, we examine the differing messages that arise from each data source. These are then considered in terms of the implications of consequence if each data source were to be independently adopted in a real world use-case scenario. This is then cast in the context of the emerging developments on the distillation dilemma - where the pressing need is for multi-method integration - and how this relates to the WCRP regional research grand challenges.
2014-09-01
generation, exotic storage technologies, smart power grid management, and better power sources for directed-energy weapons (DEW). Accessible partner nation...near term will help to mitigate risks and improve outcomes. 2 Forecasting typically extrapolates predictions based...eventually, diminished national power . Within this context, this paper examines policy, legal, ethical, and strategy implications for DoD from the impact
Use of new scientific developments in regulatory risk assessments: challenges and opportunities.
Tarazona, Jose V
2013-07-01
Since the 1990s, science based ecological risk assessments constitute an essential tool for supporting decision making in the regulatory context. Using the European REACH Regulation as example, this article presents the challenges and opportunities for new scientific developments within the area of chemical control and environmental protection. These challenges can be sorted out in 3 main related topics (sets). In the short term, the challenges are directly associated with the regulatory requirements, required for facilitating a scientifically sound implementation of the different obligations for industry and authorities. It is important to mention that although the actual tools are different due to the regulatory requirements, the basic needs are still the same as those addressed in the early 1990s: understanding the ecological relevance of the predicted effects, including the uncertainty, and facilitating the link with the socio-economic assessment. The second set of challenges covers the opportunities for getting an added value from the regulatory efforts. The information compiled through REACH registration and notification processes is analyzed as source for new integrative developments for assessing the combined chemical risk at the regional level. Finally, the article discusses the challenge of inverting the process and developing risk assessment methods focusing on the receptor, the individual or ecosystem, instead of on the stressor or source. These approaches were limited in the past due to the lack of information, but the identification and dissemination of standard information, including uses, manufacturing sites, physical-chemical, environmental, ecotoxicological, and toxicological properties as well as operational conditions and risk management measures for thousands of chemicals, combined by the knowledge gathered through large scale monitoring programs and spatial information systems is generating new opportunities. The challenge is liking predictions and measured data in an integral "-omic type" approach, considering collectively data from different sources and offering a complete assessment of the chemical risk of individuals and ecosystems, with new conceptual approaches that could be defined as "risk-omics based" paradigms and models. Copyright © 2013 SETAC.
NASA Astrophysics Data System (ADS)
Xie, Dexuan
2014-10-01
The Poisson-Boltzmann equation (PBE) is one widely-used implicit solvent continuum model in the calculation of electrostatic potential energy for biomolecules in ionic solvent, but its numerical solution remains a challenge due to its strong singularity and nonlinearity caused by its singular distribution source terms and exponential nonlinear terms. To effectively deal with such a challenge, in this paper, new solution decomposition and minimization schemes are proposed, together with a new PBE analysis on solution existence and uniqueness. Moreover, a PBE finite element program package is developed in Python based on the FEniCS program library and GAMer, a molecular surface and volumetric mesh generation program package. Numerical tests on proteins and a nonlinear Born ball model with an analytical solution validate the new solution decomposition and minimization schemes, and demonstrate the effectiveness and efficiency of the new PBE finite element program package.
Long-term consequences of pain in human neonates.
Grunau, Ruth E; Holsti, Liisa; Peters, Jeroen W B
2006-08-01
The low tactile threshold in preterm infants when they are in the neonatal intensive care unit (NICU), while their physiological systems are unstable and immature, potentially renders them more vulnerable to the effects of repeated invasive procedures. There is a small but growing literature on pain and tactile responsivity following procedural pain in the NICU, or early surgery. Long-term effects of repeated pain in the neonatal period on neurodevelopment await further research. However, there are multiple sources of stress in the NICU, which contribute to inducing high overall 'allostatic load', therefore determining specific effects of neonatal pain in human infants is challenging.
NASA Astrophysics Data System (ADS)
Pellerin, B. A.; Bergamaschi, B. A.; Downing, B. D.; Saraceno, J.; Fleck, J.; Shanley, J. B.; Aiken, G.; Boss, E.; Fujii, R.
2009-12-01
A critical challenge for understanding the sources, character and cycling of dissolved organic matter (DOM) is making measurements at the time scales in which changes occur in aquatic systems. Traditional approaches for data collection (daily to monthly discrete sampling) are often limited by analytical and field costs, site access and logistical challenges, particularly for long-term sampling at a large number of sites. The ability to make optical measurements of DOM in situ has been known for more than 50 years, but much of the work on in situ DOM absorbance and fluorescence using commercially-available instruments has taken place in the last few years. Here we present several recent examples that highlight the application of in situ measurements for understanding DOM dynamics in riverine systems at intervals of minutes to hours. Examples illustrate the utility of in situ optical sensors for studies of DOM over short-duration events of days to weeks (diurnal cycles, tidal cycles, storm events and snowmelt periods) as well as longer-term continuous monitoring for months to years. We also highlight the application of in situ optical DOM measurements as proxies for constituents that are significantly more difficult and expensive to measure at high frequencies (e.g. methylmercury, trihalomethanes). Relatively simple DOM absorbance and fluorescence measurements made in situ could be incorporated into short and long-term ecological research and monitoring programs, resulting in advanced understanding of organic matter sources, character and cycling in riverine systems.
Griffiths, David J.; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G. F.; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L.; Thompson, Faye J.; Vitikainen, Emma I. K.; Cant, Michael A.
2018-01-01
Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects. PMID:29315317
Approaches to advancescientific understanding of macrosystems ecology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, Ofir; Ball, Becky; Bond-Lamberty, Benjamin
Macrosystem ecological studies inherently investigate processes that interact across multiple spatial and temporal scales, requiring intensive sampling and massive amounts of data from diverse sources to incorporate complex cross-scale and hierarchical interactions. Inherent challenges associated with these characteristics include high computational demands, data standardization and assimilation, identification of important processes and scales without prior knowledge, and the need for large, cross-disciplinary research teams that conduct long-term studies. Therefore, macrosystem ecology studies must utilize a unique set of approaches that are capable of encompassing these methodological characteristics and associated challenges. Several case studies demonstrate innovative methods used in current macrosystem ecologymore » studies.« less
Big Data Analytics in Chemical Engineering.
Chiang, Leo; Lu, Bo; Castillo, Ivan
2017-06-07
Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.
Gabbard, Joseph L.; Shukla, Maulik; Sobral, Bruno
2010-01-01
Systems biology and infectious disease (host-pathogen-environment) research and development is becoming increasingly dependent on integrating data from diverse and dynamic sources. Maintaining integrated resources over long periods of time presents distinct challenges. This paper describes experiences and lessons learned from integrating data in two five-year projects focused on pathosystems biology: the Pathosystems Resource Integration Center (PATRIC, http://patric.vbi.vt.edu/), with a goal of developing bioinformatics resources for the research and countermeasures development communities based on genomics data, and the Resource Center for Biodefense Proteomics Research (RCBPR, http://www.proteomicsresource.org/), with a goal of developing resources based on the experiment data such as microarray and proteomics data from diverse sources and technologies. Some challenges include integrating genomic sequence and experiment data, data synchronization, data quality control, and usability engineering. We present examples of a variety of data integration problems drawn from our experiences with PATRIC and RBPRC, as well as open research questions related to long term sustainability, and describe the next steps to meeting these challenges. Novel contributions of this work include (1) an approach for addressing discrepancies between experiment results and interpreted results and (2) expanding the range of data integration techniques to include usability engineering at the presentation level. PMID:20491070
2013-01-01
Background The objective was to examine feasibility of using hospital discharge register data for studying fire-related injuries. Methods The Finnish National Hospital Discharge Register (FHDR) was the database used to select relevant hospital discharge data to study usability and data quality issues. Patterns of E-coding were assessed, as well as prominent challenges in defining the incidence of injuries. Additionally, the issue of defining the relevant amount of hospital days accounted for in injury care was considered. Results Directly after the introduction of the ICD-10 classification system, in 1996, the completeness of E-coding was found to be poor, but to have improved dramatically around 2000 and thereafter. The scale of the challenges to defining the incidence of injuries was found to be manageable. In counting the relevant hospital days, psychiatric and long-term care were found to be the obvious and possible sources of overestimation. Conclusions The FHDR was found to be a feasible data source for studying fire-related injuries so long as potential challenges are acknowledged and taken into account. Hospital discharge data can be a unique and powerful means for injury research as issues of representativeness and coverage of traditional probability samples can frequently be completely avoided. PMID:23496937
Owen, Julia P; Wipf, David P; Attias, Hagai T; Sekihara, Kensuke; Nagarajan, Srikantan S
2012-03-01
In this paper, we present an extensive performance evaluation of a novel source localization algorithm, Champagne. It is derived in an empirical Bayesian framework that yields sparse solutions to the inverse problem. It is robust to correlated sources and learns the statistics of non-stimulus-evoked activity to suppress the effect of noise and interfering brain activity. We tested Champagne on both simulated and real M/EEG data. The source locations used for the simulated data were chosen to test the performance on challenging source configurations. In simulations, we found that Champagne outperforms the benchmark algorithms in terms of both the accuracy of the source localizations and the correct estimation of source time courses. We also demonstrate that Champagne is more robust to correlated brain activity present in real MEG data and is able to resolve many distinct and functionally relevant brain areas with real MEG and EEG data. Copyright © 2011 Elsevier Inc. All rights reserved.
Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji
2013-12-01
The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.
Crowd Sourcing to Improve Urban Stormwater Management
NASA Astrophysics Data System (ADS)
Minsker, B. S.; Band, L. E.; Heidari Haratmeh, B.; Law, N. L.; Leonard, L. N.; Rai, A.
2017-12-01
Over half of the world's population currently lives in urban areas, a number predicted to grow to 60 percent by 2030. Urban areas face unprecedented and growing challenges that threaten society's long-term wellbeing, including poverty; chronic health problems; widespread pollution and resource degradation; and increased natural disasters. These are "wicked" problems involving "systems of systems" that require unprecedented information sharing and collaboration across disciplines and organizational boundaries. Cities are recognizing that the increasing stream of data and information ("Big Data"), informatics, and modeling can support rapid advances on these challenges. Nonetheless, information technology solutions can only be effective in addressing these challenges through deeply human and systems perspectives. A stakeholder-driven approach ("crowd sourcing") is needed to develop urban systems that address multiple needs, such as parks that capture and treat stormwater while improving human and ecosystem health and wellbeing. We have developed informatics- and Cloud-based collaborative methods that enable crowd sourcing of green stormwater infrastructure (GSI: rain gardens, bioswales, trees, etc.) design and management. The methods use machine learning, social media data, and interactive design tools (called IDEAS-GI) to identify locations and features of GSI that perform best on a suite of objectives, including life cycle cost, stormwater volume reduction, and air pollution reduction. Insights will be presented on GI features that best meet stakeholder needs and are therefore most likely to improve human wellbeing and be well maintained.
Bell, Emily; Maxwell, Bruce; McAndrews, Mary Pat; Sadikot, Abbas; Racine, Eric
2011-12-01
Deep brain stimulation (DBS) is an approved neurosurgical intervention for motor disorders such as Parkinson disease. The emergence of psychiatric uses for DBS combined with the fact that it is an invasive and expensive procedure creates important ethical and social challenges in the delivery of care that need further examination. We endeavored to examine health care provider perspectives on ethical and social challenges encountered in DBS. Health care providers working in Canadian DBS surgery programs participated in a semistructured interview to identify and characterize ethical and social challenges of DBS. A content analysis of the interviews was conducted. Several key ethical issues, such as patient screening and resource allocation, were identified by members of neurosurgical teams. Providers described challenges in selecting patients for DBS on the basis of unclear evidence-based guidance regarding behavioral issues or cognitive criteria. Varied contexts of resource allocation, including some very challenging schemas, were also reported. In addition, the management of patients in the community was highlighted as a source of ethical and clinical complexity, given the need for coordinated long-term care. This study provides insights into the complexity of ethical challenges that providers face in the use of DBS across different neurosurgical centers. We propose actions for health care providers for the long-term care and postoperative monitoring of patients with DBS. More data on patient perspectives in DBS would complement the understanding of key challenges, as well as contribute to best practices, for patient selection, management, and resource allocation. Copyright © 2011 Elsevier Inc. All rights reserved.
Zhang, Dongdong; Beck, Benjamin H; Lange, Miles; Zhao, Honggang; Thongda, Wilawan; Ye, Zhi; Li, Chao; Peatman, Eric
2017-01-01
Flavobacterium columnare is the causative agent of columnaris disease and causes tremendous morbidity and mortality of farmed fish globally. Previously, we identified a potential lectin-mediator (a rhamnose-binding lectin; RBL1a) of F. columnare adhesion and showed higher RBL1a expression in susceptible channel catfish under basal conditions and following infection. Exposure of challenged fish to the carbohydrate ligand l-rhamnose just prior to a challenge substantially decreased columnaris mortality and pathogen adherence via the down-regulation of RBL1a. While highly effective in protecting fish from columnaris, l-rhamnose is prohibitively expensive, underscoring the need for alternative cost-effective sources of rhamnose for disease control. One such alternative may be microbially produced glycolipid compounds termed rhamnolipids (RLs), which feature abundant l-rhamnose moieties and are readily available from commercial sources. In the present study, we examined whether commercially available RLs (administered either by immersion or via feed) would function similarly to l-rhamnose in affording host protection against F. columnare. A four-week feeding trial with basal and RL top-coated diets (basal diet + RLs) was conducted in channel catfish fingerlings. Surprisingly, columnaris challenges revealed significantly lower survival following the 10 d challenge period in RL diet fed fish when compared with the basal treatment group (p < 0.001). In fish fed RLs, we observed a rapid and large-scale upregulation of RBL1a immediately after challenge combined with a suppression of mucin and lysozyme transcripts. Similarly, fish that were briefly pre-exposed to RLs by immersion and then challenged exhibited lower survival as compared to unexposed fish during a 4 d trial. In conclusion, RLs do not represent an alternative to rhamnose as an experimental treatment for protecting catfish from columnaris mortality. Further research is needed to find other affordable and efficacious alternative sources of l-rhamnose. Copyright © 2016 Elsevier Ltd. All rights reserved.
Endophytic Fungi—Alternative Sources of Cytotoxic Compounds: A Review
Uzma, Fazilath; Mohan, Chakrabhavi D.; Hashem, Abeer; Konappa, Narasimha M.; Rangappa, Shobith; Kamath, Praveen V.; Singh, Bhim P.; Mudili, Venkataramana; Gupta, Vijai K.; Siddaiah, Chandra N.; Chowdappa, Srinivas; Alqarawi, Abdulaziz A.; Abd_Allah, Elsayed F.
2018-01-01
Cancer is a major cause of death worldwide, with an increasing number of cases being reported annually. The elevated rate of mortality necessitates a global challenge to explore newer sources of anticancer drugs. Recent advancements in cancer treatment involve the discovery and development of new and improved chemotherapeutics derived from natural or synthetic sources. Natural sources offer the potential of finding new structural classes with unique bioactivities for cancer therapy. Endophytic fungi represent a rich source of bioactive metabolites that can be manipulated to produce desirable novel analogs for chemotherapy. This review offers a current and integrative account of clinically used anticancer drugs such as taxol, podophyllotoxin, camptothecin, and vinca alkaloids in terms of their mechanism of action, isolation from endophytic fungi and their characterization, yield obtained, and fungal strain improvement strategies. It also covers recent literature on endophytic fungal metabolites from terrestrial, mangrove, and marine sources as potential anticancer agents and emphasizes the findings for cytotoxic bioactive compounds tested against specific cancer cell lines. PMID:29755344
Upper and lower bounds of ground-motion variabilities: implication for source properties
NASA Astrophysics Data System (ADS)
Cotton, Fabrice; Reddy-Kotha, Sreeram; Bora, Sanjay; Bindi, Dino
2017-04-01
One of the key challenges of seismology is to be able to analyse the physical factors that control earthquakes and ground-motion variabilities. Such analysis is particularly important to calibrate physics-based simulations and seismic hazard estimations at high frequencies. Within the framework of the development of ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-source records and modern GMPE analysis technics allow to partition these residuals into between- and a within-event components. In particular, the between-event term quantifies all those repeatable source effects (e.g. related to stress-drop or kappa-source variability) which have not been accounted by the magnitude-dependent term of the model. In this presentation, we first discuss the between-event variabilities computed both in the Fourier and Response Spectra domains, using recent high-quality global accelerometric datasets (e.g. NGA-west2, Resorce, Kiknet). These analysis lead to the assessment of upper bounds for the ground-motion variability. Then, we compare these upper bounds with lower bounds estimated by analysing seismic sequences which occurred on specific fault systems (e.g., located in Central Italy or in Japan). We show that the lower bounds of between-event variabilities are surprisingly large which indicates a large variability of earthquake dynamic properties even within the same fault system. Finally, these upper and lower bounds of ground-shaking variability are discussed in term of variability of earthquake physical properties (e.g., stress-drop and kappa_source).
The Electrification of Energy: Long-Term Trends and Opportunities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsao, Jeffrey Y.; Fouquet, Roger; Schubert, E. Fred
Here, we present and analyze three powerful long-term historical trends in energy, particularly electrical energy, as well as the opportunities and challenges associated with these trends. The first trend is from a world containing a diversity of energy currencies to one whose predominant currency is electricity, driven by electricity’s transportability, exchangeability, and steadily decreasing cost. The second trend is from electricity generated from a diversity of sources to electricity generated predominantly by free-fuel sources, driven by their steadily decreasing cost and long-term abundance. These trends necessitate a just-emerging third trend: from a grid in which electricity is transported uni-directionally, tradedmore » at near-static prices, and consumed under direct human control; to a grid in which electricity is transported bi-directionally, traded at dynamic prices, and consumed under human-tailored agential control. Early acceptance and appreciation of these trends will accelerate their remaking of humanity’s energy landscape into one in which energy is much more affordable, abundant and efficiently deployed than it is today; with major economic, geo-political, and environmental benefits to human society.« less
Medical concerns for exploration-class missions
NASA Technical Reports Server (NTRS)
Stewart, Donald F.; Lujan, Barbara
1991-01-01
The Space Exploration initiative will challenge life scientists with a diverse set of crew medical risks. The varied sources of this cumulative risk are identified and briefly discussed in terms of risk assessment and preliminary plans for risk management. The roles of Space Station Freedom and other flight programs are discussed in the context of exploration medical objectives. The significant differences between Space Station era (second generation) and exploration medical support systems (third generation) are reviewed.
TaxaGloss - A Glossary and Translation Tool for Biodiversity Studies.
Collin, Rachel; Fredericq, Suzanne; Freshwater, D Wilson; Gilbert, Edward; Madrid, Maycol; Maslakova, Svetlana; Miglietta, Maria Pia; Rocha, Rosana M; Rodríguez, Estefanía; Thacker, Robert W
2016-01-01
Correctly identifying organisms is key to most biological research, and is especially critical in areas of biodiversity and conservation. Yet it remains one of the greatest challenges when studying all but the few well-established model systems. The challenge is in part due to the fact that most species have yet to be described, vanishing taxonomic expertise and the relative inaccessibility of taxonomic information. Furthermore, identification keys and other taxonomic resources are based on complex, taxon-specific vocabularies used to describe important morphological characters. Using these resources is made difficult by the fact that taxonomic documentation of the world's biodiversity is an international endeavour, and keys and field guides are not always available in the practitioner's native language. To address this challenge, we have developed a publicly available on-line illustrated multilingual glossary and translation tool for technical taxonomic terms using the Symbiota Software Project biodiversity platform. Illustrations, photographs and translations have been sourced from the global community of taxonomists working with marine invertebrates and seaweeds. These can be used as single-language illustrated glossaries or to make customized translation tables. The glossary has been launched with terms and illustrations of seaweeds, tunicates, sponges, hydrozoans, sea anemones, and nemerteans, and already includes translations into seven languages for some groups. Additional translations and development of terms for more taxa are underway, but the ultimate utility of this tool depends on active participation of the international taxonomic community.
UV fatigue investigations with non-destructive tools in silica
NASA Astrophysics Data System (ADS)
Natoli, Jean-Yves; Beaudier, Alexandre; Wagner, Frank R.
2017-08-01
A fatigue effect is often observed under multiple laser irradiations, overall in UV. This decrease of LIDT, is a critical parameter for laser sources with high repetition rates and with a need of long-term life, as in spatial applications at 355nm. A challenge is also to replace excimer lasers by solid laser sources, this challenge requires to improve drastically the lifetime of optical materials at 266nm. Main applications of these sources are devoted to material surface nanostructuration, spectroscopy and medical surgeries. In this work we focus on the understanding of the laser matter interaction at 266nm in silica in order to predict the lifetime of components and study parameters links to these lifetimes to give keys of improvement for material suppliers. In order to study the mechanism involved in the case of multiple irradiations, an interesting approach is to involve the evolution of fluorescence, in order to observe the first stages of material changes just before breakdown. We will show that it is sometime possible to estimate the lifetime of component only with the fluorescence measurement, saving time and materials. Moreover, the data from the diagnostics give relevant informations to highlight "defects" induced by multiple laser irradiations.
Smith, Alexandra E; Slivicki, Richard A; Hohmann, Andrea G; Crystal, Jonathon D
2017-03-01
Chemotherapeutic agents are widely used to treat patients with systemic cancer. The efficacy of these therapies is undermined by their adverse side-effect profiles such as cognitive deficits that have a negative impact on the quality of life of cancer survivors. Cognitive side effects occur across a variety of domains, including memory, executive function, and processing speed. Such impairments are exacerbated under cognitive challenges and a subgroup of patients experience long-term impairments. Episodic memory in rats can be examined using a source memory task. In the current study, rats received paclitaxel, a taxane-derived chemotherapeutic agent, and learning and memory functioning was examined using the source memory task. Treatment with paclitaxel did not impair spatial and episodic memory, and paclitaxel treated rats were not more susceptible to cognitive challenges. Under conditions in which memory was not impaired, paclitaxel treatment impaired learning of new rules, documenting a decreased sensitivity to changes in experimental contingencies. These findings provide new information on the nature of cancer chemotherapy-induced cognitive impairments, particularly regarding the incongruent vulnerability of episodic memory and new learning following treatment with paclitaxel. Copyright © 2016 Elsevier B.V. All rights reserved.
Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T Elizabeth
2018-01-01
The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. A mixed-methods approach was used with cancer survivors from the "Assessment of Patients' Experience with Cancer Care" 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families' and friends' provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients.
Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T. Elizabeth
2018-01-01
Objective The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. Methods A mixed-methods approach was used with cancer survivors from the “Assessment of Patients’ Experience with Cancer Care” 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Results Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families’ and friends’ provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Conclusion Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients. PMID:29339938
Kalra, Dipak; Kobayashi, Shinji
2013-01-01
Objectives The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Methods Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. Results The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Conclusions Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes. PMID:24523993
de Souza, Andrea; Bittker, Joshua; Lahr, David; Brudz, Steve; Chatwin, Simon; Oprea, Tudor I.; Waller, Anna; Yang, Jeremy; Southall, Noel; Guha, Rajarshi; Schurer, Stephan; Vempati, Uma; Southern, Mark R.; Dawson, Eric S.; Clemons, Paul A.; Chung, Thomas D.Y.
2015-01-01
Recent industry-academic partnerships involve collaboration across disciplines, locations, and organizations using publicly funded “open-access” and proprietary commercial data sources. These require effective integration of chemical and biological information from diverse data sources, presenting key informatics, personnel, and organizational challenges. BARD (BioAssay Research Database) was conceived to address these challenges and to serve as a community-wide resource and intuitive web portal for public-sector chemical biology data. Its initial focus is to enable scientists to more effectively use the NIH Roadmap Molecular Libraries Program (MLP) data generated from 3-year pilot and 6-year production phases of the Molecular Libraries Probe Production Centers Network (MLPCN), currently in its final year. BARD evolves the current data standards through structured assay and result annotations that leverage the BioAssay Ontology (BAO) and other industry-standard ontologies, and a core hierarchy of assay definition terms and data standards defined specifically for small-molecule assay data. We have initially focused on migrating the highest-value MLP data into BARD and bringing it up to this new standard. We review the technical and organizational challenges overcome by the inter-disciplinary BARD team, veterans of public and private sector data-integration projects, collaborating to describe (functional specifications), design (technical specifications), and implement this next-generation software solution. PMID:24441647
Mechanisms of motivation–cognition interaction: challenges and opportunities
Krug, Marie K.; Chiew, Kimberly S.; Kool, Wouter; Westbrook, J. Andrew; Clement, Nathan J.; Adcock, R. Alison; Barch, Deanna M.; Botvinick, Matthew M.; Carver, Charles S.; Cools, Roshan; Custers, Ruud; Dickinson, Anthony; Dweck, Carol S.; Fishbach, Ayelet; Gollwitzer, Peter M.; Hess, Thomas M.; Isaacowitz, Derek M.; Mather, Mara; Murayama, Kou; Pessoa, Luiz; Samanez-Larkin, Gregory R.; Somerville, Leah H.
2016-01-01
Recent years have seen a rejuvenation of interest in studies of motivation–cognition interactions arising from many different areas of psychology and neuroscience. The present issue of Cognitive, Affective, & Behavioral Neuroscience provides a sampling of some of the latest research from a number of these different areas. In this introductory article, we provide an overview of the current state of the field, in terms of key research developments and candidate neural mechanisms receiving focused investigation as potential sources of motivation–cognition interaction. However, our primary goal is conceptual: to highlight the distinct perspectives taken by different research areas, in terms of how motivation is defined, the relevant dimensions and dissociations that are emphasized, and the theoretical questions being targeted. Together, these distinctions present both challenges and opportunities for efforts aiming toward a more unified and cross-disciplinary approach. We identify a set of pressing research questions calling for this sort of cross-disciplinary approach, with the explicit goal of encouraging integrative and collaborative investigations directed toward them. PMID:24920442
Observed ground-motion variabilities and implication for source properties
NASA Astrophysics Data System (ADS)
Cotton, F.; Bora, S. S.; Bindi, D.; Specht, S.; Drouet, S.; Derras, B.; Pina-Valdes, J.
2016-12-01
One of the key challenges of seismology is to be able to calibrate and analyse the physical factors that control earthquake and ground-motion variabilities. Within the framework of empirical ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-field records and modern regression algorithms allow to decompose these residuals into between-event and a within-event residual components. The between-event term quantify all the residual effects of the source (e.g. stress-drops) which are not accounted by magnitude term as the only source parameter of the model. Between-event residuals provide a new and rather robust way to analyse the physical factors that control earthquake source properties and associated variabilities. We first will show the correlation between classical stress-drops and between-event residuals. We will also explain why between-event residuals may be a more robust way (compared to classical stress-drop analysis) to analyse earthquake source-properties. We will finally calibrate between-events variabilities using recent high-quality global accelerometric datasets (NGA-West 2, RESORCE) and datasets from recent earthquakes sequences (Aquila, Iquique, Kunamoto). The obtained between-events variabilities will be used to evaluate the variability of earthquake stress-drops but also the variability of source properties which cannot be explained by a classical Brune stress-drop variations. We will finally use the between-event residual analysis to discuss regional variations of source properties, differences between aftershocks and mainshocks and potential magnitude dependencies of source characteristics.
Advances in audio source seperation and multisource audio content retrieval
NASA Astrophysics Data System (ADS)
Vincent, Emmanuel
2012-06-01
Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.
Room temperature single photon source using fiber-integrated hexagonal boron nitride
NASA Astrophysics Data System (ADS)
Vogl, Tobias; Lu, Yuerui; Lam, Ping Koy
2017-07-01
Single photons are a key resource for quantum optics and optical quantum information processing. The integration of scalable room temperature quantum emitters into photonic circuits remains to be a technical challenge. Here we utilize a defect center in hexagonal boron nitride (hBN) attached by Van der Waals force onto a multimode fiber as a single photon source. We perform an optical characterization of the source in terms of spectrum, state lifetime, power saturation and photostability. A special feature of our source is that it allows for easy switching between fiber-coupled and free space single photon generation modes. In order to prove the quantum nature of the emission we measure the second-order correlation function {{g}(2)}≤ft(τ \\right) . For both fiber-coupled and free space emission, the {{g}(2)}≤ft(τ \\right) dips below 0.5 indicating operation in the single photon regime. The results so far demonstrate the feasibility of 2D material single photon sources for scalable photonic quantum information processing.
Sabio Paz, Verónica; Panattieri, Néstor D; Cristina Godio, Farmacéutica; Ratto, María E; Arpí, Lucrecia; Dackiewicz, Nora
2015-10-01
Patient safety and quality of care has become a challenge for health systems. Health care is an increasingly complex and risky activity, as it represents a combination of human, technological and organizational processes. It is necessary, therefore, to take effective actions to reduce the adverse events and mitigate its impact. This glossary is a local adaptation of key terms and concepts from the international bibliographic sources. The aim is providing a common language for assessing patient safety processes and compare them.
Matrix effect and recovery terminology issues in regulated drug bioanalysis.
Huang, Yong; Shi, Robert; Gee, Winnie; Bonderud, Richard
2012-02-01
Understanding the meaning of the terms used in the bioanalytical method validation guidance is essential for practitioners to implement best practice. However, terms that have several meanings or that have different interpretations exist within bioanalysis, and this may give rise to differing practices. In this perspective we discuss an important but often confusing term - 'matrix effect (ME)' - in regulated drug bioanalysis. The ME can be interpreted as either the ionization change or the measurement bias of the method caused by the nonanalyte matrix. The ME definition dilemma makes its evaluation challenging. The matrix factor is currently used as a standard method for evaluation of ionization changes caused by the matrix in MS-based methods. Standard additions to pre-extraction samples have been suggested to evaluate the overall effects of a matrix from different sources on the analytical system, because it covers ionization variation and extraction recovery variation. We also provide our personal views on the term 'recovery'.
Pipelining in a changing competitive environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, E.G.; Wishart, D.M.
1996-12-31
The changing competitive environment for the pipeline industry presents a broad spectrum of new challenges and opportunities: international cooperation; globalization of opportunities, organizations and competition; and integrated systems approach to system configuration, financing, contracting strategy, materials sourcing, and operations; cutting edge and emerging technologies; adherence to high standards of environmental protection; an emphasis on safety; innovative approaches to project financing; and advances in technology and programs to maintain the long term, cost effective integrity of operating pipeline systems. These challenges and opportunities are partially a result of the increasingly competitive nature of pipeline development and the public`s intolerance to incidentsmore » of pipeline failure. A creative systems approach to these challenges is often the key to the project moving ahead. This usually encompasses collaboration among users of the pipeline, pipeline owners and operators, international engineering and construction companies, equipment and materials suppliers, in-country engineers and constructors, international lending agencies and financial institutions.« less
Fish-Eye Observing with Phased Array Radio Telescopes
NASA Astrophysics Data System (ADS)
Wijnholds, S. J.
The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.
Personality disorder assessment: the challenge of construct validity.
Clark, L A; Livesley, W J; Morey, L
1997-01-01
We begin with a review of the data that challenge the current categorical system for classifying personality disorder, focusing on the central assessment issues of convergent and discriminant validity. These data indicate that while there is room for improvement in assessment, even greater change is needed in conceptualization than in instrumentation. Accordingly, we then refocus the categorical-dimensional debate in assessment terms, and place it in the broader context of such issues as the hierarchical structure of personality, overlap and distinctions between normal and abnormal personality, sources of information in personality disorder assessment, and overlap and discrimination of trait and state assessment. We conclude that more complex conceptual models that can incorporate both biological and environmental influences on the development of adaptive and maladaptive personality are needed.
Production and use of estimates for monitoring progress in the health sector: the case of Bangladesh
Ahsan, Karar Zunaid; Tahsina, Tazeen; Iqbal, Afrin; Ali, Nazia Binte; Chowdhury, Suman Kanti; Huda, Tanvir M.; Arifeen, Shams El
2017-01-01
ABSTRACT Background: In order to support the progress towards the post-2015 development agenda for the health sector, the importance of high-quality and timely estimates has become evident both globally and at the country level. Objective and Methods: Based on desk review, key informant interviews and expert panel discussions, the paper critically reviews health estimates from both the local (i.e. nationally generated information by the government and other agencies) and the global sources (which are mostly modeled or interpolated estimates developed by international organizations based on different sources of information), and assesses the country capacity and monitoring strategies to meet the increasing data demand in the coming years. Primarily, this paper provides a situation analysis of Bangladesh in terms of production and use of health estimates for monitoring progress towards the post-2015 development goals for the health sector. Results: The analysis reveals that Bangladesh is data rich, particularly from household surveys and health facility assessments. Practices of data utilization also exist, with wide acceptability of survey results for informing policy, programme review and course corrections. Despite high data availability from multiple sources, the country capacity for providing regular updates of major global health estimates/indicators remains low. Major challenges also include limited human resources, capacity to generate quality data and multiplicity of data sources, where discrepancy and lack of linkages among different data sources (local sources and between local and global estimates) present emerging challenges for interpretation of the resulting estimates. Conclusion: To fulfill the increased data requirement for the post-2015 era, Bangladesh needs to invest more in electronic data capture and routine health information systems. Streamlining of data sources, integration of parallel information systems into a common platform, and capacity building for data generation and analysis are recommended as priority actions for Bangladesh in the coming years. In addition to automation of routine health information systems, establishing an Indicator Reference Group for Bangladesh to analyze data; building country capacity in data quality assessment and triangulation; and feeding into global, inter-agency estimates for better reporting would address a number of mentioned challenges in the short- and long-run. PMID:28532305
The Mock LISA Data Challenge Round 3: New and Improved Sources
NASA Technical Reports Server (NTRS)
Baker, John
2008-01-01
The Mock LISA Data Challenges are a program to demonstrate and encourage the development of data-analysis capabilities for LISA. Each round of challenges consists of several data sets containing simulated instrument noise and gravitational waves from sources of undisclosed parameters. Participants are asked to analyze the data sets and report the maximum information they can infer about the source parameters. The challenges are being released in rounds of increasing complexity and realism. Challenge 3. currently in progress, brings new source classes, now including cosmic-string cusps and primordial stochastic backgrounds, and more realistic signal models for supermassive black-hole inspirals and galactic double white dwarf binaries.
Simulating the Heliosphere with Kinetic Hydrogen and Dynamic MHD Source Terms
Heerikhuisen, Jacob; Pogorelov, Nikolai; Zank, Gary
2013-04-01
The interaction between the ionized plasma of the solar wind (SW) emanating from the sun and the partially ionized plasma of the local interstellar medium (LISM) creates the heliosphere. The heliospheric interface is characterized by the tangential discontinuity known as the heliopause that separates the SW and LISM plasmas, and a termination shock on the SW side along with a possible bow shock on the LISM side. Neutral Hydrogen of interstellar origin plays a critical role in shaping the heliospheric interface, since it freely traverses the heliopause. Charge-exchange between H-atoms and plasma protons couples the ions and neutrals, but themore » mean free paths are large, resulting in non-equilibrated energetic ion and neutral components. In our model, source terms for the MHD equations are generated using a kinetic approach for hydrogen, and the key computational challenge is to resolve these sources with sufficient statistics. For steady-state simulations, statistics can accumulate over arbitrarily long time intervals. In this paper we discuss an approach for improving the statistics in time-dependent calculations, and present results from simulations of the heliosphere where the SW conditions at the inner boundary of the computation vary according to an idealized solar cycle.« less
Towards sustainable and renewable systems for electrochemical energy storage.
Tarascon, Jean-Marie
2008-01-01
Renewable energy sources and electric automotive transportation are popular topics in our belated energy-conscious society, placing electrochemical energy management as one of the major technological developments for this new century. Besides efficiency, any new storage technologies will have to provide advantages in terms of cost and environmental footprint and thus rely on sustainable materials that can be processed at low temperature. To meet such challenges future devices will require inspiration from living organisms and rely on either bio-inspired or biomimetic approaches.
Who owns the long term? Perspectives from global business leaders.
Lévy, Maurice; Eskew, Mike; Bernotat, Wulf H; Barner, Marianne
2007-01-01
Day-to-day management is challenging enough for CEOs. How do they manage for the long term as well? We posed that question to four top executives of global companies. According to Maurice Levy, chairman and CEO of Publicis Groupe, building the future is really about building the present and keeping close to the front line--those who deal with your customers and markets. He also attributes his company's success in large part to knowing when to take action: In a market where clients' needs steer your long-term future, timing is everything. UPS Chairman and CEO Mike Eskew emphasizes staying true to your vision and values over the long run, despite meeting obstacles along the way. It took more than 20 years, and many lessons learned, to produce consistent profits in what is today the company's fastest-growing and most profitable business: international small packages. Wulf H. Bernotat, CEO of E.ON, examines the challenges facing business leaders and politicians as they try to balance energy needs against potential environmental damage. He calls for educating people about consumption and waste, and he maintains that a diverse and reliable mix of energy sources is the only way to ensure a secure supply while protecting our environment. Finally, Marianne Barner, the director of corporate communications and ombudsman for children's issues at IKEA, discusses how the company is taking steps to improve the environment and be otherwise socially responsible. For example, it's partnering with NGOs to address child labor issues and, on its own, is working to help mitigate climate change. IKEA's goals include using renewable sources for 100% of its energy needs and cutting its overall energy consumption by 25%.
de Souza, Andrea; Bittker, Joshua A; Lahr, David L; Brudz, Steve; Chatwin, Simon; Oprea, Tudor I; Waller, Anna; Yang, Jeremy J; Southall, Noel; Guha, Rajarshi; Schürer, Stephan C; Vempati, Uma D; Southern, Mark R; Dawson, Eric S; Clemons, Paul A; Chung, Thomas D Y
2014-06-01
Recent industry-academic partnerships involve collaboration among disciplines, locations, and organizations using publicly funded "open-access" and proprietary commercial data sources. These require the effective integration of chemical and biological information from diverse data sources, which presents key informatics, personnel, and organizational challenges. The BioAssay Research Database (BARD) was conceived to address these challenges and serve as a community-wide resource and intuitive web portal for public-sector chemical-biology data. Its initial focus is to enable scientists to more effectively use the National Institutes of Health Roadmap Molecular Libraries Program (MLP) data generated from the 3-year pilot and 6-year production phases of the Molecular Libraries Probe Production Centers Network (MLPCN), which is currently in its final year. BARD evolves the current data standards through structured assay and result annotations that leverage BioAssay Ontology and other industry-standard ontologies, and a core hierarchy of assay definition terms and data standards defined specifically for small-molecule assay data. We initially focused on migrating the highest-value MLP data into BARD and bringing it up to this new standard. We review the technical and organizational challenges overcome by the interdisciplinary BARD team, veterans of public- and private-sector data-integration projects, who are collaborating to describe (functional specifications), design (technical specifications), and implement this next-generation software solution. © 2014 Society for Laboratory Automation and Screening.
Seltmann, Katja C.; Pénzes, Zsolt; Yoder, Matthew J.; Bertone, Matthew A.; Deans, Andrew R.
2013-01-01
Hymenoptera, the insect order that includes sawflies, bees, wasps, and ants, exhibits an incredible diversity of phenotypes, with over 145,000 species described in a corpus of textual knowledge since Carolus Linnaeus. In the absence of specialized training, often spanning decades, however, these articles can be challenging to decipher. Much of the vocabulary is domain-specific (e.g., Hymenoptera biology), historically without a comprehensive glossary, and contains much homonymous and synonymous terminology. The Hymenoptera Anatomy Ontology was developed to surmount this challenge and to aid future communication related to hymenopteran anatomy, as well as provide support for domain experts so they may actively benefit from the anatomy ontology development. As part of HAO development, an active learning, dictionary-based, natural language recognition tool was implemented to facilitate Hymenoptera anatomy term discovery in literature. We present this tool, referred to as the ‘Proofer’, as part of an iterative approach to growing phenotype-relevant ontologies, regardless of domain. The process of ontology development results in a critical mass of terms that is applied as a filter to the source collection of articles in order to reveal term occurrence and biases in natural language species descriptions. Our results indicate that taxonomists use domain-specific terminology that follows taxonomic specialization, particularly at superfamily and family level groupings and that the developed Proofer tool is effective for term discovery, facilitating ontology construction. PMID:23441153
Seltmann, Katja C; Pénzes, Zsolt; Yoder, Matthew J; Bertone, Matthew A; Deans, Andrew R
2013-01-01
Hymenoptera, the insect order that includes sawflies, bees, wasps, and ants, exhibits an incredible diversity of phenotypes, with over 145,000 species described in a corpus of textual knowledge since Carolus Linnaeus. In the absence of specialized training, often spanning decades, however, these articles can be challenging to decipher. Much of the vocabulary is domain-specific (e.g., Hymenoptera biology), historically without a comprehensive glossary, and contains much homonymous and synonymous terminology. The Hymenoptera Anatomy Ontology was developed to surmount this challenge and to aid future communication related to hymenopteran anatomy, as well as provide support for domain experts so they may actively benefit from the anatomy ontology development. As part of HAO development, an active learning, dictionary-based, natural language recognition tool was implemented to facilitate Hymenoptera anatomy term discovery in literature. We present this tool, referred to as the 'Proofer', as part of an iterative approach to growing phenotype-relevant ontologies, regardless of domain. The process of ontology development results in a critical mass of terms that is applied as a filter to the source collection of articles in order to reveal term occurrence and biases in natural language species descriptions. Our results indicate that taxonomists use domain-specific terminology that follows taxonomic specialization, particularly at superfamily and family level groupings and that the developed Proofer tool is effective for term discovery, facilitating ontology construction.
NASA Astrophysics Data System (ADS)
Braban, Christine; Tang, Sim; Poskitt, Janet; Van Dijk, Netty; Leeson, Sarah; Dragosits, Ulli; Hutchings, Torben; Twigg, Marsailidh; Di Marco, Chiara; Langford, Ben; Tremper, Anja; Nemitz, Eiko; Sutton, Mark
2017-04-01
Emissions of ammonia affect both rural and urban air quality primarily via reaction of ammonia in the atmosphere forming secondary ammonium salts in particulate matter (PM). Urban ammonia emissions come from a variety of sources including biological decomposition, human waste, industrial processes and combustion engines. In the UK, the only long-term urban ammonia measurement is a UK National Ammonia Monitoring Network site at London Cromwell Road, recording monthly average concentrations. Short term measurements have also been made in the past decade at Marylebone Road, North Kensington and on the BT Tower. Cromwell Road is a kerbside site operational since 1999. The Cromwell Road data indicates that ammonia concentrations may be increasing since 2010-2012 after a long period of decreasing. Data from the National Atmospheric Emissions Inventory indicates ammonia emissions from diesel fleet exhausts increasing over this time period but an overall net decrease in ammonia emissions. With changes in engine and exhaust technology to minimise pollutant emissions and the importance of ammonia as a precursor gas for secondary PM, there is a challenge to understand urban ammonia concentrations and subsequent impacts on urban air quality. In this paper the long term measurements are assessed in conjunction with the short-term measurements.The challenges to assess the relative importance of local versus long range ammonia emission are discussed.
The Grand Challenge of Basin-Scale Groundwater Quality Management Modelling
NASA Astrophysics Data System (ADS)
Fogg, G. E.
2017-12-01
The last 50+ years of agricultural, urban and industrial land and water use practices have accelerated the degradation of groundwater quality in the upper portions of many major aquifer systems upon which much of the world relies for water supply. In the deepest and most extensive systems (e.g., sedimentary basins) that typically have the largest groundwater production rates and hold fresh groundwaters on decadal to millennial time scales, most of the groundwater is not yet contaminated. Predicting the long-term future groundwater quality in such basins is a grand scientific challenge. Moreover, determining what changes in land and water use practices would avert future, irreversible degradation of these massive freshwater stores is a grand challenge both scientifically and societally. It is naïve to think that the problem can be solved by eliminating or reducing enough of the contaminant sources, for human exploitation of land and water resources will likely always result in some contamination. The key lies in both reducing the contaminant sources and more proactively managing recharge in terms of both quantity and quality, such that the net influx of contaminants is sufficiently moderate and appropriately distributed in space and time to reverse ongoing groundwater quality degradation. Just as sustainable groundwater quantity management is greatly facilitated with groundwater flow management models, sustainable groundwater quality management will require the use of groundwater quality management models. This is a new genre of hydrologic models do not yet exist, partly because of the lack of modeling tools and the supporting research to model non-reactive as well as reactive transport on large space and time scales. It is essential that the contaminant hydrogeology community, which has heretofore focused almost entirely on point-source plume-scale problems, direct it's efforts toward the development of process-based transport modeling tools and analyses capable of appropriately upscaling advection-dispersion and reactions at the basin scale (10^2 km). A road map for research and development in groundwater quality management modeling and its application toward securing future groundwater resources will be discussed.
Source mechanisms of volcanic tsunamis.
Paris, Raphaël
2015-10-28
Volcanic tsunamis are generated by a variety of mechanisms, including volcano-tectonic earthquakes, slope instabilities, pyroclastic flows, underwater explosions, shock waves and caldera collapse. In this review, we focus on the lessons that can be learnt from past events and address the influence of parameters such as volume flux of mass flows, explosion energy or duration of caldera collapse on tsunami generation. The diversity of waves in terms of amplitude, period, form, dispersion, etc. poses difficulties for integration and harmonization of sources to be used for numerical models and probabilistic tsunami hazard maps. In many cases, monitoring and warning of volcanic tsunamis remain challenging (further technical and scientific developments being necessary) and must be coupled with policies of population preparedness. © 2015 The Author(s).
Helicopter external noise prediction and reduction
NASA Astrophysics Data System (ADS)
Lewy, Serge
Helicopter external noise is a major challenge for the manufacturers, both in the civil domain and in the military domain. The strongest acoustic sources are due to the main rotor. Two flight conditions are analyzed in detail because radiated sound is then very loud and very impulsive: (1) high-speed flight, with large thickness and shear terms on the advancing blade side; and (2) descent flight, with blade-vortex interaction for certain rates of descent. In both cases, computational results were obtained and tests on new blade designs have been conducted in wind tunnels. These studies prove that large noise reduction can be achieved. It is shown in conclusion, however, that the other acoustic sources (tail rotor, turboshaft engines) must not be neglected to define a quiet helicopter.
NASA Astrophysics Data System (ADS)
Gica, E.
2016-12-01
The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.
Communication Challenges in Neonatal Encephalopathy.
Lemmon, Monica E; Donohue, Pamela K; Parkinson, Charlamaine; Northington, Frances J; Boss, Renee D
2016-09-01
Families must process complex information related to neonatal encephalopathy and therapeutic hypothermia. In this mixed methods study, semi-structured interviews were performed with parents whose infants were enrolled in an existing longitudinal cohort study of therapeutic hypothermia between 2011 and 2014. Thematic saturation was achieved after 20 interviews. Parental experience of communicating with clinicians was characterized by 3 principle themes. Theme 1 highlighted that a fragmented communication process mirrored the chaotic maternal and neonatal course. Parents often received key information about neonatal encephalopathy and therapeutic hypothermia from maternal clinicians. Infant medical information was often given to 1 family member (60%), who felt burdened by the responsibility to relay that information to others. Families universally valued the role of the bedside nurse, who was perceived as the primary source of communication for most (75%) families. Theme 2 encompassed the challenges of discussing the complex therapy of therapeutic hypothermia: families appreciated clinicians who used lay language and provided written material, and they often felt overwhelmed by technical information that made it hard to understand the "big picture" of their infant's medical course. Theme 3 involved the uncertain prognosis after neonatal encephalopathy. Parents appreciated specific expectations about their infant's long-term development, and experienced long-term distress about prognostic uncertainty. Communicating complex and large volumes of information in the midst of perinatal crisis presents inherent challenges for both clinicians and families. We identified an actionable set of communication challenges that can be addressed with targeted interventions. Copyright © 2016 by the American Academy of Pediatrics.
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.
“Gestaltomics”: Systems Biology Schemes for the Study of Neuropsychiatric Diseases
Gutierrez Najera, Nora A.; Resendis-Antonio, Osbaldo; Nicolini, Humberto
2017-01-01
The integration of different sources of biological information about what defines a behavioral phenotype is difficult to unify in an entity that reflects the arithmetic sum of its individual parts. In this sense, the challenge of Systems Biology for understanding the “psychiatric phenotype” is to provide an improved vision of the shape of the phenotype as it is visualized by “Gestalt” psychology, whose fundamental axiom is that the observed phenotype (behavior or mental disorder) will be the result of the integrative composition of every part. Therefore, we propose the term “Gestaltomics” as a term from Systems Biology to integrate data coming from different sources of information (such as the genome, transcriptome, proteome, epigenome, metabolome, phenome, and microbiome). In addition to this biological complexity, the mind is integrated through multiple brain functions that receive and process complex information through channels and perception networks (i.e., sight, ear, smell, memory, and attention) that in turn are programmed by genes and influenced by environmental processes (epigenetic). Today, the approach of medical research in human diseases is to isolate one disease for study; however, the presence of an additional disease (co-morbidity) or more than one disease (multimorbidity) adds complexity to the study of these conditions. This review will present the challenge of integrating psychiatric disorders at different levels of information (Gestaltomics). The implications of increasing the level of complexity, for example, studying the co-morbidity with another disease such as cancer, will also be discussed. PMID:28536537
Current challenges in meeting global iodine requirements.
Eastman, Creswell J; Jooste, Pieter
2012-01-01
Iodine deficiency is a global problem of immense magnitude afflicting 2 billion of the world's population. The adverse effects of iodine deficiency in humans, collectively termed iodine deficiency disorders, result from decreased thyroid hormone production and action, and vary in severity from thyroid enlargement (goiter) to severe, irreversible brain damage, termed endemic cretinism. Thyroid hormone is essential throughout life, but it is critical for normal brain development in the fetus and throughout childhood. During pregnancy, maternal thyroid hormone production must increase by 25-50% to meet maternal-fetal requirements. The principal sources of iodine in the diet include milk and dairy products, seafoods and foods with added iodized salt. Vegetables, fruits and cereals are generally poor sources of iodine because most of our soils and water supplies are deficient in iodine. The accepted solution to the problem is Universal Salt Iodization where all salt for human and animal consumption is iodized at a level of 20-40 µg/g. In principle, mandatory fortification represents the most effective public health strategy where safety and efficacy can be assured and there is a demonstrated need for the nutrient in the population. Voluntary fortification of salt and other foods has many limitations and few benefits. Iodine supplementation is a useful, but expensive, inefficient and unsustainable strategy for preventing iodine deficiency. The current worldwide push to decrease salt intake to prevent cardiovascular disease presents an entirely new challenge in addressing iodine deficiency in both developing and developed countries. Copyright © 2012 S. Karger AG, Basel.
PDF-ECG in clinical practice: A model for long-term preservation of digital 12-lead ECG data.
Sassi, Roberto; Bond, Raymond R; Cairns, Andrew; Finlay, Dewar D; Guldenring, Daniel; Libretti, Guido; Isola, Lamberto; Vaglio, Martino; Poeta, Roberto; Campana, Marco; Cuccia, Claudio; Badilini, Fabio
In clinical practice, data archiving of resting 12-lead electrocardiograms (ECGs) is mainly achieved by storing a PDF report in the hospital electronic health record (EHR). When available, digital ECG source data (raw samples) are only retained within the ECG management system. The widespread availability of the ECG source data would undoubtedly permit successive analysis and facilitate longitudinal studies, with both scientific and diagnostic benefits. PDF-ECG is a hybrid archival format which allows to store in the same file both the standard graphical report of an ECG together with its source ECG data (waveforms). Using PDF-ECG as a model to address the challenge of ECG data portability, long-term archiving and documentation, a real-world proof-of-concept test was conducted in a northern Italy hospital. A set of volunteers undertook a basic ECG using routine hospital equipment and the source data captured. Using dedicated web services, PDF-ECG documents were then generated and seamlessly uploaded in the hospital EHR, replacing the standard PDF reports automatically generated at the time of acquisition. Finally, the PDF-ECG files could be successfully retrieved and re-analyzed. Adding PDF-ECG to an existing EHR had a minimal impact on the hospital's workflow, while preserving the ECG digital data. Copyright © 2017 Elsevier Inc. All rights reserved.
Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C
2017-10-17
Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.
The Role of NOAA's National Data Centers in the Earth and Space Science Infrastructure
NASA Astrophysics Data System (ADS)
Fox, C. G.
2008-12-01
NOAA's National Data Centers (NNDC) provide access to long-term archives of environmental data from NOAA and other sources. The NNDCs face significant challenges in the volume and complexity of modern data sets. Data volume challenges are being addressed using more capable data archive systems such as the Comprehensive Large Array-Data Stewardship System (CLASS). Challenges in assuring data quality and stewardship are in many ways more challenging. In the past, scientists at the Data Centers could provide reasonable stewardship of data sets in their area of expertise. As staff levels have decreased and data complexity has increased, Data Centers depend on their data providers and user communities to provide high-quality metadata, feedback on data problems and improvements. This relationship requires strong partnerships between the NNDCs and academic, commercial, and international partners, as well as advanced data management and access tools that conform to established international standards when available. The NNDCs are looking to geospatial databases, interactive mapping, web services, and other Application Program Interface approaches to help preserve NNDC data and information and to make it easily available to the scientific community.
Remotely measuring populations during a crisis by overlaying two data sources
Bharti, Nita; Lu, Xin; Bengtsson, Linus; Wetter, Erik; Tatem, Andrew J.
2015-01-01
Background Societal instability and crises can cause rapid, large-scale movements. These movements are poorly understood and difficult to measure but strongly impact health. Data on these movements are important for planning response efforts. We retrospectively analyzed movement patterns surrounding a 2010 humanitarian crisis caused by internal political conflict in Côte d'Ivoire using two different methods. Methods We used two remote measures, nighttime lights satellite imagery and anonymized mobile phone call detail records, to assess average population sizes as well as dynamic population changes. These data sources detect movements across different spatial and temporal scales. Results The two data sources showed strong agreement in average measures of population sizes. Because the spatiotemporal resolution of the data sources differed, we were able to obtain measurements on long- and short-term dynamic elements of populations at different points throughout the crisis. Conclusions Using complementary, remote data sources to measure movement shows promise for future use in humanitarian crises. We conclude with challenges of remotely measuring movement and provide suggestions for future research and methodological developments. PMID:25733558
Bosire Onyancha, Omwoyo
2008-05-01
As channels of communicating HIV/AIDS research information, serial publications and particularly journals are increasingly used in response to the pandemic. The last few decades have witnessed a proliferation of sources of HIV/AIDS-related information, bringing many challenges to collection-development librarians as well as to researchers. This study uses an informetric approach to examine the growth, productivity and scientific impact of these sources, during the period 1980 to 2005, and especially to measure performance in the publication and dissemination of HIV/AIDS research about or from eastern or southern Africa. Data were collected from MEDLINE, Science Citation Index (SCI), Social Sciences Citation Index (SSCI), and Ulrich's Periodical Directory. The analysis used Sitkis version 1.5, Microsoft Office Access, Microsoft Office Excel, Bibexcel, and Citespace version 2.0.1. The specific objectives were to identify the number of sources of HIV/AIDS-related information that have been published in the region, the coverage of these in key bibliographic databases, the most commonly used publication type for HIV/AIDS research, the countries in which the sources are published, the sources' productivity in terms of numbers of papers and citations, the most influential sources, the subject coverage of the sources, and the core sources of HIV/AIDS-information.
Minimal-Drift Heading Measurement using a MEMS Gyro for Indoor Mobile Robots.
Hong, Sung Kyung; Park, Sungsu
2008-11-17
To meet the challenges of making low-cost MEMS yaw rate gyros for the precise self-localization of indoor mobile robots, this paper examines a practical and effective method of minimizing drift on the heading angle that relies solely on integration of rate signals from a gyro. The main idea of the proposed approach is consists of two parts; 1) self-identification of calibration coefficients that affects long-term performance, and 2) threshold filter to reject the broadband noise component that affects short-term performance. Experimental results with the proposed phased method applied to Epson XV3500 gyro demonstrate that it effectively yields minimal drift heading angle measurements getting over major error sources in the MEMS gyro output.
The Disposal of Spacecraft and Launch Vehicle Stages in Low Earth Orbit
NASA Technical Reports Server (NTRS)
Johnson, Nicholas L.
2007-01-01
Spacecraft and launch vehicle stages abandoned in Earth orbit have historically been a primary source of debris from accidental explosions. In the future, such satellites will become the principal cause of orbital debris via inadvertent collisions. To curtail both the near-term and far-term risks posed by derelict spacecraft and launch vehicle stages to operational space systems, numerous national and international orbital debris mitigation guidelines specifically recommend actions which could prevent or limit such future debris generation. Although considerable progress has been made in implementing these recommendations, some changes to existing vehicle designs can be difficult. Moreover, the nature of some missions also can present technological and budgetary challenges to be compliant with widely accepted orbital debris mitigation measures.
Energy aware swarm optimization with intercluster search for wireless sensor network.
Thilagavathi, Shanmugasundaram; Geetha, Bhavani Gnanasambandan
2015-01-01
Wireless sensor networks (WSNs) are emerging as a low cost popular solution for many real-world challenges. The low cost ensures deployment of large sensor arrays to perform military and civilian tasks. Generally, WSNs are power constrained due to their unique deployment method which makes replacement of battery source difficult. Challenges in WSN include a well-organized communication platform for the network with negligible power utilization. In this work, an improved binary particle swarm optimization (PSO) algorithm with modified connected dominating set (CDS) based on residual energy is proposed for discovery of optimal number of clusters and cluster head (CH). Simulations show that the proposed BPSO-T and BPSO-EADS perform better than LEACH- and PSO-based system in terms of energy savings and QOS.
Recent Approaches to Estimate Associations Between Source-Specific Air Pollution and Health.
Krall, Jenna R; Strickland, Matthew J
2017-03-01
Estimating health effects associated with source-specific exposure is important for better understanding how pollution impacts health and for developing policies to better protect public health. Although epidemiologic studies of sources can be informative, these studies are challenging to conduct because source-specific exposures (e.g., particulate matter from vehicles) often are not directly observed and must be estimated. We reviewed recent studies that estimated associations between pollution sources and health to identify methodological developments designed to address important challenges. Notable advances in epidemiologic studies of sources include approaches for (1) propagating uncertainty in source estimation into health effect estimates, (2) assessing regional and seasonal variability in emissions sources and source-specific health effects, and (3) addressing potential confounding in estimated health effects. Novel methodological approaches to address challenges in studies of pollution sources, particularly evaluation of source-specific health effects, are important for determining how source-specific exposure impacts health.
NASA Astrophysics Data System (ADS)
Wang, Chaoen; Chang, Lung-Hai; Chang, Mei-Hsia; Chen, Ling-Jhen; Chung, Fu-Tsai; Lin, Ming-Chyuan; Liu, Zong-Kai; Lo, Chih-Hung; Tsai, Chi-Lin; Yeh, Meng-Shu; Yu, Tsung-Chi
2017-11-01
Excitation of multipacting, enhanced by gas condensation on cold surfaces of the high power input coupler in a SRF module poses the highest challenge for reliable SRF operation under high average RF power. This could prevent the light source SRF module from being operated with a desired high beam current. Off-line long-term reliability tests have been conducted for the newly constructed 500-MHz SRF KEKB type modules at an accelerating RF voltage of 1.6-MV to enable prediction of their operational reliability in the 3-GeV Taiwan Photon Source (TPS), since prediction from mere production performance by conventional horizontal test is presently unreliable. As expected, operational difficulties resulting from multipacting, enhanced by gas condensation, have been identified in the course of long-term reliability test. Our present hypothesis is that gas condensation can be slowed down by preserving the vacuum pressure at the power coupler close to that reached just after its cool down to liquid helium temperatures. This is achievable by reduction of the power coupler out-gassing rate through comprehensive warm aging. Its feasibility and effectiveness has been experimentally verified in a second long term reliability test. Our success opens the possibility to operate the SRF module free of multipacting trouble and opens a new direction to improve the operational performance of next generation SRF modules in light sources with high beam currents.
Kochanska, Grazyna; Kim, Sanghag; Nordling, Jamie Koenig
2013-01-01
The need for research on potential moderators of personality–parenting links has been repeatedly emphasized, yet few studies have examined how varying stressful or challenging circumstances may influence such links. We studied 186 diverse, low-income mother–toddler dyads. Mothers described themselves in terms of Big Five traits, were observed in lengthy interactions with their children, and provided parenting reports. Ecological adversity, assessed as a cumulative index of known risk factors, and the child’s difficulty observed as negative affect and defiance in interactions with mothers were posited as sources of parenting challenge. Mothers high in Neuroticism reported more power assertion. Some personality–parenting relations emerged only under challenging conditions. For mothers raising difficult children, higher Extraversion was linked to increased observed power assertion, but higher Conscientiousness was linked to decreased reported power assertion. There were no such relations for mothers of easy children. By contrast, some relations emerged only in the absence of challenge. Agreeableness was associated with more positive parenting for mothers who lived under conditions of low ecological adversity, and with less reported power for those who had easy children, and Openness was linked to more positive parenting for mothers of easy children. Those traits were unrelated to parenting under challenging conditions. PMID:23066882
The successes and challenges of open-source biopharmaceutical innovation.
Allarakhia, Minna
2014-05-01
Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.
Long-term dataset on aquatic responses to concurrent climate change and recovery from acidification
NASA Astrophysics Data System (ADS)
Leach, Taylor H.; Winslow, Luke A.; Acker, Frank W.; Bloomfield, Jay A.; Boylen, Charles W.; Bukaveckas, Paul A.; Charles, Donald F.; Daniels, Robert A.; Driscoll, Charles T.; Eichler, Lawrence W.; Farrell, Jeremy L.; Funk, Clara S.; Goodrich, Christine A.; Michelena, Toby M.; Nierzwicki-Bauer, Sandra A.; Roy, Karen M.; Shaw, William H.; Sutherland, James W.; Swinton, Mark W.; Winkler, David A.; Rose, Kevin C.
2018-04-01
Concurrent regional and global environmental changes are affecting freshwater ecosystems. Decadal-scale data on lake ecosystems that can describe processes affected by these changes are important as multiple stressors often interact to alter the trajectory of key ecological phenomena in complex ways. Due to the practical challenges associated with long-term data collections, the majority of existing long-term data sets focus on only a small number of lakes or few response variables. Here we present physical, chemical, and biological data from 28 lakes in the Adirondack Mountains of northern New York State. These data span the period from 1994-2012 and harmonize multiple open and as-yet unpublished data sources. The dataset creation is reproducible and transparent; R code and all original files used to create the dataset are provided in an appendix. This dataset will be useful for examining ecological change in lakes undergoing multiple stressors.
Evaluation of international recruitment of health professionals in England.
Young, Ruth; Noble, Jenny; Mahon, Ann; Maxted, Mairead; Grant, Janet; Sibbald, Bonnie
2010-10-01
To explore whether a period of intensive international recruitment by the English National Health Service (NHS) achieved its objectives of boosting workforce numbers and to set this against the wider costs, longer-term challenges and questions arising. A postal survey of all pre-2006 NHS providers, Strategic Health Authorities and Deans of Postgraduate Medical Education obtained information on 284 (45%) organizations (142 completed questionnaires). Eight subsequent case studies (74 interviews) covered medical consultant, general practitioner, nurse, midwife and allied health professional recruitment. Most respondents had undertaken or facilitated international recruitment between 2001 and 2006 and believed that it had enabled them to address immediate staff shortages. Views on longer-term implications, such as recruit retention, were more equivocal. Most organizations had made only a limited value-for-money assessment, balancing direct expenditure on overseas recruitment against savings on temporary staff. Other short and long-term transaction and opportunity costs arose from pressures on existing staff, time spent on induction/pastoral support, and human resource management and workforce planning challenges. Though recognized, these extensive 'hidden costs' for NHS organizations were harder to assess as were the implications for source countries and migrant staff. The main achievement of the intensive international recruitment period from a UK viewpoint was that such a major undertaking was seen through without major disruption to NHS services. The wider costs and challenges meant, however, that large-scale international recruitment was not sustainable as a solution to workforce shortages. Should such approaches be attempted in future, a clearer upfront appraisal of all the potential costs and implications will be vital.
An Outlook on Lithium Ion Battery Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manthiram, Arumugam
Lithium ion batteries as a power source are dominating in portable electronics, penetrating the electric vehicle market, and on the verge of entering the utility market for grid-energy storage. Depending on the application, trade-offs among the various performance parameters—energy, power, cycle life, cost, safety, and environmental impact—are often needed, which are linked to severe materials chemistry challenges. The current lithium ion battery technology is based on insertion-reaction electrodes and organic liquid electrolytes. With an aim to increase the energy density or optimize the other performance parameters, new electrode materials based on both insertion reaction and dominantly conversion reaction along withmore » solid electrolytes and lithium metal anode are being intensively pursued. In conclusion, this article presents an outlook on lithium ion technology by providing first the current status and then the progress and challenges with the ongoing approaches. In light of the formidable challenges with some of the approaches, the article finally points out practically viable near-term strategies.« less
An Outlook on Lithium Ion Battery Technology
Manthiram, Arumugam
2017-09-07
Lithium ion batteries as a power source are dominating in portable electronics, penetrating the electric vehicle market, and on the verge of entering the utility market for grid-energy storage. Depending on the application, trade-offs among the various performance parameters—energy, power, cycle life, cost, safety, and environmental impact—are often needed, which are linked to severe materials chemistry challenges. The current lithium ion battery technology is based on insertion-reaction electrodes and organic liquid electrolytes. With an aim to increase the energy density or optimize the other performance parameters, new electrode materials based on both insertion reaction and dominantly conversion reaction along withmore » solid electrolytes and lithium metal anode are being intensively pursued. In conclusion, this article presents an outlook on lithium ion technology by providing first the current status and then the progress and challenges with the ongoing approaches. In light of the formidable challenges with some of the approaches, the article finally points out practically viable near-term strategies.« less
An Outlook on Lithium Ion Battery Technology
2017-01-01
Lithium ion batteries as a power source are dominating in portable electronics, penetrating the electric vehicle market, and on the verge of entering the utility market for grid-energy storage. Depending on the application, trade-offs among the various performance parameters—energy, power, cycle life, cost, safety, and environmental impact—are often needed, which are linked to severe materials chemistry challenges. The current lithium ion battery technology is based on insertion-reaction electrodes and organic liquid electrolytes. With an aim to increase the energy density or optimize the other performance parameters, new electrode materials based on both insertion reaction and dominantly conversion reaction along with solid electrolytes and lithium metal anode are being intensively pursued. This article presents an outlook on lithium ion technology by providing first the current status and then the progress and challenges with the ongoing approaches. In light of the formidable challenges with some of the approaches, the article finally points out practically viable near-term strategies. PMID:29104922
NASA Astrophysics Data System (ADS)
Johnes, P.
2013-12-01
Nutrient enrichment of waters from land-based and atmospheric sources presents a significant management challenge, requiring effective stakeholder engagement and policy development, properly underpinned by robust scientific evidence. The challenge is complex, raising significant questions about the specific sources, apportionment and pathways that determine nutrient enrichment and the key priorities for effective management and policy intervention. This paper presents outputs from 4 major UK research programmes: the Defra Demonstration Test Catchments programme (DTC), the Environment Agency's Catchment Sensitive Farming monitoring and evaluation programme (CSF), Natural Resources Wales Welsh Catchment Initiative (WCI) and the NERC Environmental Virtual Observatory programme (EVOp). Funded to meet this challenge, they are delivering new understanding of the rates and sources of pollutant fluxes from land to water, their impacts on ecosystem goods and services, and likely trends under future climate and land use change from field to national scale. DTC, a 12m investment by the UK Government, has set up long-term, high resolution research platforms equipped with novel telemetered sensor networks to monitor stream ecosystem responses to on-farm mitigation measures at a representative scale for catchment management. Ecosystem structural and functional responses and bulk hydrochemistry are also being monitored using standard protocols. CSF has set up long-term, enhanced monitoring in 8 priority catchments, with monthly monitoring in a further 72 English catchments and 6 Welsh priority catchments, to identify shifts in pollutant flux to waters resulting from mitigation measures in priority areas and farming sectors. CSF and WCI have contributed to >50 million of targeted farm improvements to date, representing a significant shift in farming practice. Each programme has generated detailed evidence on stream ecosystem responses to targeted mitigation. However, to provide effective underpinning for policy the major challenge has been to upscale this knowledge beyond these data-rich systems and identify the dominant contributing areas and priorities for management intervention to control nutrient flux and ecological impacts in data-poor systems which are located downstream from existing monitoring infrastructure or are in unmonitored catchments in remote locations. EVOp has directly addressed this challenge, developing a cloud computing enabled National Biogeochemical Modelling Framework to support ensemble modelling, knowledge capture and transfer from DTC, CSF, WCI and data-rich research catchments. This platform provides opportunities for further development of national biogeochemical modelling capability, allowing upscaled predictions from plot to catchment and national scale, enabling knowledge transfer from data-rich to data-poor areas. This paper presents initial findings from these research platforms, identifying the key priorities for action emerging from our national scale scenario analysis, and future research directions to further improve understanding, prediction and management capability in nutrient enriched waters and their catchments under changing climate and land use.
Stansfield, Claire; Brunton, Ginny; Rees, Rebecca
2014-06-01
When literature searching for systematic reviews, it is good practice to search widely across different information sources. Little is known about the contributions of different publication formats (e.g. journal article and book chapter) and sources, especially for studies of people's views. Studies from four reviews spanning three public health areas (active transport, motherhood and obesity) were analysed in terms of publication formats and the information sources they were identified from. They comprised of 229 studies exploring people's perceptions, beliefs and experiences ('views studies') and were largely qualitative. Although most (61%) research studies were published within journals, nearly a third (29%) were published as research reports and 5% were published in books. The remainder consisted of theses, conference papers and raw datasets. Two-thirds of studies (66%) were located in a total of 19 bibliographic databases, and 15 databases provided studies that were not identified elsewhere. PubMed was a good source for all reviews. Supplementary information sources were important for identifying studies in all publication formats. Undertaking sensitive searches across a range of information sources is essential for locating views studies in all publication formats. We discuss some benefits and challenges of utilising different information sources. Copyright © 2013 John Wiley & Sons, Ltd.
Evaluating sources and processing of nonpoint source nitrate in a small suburban watershed in China
NASA Astrophysics Data System (ADS)
Han, Li; Huang, Minsheng; Ma, Minghai; Wei, Jinbao; Hu, Wei; Chouhan, Seema
2018-04-01
Identifying nonpoint sources of nitrate has been a long-term challenge in mixed land-use watershed. In the present study, we combine dual nitrate isotope, runoff and stream water monitoring to elucidate the nonpoint nitrate sources across land use, and determine the relative importance of biogeochemical processes for nitrate export in a small suburban watershed, Longhongjian watershed, China. Our study suggested that NH4+ fertilizer, soil NH4+, litter fall and groundwater were the main nitrate sources in Longhongjian Stream. There were large changes in nitrate sources in response to season and land use. Runoff analysis illustrated that the tea plantation and forest areas contributed to a dominated proportion of the TN export. Spatial analysis illustrated that NO3- concentration was high in the tea plantation and forest areas, and δ15N-NO3 and δ18O-NO3 were enriched in the step ponds. Temporal analysis showed high NO3- level in spring, and nitrate isotopes were enriched in summer. Study as well showed that the step ponds played an important role in mitigating nitrate pollution. Nitrification and plant uptake were the significant biogeochemical processes contributing to the nitrogen transformation, and denitrification hardly occurred in the stream.
Review of high-sensitivity Radon studies
NASA Astrophysics Data System (ADS)
Wojcik, M.; Zuzel, G.; Simgen, H.
2017-10-01
A challenge in many present cutting-edge particle physics experiments is the stringent requirements in terms of radioactive background. In peculiar, the prevention of Radon, a radioactive noble gas, which occurs from ambient air and it is also released by emanation from the omnipresent progenitor Radium. In this paper we review various high-sensitivity Radon detection techniques and approaches, applied in the experiments looking for rare nuclear processes happening at low energies. They allow to identify, quantitatively measure and finally suppress the numerous sources of Radon in the detectors’ components and plants.
Large-scale quantum photonic circuits in silicon
NASA Astrophysics Data System (ADS)
Harris, Nicholas C.; Bunandar, Darius; Pant, Mihir; Steinbrecher, Greg R.; Mower, Jacob; Prabhu, Mihika; Baehr-Jones, Tom; Hochberg, Michael; Englund, Dirk
2016-08-01
Quantum information science offers inherently more powerful methods for communication, computation, and precision measurement that take advantage of quantum superposition and entanglement. In recent years, theoretical and experimental advances in quantum computing and simulation with photons have spurred great interest in developing large photonic entangled states that challenge today's classical computers. As experiments have increased in complexity, there has been an increasing need to transition bulk optics experiments to integrated photonics platforms to control more spatial modes with higher fidelity and phase stability. The silicon-on-insulator (SOI) nanophotonics platform offers new possibilities for quantum optics, including the integration of bright, nonclassical light sources, based on the large third-order nonlinearity (χ(3)) of silicon, alongside quantum state manipulation circuits with thousands of optical elements, all on a single phase-stable chip. How large do these photonic systems need to be? Recent theoretical work on Boson Sampling suggests that even the problem of sampling from e30 identical photons, having passed through an interferometer of hundreds of modes, becomes challenging for classical computers. While experiments of this size are still challenging, the SOI platform has the required component density to enable low-loss and programmable interferometers for manipulating hundreds of spatial modes. Here, we discuss the SOI nanophotonics platform for quantum photonic circuits with hundreds-to-thousands of optical elements and the associated challenges. We compare SOI to competing technologies in terms of requirements for quantum optical systems. We review recent results on large-scale quantum state evolution circuits and strategies for realizing high-fidelity heralded gates with imperfect, practical systems. Next, we review recent results on silicon photonics-based photon-pair sources and device architectures, and we discuss a path towards large-scale source integration. Finally, we review monolithic integration strategies for single-photon detectors and their essential role in on-chip feed forward operations.
A Cross-Lingual Similarity Measure for Detecting Biomedical Term Translations
Bollegala, Danushka; Kontonatsios, Georgios; Ananiadou, Sophia
2015-01-01
Bilingual dictionaries for technical terms such as biomedical terms are an important resource for machine translation systems as well as for humans who would like to understand a concept described in a foreign language. Often a biomedical term is first proposed in English and later it is manually translated to other languages. Despite the fact that there are large monolingual lexicons of biomedical terms, only a fraction of those term lexicons are translated to other languages. Manually compiling large-scale bilingual dictionaries for technical domains is a challenging task because it is difficult to find a sufficiently large number of bilingual experts. We propose a cross-lingual similarity measure for detecting most similar translation candidates for a biomedical term specified in one language (source) from another language (target). Specifically, a biomedical term in a language is represented using two types of features: (a) intrinsic features that consist of character n-grams extracted from the term under consideration, and (b) extrinsic features that consist of unigrams and bigrams extracted from the contextual windows surrounding the term under consideration. We propose a cross-lingual similarity measure using each of those feature types. First, to reduce the dimensionality of the feature space in each language, we propose prototype vector projection (PVP)—a non-negative lower-dimensional vector projection method. Second, we propose a method to learn a mapping between the feature spaces in the source and target language using partial least squares regression (PLSR). The proposed method requires only a small number of training instances to learn a cross-lingual similarity measure. The proposed PVP method outperforms popular dimensionality reduction methods such as the singular value decomposition (SVD) and non-negative matrix factorization (NMF) in a nearest neighbor prediction task. Moreover, our experimental results covering several language pairs such as English–French, English–Spanish, English–Greek, and English–Japanese show that the proposed method outperforms several other feature projection methods in biomedical term translation prediction tasks. PMID:26030738
Aid to people with disabilities: Medicaid's growing role.
Carbaugh, Alicia L; Elias, Risa; Rowland, Diane
2006-01-01
Medicaid is the nation's largest health care program providing assistance with health and long-term care services for millions of low-income Americans, including people with chronic illness and severe disabilities. This article traces the evolution of Medicaid's now-substantial role for people with disabilities; assesses Medicaid's contributions over the last four decades to improving health insurance coverage, access to care, and the delivery of care; and examines the program's future challenges as a source of assistance to children and adults with disabilities. Medicaid has shown that it is an important source of health insurance coverage for this population, people for whom private coverage is often unavailable or unaffordable, substantially expanding coverage and helping to reduce the disparities in access to care between the low-income population and the privately insured.
NASA Astrophysics Data System (ADS)
Braban, Christine; Tang, Sim; Bealey, Bill; Roberts, Elin; Stephens, Amy; Galloway, Megan; Greenwood, Sarah; Sutton, Mark; Nemitz, Eiko; Leaver, David
2017-04-01
Ambient ammonia measurements have been undertaken both in the atmosphere to understand sources, concentrations at background and vulnerable ecosystems and for long term monitoring of concentrations. As a pollutant which is projected to increase concentration in the coming decades with significant policy challenges to implementing mitigation strategies it is useful to assess what has been measured, where and why. In this study a review of the literature, has shown that ammonia measurements are frequently not publically reported and in general not reposited in the open data centres, available for research. The specific sectors where measurements have been undertaken are: agricultural point source assessments, agricultural surface exchange measurements, sensitive ecosystem monitoring, landscape/regional studies and governmental long term monitoring. Less frequently ammonia is measured as part of an intensive atmospheric chemistry field campaign. Technology is developing which means a shift from chemical denuder methods to spectroscopic techniques may be possible, however chemical denuding techniques with off-line laboratory analysis will likely be an economical approach for some time to come. This paper reviews existing datasets from the different sectors of research and integrates them for a global picture to allow both a long term understanding and facilitate comparison with future measurements.
Challenges for Synchrotron X-Ray Optics
NASA Astrophysics Data System (ADS)
Freund, Andreas K.
2002-12-01
It is the task of x-ray optics to adapt the raw beam generated by modern sources such as synchrotron storage rings to a great variety of experimental requirements in terms of intensity, spot size, polarization and other parameters. The very high quality of synchrotron radiation (source size of a few microns and beam divergence of a few micro-radians) and the extreme x-ray flux (power of several hundred Watts in a few square mm) make this task quite difficult. In particular the heat load aspect is very important in the conditioning process of the brute x-ray power to make it suitable for being used on the experimental stations. Cryogenically cooled silicon crystals and water-cooled diamond crystals can presently fulfill this task, but limits will soon be reached and new schemes and materials must be envisioned. A major tendency of instrument improvement has always been to concentrate more photons into a smaller spot utilizing a whole variety of focusing devices such as Fresnel zone plates, refractive lenses and systems based on bent surfaces, for example, Kirkpatrick-Baez systems. Apart from the resistance of the sample, the ultimate limits are determined by the source size and strength on one side, by materials properties, cooling, mounting and bending schemes on the other side, and fundamentally by the diffraction process. There is also the important aspect of coherence that can be both a nuisance and a blessing for the experiments, in particular for imaging techniques. Its conservation puts additional constraints on the quality of the optical elements. The overview of the present challenges includes the properties of present and also mentions aspects of future x-ray sources such as the "ultimate" storage ring and free electron lasers. These challenges range from the thermal performances of monochromators to the surface quality of mirrors, from coherence preservation of modern multilayers to short pulse preservation by crystals, and from micro- and nano-focusing techniques to the accuracy and stability of mechanical supports.
Responses of dogs with food allergies to single-ingredient dietary provocation.
Jeffers, J G; Meyer, E K; Sosis, E J
1996-08-01
To characterize specific food ingredients causing allergic reactions in dogs and to assess cross-reactivity between proteins derived from a single animal source or from different plant products. Prospective study. 25 dogs with histories and cutaneous signs consistent with food-allergic dermatitis. Dogs were fed a food-elimination diet until resolution of clinical signs and then challenged with their original diet. A diagnosis of food allergy was made if there was complete return of pruritus within 14 days of challenge exposure. After diagnosis, dogs were fed the food-elimination diet until signs related to dietary challenge abated. The dogs then were fed beef, chicken, chicken eggs, cows' milk, wheat, soy, and corn in single-ingredient provocation trials for 1 week. Any cutaneous reactions to these food ingredients were recorded by their owners. Beef and soy most often caused adverse cutaneous reactions, although all ingredients induced clinical signs in at least 1 dog. Mean number of allergens per dog was 2.4, with 80% reacting to 1 or 2 proteins and 64% reacting to 2 or more of the proteins tested. A significant difference was found between dogs reacting to beef versus cows' milk and between dogs reacting to soy versus wheat; thus, the hypothesis of cross-reactivity to ingredients derived from a single animal source or to different plant products was not supported. Similar differences between chicken meat and eggs were not identified. Long-term management of dogs with food allergies is facilitated by identification of the most commonly encountered food allergens. Because cross-reactivity cannot be verified, each protein source should be included separately in food-provocation trials.
Subsurface Hydrology: Data Integration for Properties and Processes
NASA Astrophysics Data System (ADS)
Hyndman, David W.; Day-Lewis, Frederick D.; Singha, Kamini
Groundwater is a critical resource and the PrinciPal source of drinking water for over 1.5 billion people. In 2001, the National Research Council cited as a "grand challenge" our need to understand the processes that control water movement in the subsurface. This volume faces that challenge in terms of data integration between complex, multi-scale hydrologie processes, and their links to other physical, chemical, and biological processes at multiple scales. Subsurface Hydrology: Data Integration for Properties and Processes presents the current state of the science in four aspects: • Approaches to hydrologie data integration • Data integration for characterization of hydrologie properties • Data integration for understanding hydrologie processes • Meta-analysis of current interpretations Scientists and researchers in the field, the laboratory, and the classroom will find this work an important resource in advancing our understanding of subsurface water movement.
Challenges and opportunities of power systems from smart homes to super-grids.
Kuhn, Philipp; Huber, Matthias; Dorfner, Johannes; Hamacher, Thomas
2016-01-01
The world's power systems are facing a structural change including liberalization of markets and integration of renewable energy sources. This paper describes the challenges that lie ahead in this process and points out avenues for overcoming different problems at different scopes, ranging from individual homes to international super-grids. We apply energy system models at those different scopes and find a trade-off between technical and social complexity. Small-scale systems would require technological breakthroughs, especially for storage, but individual agents can and do already start to build and operate such systems. In contrast, large-scale systems could potentially be more efficient from a techno-economic point of view. However, new political frameworks are required that enable long-term cooperation among sovereign entities through mutual trust. Which scope first achieves its breakthrough is not clear yet.
Constructing a philosophy of chiropractic: evolving worldviews and modern foundation().
Senzon, Simon A
2011-12-01
The purpose of this article is to trace the foundations of DD Palmer's sense of self and philosophy of chiropractic to its sources in modern Western philosophy as well as current metatheories about modernity. DD Palmer's sense of self was indicative of a modern self. A modern self is characterized as a self that developed after the Western Enlightenment and must come to terms with the insights of modernity such as Cartesian dualism, Spinoza's substance, Rousseau's expressivism, and Kant's critiques. It is argued that Palmer's philosophy can be viewed as part of the this tradition alongside his involvement in the 19th century American metaphysical religious culture, which was itself a response to these challenges of the modern self of modernity. Palmer's development of chiropractic and its philosophy was a reaction to the challenges and promises of modernity.
Land surface temperature measurements from EOS MODIS data
NASA Technical Reports Server (NTRS)
Wan, Zhengming
1993-01-01
The task objectives of this reporting phase included: (1) completing the draft of the LST Algorithms Theoretical Basic Document by July 30, 1993; (2) making a detailed characterization of the thermal infrared measurement system including spectrometer, blackbody, and radiation sources; (3) making TIR spectral measurements of water and snow-cover surfaces with the MIDAC M2401 spectrometer; and (4) making conceptual and engineering design of an accessory system for spectrometric measurements at variable angles. These objectives are based on the requirements by the MODIS Science Team and the unique challenge in the development of MODIS LST algorithms: to acquire accurate spectral emissivity data of land covers in the near-term and to make ground validations of the LST product in the long-term with a TIR measurement system.
The paper examines the quality assurance challenges associated with open path Fourier transform infrared (OPFTIR) measurements of large area pollution sources with plume reconstruction by computed tomography (CT) and how each challenge may be met. Traditionally, pollutant concent...
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.
NASA Astrophysics Data System (ADS)
Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.
2015-06-01
The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.
Genomic Selection Improves Heat Tolerance in Dairy Cattle
Garner, J. B.; Douglas, M. L.; Williams, S. R. O; Wales, W. J.; Marett, L. C.; Nguyen, T. T. T.; Reich, C. M.; Hayes, B. J.
2016-01-01
Dairy products are a key source of valuable proteins and fats for many millions of people worldwide. Dairy cattle are highly susceptible to heat-stress induced decline in milk production, and as the frequency and duration of heat-stress events increases, the long term security of nutrition from dairy products is threatened. Identification of dairy cattle more tolerant of heat stress conditions would be an important progression towards breeding better adapted dairy herds to future climates. Breeding for heat tolerance could be accelerated with genomic selection, using genome wide DNA markers that predict tolerance to heat stress. Here we demonstrate the value of genomic predictions for heat tolerance in cohorts of Holstein cows predicted to be heat tolerant and heat susceptible using controlled-climate chambers simulating a moderate heatwave event. Not only was the heat challenge stimulated decline in milk production less in cows genomically predicted to be heat-tolerant, physiological indicators such as rectal and intra-vaginal temperatures had reduced increases over the 4 day heat challenge. This demonstrates that genomic selection for heat tolerance in dairy cattle is a step towards securing a valuable source of nutrition and improving animal welfare facing a future with predicted increases in heat stress events. PMID:27682591
Measuring Nursing Care Time and Tasks in Long-Term Services and Supports: One Size Does Not Fit All
Sochalski, Julie A.; Foust, Janice B.; Zubritsky, Cynthia D.; Hirschman, Karen B.; Abbott, Katherine M.; Naylor, Mary D.
2015-01-01
Background Although nursing care personnel comprise the majority of staff in long-term care services and supports (LTSS), a method for measuring the provision of nursing care has not yet been developed. Purpose/Methods We sought to understand the challenges of measuring nursing care across different types of LTSS using a qualitative approach that included the triangulation of data from three unique sources. Results Six primary challenges to measuring nursing care across LTSS emerged: level of detail about time of day, amount of time, or type of tasks varied by type of nursing and organization; time and tasks were documented in clinical records and administrative databases; data existed both on paper and electronically; several sources of information were needed to create the fullest picture of nursing care; data was inconsistently available for contracted providers; documentation of informal caregiving was unavailable. Differences were observed for assisted living facilities and home and community based services compared to nursing homes and across organizations within a setting. A commonality across settings and organizations was the availability of an electronically stored care plan specifying individual needs but not necessarily how these would be met. Conclusions The findings demonstrate the variability of data availability and specificity across three distinct LTSS settings. This study is an initial step toward establishing a process for measuring the provision of nursing care across LTSS to be able to explore the range of nursing care needs of LTSS recipients and how these needs are fulfilled. PMID:22902975
Deming, Damon; Sheahan, Timothy; Heise, Mark; Yount, Boyd; Davis, Nancy; Sims, Amy; Suthar, Mehul; Harkema, Jack; Whitmore, Alan; Pickles, Raymond; West, Ande; Donaldson, Eric; Curtis, Kristopher; Johnston, Robert; Baric, Ralph
2006-01-01
Background In 2003, severe acute respiratory syndrome coronavirus (SARS-CoV) was identified as the etiological agent of severe acute respiratory syndrome, a disease characterized by severe pneumonia that sometimes results in death. SARS-CoV is a zoonotic virus that crossed the species barrier, most likely originating from bats or from other species including civets, raccoon dogs, domestic cats, swine, and rodents. A SARS-CoV vaccine should confer long-term protection, especially in vulnerable senescent populations, against both the 2003 epidemic strains and zoonotic strains that may yet emerge from animal reservoirs. We report the comprehensive investigation of SARS vaccine efficacy in young and senescent mice following homologous and heterologous challenge. Methods and Findings Using Venezuelan equine encephalitis virus replicon particles (VRP) expressing the 2003 epidemic Urbani SARS-CoV strain spike (S) glycoprotein (VRP-S) or the nucleocapsid (N) protein from the same strain (VRP-N), we demonstrate that VRP-S, but not VRP-N vaccines provide complete short- and long-term protection against homologous strain challenge in young and senescent mice. To test VRP vaccine efficacy against a heterologous SARS-CoV, we used phylogenetic analyses, synthetic biology, and reverse genetics to construct a chimeric virus (icGDO3-S) encoding a synthetic S glycoprotein gene of the most genetically divergent human strain, GDO3, which clusters among the zoonotic SARS-CoV. icGD03-S replicated efficiently in human airway epithelial cells and in the lungs of young and senescent mice, and was highly resistant to neutralization with antisera directed against the Urbani strain. Although VRP-S vaccines provided complete short-term protection against heterologous icGD03-S challenge in young mice, only limited protection was seen in vaccinated senescent animals. VRP-N vaccines not only failed to protect from homologous or heterologous challenge, but resulted in enhanced immunopathology with eosinophilic infiltrates within the lungs of SARS-CoV–challenged mice. VRP-N–induced pathology presented at day 4, peaked around day 7, and persisted through day 14, and was likely mediated by cellular immune responses. Conclusions This study identifies gaps and challenges in vaccine design for controlling future SARS-CoV zoonosis, especially in vulnerable elderly populations. The availability of a SARS-CoV virus bearing heterologous S glycoproteins provides a robust challenge inoculum for evaluating vaccine efficacy against zoonotic strains, the most likely source of future outbreaks. PMID:17194199
Importance of hard coal in electricity generation in Poland
NASA Astrophysics Data System (ADS)
Plewa, Franciszek; Strozik, Grzegorz
2017-11-01
Polish energy sector is facing a number of challenges, in particular as regards the reconstruction of production potential, diversification of energy sources, environmental issues, adequate fuels supplies and other. Mandatory implementation of Europe 2020 strategy in terms of “3x20” targets (20% reduction of greenhouse gases, 20% of energy from renewable sources, and 20% increase of efficiency in energy production) requires fast decision, which have to be coordinated with energetic safety issues, increasing demands for electric energy, and other factors. In Poland almost 80% of power is installed in coal fired power plants and energy from hard coals is relatively less expensive than from other sources, especially renewable. The most of renewable energy sources power plants are unable to generate power in amounts which can be competitive with coal fires power stations and are highly expensive, what leads o high prices of electric energy. Alternatively, new generation of coal fired coal power plants is able to significantly increase efficiency, reduce carbon dioxide emission, and generate less expensive electric power in amounts adequate to the demands of a country.
The Upper Rio Grande Basin as a Long-Term Hydrologic Observatory - Challenges and Opportunities
NASA Astrophysics Data System (ADS)
Springer, E.; Duffy, C.; Phillips, F.; Hogan, J.; Winter, C. L.
2001-12-01
Long-term hydrologic observatories (LTHO) have been identified as a key element to advance hydrologic science. Issues to be addressed are the size and locations of LTHOs to meet research needs and address water resources management concerns. To date, considerable small watershed research has been performed, and these have provided valuable insights into processes governing hydrologic response on local scales. For hydrology to advance as a science, more complete and coherent data sets at larger scales are needed to tie together local studies and examine lower frequency long wavelength processes that may govern the water cycle at the scale of river basins and continents. The objective of this poster is to describe the potential opportunities and challenges for the upper Rio Grande as a LTHO. The presence of existing research programs and facilities can be leveraged by a LTHO to develop the required scientific measurements. Within the upper Rio Grande Basin, there are two Long-Term Ecological Research sites, Jornada and Sevilleta; Los Alamos National Laboratory, which monitors the atmosphere, surface water and groundwater; a groundwater study is being performed by the USGS in the Albuquerque Basin to examine recharge and water quality issues. Additionally, the upper Rio Grande basin served as an USGS-NAWQA study site starting in the early 1990's and is currently being studied by SAHRA (NSF-STC) to understand sources of salinity of the river system; such studies provide an existing framework on which to base long-term monitoring of water quality. The upper Rio Grande Basin has a wealth of existing long-term climate, hydrologic and geochemical records on which to base an LTHO. Within the basin there are currently 122 discharge gages operated by the USGS; and many of these gages have long-term records of discharge. Other organizations operate additional surface water gages in the lower part of the basin. Long-term records of river chemistry have been kept by the USGS, U. S. Bureau of Reclamation, IBWC and EBID. Significantly, these records extend through periods of climate extremes, notably the 1950's drought. One challenge that the Rio Grande faces as a LTHO is combining datasets maintained by different agencies in order to address research questions at this spatial and temporal scale. Challenges facing the development of a LTHO on the Rio Grande include instrumentation over steep topographic and biological gradients that exist. Political issues surrounding any basin can create problems for making long-term measurements. Current water resources management requires a greater scientific understanding of coupled processes, serious improvements in predictive capability and available computational resources, both of which require a comprehensive hydrologic monitoring system beyond any which exist today.
Successful propagation of shrimp yellow head virus in immortal mosquito cells.
Gangnonngiw, Warachin; Kanthong, Nipaporn; Flegel, Timothy W
2010-05-18
Research on crustacean viruses is hampered by the lack of continuous cell lines susceptible to them. To overcome this problem, we previously challenged immortal mosquito and lepidopteran cell lines with shrimp yellow head virus (YHV), followed by serial, split-passage of whole cells, and showed that this produced cells that persistently expressed YHV antigens. To determine whether such insect cultures positive for YHV antigens could be used to infect shrimp Penaeus monodon with YHV, culture supernatants and whole-cell homogenates were used to challenge shrimp by injection. Shrimp injected with culture supernatants could not be infected. However, shrimp injection-challenged with whole-cell homogenates from Passage 5 (early-passage) of such cultures died with histological and clinical signs typical for yellow head disease (YHD), while homogenates of mock-passaged, YHV-challenged cells did not. By contrast, shrimp challenged with cell homogenates of late-passage cultures became infected with YHV, but survived, suggesting that YHV attenuation had occurred during its long-term serial passage in insect cells. Thus, YHV could be propagated successfully in C6/36 mosquito cells and used at low passage numbers as a source of inoculum to initiate lethal infections in shrimp. This partially solves the problem of lack of continuous shrimp cell lines for cultivation of YHV.
Rebound of a coal tar creosote plume following partial source zone treatment with permanganate.
Thomson, N R; Fraser, M J; Lamarche, C; Barker, J F; Forsey, S P
2008-11-14
The long-term management of dissolved plumes originating from a coal tar creosote source is a technical challenge. For some sites stabilization of the source may be the best practical solution to decrease the contaminant mass loading to the plume and associated off-site migration. At the bench-scale, the deposition of manganese oxides, a permanganate reaction byproduct, has been shown to cause pore plugging and the formation of a manganese oxide layer adjacent to the non-aqueous phase liquid creosote which reduces post-treatment mass transfer and hence mass loading from the source. The objective of this study was to investigate the potential of partial permanganate treatment to reduce the ability of a coal tar creosote source zone to generate a multi-component plume at the pilot-scale over both the short-term (weeks to months) and the long-term (years) at a site where there is >10 years of comprehensive synoptic plume baseline data available. A series of preliminary bench-scale experiments were conducted to support this pilot-scale investigation. The results from the bench-scale experiments indicated that if sufficient mass removal of the reactive compounds is achieved then the effective solubility, aqueous concentration and rate of mass removal of the more abundant non-reactive coal tar creosote compounds such as biphenyl and dibenzofuran can be increased. Manganese oxide formation and deposition caused an order-of-magnitude decrease in hydraulic conductivity. Approximately 125 kg of permanganate were delivered into the pilot-scale source zone over 35 days, and based on mass balance estimates <10% of the initial reactive coal tar creosote mass in the source zone was oxidized. Mass discharge estimated at a down-gradient fence line indicated >35% reduction for all monitored compounds except for biphenyl, dibenzofuran and fluoranthene 150 days after treatment, which is consistent with the bench-scale experimental results. Pre- and post-treatment soil core data indicated a highly variable and random spatial distribution of mass within the source zone and provided no insight into the mass removed of any of the monitored species. The down-gradient plume was monitored approximately 1, 2 and 4 years following treatment. The data collected at 1 and 2 years post-treatment showed a decrease in mass discharge (10 to 60%) and/or total plume mass (0 to 55%); however, by 4 years post-treatment there was a rebound in both mass discharge and total plume mass for all monitored compounds to pre-treatment values or higher. The variability of the data collected was too large to resolve subtle changes in plume morphology, particularly near the source zone, that would provide insight into the impact of the formation and deposition of manganese oxides that occurred during treatment on mass transfer and/or flow by-passing. Overall, the results from this pilot-scale investigation indicate that there was a significant but short-term (months) reduction of mass emanating from the source zone as a result of permanganate treatment but there was no long-term (years) impact on the ability of this coal tar creosote source zone to generate a multi-component plume.
Reconstructing Forty Years of Landsat Observations
NASA Astrophysics Data System (ADS)
Meyer, D. J.; Dwyer, J. L.; Steinwand, D.
2013-12-01
In July 1972, NASA launched the Earth Resource Technology Satellite (ERTS), the first of what was to be the series of Earth-observing satellites we now know as the Landsat system. This system, originally conceived in the 1960's within the US Department of the Interior and US Geological Survey (USGS), has continued with little interruption for over 40 years, creating the longest record of satellite-based global land observations. The current USGS archive of Landsat images exceeds 4 million scenes, and the recently launched Landsat 8 platform will extend that archive to nearly 50 years of observations. Clearly, these observations are critical to the study of Earth system processes, and the interaction between these processes and human activities. However, the seven successful Landsat missions represent more of an ad hoc program than a long-term record of consistent observations, due largely to changing Federal policies and challenges finding an operational home for the program. Technologically, these systems evolved from the original Multispectral Scanning System (MSS) through the Thematic Mapper and Enhanced Thematic Mapper Plus (ETM+) systems, to the current Observational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) systems. Landsat data were collected globally by a network of international cooperators having diverse data management policies. Much of the oldest data were stored on archaic media that could not be retrieved using modern media readers. Collecting these data from various sensors and sources, and reconstructing them into coherent Earth observation records, posed numerous challenges. We present here a brief overview of work done to overcome these challenges and create a consistent, long-term Landsat observation record. Much of the current archive was 'repatriated' from international cooperators and often required the reconstruction of (sometimes absent) metadata for geo-location and radiometric calibration. The older MSS data, some of which had been successfully retrieved from outdated wide band video media, required similar metadata reconstruction. TM data from Landsats 4 and 5 relied on questionable on-board lamp data for calibration, thus the calibration history for these missions was reconstructed to account for sensor degradation over time. To improve continuity between platforms, Landsat 7 and 8 missions employed 'under-flight' maneuvers to reduce inter-calibration error. Data from the various sensors, platforms and sources were integrated into a common metadata standard, with quality assurance information, to ensure understandability of the data for long-term preservation. Because of these efforts, the current Landsat archive can now support the creation of the long-term climate data records and essential climate variables required to monitor changes on the Earth's surface quantitatively over decades of observations.
NEON: Contributing continental-scale long-term environmental data for the benefit of society
NASA Astrophysics Data System (ADS)
Wee, B.; Aulenbach, S.
2011-12-01
The National Ecological Observatory Network (NEON) is a NSF funded national investment in physical and information infrastructure. Large-scale environmental changes pose challenges that straddle environmental, economic, and social boundaries. As we develop climate adaptation strategies at the Federal, state, local, and tribal levels, accessible and usable data are essential for implementing actions that are informed by the best available information. NEON's goal is to enable understanding and forecasting of the impacts of climate change, land use change and invasive species on continental-scale ecology by providing physical and information infrastructure. The NEON framework will take standardized, long-term, coordinated measurements of related environmental variables at each of its 62 sites across the nation. These observations, collected by automated instruments, field crews, and airborne instruments, will be processed into more than 700 data products that are provided freely over the web to support research, education, and environmental management. NEON is envisioned to be an integral component of an interoperable ecosystem of credible data and information sources. Other members of this information ecosystem include Federal, commercial, and non-profit entities. NEON is actively involved with the interoperability community via forums like the Foundation for Earth Science Information Partners and the USGS Community for Data Integration in a collective effort to identify the technical standards, best practices, and organizational principles that enable the emergence of such an information ecosystem. These forums have proven to be effective innovation engines for the experimentation of new techniques that evolve into emergent standards. These standards are, for the most part, discipline agnostic. It is becoming increasingly evident that we need to include socio-economic and public health data sources in interoperability initiatives, because the dynamics of coupled natural-human systems cannot be understood in the absence of data about the human dimension. Another essential element is the community of tool and platform developers who create the infrastructure for scientists, educators, resource managers, and policy analysts to discover, analyze, and collaborate on problems using the diverse data that are required to address emerging large-scale environmental challenges. These challenges are very unlikely to be problems confined to this generation: they are urgent, compelling, and long-term problems that require a sustained effort to generate and curate data and information from observations, models, and experiments. NEON's long-term national physical and information infrastructure for environmental observation is one of the cornerstones of a framework that transforms science and information for the benefit of society.
Remotely measuring populations during a crisis by overlaying two data sources.
Bharti, Nita; Lu, Xin; Bengtsson, Linus; Wetter, Erik; Tatem, Andrew J
2015-03-01
Societal instability and crises can cause rapid, large-scale movements. These movements are poorly understood and difficult to measure but strongly impact health. Data on these movements are important for planning response efforts. We retrospectively analyzed movement patterns surrounding a 2010 humanitarian crisis caused by internal political conflict in Côte d'Ivoire using two different methods. We used two remote measures, nighttime lights satellite imagery and anonymized mobile phone call detail records, to assess average population sizes as well as dynamic population changes. These data sources detect movements across different spatial and temporal scales. The two data sources showed strong agreement in average measures of population sizes. Because the spatiotemporal resolution of the data sources differed, we were able to obtain measurements on long- and short-term dynamic elements of populations at different points throughout the crisis. Using complementary, remote data sources to measure movement shows promise for future use in humanitarian crises. We conclude with challenges of remotely measuring movement and provide suggestions for future research and methodological developments. © The Author 2015. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.
Theoretical considerations for mapping activation in human cardiac fibrillation
NASA Astrophysics Data System (ADS)
Rappel, Wouter-Jan; Narayan, Sanjiv M.
2013-06-01
Defining mechanisms for cardiac fibrillation is challenging because, in contrast to other arrhythmias, fibrillation exhibits complex non-repeatability in spatiotemporal activation but paradoxically exhibits conserved spatial gradients in rate, dominant frequency, and electrical propagation. Unlike animal models, in which fibrillation can be mapped at high spatial and temporal resolution using optical dyes or arrays of contact electrodes, mapping of cardiac fibrillation in patients is constrained practically to lower resolutions or smaller fields-of-view. In many animal models, atrial fibrillation is maintained by localized electrical rotors and focal sources. However, until recently, few studies had revealed localized sources in human fibrillation, so that the impact of mapping constraints on the ability to identify rotors or focal sources in humans was not described. Here, we determine the minimum spatial and temporal resolutions theoretically required to detect rigidly rotating spiral waves and focal sources, then extend these requirements for spiral waves in computer simulations. Finally, we apply our results to clinical data acquired during human atrial fibrillation using a novel technique termed focal impulse and rotor mapping (FIRM). Our results provide theoretical justification and clinical demonstration that FIRM meets the spatio-temporal resolution requirements to reliably identify rotors and focal sources for human atrial fibrillation.
Burrage, Rachel L; Gone, Joseph P; Momper, Sandra L
2016-09-01
American Indian (AI) youth have some of the highest rates of suicide of any group in the United States, and the majority of AI youth live in urban areas away from tribal communities. As such, understanding the resources available for suicide prevention among urban AI youth is critical, as is understanding the challenges involved in accessing such resources. Pre-existing interview data from 15 self-identified AI community members and staff from an Urban Indian Health Organization were examined to understand existing resources for urban AI youth suicide prevention, as well as related challenges. A thematic analysis was undertaken, resulting in three principal themes around suicide prevention: formal resources, informal resources, and community values and beliefs. Formal resources that meet the needs of AI youth were viewed as largely inaccessible or nonexistent, and youth were seen as more likely to seek help from informal sources. Community values of mutual support were thought to reinforce available informal supports. However, challenges arose in terms of the community's knowledge of and views on discussing suicide, as well as the perceived fit between community values and beliefs and formal prevention models. © Society for Community Research and Action 2016.
Adolescent health literacy and the Internet: challenges and opportunities.
Jain, Anuja V; Bickham, David
2014-08-01
Adolescents have increasingly turned to the Internet as a resource for insight into their health questions and concerns. However, the extent to which adolescents will benefit from using the Internet as a source for health information will be determined in great part by their level of media literacy and health literacy. The purpose of this review is to explore challenges that adolescents face when using the Internet to access health information and opportunities for intervention. Adolescents must be able to access, understand, analyze, and evaluate health information on the Internet and then apply this information to make appropriate health decisions. Challenges faced by adolescents fall into the realm of functional literacy (e.g., not being able to spell a medical term needed in a search), critical literacy (e.g., not being able to differentiate accurate from inaccurate online health information), and, lastly, interactive literacy (e.g., translating online health information to appropriate health behaviors). More research is needed in this field to better understand the challenges and to propose effective solutions. However, a multifaceted approach that engages policymakers, educators, healthcare providers, online health information providers, and parents may be positioned to make the largest impact.
Verification of target motion effects on SAR imagery using the Gotcha GMTI challenge dataset
NASA Astrophysics Data System (ADS)
Hack, Dan E.; Saville, Michael A.
2010-04-01
This paper investigates the relationship between a ground moving target's kinematic state and its SAR image. While effects such as cross-range offset, defocus, and smearing appear well understood, their derivations in the literature typically employ simplifications of the radar/target geometry and assume point scattering targets. This study adopts a geometrical model for understanding target motion effects in SAR imagery, termed the target migration path, and focuses on experimental verification of predicted motion effects using both simulated and empirical datasets based on the Gotcha GMTI challenge dataset. Specifically, moving target imagery is generated from three data sources: first, simulated phase history for a moving point target; second, simulated phase history for a moving vehicle derived from a simulated Mazda MPV X-band signature; and third, empirical phase history from the Gotcha GMTI challenge dataset. Both simulated target trajectories match the truth GPS target position history from the Gotcha GMTI challenge dataset, allowing direct comparison between all three imagery sets and the predicted target migration path. This paper concludes with a discussion of the parallels between the target migration path and the measurement model within a Kalman filtering framework, followed by conclusions.
NASA Astrophysics Data System (ADS)
Damiani, F.; Maggio, A.; Micela, G.; Sciortino, S.
1997-07-01
We apply to the specific case of images taken with the ROSAT PSPC detector our wavelet-based X-ray source detection algorithm presented in a companion paper. Such images are characterized by the presence of detector ``ribs,'' strongly varying point-spread function, and vignetting, so that their analysis provides a challenge for any detection algorithm. First, we apply the algorithm to simulated images of a flat background, as seen with the PSPC, in order to calibrate the number of spurious detections as a function of significance threshold and to ascertain that the spatial distribution of spurious detections is uniform, i.e., unaffected by the ribs; this goal was achieved using the exposure map in the detection procedure. Then, we analyze simulations of PSPC images with a realistic number of point sources; the results are used to determine the efficiency of source detection and the accuracy of output quantities such as source count rate, size, and position, upon a comparison with input source data. It turns out that sources with 10 photons or less may be confidently detected near the image center in medium-length (~104 s), background-limited PSPC exposures. The positions of sources detected near the image center (off-axis angles < 15') are accurate to within a few arcseconds. Output count rates and sizes are in agreement with the input quantities, within a factor of 2 in 90% of the cases. The errors on position, count rate, and size increase with off-axis angle and for detections of lower significance. We have also checked that the upper limits computed with our method are consistent with the count rates of undetected input sources. Finally, we have tested the algorithm by applying it on various actual PSPC images, among the most challenging for automated detection procedures (crowded fields, extended sources, and nonuniform diffuse emission). The performance of our method in these images is satisfactory and outperforms those of other current X-ray detection techniques, such as those employed to produce the MPE and WGA catalogs of PSPC sources, in terms of both detection reliability and efficiency. We have also investigated the theoretical limit for point-source detection, with the result that even sources with only 2-3 photons may be reliably detected using an efficient method in images with sufficiently high resolution and low background.
The bioartificial pancreas (BAP): Biological, chemical and engineering challenges.
Iacovacci, Veronica; Ricotti, Leonardo; Menciassi, Arianna; Dario, Paolo
2016-01-15
The bioartificial pancreas (BAP) represents a viable solution for the treatment of type 1 diabetes (T1D). By encapsulating pancreatic cells in a semipermeable membrane to allow nutrient, insulin and glucose exchange, the side effects produced by islets and whole organ transplantation-related immunosuppressive therapy can be circumvented. Several factors, mainly related to materials properties, capsule morphology and biological environment, play a key role in optimizing BAP systems. The BAP is an extremely complex delivery system for insulin. Despite considerable efforts, in some instances meeting with limited degree of success, a BAP capable of restoring physiological pancreas functions without the need for immunosuppressive drugs and of controlling blood glucose levels especially in large animal models and a few clinical trials, does not exist. The state of the art in terms of materials, fabrication techniques and cell sources, as well as the current status of commercial devices and clinical trials, are described in this overview from an interdisciplinary viewpoint. In addition, challenges to the creation of effective BAP systems are highlighted including future perspectives in terms of component integration from both a biological and an engineering viewpoint. Copyright © 2015 Elsevier Inc. All rights reserved.
Massoud, May A; Al-Abady, Abdolmonim; Jurdi, Mey; Nuwayhid, Iman
2010-06-01
Adequate and safe water is important for human health and well-being, economic production, and sustainable development. Failure to ensure the safety of drinking water may expose the community to the risk of outbreaks of waterborne and infectious diseases. Although drinking water is a basic human right, many people do not have access to safe and adequate drinking water or proper sanitation facilities. The authors conducted a study to assess the quantity, cost, continuity, coverage, and quality of drinking water in the village of Zawtar El-Charkieh, Lebanon. Their aim was to identify the challenges of sustainable access to safe drinking water in order to determine the short-term management actions and long-term strategies to improve water quality. Results revealed that contamination of the source, absence of any disinfection method or insufficient dose, poor maintenance operations, and aging of the networks are significant factors contributing to water contamination during the storage and distribution process. Establishing a comprehensive drinking water system that integrates water supply, quality, and management as well as associated educational programs in order to ensure the safety and sustainability of drinking water supplies is essential.
Formaldehyde Concentration Dynamics of the International Space Station Cabin Atmosphere
NASA Technical Reports Server (NTRS)
Perry, J. L.
2005-01-01
Formaldehyde presents a significant challenge to maintaining cabin air quality on board crewed spacecraft. Generation sources include offgassing from a variety of non-metallic materials as well as human metabolism. Because generation sources are pervasive and human health can be affected by continual exposure to low concentrations, toxicology and air quality control engineering experts jointly identified formaldehyde as a key compound to be monitored as part the International Space Station's (ISS) environmental health monitoring and maintenance program. Data acquired from in-flight air quality monitoring methods are the basis for assessing the cabin environment's suitability for long-term habitation and monitoring the performance of passive and active controls that are in place to minimize crew exposure. Formaldehyde concentration trends and dynamics served in the ISS cabin atmosphere are reviewed implications to present and future flight operations discussed.
Wonder and the clinical encounter.
Evans, H M
2012-04-01
In terms of intervening in embodied experience, medical treatment is wonder-full in its ambition and its metaphysical presumption; yet, wonder's role in clinical medicine has received little philosophical attention. In this paper, I propose, to doctors and others in routine clinical life, the value of an openness to wonder and to the sense of wonder. Key to this is the identity of the central ethical challenges facing most clinicians, which is not the high-tech drama of the popular conceptions of medical ethics but, rather, the routine of patients' undramatic but unremitting demands for the clinician's time and respectful attention. Wonder (conceived as an intense and transfiguring attentiveness) is a ubiquitous ethical source, an alternative to the more familiar respect for rational autonomy, a source of renewal galvanizing diagnostic imagination, and a timely recalling of the embodied agency of both patient and clinician.
Human resources for health (and rehabilitation): Six Rehab-Workforce Challenges for the century.
Jesus, Tiago S; Landry, Michel D; Dussault, Gilles; Fronteira, Inês
2017-01-23
People with disabilities face challenges accessing basic rehabilitation health care. In 2006, the United Nations Convention on the Rights of Persons with Disabilities (CRPD) outlined the global necessity to meet the rehabilitation needs of people with disabilities, but this goal is often challenged by the undersupply and inequitable distribution of rehabilitation workers. While the aggregate study and monitoring of the physical rehabilitation workforce has been mostly ignored by researchers or policy-makers, this paper aims to present the 'challenges and opportunities' for guiding further long-term research and policies on developing the relatively neglected, highly heterogeneous physical rehabilitation workforce. The challenges were identified through a two-phased investigation. Phase 1: critical review of the rehabilitation workforce literature, organized by the availability, accessibility, acceptability and quality (AAAQ) framework. Phase 2: integrate reviewed data into a SWOT framework to identify the strengths and opportunities to be maximized and the weaknesses and threats to be overcome. The critical review and SWOT analysis have identified the following global situation: (i) needs-based shortages and lack of access to rehabilitation workers, particularly in lower income countries and in rural/remote areas; (ii) deficiencies in the data sources and monitoring structures; and (iii) few exemplary innovations, of both national and international scope, that may help reduce supply-side shortages in underserved areas. Based on the results, we have prioritized the following 'Six Rehab-Workforce Challenges': (1) monitoring supply requirements: accounting for rehabilitation needs and demand; (2) supply data sources: the need for structural improvements; (3) ensuring the study of a whole rehabilitation workforce (i.e. not focused on single professions), including across service levels; (4) staffing underserved locations: the rising of education, attractiveness and tele-service; (5) adapt policy options to different contexts (e.g. rural vs urban), even within a country; and (6) develop international solutions, within an interdependent world. Concrete examples of feasible local, global and research action toward meeting the Six Rehab-Workforce Challenges are provided. Altogether, these may help advance a policy and research agenda for ensuring that an adequate rehabilitation workforce can meet the current and future rehabilitation health needs.
Noninvasive Fetal ECG: the PhysioNet/Computing in Cardiology Challenge 2013.
Silva, Ikaro; Behar, Joachim; Sameni, Reza; Zhu, Tingting; Oster, Julien; Clifford, Gari D; Moody, George B
2013-03-01
The PhysioNet/CinC 2013 Challenge aimed to stimulate rapid development and improvement of software for estimating fetal heart rate (FHR), fetal interbeat intervals (FRR), and fetal QT intervals (FQT), from multichannel recordings made using electrodes placed on the mother's abdomen. For the challenge, five data collections from a variety of sources were used to compile a large standardized database, which was divided into training, open test, and hidden test subsets. Gold-standard fetal QRS and QT interval annotations were developed using a novel crowd-sourcing framework. The challenge organizers used the hidden test subset to evaluate 91 open-source software entries submitted by 53 international teams of participants in three challenge events, estimating FHR, FRR, and FQT using the hidden test subset, which was not available for study by participants. Two additional events required only user-submitted QRS annotations to evaluate FHR and FRR estimation accuracy using the open test subset available to participants. The challenge yielded a total of 91 open-source software entries. The best of these achieved average estimation errors of 187bpm 2 for FHR, 20.9 ms for FRR, and 152.7 ms for FQT. The open data sets, scoring software, and open-source entries are available at PhysioNet for researchers interested on working on these problems.
Smith, Besa; Chu, Laura K; Smith, Tyler C; Amoroso, Paul J; Boyko, Edward J; Hooper, Tomoko I; Gackstetter, Gary D; Ryan, Margaret AK
2008-01-01
Background Self-reported medical history data are frequently used in epidemiological studies. Self-reported diagnoses may differ from medical record diagnoses due to poor patient-clinician communication, self-diagnosis in the absence of a satisfactory explanation for symptoms, or the "health literacy" of the patient. Methods The US Department of Defense military health system offers a unique opportunity to evaluate electronic medical records with near complete ascertainment while on active duty. This study compared 38 self-reported medical conditions to electronic medical record data in a large population-based US military cohort. The objective of this study was to better understand challenges and strengths in self-reporting of medical conditions. Results Using positive and negative agreement statistics for less-prevalent conditions, near-perfect negative agreement and moderate positive agreement were found for the 38 diagnoses. Conclusion This report highlights the challenges of using self-reported medical data and electronic medical records data, but illustrates that agreement between the two data sources increases with increased surveillance period of medical records. Self-reported medical data may be sufficient for ruling out history of a particular condition whereas prevalence studies may be best served by using an objective measure of medical conditions found in electronic healthcare records. Defining medical conditions from multiple sources in large, long-term prospective cohorts will reinforce the value of the study, particularly during the initial years when prevalence for many conditions may still be low. PMID:18644098
Uncertainty and risk in wildland fire management: a review.
Thompson, Matthew P; Calkin, Dave E
2011-08-01
Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.
Moment Tensor Analysis of Shallow Sources
NASA Astrophysics Data System (ADS)
Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.; Yoo, S. H.
2015-12-01
A potential issue for moment tensor inversion of shallow seismic sources is that some moment tensor components have vanishing amplitudes at the free surface, which can result in bias in the moment tensor solution. The effects of the free-surface on the stability of the moment tensor method becomes important as we continue to investigate and improve the capabilities of regional full moment tensor inversion for source-type identification and discrimination. It is important to understand these free surface effects on discriminating shallow explosive sources for nuclear monitoring purposes. It may also be important in natural systems that have shallow seismicity such as volcanoes and geothermal systems. In this study, we apply the moment tensor based discrimination method to the HUMMING ALBATROSS quarry blasts. These shallow chemical explosions at approximately 10 m depth and recorded up to several kilometers distance represent rather severe source-station geometry in terms of vanishing traction issues. We show that the method is capable of recovering a predominantly explosive source mechanism, and the combined waveform and first motion method enables the unique discrimination of these events. Recovering the correct yield using seismic moment estimates from moment tensor inversion remains challenging but we can begin to put error bounds on our moment estimates using the NSS technique.
Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine
2017-02-01
Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.
A dynamic model of stress and sustained attention
NASA Technical Reports Server (NTRS)
Hancock, P. A.; Warm, Joel S.
1989-01-01
Arguments are presented that an integrated view of stress and performance must consider the task demanding a sustained attention as a primary source of cognitive stress. A dynamic model is developed on the basis of the concept of adaptability in both physiological and psychological terms, that addresses the effects of stress on vigilance and, potentially, a wide variety of attention-demanding performance tasks. The model provides an insight into the failure of an operator under the driving influences of stress and opens a number of potential avenues through which solutions to the complex challenge of stress and performance might be posed.
Water resources management. World Bank policy paper; Gestion des ressources en eau
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-12-31
The management framework presented in this study addresses the demand for water in Asia caused by rapid population growth and economic development. It focuses on three key actions to meet the challenge: evaluate how the region manages water resources; identify guidelines for the Bank`s water resource programs; and develop country-specific strategies and promote joint programs. Reforms built into the framework seek to modernize institutions that affect water sources. The authors suggest ways to improve planning and long-term management, streamline economic and financial policy, and upgrade `real-time` management, operation, and maintenance.
Edwards, Kathryn M
2005-06-01
Pertussis, or whooping cough, is a bacterial disease characterized by paroxysmal cough often accompanied by inspiratory whoop and posttussive emesis. Although the introduction of whole-cell pertussis vaccine in the 1940s led to a significant decline in the incidence of pertussis, there has been a gradual increase in reported pertussis cases since 1980. Some of these cases are in infants too young to have received routine pertussis vaccination, and many are in adolescents immunized previously as young children. Based on a literature review, an overview of pertussis is provided, focusing on epidemiology, sources of infection, and trends in incidence patterns, particularly among adolescents. Issues surrounding long-term protection after infant vaccination are also discussed. The most dramatic increase in pertussis incidence has been among adolescents and young adults. Waning vaccine-induced immunity and refinements in the diagnosis of pertussis have contributed to the rise in the occurrence of pertussis in older age groups. Disease rates in infants have also increased. Determining the source of infection in infants can be challenging, but studies have demonstrated that many infant cases are attributable to infections in adolescent or adult family members. Pertussis is on the rise, particularly in adolescents. Booster vaccination of adolescents with less-reactogenic acellular pertussis vaccines appears to be the most logical approach to disease prevention in adolescents and reduced transmission to young infants.
Analysis of CERN computing infrastructure and monitoring data
NASA Astrophysics Data System (ADS)
Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.
2015-12-01
Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.
A study in the founding of applied behavior analysis through its publications.
Morris, Edward K; Altus, Deborah E; Smith, Nathaniel G
2013-01-01
This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research.
Stem cells, in vitro gametogenesis and male fertility.
Nagamatsu, Go; Hayashi, Katsuhiko
2017-12-01
Reconstitution in culture of biological processes, such as differentiation and organization, is a key challenge in regenerative medicine, and one in which stem cell technology plays a central role. Pluripotent stem cells and spermatogonial stem cells are useful materials for reconstitution of germ cell development in vitro , as they are capable of differentiating into gametes. Reconstitution of germ cell development, termed in vitro gametogenesis, will provide an experimental platform for a better understanding of germ cell development, as well as an alternative source of gametes for reproduction, with the potential to cure infertility. Since germ cells are the cells for 'the next generation', both the culture system and its products must be carefully evaluated. In this issue, we summarize the progress in in vitro gametogenesis, most of which has been made using mouse models, as well as the future challenges in this field. © 2017 Society for Reproduction and Fertility.
Synthetic biology between technoscience and thing knowledge.
Gelfert, Axel
2013-06-01
Synthetic biology presents a challenge to traditional accounts of biology: Whereas traditional biology emphasizes the evolvability, variability, and heterogeneity of living organisms, synthetic biology envisions a future of homogeneous, humanly engineered biological systems that may be combined in modular fashion. The present paper approaches this challenge from the perspective of the epistemology of technoscience. In particular, it is argued that synthetic-biological artifacts lend themselves to an analysis in terms of what has been called 'thing knowledge'. As such, they should neither be regarded as the simple outcome of applying theoretical knowledge and engineering principles to specific technological problems, nor should they be treated as mere sources of new evidence in the general pursuit of scientific understanding. Instead, synthetic-biological artifacts should be viewed as partly autonomous research objects which, qua their material-biological constitution, embody knowledge about the natural world-knowledge that, in turn, can be accessed via continuous experimental interrogation. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Christensen, CarissaBryce; Beard, Suzette
2001-03-01
This paper will provide an overview of the Iridium business venture in terms of the challenges faced, the successes achieved, and the causes of the ultimate failure of the venture — bankruptcy and system de-orbit. The paper will address technical, business, and policy issues. The intent of the paper is to provide a balanced and accurate overview of the Iridium experience, to aid future decision-making by policy makers, the business community, and technical experts. Key topics will include the history of the program, the objectives and decision-making of Motorola, the market research and analysis conducted, partnering strategies and their impact, consumer equipment availability, and technical issues — target performance, performance achieved, technical accomplishments, and expected and unexpected technical challenges. The paper will use as sources trade media and business articles on the Iridium program, technical papers and conference presentations, Wall Street analyst's reports, and, where possible, interviews with participants and close observers.
Scale models: A proven cost-effective tool for outage planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, R.; Segroves, R.
1995-03-01
As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning andmore » monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.« less
Constructing a philosophy of chiropractic: evolving worldviews and modern foundation☆
Senzon, Simon A.
2011-01-01
Objective The purpose of this article is to trace the foundations of DD Palmer's sense of self and philosophy of chiropractic to its sources in modern Western philosophy as well as current metatheories about modernity. Discussion DD Palmer's sense of self was indicative of a modern self. A modern self is characterized as a self that developed after the Western Enlightenment and must come to terms with the insights of modernity such as Cartesian dualism, Spinoza's substance, Rousseau's expressivism, and Kant's critiques. It is argued that Palmer's philosophy can be viewed as part of the this tradition alongside his involvement in the 19th century American metaphysical religious culture, which was itself a response to these challenges of the modern self of modernity. Conclusion Palmer's development of chiropractic and its philosophy was a reaction to the challenges and promises of modernity. PMID:22693479
A Study in the Founding of Applied Behavior Analysis Through Its Publications
Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.
2013-01-01
This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133
Getting Open Source Software into Schools: Strategies and Challenges
ERIC Educational Resources Information Center
Hepburn, Gary; Buley, Jan
2006-01-01
In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…
Benefit transfer protocol for long-term health risk valuation: A case of surface water contamination
NASA Astrophysics Data System (ADS)
Kask, Susan B.; Shogren, Jason F.
1994-10-01
In response to scarce financial resources, economists have promoted the concept of benefit transfer as a cost-effective alternative to new nonmarket valuation studies. Recent discussion on benefit transfer for improved water quality has focused on recreational benefits. While useful, the discussion must now be expanded to include another key benefit from improved water quality: the reduction in risk to public health. This paper develops a protocol for benefit transfer of long-term health risk reduction and presents a case study for surface water contamination. Challenges such as the multiple sources of risk, the mortality and morbidity effects indicated by a variety of symptoms, the long latency period between cause and effect, and an individual's ability to privately or collectively reduce the probability or severity of the risk are discussed.
PHYSICS OF OUR DAYS: Dark energy and universal antigravitation
NASA Astrophysics Data System (ADS)
Chernin, A. D.
2008-03-01
Universal antigravitation, a new physical phenomenon discovered astronomically at distances of 5 to 8 billion light years, manifests itself as cosmic repulsion that acts between distant galaxies and overcomes their gravitational attraction, resulting in the accelerating expansion of the Universe. The source of the antigravitation is not galaxies or any other bodies of nature but a previously unknown form of mass/energy that has been termed dark energy. Dark energy accounts for 70 to 80% of the total mass and energy of the Universe and, in macroscopic terms, is a kind of continuous medium that fills the entire space of the Universe and is characterized by positive density and negative pressure. With its physical nature and microscopic structure unknown, dark energy is among the most critical challenges fundamental science faces in the twenty-first century.
Neubauer, Georg; Feychting, Maria; Hamnerius, Yngve; Kheifets, Leeka; Kuster, Niels; Ruiz, Ignacio; Schüz, Joachim; Uberbacher, Richard; Wiart, Joe; Röösli, Martin
2007-04-01
The increasing deployment of mobile communication base stations led to an increasing demand for epidemiological studies on possible health effects of radio frequency emissions. The methodological challenges of such studies have been critically evaluated by a panel of scientists in the fields of radiofrequency engineering/dosimetry and epidemiology. Strengths and weaknesses of previous studies have been identified. Dosimetric concepts and crucial aspects in exposure assessment were evaluated in terms of epidemiological studies on different types of outcomes. We conclude that in principle base station epidemiological studies are feasible. However, the exposure contributions from all relevant radio frequency sources have to be taken into account. The applied exposure assessment method should be piloted and validated. Short to medium term effects on physiology or health related quality of life are best investigated by cohort studies. For long term effects, groups with a potential for high exposure need to first be identified; for immediate effect, human laboratory studies are the preferred approach. (c) 2006 Wiley-Liss, Inc.
Specialty pharmacy cost management strategies of private health care payers.
Stern, Debbie; Reissman, Debi
2006-01-01
The rate of increase in spending on specialty pharmaceuticals is outpacing by far the rate of increase in spending for other drugs. To explore the strategies payers are using in response to challenges related to coverage, cost, clinical management, and access of specialty pharmaceuticals and to describe the potential implications for key stakeholders, including patients, physicians, and health care purchasers. Sources of information were identified in the course of providing consulting services in the subject area of specialty pharmaceuticals to health plans, pharmacy benefit managers, employers, and pharmaceutical manufacturers. Specialty pharmaceuticals represent the fastest growing segment of drug spending due to new product approvals, high unit costs, and increasing use. Health care payers are faced with significant challenges related to coverage, cost, clinical management, and access. A variety of short- and long-term strategies have been employed to address these challenges. Current management techniques for specialty pharmaceuticals often represent a stop-gap approach for controlling rising drug costs. Optimum cost and care management methods will evolve as further research identifies the true clinical and economic value of various specialty pharmaceuticals.
Co-Producing Accessible Climate Knowledge: Case Study of a Scientific Challenge
NASA Astrophysics Data System (ADS)
Bourqui, M.; Charriere, M. K. M.; Bolduc, C.
2016-12-01
This talk presents the process of and the lessons learned from a scientific challenge where climate scientists re-framed their research for the general public in interaction with members of the general public. This challenge was organized by Climanosco in the context of its launch in the fall 2015 and is due to end in September 2016. It led to the publication of 11 articles from scientific authors spanning 7 countries and engaged the participation of 24 members of the general public. The process of interaction between scientists and members of the general public took place along an extended peer-review process which included on-line community discussions and non-scientific review reports. Details of this interaction, as perceived by the participants and evaluated by a survey, will be discussed in this talk. On the longer term this co-production of accessible climate knowledge, which represents the main goal of the non-profit association Climanosco, is meant to serve as a reliable, research-based source, the decision makers but also the journalists, teachers and communities around the world.
Schiller, Q.; Tu, W.; Ali, A. F.; ...
2017-03-11
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiller, Q.; Tu, W.; Ali, A. F.
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
NASA Astrophysics Data System (ADS)
Emelko, M.; Silins, U.; Stone, M.
2016-12-01
Wildfire remains the most catastrophic agent of landscape disturbance in many forested source water regions. Notably, while wildfire impacts on water have been well studied, little if any of that work has specifically focused on drinking water treatability impacts, which will have both significant regional differences and similarities. Wildfire effects on water quality, particularly nutrient concentrations and character/forms, can be significant. The longevity and downstream propagation of these effects, as well as the geochemical mechanisms regulating them have been largely undocumented at larger river basin scales. This work demonstrates that fine sediment in gravel-bed rivers is a significant, long-term source of in-stream bioavailable P that contributes to a legacy of wildfire impacts on downstream water quality, aquatic ecology, and drinking water treatability in some ecoregions. The short- and mid-term impacts include increases in primary productivity and dissolved organic carbon, associated changes in carbon character, and increased potential for the formation of disinfection byproducts during drinking water treatment. The longer term impacts also may include increases in potentially toxic algal blooms and the production of taste and odor compounds. These documented impacts, as well as strategies for assessing the risk of wildfire-associated water service disruptions and infrastructure and land management-associated opportunities for adaptation to and mitigation of wildfire risk to drinking water supply will be discussed.
Natalwala, Ammar; Kunath, Tilo
2017-01-01
Parkinson's disease is a complex and progressive neurodegenerative condition that is characterized by the severe loss of midbrain dopaminergic (mDA) neurons, which innervate the striatum. Cell transplantation therapies to rebuild this dopaminergic network have been attempted for over 30 years. The most promising outcomes were observed when human fetal mesencephalic tissue was used as the source of cells for transplantation. However, reliance on terminations for a Parkinson's therapy presents significant logistical and ethical hurdles. An alternative source of transplantable mDA neurons is urgently needed, and the solution may come from human embryonic stem cells (hESCs) and induced pluripotent stem cells (iPSCs). Protocols to differentiate hESCs/iPSCs toward mDA neurons are now robust and efficient, and upon grafting the cells rescue preclinical animal models of Parkinson's disease. The challenge now is to apply Good Manufacturing Practice (GMP) to the academic discoveries and protocols to produce clinical-grade transplantable mDA cells. Major technical and logistical considerations include (i) source of hESC or iPSC line, (ii) GMP compliance of the differentiation protocol and all reagents, (iii) characterization of the cell product in terms of identity, safety, and efficacy, (iv) characterization of genomic state and stability, and (v) banking of a transplantation-ready cell product. Approaches and solutions to these challenges are reviewed here. © 2017 Elsevier B.V. All rights reserved.
Spiritual well-being in long-term colorectal cancer survivors with ostomies.
Bulkley, Joanna; McMullen, Carmit K; Hornbrook, Mark C; Grant, Marcia; Altschuler, Andrea; Wendel, Christopher S; Krouse, Robert S
2013-11-01
Spiritual well-being (SpWB) is integral to health-related quality of life. The challenges of colorectal cancer (CRC) and subsequent bodily changes can affect SpWB. We analyzed the SpWB of CRC survivors with ostomies. Two-hundred-eighty-three long-term (≥ 5 years) CRC survivors with permanent ostomies completed the modified City of Hope Quality of Life-Ostomy (mCOH-QOL-O) questionnaire. An open-ended question elicited respondents' greatest challenge in living with an ostomy. We used content analysis to identify SpWB responses and develop themes. We analyzed responses on the three-item SpWB sub-scale. Open-ended responses from 52% of participants contained SpWB content. Fifteen unique SpWB themes were identified. Sixty percent of individuals expressed positive themes such as "positive attitude", "I am fortunate", "appreciate life more", and "strength through religious faith". Negative themes, expressed by only 29% of respondents, included "struggling to cope", "not feeling 'normal' ", and "loss". Fifty-five percent of respondents expressed ambivalent themes including "learning acceptance", "an ostomy is the price for survival", "reason to be around despite suffering", and "continuing to cope despite challenges". The majority (64%) had a high SpWB sub-scale score. Although CRC survivors with ostomies infrequently mentioned negative SpWB themes as a major challenge, ambivalent themes were common. SpWB themes were often mentioned as a source of resilience or part of the struggle to adapt to an altered body after cancer surgery. Interventions to improve the quality of life of cancer survivors should contain program elements designed to address SpWB that support personal meaning, inner peace, inter connectedness, and belonging. Copyright © 2013 John Wiley & Sons, Ltd.
Carlson, SJ; Nandivada, P; Chang, MI; Mitchell, PD; O’Loughlin, A; Cowan, E; Gura, KM; Nose, V; Bistrian, B; Puder, M
2014-01-01
Objective Parenteral nutrition associated liver disease (PNALD) is a deadly complication of long term parenteral nutrition (PN) use in infants. Fish oil-based lipid emulsion has been shown in recent years to effectively treat PNALD. Alternative fat sources free of essential fatty acids have recently been investigated for health benefits related to decreased inflammatory response. We hypothesized that the addition of medium-chain triglycerides (MCT) to a purified fish oil-based diet would decrease the response to inflammatory challenge in mice, while allowing for sufficient growth and development. Materials/Methods Six groups of ten adult male C57/Bl6 mice were pair-fed different dietary treatments for a period of twelve weeks, varying only in fat source (percent calories by weight): 10.84% soybean oil (SOY), 10% coconut oil (HCO), 10% medium-chain triglycerides (MCT), 3% purified fish oil (PFO), 3% purified fish oil with 3% medium-chain triglycerides (50:50 MCT:PFO) and 3% purified fish oil with 7.59% medium-chain triglycerides (70:30 MCT:PFO). An endotoxin challenge was administered to half of the animals in each group at the completion of dietary treatment. Results All groups demonstrated normal growth throughout the study period. Groups fed MCT and HCO diets demonstrated biochemical essential fatty acid deficiency and decreased IL-6 and TNF-α response to endotoxin challenge. Groups containing PFO had increased inflammatory response to endotoxin challenge, and the addition of MCT to PFO mitigated this inflammatory response. Conclusion These results suggest that the addition of MCT to PFO formulations may decrease the host response to inflammatory challenge, which may pose potential for optimized PN formulations. Inclusion of MCT in lipid emulsions given with PN formulations may be of use in therapeutic interventions for disease states resulting from chronic inflammation. PMID:25458829
Carlson, Sarah J; Nandivada, Prathima; Chang, Melissa I; Mitchell, Paul D; O'Loughlin, Alison; Cowan, Eileen; Gura, Kathleen M; Nose, Vania; Bistrian, Bruce R; Puder, Mark
2015-02-01
Parenteral nutrition associated liver disease (PNALD) is a deadly complication of long term parenteral nutrition (PN) use in infants. Fish oil-based lipid emulsion has been shown in recent years to effectively treat PNALD. Alternative fat sources free of essential fatty acids have recently been investigated for health benefits related to decreased inflammatory response. We hypothesized that the addition of medium-chain triglycerides (MCT) to a purified fish oil-based diet would decrease the response to inflammatory challenge in mice, while allowing for sufficient growth and development. Six groups of ten adult male C57/Bl6 mice were pair-fed different dietary treatments for a period of twelve weeks, varying only in fat source (percent calories by weight): 10.84% soybean oil (SOY), 10% coconut oil (HCO), 10% medium-chain triglycerides (MCT), 3% purified fish oil (PFO), 3% purified fish oil with 3% medium-chain triglycerides (50:50 MCT:PFO) and 3% purified fish oil with 7.59% medium-chain triglycerides (70:30 MCT:PFO). An endotoxin challenge was administered to half of the animals in each group at the completion of dietary treatment. All groups demonstrated normal growth throughout the study period. Groups fed MCT and HCO diets demonstrated biochemical essential fatty acid deficiency and decreased IL-6 and TNF-α response to endotoxin challenge. Groups containing PFO had increased inflammatory response to endotoxin challenge, and the addition of MCT to PFO mitigated this inflammatory response. These results suggest that the addition of MCT to PFO formulations may decrease the host response to inflammatory challenge, which may pose potential for optimized PN formulations. Inclusion of MCT in lipid emulsions given with PN formulations may be of use in therapeutic interventions for disease states resulting from chronic inflammation. Copyright © 2015 Elsevier Inc. All rights reserved.
OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts.
Ravagli, Carlo; Pognan, Francois; Marc, Philippe
2017-01-01
The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com This software is designed to run on a Java EE application server and store data in a relational database. philippe.marc@novartis.com. © The Author 2016. Published by Oxford University Press.
OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts
Ravagli, Carlo; Pognan, Francois
2017-01-01
Summary: The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. Availability and implementation: The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com. This software is designed to run on a Java EE application server and store data in a relational database. Contact: philippe.marc@novartis.com PMID:27605099
Demand for Long-Term Care Insurance in China.
Wang, Qun; Zhou, Yi; Ding, Xinrui; Ying, Xiaohua
2017-12-22
The aim of this study was to estimate willingness to pay (WTP) for long-term care insurance (LTCI) and to explore the determinants of demand for LTCI in China. We collected data from a household survey conducted in Qinghai and Zhejiang on a sample of 1842 households. We relied on contingent valuation methods to elicit the demand for LTCI and random effects logistic regression to analyze the factors associated with the demand for LTCI. Complementarily, we used document analysis to compare the LTCI designed in this study and the current LTCI policies in the pilot cities. More than 90% of the respondents expressed their willingness to buy LTCI. The median WTP for LTCI was estimated at 370.14 RMB/year, accounting for 2.29% of average annual per capita disposable income. Price, age, education status, and income were significantly associated with demand for LTCI. Most pilot cities were found to mainly rely on Urban Employees Basic Medical Insurance funds as the financing source for LTCI. Considering that financing is one of the greatest challenges in the development of China's LTCI, we suggest that policy makers consider individual contribution as an important and possible option as a source of financing for LTCI.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291
Hydrology and water quality of forested lands in eastern North Carolina
G.M. Chescheir; M.E. Lebo; D.M. Amatya; J. Hughes; J.W. Gilliam; R.W. Skaggs; R.B. Herrmann
2003-01-01
Nonpoint sources of nutrients (NPS) are a widespread source of surface water pollution throu&out the United States. Characterizing the sources of this NPS nutrient loading is challenging due to variation in land management practices, physioyaphic setting, site conditions such as soil type, and climatic variation. For nutrients, there is the added challenge of...
Accuracy-preserving source term quadrature for third-order edge-based discretization
NASA Astrophysics Data System (ADS)
Nishikawa, Hiroaki; Liu, Yi
2017-09-01
In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul L. Wichlacz
2003-09-01
This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less
Long-term human exposure to lead from different media and intake pathways.
Pizzol, Massimo; Thomsen, Marianne; Andersen, Mikael Skou
2010-10-15
Lead (Pb) is well known as an environmental pollutant: it can accumulate in various media, so actual lead exposure reflects both historical and present contaminations. Two main challenges then emerge: obtaining updated information to gain an overall picture of the sources of exposure, and predicting the resulting internal body exposure levels and effects that occur under long-term exposure conditions. In this paper, a modeling approach is used to meet these challenges with reference to Danish exposure conditions. Levels of lead content in various media have been coupled with data for lead intake and absorption in the human body, for both children and adults. An age-dependent biokinetic model allows then for determination of the blood lead levels resulting from chronic exposure. The study shows that the actual intake of lead is up to 27% of the Provisional Tolerable Daily Intake (PTDI) for children and around 8% for adults. It is confirmed that the critical route of exposure is via ingestion, accounting for 99% of total lead intake, while inhalation contributes only to 1% of total lead intake. The resulting lead levels in the blood after 2 years of exposure to actual contamination conditions have been estimated as up to 2.2μg/dl in children and almost 1μg/dl in adults. Impacts from lead can occur even at such levels. The role of historical and present sources to lead in the environment is discussed, and, for specific child and adult exposure scenarios, external-internal concentration relationships for the direct linkage between lead in environmental media and resulting concentrations of lead in blood are then presented. Copyright © 2010 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Kouyoumdjian, Claudia; Guzmán, Bianca L.; Garcia, Nichole M.; Talavera-Bustillos, Valerie
2017-01-01
Growth of Latino students in postsecondary education merits an examination of their resources/challenges. A community cultural wealth model provided a framework to examine unacknowledged student resources and challenges. A mixed method approach found that first- and second-generation college students report equal numbers of sources of…
Kronholm, Scott C.; Capel, Paul D.
2015-01-01
Quantifying the relative contributions of different sources of water to a stream hydrograph is important for understanding the hydrology and water quality dynamics of a given watershed. To compare the performance of two methods of hydrograph separation, a graphical program [baseflow index (BFI)] and an end-member mixing analysis that used high-resolution specific conductance measurements (SC-EMMA) were used to estimate daily and average long-term slowflow additions of water to four small, primarily agricultural streams with different dominant sources of water (natural groundwater, overland flow, subsurface drain outflow, and groundwater from irrigation). Because the result of hydrograph separation by SC-EMMA is strongly related to the choice of slowflow and fastflow end-member values, a sensitivity analysis was conducted based on the various approaches reported in the literature to inform the selection of end-members. There were substantial discrepancies among the BFI and SC-EMMA, and neither method produced reasonable results for all four streams. Streams that had a small difference in the SC of slowflow compared with fastflow or did not have a monotonic relationship between streamflow and stream SC posed a challenge to the SC-EMMA method. The utility of the graphical BFI program was limited in the stream that had only gradual changes in streamflow. The results of this comparison suggest that the two methods may be quantifying different sources of water. Even though both methods are easy to apply, they should be applied with consideration of the streamflow and/or SC characteristics of a stream, especially where anthropogenic water sources (irrigation and subsurface drainage) are present.
Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Hector J.
2016-01-01
The majority of restoration strategies in the wake of large-scale disasters have focused on short-term emergency response solutions. Few consider medium- to long-term restoration strategies to reconnect urban areas to national supply chain interdependent critical infrastructure systems (SCICI). These SCICI promote the effective flow of goods, services, and information vital to the economic vitality of an urban environment. To re-establish the connectivity that has been broken during a disaster between the different SCICI, relationships between these systems must be identified, formulated, and added to a common framework to form a system-level restoration plan. To accomplish this goal, a considerable collection of SCICI data is necessary. The aim of this paper is to review what data are required for model construction, the accessibility of these data, and their integration with each other. While a review of publically available data reveals a dearth of real-time data to assist modeling long-term recovery following an extreme event, a significant amount of static data does exist and these data can be used to model the complex interdependencies needed. For the sake of illustration, a particular SCICI (transportation) is used to highlight the challenges of determining the interdependencies and creating models capable of describing the complexity of an urban environment with the data publically available. Integration of such data as is derived from public domain sources is readily achieved in a geospatial environment, after all geospatial infrastructure data are the most abundant data source and while significant quantities of data can be acquired through public sources, a significant effort is still required to gather, develop, and integrate these data from multiple sources to build a complete model. Therefore, while continued availability of high quality, public information is essential for modeling efforts in academic as well as government communities, a more streamlined approach to a real-time acquisition and integration of these data is essential.
Spatial and temporal dynamics of nitrate fluxes in a mesoscale catchment
NASA Astrophysics Data System (ADS)
Muller, C.; Musolff, A.; Strachauer, U.; Brauns, M.; Tarasova, L.; Merz, R.; Knoeller, K.
2017-12-01
Spatially and temporally variable and often superimposing processes like mobilization and turnover of N-species strongly affect nitrate fluxes at catchment outlets. It remains thus challenging to determine dominant nitrate sources to derive an effective river management. Here, we combine data sets from two spatially highly resolved key-date monitoring campaigns of nitrate fluxes along a mesoscale catchment in Germany with four years of monitoring data from two representative sites within the catchment. The study area is characterized by a strong land use gradient from pristine headwaters to lowland sub-catchments with intense agricultural land use and wastewater sources. Flow conditions were assessed by a hydrograph separation showing the clear dominance of base flow during both investigations. However, the absolute amounts of discharge differed significantly from each other (outlet: 1.42 m³ s-1 versus 0.43 m³ s-1). Nitrate concentration and flux in the headwater was found to be low. In contrast, nitrate loads further downstream originate from anthropogenic sources such as effluents from wastewater treatment plants (WWTP) and agricultural land use. The agricultural contribution did not vary in terms of nitrate concentration and isotopic signature between the years but in terms of flux. The contrasting amounts of discharge between the years led to a strongly increased relative wastewater contribution with decreasing discharge. This was mainly manifested in elevated δ18O-NO3- values downstream from the wastewater discharge. The four-year monitoring at two sides clearly indicates the chemostatic character of the agricultural N-source and its distinct, yet stable isotopic fingerprint. Denitrification was found to play no dominant role only for controlling nitrate loads in the river. The spatially highly resolved monitoring approach helped to accurately define hot spots of nitrate inputs into the stream while the long-term information allowed a classification of the results with respect to the seasonal N-dynamics in the catchment.
NASA Astrophysics Data System (ADS)
Liu, Xiaomang; Luo, Yuzhou; Yang, Tiantian; Liang, Kang; Zhang, Minghua; Liu, Changming
2015-10-01
In this study, we investigate the concurrent drought probability between the water source and destination regions of the central route of China's South to North Water Diversion Project. We find that both regions have been drying from 1960 to 2013. The estimated return period of concurrent drought events in both regions is 11 years. However, since 1997, these regions have experienced 5 years of simultaneous drought. The projection results of global climate models show that the probability of concurrent drought events is highly likely to increase during 2020 to 2050. The increasing concurrent drought events will challenge the success of the water diversion project, which is a strategic attempt to resolve the water crisis of North China Plain. The data suggest great urgency in preparing adaptive measures to ensure the long-term sustainable operation of the water diversion project.
NASA Astrophysics Data System (ADS)
Heath, Julian
2005-10-01
The past decade has seen huge advances in the application of microscopy in all areas of science. This welcome development in microscopy has been paralleled by an expansion of the vocabulary of technical terms used in microscopy: terms have been coined for new instruments and techniques and, as microscopes reach even higher resolution, the use of terms that relate to the optical and physical principles underpinning microscopy is now commonplace. The Dictionary of Microscopy was compiled to meet this challenge and provides concise definitions of over 2,500 terms used in the fields of light microscopy, electron microscopy, scanning probe microscopy, x-ray microscopy and related techniques. Written by Dr Julian P. Heath, Editor of Microscopy and Analysis, the dictionary is intended to provide easy navigation through the microscopy terminology and to be a first point of reference for definitions of new and established terms. The Dictionary of Microscopy is an essential, accessible resource for: students who are new to the field and are learning about microscopes equipment purchasers who want an explanation of the terms used in manufacturers' literature scientists who are considering using a new microscopical technique experienced microscopists as an aide mémoire or quick source of reference librarians, the press and marketing personnel who require definitions for technical reports.
Basaruddin, T.
2016-01-01
One essential task in information extraction from the medical corpus is drug name recognition. Compared with text sources come from other domains, the medical text mining poses more challenges, for example, more unstructured text, the fast growing of new terms addition, a wide range of name variation for the same drug, the lack of labeled dataset sources and external knowledge, and the multiple token representations for a single drug name. Although many approaches have been proposed to overwhelm the task, some problems remained with poor F-score performance (less than 0.75). This paper presents a new treatment in data representation techniques to overcome some of those challenges. We propose three data representation techniques based on the characteristics of word distribution and word similarities as a result of word embedding training. The first technique is evaluated with the standard NN model, that is, MLP. The second technique involves two deep network classifiers, that is, DBN and SAE. The third technique represents the sentence as a sequence that is evaluated with a recurrent NN model, that is, LSTM. In extracting the drug name entities, the third technique gives the best F-score performance compared to the state of the art, with its average F-score being 0.8645. PMID:27843447
Cost of care of haemophilia with inhibitors.
Di Minno, M N D; Di Minno, G; Di Capua, M; Cerbone, A M; Coppola, A
2010-01-01
In Western countries, the treatment of patients with inhibitors is presently the most challenging and serious issue in haemophilia management, direct costs of clotting factor concentrates accounting for >98% of the highest economic burden absorbed for the healthcare of patients in this setting. Being designed to address questions of resource allocation and effectiveness, decision models are the golden standard to reliably assess the overall economic implications of haemophilia with inhibitors in terms of mortality, bleeding-related morbidity, and severity of arthropathy. However, presently, most data analyses stem from retrospective short-term evaluations, that only allow for the analysis of direct health costs. In the setting of chronic diseases, the cost-utility analysis, that takes into account the beneficial effects of a given treatment/healthcare intervention in terms of health-related quality of life, is likely to be the most appropriate approach. To calculate net benefits, the quality adjusted life year, that significantly reflects such health gain, has to be compared with specific economic impacts. Differences in data sources, in medical practice and/or in healthcare systems and costs, imply that most current pharmacoeconomic analyses are confined to a narrow healthcare payer perspective. Long-term/lifetime prospective or observational studies, devoted to a careful definition of when to start a treatment; of regimens (dose and type of product) to employ, and of inhibitor population (children/adults, low-responding/high responding inhibitors) to study, are thus urgently needed to allow for newer insights, based on reliable data sources into resource allocation, effectiveness and cost-utility analysis in the treatment of haemophiliacs with inhibitors.
Lee, Charlotte A; Sinha, Siddharth; Fitzpatrick, Emer; Dhawan, Anil
2018-06-01
Human hepatocyte transplantation has been actively perused as an alternative to liver replacement for acute liver failure and liver-based metabolic defects. Current challenges in this field include a limited cell source, reduced cell viability following cryopreservation and poor engraftment of cells into the recipient liver with consequent limited life span. As a result, alternative stem cell sources such as pluripotent stem cells, fibroblasts, hepatic progenitor cells, amniotic epithelial cells and mesenchymal stem/stromal cells (MSCs) can be used to generate induced hepatocyte like cells (HLC) with each technique exhibiting advantages and disadvantages. HLCs may have comparable function to primary human hepatocytes and could offer patient-specific treatment. However, long-term functionality of transplanted HLCs and the potential oncogenic risks of using stem cells have yet to be established. The immunomodulatory effects of MSCs are promising, and multiple clinical trials are investigating their effect in cirrhosis and acute liver failure. Here, we review the current status of hepatocyte transplantation, alternative cell sources to primary human hepatocytes and their potential in liver regeneration. We also describe recent clinical trials using hepatocytes derived from stem cells and their role in improving the phenotype of several liver diseases.
Single photon sources with single semiconductor quantum dots
NASA Astrophysics Data System (ADS)
Shan, Guang-Cun; Yin, Zhang-Qi; Shek, Chan Hung; Huang, Wei
2014-04-01
In this contribution, we briefly recall the basic concepts of quantum optics and properties of semiconductor quantum dot (QD) which are necessary to the understanding of the physics of single-photon generation with single QDs. Firstly, we address the theory of quantum emitter-cavity system, the fluorescence and optical properties of semiconductor QDs, and the photon statistics as well as optical properties of the QDs. We then review the localization of single semiconductor QDs in quantum confined optical microcavity systems to achieve their overall optical properties and performances in terms of strong coupling regime, efficiency, directionality, and polarization control. Furthermore, we will discuss the recent progress on the fabrication of single photon sources, and various approaches for embedding single QDs into microcavities or photonic crystal nanocavities and show how to extend the wavelength range. We focus in particular on new generations of electrically driven QD single photon source leading to high repetition rates, strong coupling regime, and high collection efficiencies at elevated temperature operation. Besides, new developments of room temperature single photon emission in the strong coupling regime are reviewed. The generation of indistinguishable photons and remaining challenges for practical single-photon sources are also discussed.
Balancing water scarcity and quality for sustainable irrigated agriculture
NASA Astrophysics Data System (ADS)
Assouline, Shmuel; Russo, David; Silber, Avner; Or, Dani
2015-05-01
The challenge of meeting the projected doubling of global demand for food by 2050 is monumental. It is further exacerbated by the limited prospects for land expansion and rapidly dwindling water resources. A promising strategy for increasing crop yields per unit land requires the expansion of irrigated agriculture and the harnessing of water sources previously considered "marginal" (saline, treated effluent, and desalinated water). Such an expansion, however, must carefully consider potential long-term risks on soil hydroecological functioning. The study provides critical analyses of use of marginal water and management approaches to map out potential risks. Long-term application of treated effluent (TE) for irrigation has shown adverse impacts on soil transport properties, and introduces certain health risks due to the persistent exposure of soil biota to anthropogenic compounds (e.g., promoting antibiotic resistance). The availability of desalinated water (DS) for irrigation expands management options and improves yields while reducing irrigation amounts and salt loading into the soil. Quantitative models are used to delineate trends associated with long-term use of TE and DS considering agricultural, hydrological, and environmental aspects. The primary challenges to the sustainability of agroecosystems lies with the hazards of saline and sodic conditions, and the unintended consequences on soil hydroecological functioning. Multidisciplinary approaches that combine new scientific knowhow with legislative, economic, and societal tools are required to ensure safe and sustainable use of water resources of different qualities. The new scientific knowhow should provide quantitative models for integrating key biophysical processes with ecological interactions at appropriate spatial and temporal scales.
Can fungi compete with marine sources for chitosan production?
Ghormade, V; Pathan, E K; Deshpande, M V
2017-11-01
Chitosan, a β-1,4-linked glucosamine polymer is formed by deacetylation of chitin. It has a wide range of applications from agriculture to human health care products. Chitosan is commercially produced from shellfish, shrimp waste, crab and lobster processing using strong alkalis at high temperatures for long time periods. The production of chitin and chitosan from fungal sources has gained increased attention in recent years due to potential advantages in terms of homogenous polymer length, high degree of deacetylation and solubility over the current marine source. Zygomycetous fungi such as Absidia coerulea, Benjaminiella poitrasii, Cunninghamella elegans, Gongrenella butleri, Mucor rouxii, Mucor racemosus and Rhizopus oryzae have been studied extensively. Isolation of chitosan are reported from few edible basidiomycetous fungi like Agaricus bisporus, Lentinula edodes and Pleurotus sajor-caju. Other organisms from mycotech industries explored for chitosan production are Aspergillus niger, Penicillium chrysogenum, Saccharomyces cerevisiae and other wine yeasts. Number of aspects such as value addition to the existing applications of fungi, utilization of waste from agriculture sector, and issues and challenges for the production of fungal chitosan to compete with existing sources, metabolic engineering and novel applications have been discussed to adjudge the potential of fungal sources for commercial chitosan production. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rajaona, Harizo; Septier, François; Armand, Patrick; Delignon, Yves; Olry, Christophe; Albergel, Armand; Moussafir, Jacques
2015-12-01
In the eventuality of an accidental or intentional atmospheric release, the reconstruction of the source term using measurements from a set of sensors is an important and challenging inverse problem. A rapid and accurate estimation of the source allows faster and more efficient action for first-response teams, in addition to providing better damage assessment. This paper presents a Bayesian probabilistic approach to estimate the location and the temporal emission profile of a pointwise source. The release rate is evaluated analytically by using a Gaussian assumption on its prior distribution, and is enhanced with a positivity constraint to improve the estimation. The source location is obtained by the means of an advanced iterative Monte-Carlo technique called Adaptive Multiple Importance Sampling (AMIS), which uses a recycling process at each iteration to accelerate its convergence. The proposed methodology is tested using synthetic and real concentration data in the framework of the Fusion Field Trials 2007 (FFT-07) experiment. The quality of the obtained results is comparable to those coming from the Markov Chain Monte Carlo (MCMC) algorithm, a popular Bayesian method used for source estimation. Moreover, the adaptive processing of the AMIS provides a better sampling efficiency by reusing all the generated samples.
NASA Astrophysics Data System (ADS)
Trugman, Daniel T.; Shearer, Peter M.
2017-04-01
Earthquake source spectra contain fundamental information about the dynamics of earthquake rupture. However, the inherent tradeoffs in separating source and path effects, when combined with limitations in recorded signal bandwidth, make it challenging to obtain reliable source spectral estimates for large earthquake data sets. We present here a stable and statistically robust spectral decomposition method that iteratively partitions the observed waveform spectra into source, receiver, and path terms. Unlike previous methods of its kind, our new approach provides formal uncertainty estimates and does not assume self-similar scaling in earthquake source properties. Its computational efficiency allows us to examine large data sets (tens of thousands of earthquakes) that would be impractical to analyze using standard empirical Green's function-based approaches. We apply the spectral decomposition technique to P wave spectra from five areas of active contemporary seismicity in Southern California: the Yuha Desert, the San Jacinto Fault, and the Big Bear, Landers, and Hector Mine regions of the Mojave Desert. We show that the source spectra are generally consistent with an increase in median Brune-type stress drop with seismic moment but that this observed deviation from self-similar scaling is both model dependent and varies in strength from region to region. We also present evidence for significant variations in median stress drop and stress drop variability on regional and local length scales. These results both contribute to our current understanding of earthquake source physics and have practical implications for the next generation of ground motion prediction assessments.
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
Neutze, Richard; Moffat, Keith
2012-01-01
X-ray free electron lasers (XFELs) are potentially revolutionary X-ray sources because of their very short pulse duration, extreme peak brilliance and high spatial coherence, features that distinguish them from today’s synchrotron sources. We review recent time-resolved Laue diffraction and time-resolved wide angle X-ray scattering (WAXS) studies at synchrotron sources, and initial static studies at XFELs. XFELs have the potential to transform the field of time-resolved structural biology, yet many challenges arise in devising and adapting hardware, experimental design and data analysis strategies to exploit their unusual properties. Despite these challenges, we are confident that XFEL sources are poised to shed new light on ultrafast protein reaction dynamics. PMID:23021004
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
Data Visualization Challenges and Opportunities in User-Oriented Application Development
NASA Astrophysics Data System (ADS)
Pilone, D.; Quinn, P.; Mitchell, A. E.; Baynes, K.; Shum, D.
2015-12-01
This talk introduces the audience to some of the very real challenges associated with visualizing data from disparate data sources as encountered during the development of real world applications. In addition to the fundamental challenges of dealing with the data and imagery, this talk discusses usability problems encountered while trying to provide interactive and user-friendly visualization tools. At the end of this talk the audience will be aware of some of the pitfalls of data visualization along with tools and techniques to help mitigate them. There are many sources of variable resolution visualizations of science data available to application developers including NASA's Global Imagery Browse Services (GIBS), however integrating and leveraging visualizations in modern applications faces a number of challenges, including: - Varying visualized Earth "tile sizes" resulting in challenges merging disparate sources - Multiple visualization frameworks and toolkits with varying strengths and weaknesses - Global composite imagery vs. imagery matching EOSDIS granule distribution - Challenges visualizing geographically overlapping data with different temporal bounds - User interaction with overlapping or collocated data - Complex data boundaries and shapes combined with multi-orbit data and polar projections - Discovering the availability of visualizations and the specific parameters, color palettes, and configurations used to produce them In addition to discussing the challenges and approaches involved in visualizing disparate data, we will discuss solutions and components we'll be making available as open source to encourage reuse and accelerate application development.
Network Ecology and Adolescent Social Structure
McFarland, Daniel A.; Moody, James; Diehl, David; Smith, Jeffrey A.; Thomas, Reuben J.
2014-01-01
Adolescent societies—whether arising from weak, short-term classroom friendships or from close, long-term friendships—exhibit various levels of network clustering, segregation, and hierarchy. Some are rank-ordered caste systems and others are flat, cliquish worlds. Explaining the source of such structural variation remains a challenge, however, because global network features are generally treated as the agglomeration of micro-level tie-formation mechanisms, namely balance, homophily, and dominance. How do the same micro-mechanisms generate significant variation in global network structures? To answer this question we propose and test a network ecological theory that specifies the ways features of organizational environments moderate the expression of tie-formation processes, thereby generating variability in global network structures across settings. We develop this argument using longitudinal friendship data on schools (Add Health study) and classrooms (Classroom Engagement study), and by extending exponential random graph models to the study of multiple societies over time. PMID:25535409
Beddows, Patricia A; Mallon, Edward K
2018-02-09
A low-cost data logging platform is presented that provides long-term operation in remote or submerged environments. Three premade "breakout boards" from the open-source Arduino ecosystem are assembled into the core of the data logger. Power optimization techniques are presented which extend the operational life of this module-based design to >1 year on three alkaline AA batteries. Robust underwater housings are constructed for these loggers using PVC fittings. Both the logging platform and the enclosures, are easy to build and modify without specialized tools or a significant background in electronics. This combination turns the Cave Pearl data logger into a generalized prototyping system and this design flexibility is demonstrated with two field studies recording drip rates in a cave and water flow in a flooded cave system. This paper describes a complete DIY solution, suitable for a wide range of challenging deployment conditions.
Enhancing managerial effectiveness in dietetics.
Hoover, L W
1983-01-01
Environmental pressures from such sources as economic conditions, the government, third-party payers, and inter-institutional competition create managerial challenges. Although cost-containment has received considerable attention, long-term cost-effectiveness is probably the significant issue. Dietitians must become more cost-conscious and effective in resource management to attain desired performance outcomes. Some of the skills and characteristics essential to managerial effectiveness are a marketing orientation, systems design skill, quantitative operations management techniques, financial expertise, and leadership. These abilities facilitate decision-making and achievement of long-term cost-effectiveness. Curriculum enhancement and continuing education are two strategies for improving managerial competency in the dietetics profession. In dietetics education, study of management topics should be enhanced to provide more advanced coverage of management theories and quantitative models so that managerial performance can be at a higher level of sophistication and competency. To assure the viability of the dietetics profession, the emphasis on management must be more comprehensive and rigorous.
Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong
2015-07-01
This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.
Mallon, Edward K.
2018-01-01
A low-cost data logging platform is presented that provides long-term operation in remote or submerged environments. Three premade “breakout boards” from the open-source Arduino ecosystem are assembled into the core of the data logger. Power optimization techniques are presented which extend the operational life of this module-based design to >1 year on three alkaline AA batteries. Robust underwater housings are constructed for these loggers using PVC fittings. Both the logging platform and the enclosures, are easy to build and modify without specialized tools or a significant background in electronics. This combination turns the Cave Pearl data logger into a generalized prototyping system and this design flexibility is demonstrated with two field studies recording drip rates in a cave and water flow in a flooded cave system. This paper describes a complete DIY solution, suitable for a wide range of challenging deployment conditions. PMID:29425185
Network Ecology and Adolescent Social Structure.
McFarland, Daniel A; Moody, James; Diehl, David; Smith, Jeffrey A; Thomas, Reuben J
2014-12-01
Adolescent societies-whether arising from weak, short-term classroom friendships or from close, long-term friendships-exhibit various levels of network clustering, segregation, and hierarchy. Some are rank-ordered caste systems and others are flat, cliquish worlds. Explaining the source of such structural variation remains a challenge, however, because global network features are generally treated as the agglomeration of micro-level tie-formation mechanisms, namely balance, homophily, and dominance. How do the same micro-mechanisms generate significant variation in global network structures? To answer this question we propose and test a network ecological theory that specifies the ways features of organizational environments moderate the expression of tie-formation processes, thereby generating variability in global network structures across settings. We develop this argument using longitudinal friendship data on schools (Add Health study) and classrooms (Classroom Engagement study), and by extending exponential random graph models to the study of multiple societies over time.
Blöcher, C
2007-01-01
Industrial wastewater, especially from chemical and pharmaceutical production, often contains substances that need to be eliminated before being discharged into a biological treatment plant and following water bodies. This can be done within the production itself, in selected waste water streams or in a central treatment plant. Each of these approaches has certain advantages and disadvantages. Furthermore, a variety of wastewater treatment processes exist that can be applied at each stage, making it a challenging task to choose the best one in economic and ecological terms. In this work a general approach for that and examples from practice are discussed.
Tracking PACS usage with open source tools.
French, Todd L; Langer, Steve G
2011-08-01
A typical choice faced by Picture Archiving and Communication System (PACS) administrators is deciding how many PACS workstations are needed and where they should be sited. Oftentimes, the social consequences of having too few are severe enough to encourage oversupply and underutilization. This is costly, at best in terms of hardware and electricity, and at worst (depending on the PACS licensing and support model) in capital costs and maintenance fees. The PACS administrator needs tools to asses accurately the use to which her fleet is being subjected, and thus make informed choices before buying more workstations. Lacking a vended solution for this challenge, we developed our own.
Exploratory visualization of astronomical data on ultra-high-resolution wall displays
NASA Astrophysics Data System (ADS)
Pietriga, Emmanuel; del Campo, Fernando; Ibsen, Amanda; Primet, Romain; Appert, Caroline; Chapuis, Olivier; Hempel, Maren; Muñoz, Roberto; Eyheramendy, Susana; Jordan, Andres; Dole, Hervé
2016-07-01
Ultra-high-resolution wall displays feature a very high pixel density over a large physical surface, which makes them well-suited to the collaborative, exploratory visualization of large datasets. We introduce FITS-OW, an application designed for such wall displays, that enables astronomers to navigate in large collections of FITS images, query astronomical databases, and display detailed, complementary data and documents about multiple sources simultaneously. We describe how astronomers interact with their data using both the wall's touchsensitive surface and handheld devices. We also report on the technical challenges we addressed in terms of distributed graphics rendering and data sharing over the computer clusters that drive wall displays.
Pereira, André Luiz; de Vasconcelos Barros, Raphael Tobias; Pereira, Sandra Rosa
2017-11-01
Pharmacopollution is a public health and environmental outcome of some active pharmaceutical ingredients (API) and endocrine-disrupting compounds (EDC) dispersed through water and/or soil. Its most important sources are the pharmaceutical industry, healthcare facilities (e.g., hospitals), livestock, aquaculture, and households (patients' excretion and littering). The last source is the focus of this article. Research questions are "What is the Household Waste Medicine (HWM) phenomenon?", "How HWM and pharmacopollution are related?", and "Why is a reverse logistic system necessary for HWM in Brazil?" This article followed the seven steps proposed by Rother (2007) for a systematic review based on the Cochrane Handbook and the National Health Service (NHS) Center for Reviews Dissemination (CDR) Report. The HWM phenomenon brings many environmental, public health, and, social challenges. The insufficient data is a real challenge to assessing potential human health risks and API concentrations. Therefore, the hazard of long-term exposure to low concentrations of pharmacopollutants and the combined effects of API mixtures is still uncertain. HWM are strongly related to pharmacopollution, as this review shows. The Brazilian HWM case is remarkable because it is the fourth pharmaceutical market (US$ 65,971 billion), with a wide number of private pharmacies and drugstores (3.3: 10,000 pharmacy/inhabitants), self-medication habits, and no national take-back program. The HWM generation is estimated in 56.6 g/per capita, or 10,800 t/year. The absence of a reverse logistics for HWM can lead to serious environmental and public health challenges. The sector agreement for HWM is currently under public consultation.
Bogard, Jessica R; Farook, Sami; Marks, Geoffrey C; Waid, Jillian; Belton, Ben; Ali, Masum; Toufique, Kazi; Mamun, Abdulla; Thilsted, Shakuntala H
2017-01-01
Malnutrition is one of the biggest challenges of the 21st century, with one in three people in the world malnourished, combined with poor diets being the leading cause of the global burden of disease. Fish is an under-recognised and undervalued source of micronutrients, which could play a more significant role in addressing this global challenge. With rising pressures on capture fisheries, demand is increasingly being met from aquaculture. However, aquaculture systems are designed to maximise productivity, with little consideration for nutritional quality of fish produced. A global shift away from diverse capture species towards consumption of few farmed species, has implications for diet quality that are yet to be fully explored. Bangladesh provides a useful case study of this transition, as fish is the most important animal-source food in diets, and is increasingly supplied from aquaculture. We conducted a temporal analysis of fish consumption and nutrient intakes from fish in Bangladesh, using nationally representative household expenditure surveys from 1991, 2000 and 2010 (n = 25,425 households), combined with detailed species-level nutrient composition data. Fish consumption increased by 30% from 1991-2010. Consumption of non-farmed species declined by 33% over this period, compensated (in terms of quantity) by large increases in consumption of farmed species. Despite increased total fish consumption, there were significant decreases in iron and calcium intakes from fish (P<0.01); and no significant change in intakes of zinc, vitamin A and vitamin B12 from fish, reflecting lower overall nutritional quality of fish available for consumption over time. Our results challenge the conventional narrative that increases in food supply lead to improvements in diet and nutrition. As aquaculture becomes an increasingly important food source, it must embrace a nutrition-sensitive approach, moving beyond maximising productivity to also consider nutritional quality. Doing so will optimise the complementary role that aquaculture and capture fisheries play in improving nutrition and health.
Large Energy Development Projects: Lessons Learned from Space and Politics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, Harrison H.
2005-04-15
The challenge to global energy future lies in meeting the needs and aspirations of the ten to twelve billion earthlings that will be on this planet by 2050. At least an eight-fold increase in annual production will be required by the middle of this century. The energy sources that can be considered developed and 'in the box' for consideration as sources for major increases in supply over the next half century are fossil fuels, nuclear fission, and, to a lesser degree, various forms of direct and stored solar energy and conservation. None of these near-term sources of energy will providemore » an eight-fold or more increase in energy supply for various technical, environmental and political reasons.Only a few potential energy sources that fall 'out of the box' appear worthy of additional consideration as possible contributors to energy demand in 2050 and beyond. These particular candidates are deuterium-tritium fusion, space solar energy, and lunar helium-3 fusion. The primary advantage that lunar helium-3 fusion will have over other 'out of the box' energy sources in the pre-2050 timeframe is a clear path into the private capital markets. The development and demonstration of new energy sources will require several development paths, each of Apollo-like complexity and each with sub-paths of parallel development for critical functions and components.« less
Managing and Integrating Open Environmental Data - Technological Requirements and Challenges
NASA Astrophysics Data System (ADS)
Devaraju, Anusuriya; Kunkel, Ralf; Jirka, Simon
2014-05-01
Understanding environment conditions and trends requires information. This information is usually generated from sensor observations. Today, several infrastructures (e.g., GEOSS, EarthScope, NEON, NETLAKE, OOI, TERENO, WASCAL, and PEER-EurAqua) have been deployed to promote full and open exchange of environmental data. Standards for interfaces as well as data models/formats (OGC, CUAHSI, INSPIRE, SEE Grid, ISO) and open source tools have been developed to support seamless data exchange between various domains and organizations. In spite of this growing interest, it remains a challenge to manage and integrate open environmental data on the fly due to the distributed and heterogeneous nature of the data. Intuitive tools and standardized interfaces are vital to hide the technical complexity of underlying data management infrastructures. Meaningful descriptions of raw sensor data are necessary to achieve interoperability among different sources. As raw sensor data sets usually goes through several layers of summarization and aggregation, metadata and quality measures associated with these should be captured. Further processing of sensor data sets requires that they should be made compatible with existing environmental models. We need data policies and management plans on how to handle and publish open sensor data coming from different institutions. Clearly, a better management and usability of open environmental data is crucial, not only to gather large amounts of data, but also to cater various aspects such as data integration, privacy and trust, uncertainty, quality control, visualization, and data management policies. The proposed talk presents several key findings in terms of requirements, ongoing developments and technical challenges concerning these aspects from our recent work. This includes two workshops on open observation data and supporting tools, as well as the long-term environmental monitoring initiatives such as TERENO and TERENO-MED. Workshops Details: Spin the Sensor Web: Sensor Web Workshop 2013, Muenster, 21st-22nd November 2013 (http://52north.org/news/spin-the-sensor-web-sensor-web-workshop-2013) Special Session on Management of Open Environmental Observation Data - MOEOD 2014, Lisbon, 8th January 2014 (http://www.sensornets.org/MOEOD.aspx?y=2014) Monitoring Networks: TERENO : http://teodoor.icg.kfa-juelich.de/ TERENO-MED : http://www.tereno-med.net/
NASA Astrophysics Data System (ADS)
Sandefur, Heather Nicole
Microalgal biomass has been identified as a promising feedstock for a number of industrial applications, including the synthesis of new pharmaceutical and biofuel products. However, there are several economic limitations associated with the scale up of existing algal production processes. Critical economic studies of algae-based industrial processes highlight the high cost of supplying essential nutrients to microalgae cultures. With microalgae cells having relatively high nitrogen contents (4 to 8%), the N fertilizer cost in industrial-scale production is significant. In addition, the disposal of the large volumes of cell residuals that are generated during product extraction stages can pose other economic challenges. While waste streams can provide a concentrated source of nutrients, concerns about the presence of biological contaminants and the expense of heat treatment pose challenges to processes that use wastewater as a nutrient source in microalgae cultures. The goal of this study was to evaluate the potential application of ultrafiltration technology to aid in the utilization of agricultural wastewater in the cultivation of a high-value microalgae strain. An ultrafiltration system was used to remove inorganic solids and biological contaminants from wastewater taken from a swine farm in Savoy, Arkansas. The permeate from the system was then used as the nutrient source for the cultivation of the marine microalgae Porphyridium cruentum. During the ultrafiltration system operation, little membrane fouling was observed, and permeate fluxes remained relatively constant during both short-term and long-term tests. The complete rejection of E. coli and coliforms from the wastewater was also observed, in addition to a 75% reduction in total solids, including inorganic materials. The processed permeate was shown to have very high concentrations of total nitrogen (695.6 mg L-1) and total phosphorus (69.1 mg L-1 ). In addition, the growth of P. cruentum was analyzed in a medium containing swine waste permeate, and was compared to P. cruentum growth in a control medium. A higher biomass productivity, lipid productivity, and lipid content were observed in the microalgae cultivated in the swine waste medium compared to that of the control medium. These results suggest that, through the use of ultrafiltration technology as an alternative to traditional heat treatment, agricultural wastewaters could be effectively utilized as a nutrient source for microalgae cultivation.
NASA Astrophysics Data System (ADS)
Nigam, Kaushal; Kondekar, Pravin; Sharma, Dheeraj; Raad, Bhagwan Ram
2016-10-01
For the first time, a distinctive approach based on electrically doped concept is used for the formation of novel double gate tunnel field effect transistor (TFET). For this, the initially heavily doped n+ substrate is converted into n+-i-n+-i (Drain-Channel-Source) by the selection of appropriate work functions of control gate (CG) and polarity gate (PG) as 4.7 eV. Further, the formation of p+ region for source is performed by applying -1.2 V at PG. Hence, the structure behave like a n+-i-n+-p+ gated TFET, whereas, the control gate is used to modulate the effective tunneling barrier width. The physical realization of delta doped n+ layer near to source region is a challenging task for improving the device performance in terms of ON current and subthreshold slope. So, the proposed work will provide a better platform for fabrication of n+-i-n+-p+ TFET with low cost and suppressed random dopant fluctuation (RDF) effects. ATLAS TCAD device simulator is used to carry out the simulation work.
High speed imaging of dynamic processes with a switched source x-ray CT system
NASA Astrophysics Data System (ADS)
Thompson, William M.; Lionheart, William R. B.; Morton, Edward J.; Cunningham, Mike; Luggar, Russell D.
2015-05-01
Conventional x-ray computed tomography (CT) scanners are limited in their scanning speed by the mechanical constraints of their rotating gantries and as such do not provide the necessary temporal resolution for imaging of fast-moving dynamic processes, such as moving fluid flows. The Real Time Tomography (RTT) system is a family of fast cone beam CT scanners which instead use multiple fixed discrete sources and complete rings of detectors in an offset geometry. We demonstrate the potential of this system for use in the imaging of such high speed dynamic processes and give results using simulated and real experimental data. The unusual scanning geometry results in some challenges in image reconstruction, which are overcome using algebraic iterative reconstruction techniques and explicit regularisation. Through the use of a simple temporal regularisation term and by optimising the source firing pattern, we show that temporal resolution of the system may be increased at the expense of spatial resolution, which may be advantageous in some situations. Results are given showing temporal resolution of approximately 500 µs with simulated data and 3 ms with real experimental data.
Gardiner, James; Gunarathne, Nuwan; Howard, David; Kenney, Laurence
2016-01-01
Collecting large datasets of amputee gait data is notoriously difficult. Additionally, collecting data on less prevalent amputations or on gait activities other than level walking and running on hard surfaces is rarely attempted. However, with the wealth of user-generated content on the Internet, the scope for collecting amputee gait data from alternative sources other than traditional gait labs is intriguing. Here we investigate the potential of YouTube videos to provide gait data on amputee walking. We use an example dataset of trans-femoral amputees level walking at self-selected speeds to collect temporal gait parameters and calculate gait asymmetry. We compare our YouTube data with typical literature values, and show that our methodology produces results that are highly comparable to data collected in a traditional manner. The similarity between the results of our novel methodology and literature values lends confidence to our technique. Nevertheless, clear challenges with the collection and interpretation of crowd-sourced gait data remain, including long term access to datasets, and a lack of validity and reliability studies in this area.
Gardiner, James; Gunarathne, Nuwan; Howard, David; Kenney, Laurence
2016-01-01
Collecting large datasets of amputee gait data is notoriously difficult. Additionally, collecting data on less prevalent amputations or on gait activities other than level walking and running on hard surfaces is rarely attempted. However, with the wealth of user-generated content on the Internet, the scope for collecting amputee gait data from alternative sources other than traditional gait labs is intriguing. Here we investigate the potential of YouTube videos to provide gait data on amputee walking. We use an example dataset of trans-femoral amputees level walking at self-selected speeds to collect temporal gait parameters and calculate gait asymmetry. We compare our YouTube data with typical literature values, and show that our methodology produces results that are highly comparable to data collected in a traditional manner. The similarity between the results of our novel methodology and literature values lends confidence to our technique. Nevertheless, clear challenges with the collection and interpretation of crowd-sourced gait data remain, including long term access to datasets, and a lack of validity and reliability studies in this area. PMID:27764226
Whiskers aid anemotaxis in rats.
Yu, Yan S W; Graff, Matthew M; Bresee, Chris S; Man, Yan B; Hartmann, Mitra J Z
2016-08-01
Observation of terrestrial mammals suggests that they can follow the wind (anemotaxis), but the sensory cues underlying this ability have not been studied. We identify a significant contribution to anemotaxis mediated by whiskers (vibrissae), a modality previously studied only in the context of direct tactile contact. Five rats trained on a five-alternative forced-choice airflow localization task exhibited significant performance decrements after vibrissal removal. In contrast, vibrissal removal did not disrupt the performance of control animals trained to localize a light source. The performance decrement of individual rats was related to their airspeed threshold for successful localization: animals that found the task more challenging relied more on the vibrissae for localization cues. Following vibrissal removal, the rats deviated more from the straight-line path to the air source, choosing sources farther from the correct location. Our results indicate that rats can perform anemotaxis and that whiskers greatly facilitate this ability. Because air currents carry information about both odor content and location, these findings are discussed in terms of the adaptive significance of the interaction between sniffing and whisking in rodents.
Whiskers aid anemotaxis in rats
Yu, Yan S. W.; Graff, Matthew M.; Bresee, Chris S.; Man, Yan B.; Hartmann, Mitra J. Z.
2016-01-01
Observation of terrestrial mammals suggests that they can follow the wind (anemotaxis), but the sensory cues underlying this ability have not been studied. We identify a significant contribution to anemotaxis mediated by whiskers (vibrissae), a modality previously studied only in the context of direct tactile contact. Five rats trained on a five-alternative forced-choice airflow localization task exhibited significant performance decrements after vibrissal removal. In contrast, vibrissal removal did not disrupt the performance of control animals trained to localize a light source. The performance decrement of individual rats was related to their airspeed threshold for successful localization: animals that found the task more challenging relied more on the vibrissae for localization cues. Following vibrissal removal, the rats deviated more from the straight-line path to the air source, choosing sources farther from the correct location. Our results indicate that rats can perform anemotaxis and that whiskers greatly facilitate this ability. Because air currents carry information about both odor content and location, these findings are discussed in terms of the adaptive significance of the interaction between sniffing and whisking in rodents. PMID:27574705
Diamond-based single-photon emitters
NASA Astrophysics Data System (ADS)
Aharonovich, I.; Castelletto, S.; Simpson, D. A.; Su, C.-H.; Greentree, A. D.; Prawer, S.
2011-07-01
The exploitation of emerging quantum technologies requires efficient fabrication of key building blocks. Sources of single photons are extremely important across many applications as they can serve as vectors for quantum information—thereby allowing long-range (perhaps even global-scale) quantum states to be made and manipulated for tasks such as quantum communication or distributed quantum computation. At the single-emitter level, quantum sources also afford new possibilities in terms of nanoscopy and bio-marking. Color centers in diamond are prominent candidates to generate and manipulate quantum states of light, as they are a photostable solid-state source of single photons at room temperature. In this review, we discuss the state of the art of diamond-based single-photon emitters and highlight their fabrication methodologies. We present the experimental techniques used to characterize the quantum emitters and discuss their photophysical properties. We outline a number of applications including quantum key distribution, bio-marking and sub-diffraction imaging, where diamond-based single emitters are playing a crucial role. We conclude with a discussion of the main challenges and perspectives for employing diamond emitters in quantum information processing.
Spatial sound field synthesis and upmixing based on the equivalent source method.
Bai, Mingsian R; Hsu, Hoshen; Wen, Jheng-Ciang
2014-01-01
Given scarce number of recorded signals, spatial sound field synthesis with an extended sweet spot is a challenging problem in acoustic array signal processing. To address the problem, a synthesis and upmixing approach inspired by the equivalent source method (ESM) is proposed. The synthesis procedure is based on the pressure signals recorded by a microphone array and requires no source model. The array geometry can also be arbitrary. Four upmixing strategies are adopted to enhance the resolution of the reproduced sound field when there are more channels of loudspeakers than the microphones. Multi-channel inverse filtering with regularization is exploited to deal with the ill-posedness in the reconstruction process. The distance between the microphone and loudspeaker arrays is optimized to achieve the best synthesis quality. To validate the proposed system, numerical simulations and subjective listening experiments are performed. The results demonstrated that all upmixing methods improved the quality of reproduced target sound field over the original reproduction. In particular, the underdetermined ESM interpolation method yielded the best spatial sound field synthesis in terms of the reproduction error, timbral quality, and spatial quality.
Borowiak, Malgorzata
2010-01-01
Diabetic patients suffer from the loss of insulin-secreting β-cells, or from an improper working β-cell mass. Due to the increasing prevalence of diabetes across the world, there is a compelling need for a renewable source of cells that could replace pancreatic β-cells. In recent years, several promising approaches to the generation of new β-cells have been developed. These include directed differentiation of pluripotent cells such as embryonic stem (ES) cells or induced pluripotent stem (iPS) cells, or reprogramming of mature tissue cells. High yield methods to differentiate cell populations into β-cells, definitive endoderm, and pancreatic progenitors, have been established using growth factors and small molecules. However, the final step of directed differentiation to generate functional, mature β-cells in sufficient quantities has yet to be achieved in vitro. Beside the needs of transplantation medicine, a renewable source of β-cells would also be important in terms of a platform to study the pathogenesis of diabetes, and to seek alternative treatments. Finally, by generating new β-cells, we could learn more details about pancreatic development and β-cell specification. This review gives an overview of pancreas ontogenesis in the perspective of stem cell differentiation, and highlights the critical aspects of small molecules in the generation of a renewable β-cell source. Also, it discusses longer term challenges and opportunities in moving towards a therapeutic goal for diabetes.
The Precise Orbit and the Challenge of Long Term Stability
NASA Technical Reports Server (NTRS)
Lemoine, Frank G.; Cerri, Luca; Otten, Michiel; Bertiger, William; Zelensky, Nikita; Willis, Pascal
2012-01-01
The computation of a precise orbit reference is a fundamental component of the altimetric measurement. Since the dawn of the modern altimeter age, orbit accuracy has been determined by the quality of the GPS, SLR, and DORIS tracking systems, the fidelity of the measurement and force models, and the choice of parameterization for the orbit solutions, and whether a dynamic or a reduced-dynamic strategy is used to calculate the orbits. At the start of the TOPEX mission, the inaccuracies in the modeling of static gravity, dynamic ocean tides, and the nonconservative forces dominated the orbit error budget. Much of the error due to dynamic mismodeling can be compensated by reduced-dynamic tracking techniques depending on the measurement system strength. In the last decade, the launch of the GRACE mission has eliminated the static gravity field as a concern, and the background force models and the terrestrial reference frame have been systematically refined. GPS systems have realized many improvements, including better modeling of the forces on the GPS spacecraft, large increases in the ground tracking network, and improved modeling of the GPS measurements. DORIS systems have achieved improvements through the use of new antennae, more stable monumentation, and of satellite receivers that can track multiple beacons, and as well as through improved modeling of the nonconservative forces. Many of these improvements have been applied in the new reprocessed time series of orbits produced for the ERS satellites, Envisat, TOPEX/Poseidon and the Jason satellites, and as well as for the most recent Cryosat-2 and HY2A. We now face the challenge of maintaining a stable orbit reference for these altimetric satellites. Changes in the time-variable gravity field of the Earth and how these are modelled have been shown to affect the orbit evolution, and the calibration of the altimetric data with tide gauges. The accuracy of the reference frame realizations, and their projection into the future remains a source of error. Other sources of omission error include the geocenter for which no consensus model is as of yet applied. Although progress has been made in nonconservative force modeling through the use of detailed satellite-specific models, radiation pressure modeling, and atmospheric density modeling remain a potential source of orbit error. The longer term influence of variations in the solar and terrestrial radiation fields over annual and solar cycles remains principally untested. Also the long term variation in optical and thermal properties of the space vehicle surfaces would contribute to biases in the orbital frame if ignored. We review the status of altimetric precision orbit determination as exemplified by the recent computations undertaken by the different analysis centers for ERS, Envisat, TOPEX/Poseidon, Jason, Cryosat2 and HY2A, and we provide a perspective on the challenges for future missions such as the Jason-3, SENTINEL-3 and SWOT.
The DOD Humanitarian and Civic Assistance Program Concepts, Trends, Medical Challenges
1997-03-01
program improvements; measuring program performance and effectiveness; and defining military roles relevant to training, long term benefits, and the...support conclusions relevant to trends, benefits, challenges, suggested improvements, and suggested areas for future research. 15. SUBJECT TERMS 16...a Long Term Medical Benefit ................ 28 CONCLUSION
Challenges in the 1990's for astronaut training simulators
NASA Technical Reports Server (NTRS)
Brown, Patrick M.; Hajare, Ankur R.; Stark, George E.
1990-01-01
New challenges for the simulation community at the Johnson Space Center both in near and long terms are considered. In the near term, the challenges of supporting an increasing flight rate, maintaining operations while replacing obsolete subsystems, and incorporating forthcoming changes to the Space Shuttle are discussed, and focus is placed on a change of forward flight-deck instruments from electro-mechanical devices to electronic displays. Training astronauts for complex concurrent missions involving multiple spacecraft and geographically dispersed ground facilities is considered to be foremost of the long-term challenges, in addition to the tasks of improving the simulator reliability and the operational efficiency of the facilities.
Confronting dynamics and uncertainty in optimal decision making for conservation
Williams, Byron K.; Johnson, Fred A.
2013-01-01
The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a critically endangered population through captive breeding, control of invasive species, construction of biodiversity reserves, design of landscapes to increase habitat connectivity, and resource exploitation. Although these decision making problems and their solutions present significant challenges, we suggest that a systematic and effective approach to dynamic decision making in conservation need not be an onerous undertaking. The requirements are shared with any systematic approach to decision making--a careful consideration of values, actions, and outcomes.
Crystal surface integrity and diffusion measurements on Earth and planetary materials
NASA Astrophysics Data System (ADS)
Watson, E. B.; Cherniak, D. J.; Thomas, J. B.; Hanchar, J. M.; Wirth, R.
2016-09-01
Characterization of diffusion behavior in minerals is key to providing quantitative constraints on the ages and thermal histories of Earth and planetary materials. Laboratory experiments are a vital source of the needed diffusion measurements, but these can pose challenges because the length scales of diffusion achievable in a laboratory time are commonly less than 1 μm. An effective strategy for dealing with this challenge is to conduct experiments involving inward diffusion of the element of interest from a surface source, followed by quantification of the resulting diffusive-uptake profile using a high-resolution depth-profiling technique such as Rutherford backscattering spectroscopy (RBS), nuclear reaction analysis (NRA), or ion microprobe (SIMS). The value of data from such experiments is crucially dependent on the assumption that diffusion in the near-surface of the sample is representative of diffusion in the bulk material. Historical arguments suggest that the very process of preparing a polished surface for diffusion studies introduces defects-in the form of dislocations and cracks-in the outermost micrometer of the sample that make this region fundamentally different from the bulk crystal in terms of its diffusion properties. Extensive indirect evidence suggests that, in fact, the near-surface region of carefully prepared samples is no different from the bulk crystal in terms of its diffusion properties. A direct confirmation of this conclusion is nevertheless clearly important. Here we use transmission electron microscopy to confirm that the near-surface regions of olivine, quartz and feldspar crystals prepared using careful polishing protocols contain no features that could plausibly affect diffusion. This finding does not preclude damage to the mineral structure from other techniques used in diffusion studies (e.g., ion implantation), but even in this case the role of possible structural damage can be objectively assessed and controlled. While all evidence points to the reliability of diffusivities obtained from in-diffusion experiments, we do not recommend experiments of this type using a powder source as a means of obtaining diffusant solubility or partitioning information for the mineral of interest.
Confronting dynamics and uncertainty in optimal decision making for conservation
NASA Astrophysics Data System (ADS)
Williams, Byron K.; Johnson, Fred A.
2013-06-01
The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a critically endangered population through captive breeding, control of invasive species, construction of biodiversity reserves, design of landscapes to increase habitat connectivity, and resource exploitation. Although these decision making problems and their solutions present significant challenges, we suggest that a systematic and effective approach to dynamic decision making in conservation need not be an onerous undertaking. The requirements are shared with any systematic approach to decision making—a careful consideration of values, actions, and outcomes.
Aerodynamic sound of flow past an airfoil
NASA Technical Reports Server (NTRS)
Wang, Meng
1995-01-01
The long term objective of this project is to develop a computational method for predicting the noise of turbulence-airfoil interactions, particularly at the trailing edge. We seek to obtain the energy-containing features of the turbulent boundary layers and the near-wake using Navier-Stokes Simulation (LES or DNS), and then to calculate the far-field acoustic characteristics by means of acoustic analogy theories, using the simulation data as acoustic source functions. Two distinct types of noise can be emitted from airfoil trailing edges. The first, a tonal or narrowband sound caused by vortex shedding, is normally associated with blunt trailing edges, high angles of attack, or laminar flow airfoils. The second source is of broadband nature arising from the aeroacoustic scattering of turbulent eddies by the trailing edge. Due to its importance to airframe noise, rotor and propeller noise, etc., trailing edge noise has been the subject of extensive theoretical (e.g. Crighton & Leppington 1971; Howe 1978) as well as experimental investigations (e.g. Brooks & Hodgson 1981; Blake & Gershfeld 1988). A number of challenges exist concerning acoustic analogy based noise computations. These include the elimination of spurious sound caused by vortices crossing permeable computational boundaries in the wake, the treatment of noncompact source regions, and the accurate description of wave reflection by the solid surface and scattering near the edge. In addition, accurate turbulence statistics in the flow field are required for the evaluation of acoustic source functions. Major efforts to date have been focused on the first two challenges. To this end, a paradigm problem of laminar vortex shedding, generated by a two dimensional, uniform stream past a NACA0012 airfoil, is used to address the relevant numerical issues. Under the low Mach number approximation, the near-field flow quantities are obtained by solving the incompressible Navier-Stokes equations numerically at chord Reynolds number of 104. The far-field noise is computed using Curle's extension to the Lighthill analogy (Curle 1955). An effective method for separating the physical noise source from spurious boundary contributions is developed. This allows an accurate evaluation of the Reynolds stress volume quadrupoles, in addition to the more readily computable surface dipoles due to the unsteady lift and drag. The effect of noncompact source distribution on the far-field sound is assessed using an efficient integration scheme for the Curle integral, with full account of retarded-time variations. The numerical results confirm in quantitative terms that the far-field sound is dominated by the surface pressure dipoles at low Mach number. The techniques developed are applicable to a wide range of flows, including jets and mixing layers, where the Reynolds stress quadrupoles play a prominent or even dominant role in the overall sound generation.
77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...
de Boer, Joop; Schösler, Hanna; Aiking, Harry
2014-05-01
Adapting Western meat consumption to health and sustainability challenges requires an overall reduction of industrially produced animal proteins plus a partial replacement by plant proteins. Combining insights on food, environment, and consumers, this paper aims to explore change strategies that may help to meet these challenges, such as promoting smaller portions of meat ("less"), smaller portions using meat raised in a more sustainable manner ("less but better"), smaller portions and eating more vegetable protein ("less and more varied"), and meatless meals with or without meat substitutes ("veggie-days"). The underlying logic of the strategies was clarified by analyzing dietary choices. A nationwide sample of 1083 Dutch consumers provided information on current eating practices and potential changes. The results show that strategies to change meat eating frequencies and meat portion sizes will appeal to overlapping but partly different segments of consumers and that these strategies can be applied to address consumers in terms of their own preferences. The strategies appeared to have different strengths and weaknesses, making them complementary pathways to facilitate step-by-step changes in the amounts and the sources of protein consumed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Processing Challenges and Opportunities of Camel Dairy Products
Seifu, Eyassu; Ipsen, Richard; Kurtu, Mohamed Y.; Hansen, Egon Bech
2017-01-01
A review on the challenges and opportunities of processing camel milk into dairy products is provided with an objective of exploring the challenges of processing and assessing the opportunities for developing functional products from camel milk. The gross composition of camel milk is similar to bovine milk. Nonetheless, the relative composition, distribution, and the molecular structure of the milk components are reported to be different. Consequently, manufacturing of camel dairy products such as cheese, yoghurt, or butter using the same technology as for dairy products from bovine milk can result in processing difficulties and products of inferior quality. However, scientific evidence points to the possibility of transforming camel milk into products by optimization of the processing parameters. Additionally, camel milk has traditionally been used for its medicinal values and recent scientific studies confirm that it is a rich source of bioactive, antimicrobial, and antioxidant substances. The current literature concerning product design and functional potential of camel milk is fragmented in terms of time, place, and depth of the research. Therefore, it is essential to understand the fundamental features of camel milk and initiate detailed multidisciplinary research to fully explore and utilize its functional and technological properties. PMID:29109953
Ambert, Kyle H; Cohen, Aaron M
2009-01-01
OBJECTIVE Free-text clinical reports serve as an important part of patient care management and clinical documentation of patient disease and treatment status. Free-text notes are commonplace in medical practice, but remain an under-used source of information for clinical and epidemiological research, as well as personalized medicine. The authors explore the challenges associated with automatically extracting information from clinical reports using their submission to the Integrating Informatics with Biology and the Bedside (i2b2) 2008 Natural Language Processing Obesity Challenge Task. DESIGN A text mining system for classifying patient comorbidity status, based on the information contained in clinical reports. The approach of the authors incorporates a variety of automated techniques, including hot-spot filtering, negated concept identification, zero-vector filtering, weighting by inverse class-frequency, and error-correcting of output codes with linear support vector machines. MEASUREMENTS Performance was evaluated in terms of the macroaveraged F1 measure. RESULTS The automated system performed well against manual expert rule-based systems, finishing fifth in the Challenge's intuitive task, and 13(th) in the textual task. CONCLUSIONS The system demonstrates that effective comorbidity status classification by an automated system is possible.
Funding options for research: facing the market as well as government.
Mitchell, G; Nossal, G
1999-06-01
Parasitology is a challenge. At one level, the structural and genetic complexities of parasites provide ample technical challenges in regard to an understanding of parasite variability and adaptability, epidemiological diversity, drug resistance, etc. The intricacies of host parasite relationships including the immunology of parasitism will continually surprise yet frustrate the vaccine developer and keep the bravest immunoparasitologist busy and creative for decades. As if the technical considerations were not challenging enough, we see difficulties arising in sustaining a research endeavour and preserving a critical mass of researchers through the generation of high-level, long-term funding support. Contributing to this situation is the fact that most parasitic diseases of major impact in humans are largely centred around the rural poor in tropical, less industrially-developed countries and therefore of little or of fickle interest to the strictly commercially oriented. Moreover, the focus in the rural industries has moved away from aspects of on-farm production with lower priority given to studies on even the 'economically-important' parasites of livestock. It is contended that this may change again with pressures and clear marketing advantages to preserving a 'clean and green' image for Australia's primary industries. Overall, the extraordinary technical and conceptual advances in recent times have been tempered by uncertainties in research funding and severe cuts from some traditional sources for both fundamental and strategic/applied research in Parasitology. Several have highlighted the fact that deliverables in terms of new methods of disease control have been sparse and some claims made in the past have certainly been exaggerated. Yet the prospects and achievements at the front end of the long R&D pathway have never been brighter. In this article we examine the merits of a 'portfolio approach' to generating research funds in Parasitology and Science and Technology in Australia more generally, with an emphasis on strategies that, through welding good science with clear, medium-term product objectives, increase research funding opportunities.
NASA Astrophysics Data System (ADS)
Booske, John H.
2008-05-01
Homeland security and military defense technology considerations have stimulated intense interest in mobile, high power sources of millimeter-wave (mmw) to terahertz (THz) regime electromagnetic radiation, from 0.1 to 10THz. While vacuum electronic sources are a natural choice for high power, the challenges have yet to be completely met for applications including noninvasive sensing of concealed weapons and dangerous agents, high-data-rate communications, high resolution radar, next generation acceleration drivers, and analysis of fluids and condensed matter. The compact size requirements for many of these high frequency sources require miniscule, microfabricated slow wave circuits. This necessitates electron beams with tiny transverse dimensions and potentially very high current densities for adequate gain. Thus, an emerging family of microfabricated, vacuum electronic devices share many of the same plasma physics challenges that are currently confronting "classic" high power microwave (HPM) generators including long-life bright electron beam sources, intense beam transport, parasitic mode excitation, energetic electron interaction with surfaces, and rf air breakdown at output windows. The contemporary plasma physics and other related issues of compact, high power mmw-to-THz sources are compared and contrasted to those of HPM generation, and future research challenges and opportunities are discussed.
The Muon Conditions Data Management:. Database Architecture and Software Infrastructure
NASA Astrophysics Data System (ADS)
Verducci, Monica
2010-04-01
The management of the Muon Conditions Database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored and their analysis. The Muon conditions database is responsible for almost all of the 'non-event' data and detector quality flags storage needed for debugging of the detector operations and for performing the reconstruction and the analysis. In particular for the early data, the knowledge of the detector performance, the corrections in term of efficiency and calibration will be extremely important for the correct reconstruction of the events. In this work, an overview of the entire Muon conditions database architecture is given, in particular the different sources of the data and the storage model used, including the database technology associated. Particular emphasis is given to the Data Quality chain: the flow of the data, the analysis and the final results are described. In addition, the description of the software interfaces used to access to the conditions data are reported, in particular, in the ATLAS Offline Reconstruction framework ATHENA environment.
ERIC Educational Resources Information Center
Hsieh, Hui-hua
2012-01-01
The recruitment of international academic staff is viewed as one of the strategies to internationalise the universities. International academic staff, however, usually encounter many challenges when in a foreign context. This study aims to investigate the challenges of Chinese academic staff teaching in the UK in terms of language, relationships…
How Big Was It? Getting at Yield
NASA Astrophysics Data System (ADS)
Pasyanos, M.; Walter, W. R.; Ford, S. R.
2013-12-01
One of the most coveted pieces of information in the wake of a nuclear test is the explosive yield. Determining the yield from remote observations, however, is not necessarily a trivial thing. For instance, recorded observations of seismic amplitudes, used to estimate the yield, are significantly modified by the intervening media, which varies widely, and needs to be properly accounted for. Even after correcting for propagation effects such as geometrical spreading, attenuation, and station site terms, getting from the resulting source term to a yield depends on the specifics of the explosion source model, including material properties, and depth. Some formulas are based on assumptions of the explosion having a standard depth-of-burial and observed amplitudes can vary if the actual test is either significantly overburied or underburied. We will consider the complications and challenges of making these determinations using a number of standard, more traditional methods and a more recent method that we have developed using regional waveform envelopes. We will do this comparison for recent declared nuclear tests from the DPRK. We will also compare the methods using older explosions at the Nevada Test Site with announced yields, material and depths, so that actual performance can be measured. In all cases, we also strive to quantify realistic uncertainties on the yield estimation.
Empirical retrieval of sea spray aerosol production using satellite microwave radiometry
NASA Astrophysics Data System (ADS)
Savelyev, I. B.; Yelland, M. J.; Norris, S. J.; Salisbury, D.; Pascal, R. W.; Bettenhausen, M. H.; Prytherch, J.; Anguelova, M. D.; Brooks, I. M.
2017-12-01
This study presents a novel approach to obtaining global sea spray aerosol (SSA) production source term by relying on direct satellite observations of the ocean surface, instead of more traditional approaches driven by surface meteorology. The primary challenge in developing this empirical algorithm is to compile a calibrated, consistent dataset of SSA surface flux collected offshore over a variety of conditions (i.e., regions and seasons), thus representative of the global SSA production variability. Such dataset includes observations from SEASAW, HiWASE, and WAGES field campaigns, during which the SSA flux was measured from the bow of a research vessel using consistent and state-of-the-art eddy covariance methodology. These in situ data are matched to observations of the state of the ocean surface from Windsat polarimetric microwave satellite radiometer. Previous studies demonstrated the ability of WindSat to detect variations in surface waves slopes, roughness and foam, which led to the development of retrieval algorithms for surface wind vector and more recently whitecap fraction. Similarly, in this study, microwave emissions from the ocean surface are matched to and calibrated against in situ observations of the SSA production flux. The resulting calibrated empirical algorithm is applicable for retrieval of SSA source term throughout the duration of Windsat mission, from 2003 to present.
Bruppacher, R
1989-01-01
Criteria for epidemiological evidence of effects of elevated dosages of vitamins are basically the same as those for the evidence of effects of other exposures. Given the unambiguous classifications of both exposure and cases, they comprise strength, significance, specificity, and consistency of the statistical association, plausible time relationship as well as dose-effect relationship and consistency with other evidence. Today, the term epidemiological evidence usually refers to field experience, often to "observational," i.e., non-experimental, evidence. An extreme example for this are the so-called "ecological studies," which are frequently criticized because of their potential for exaggerated interpretations, though they can be very helpful in constructing and supporting hypotheses. For very rare and long-term effects the description and evaluation of individual cases are often combined with attempts of quantification, by relating them to the estimated exposure of the source population. This is subject to numerous sources of errors. If it is difficult to confirm the existence of rare and late effects, as the collection and interpretation of data on the prevention of such effects often present almost insurmountable methodological challenges. However, with correct interpretation and by keeping the quantitative perspective in mind, epidemiological evidence can be extremely helpful in the assessment of the overall importance, i.e., the public health significance, of such effects.
Confronting effective models for deconfinement in dense quark matter with lattice data
NASA Astrophysics Data System (ADS)
Andersen, Jens O.; Brauner, Tomáš; Naylor, William R.
2015-12-01
Ab initio numerical simulations of the thermodynamics of dense quark matter remain a challenge. Apart from the infamous sign problem, lattice methods have to deal with finite volume and discretization effects as well as with the necessity to introduce sources for symmetry-breaking order parameters. We study these artifacts in the Polyakov-loop-extended Nambu-Jona-Lasinio (PNJL) model and compare its predictions to existing lattice data for cold and dense two-color matter with two flavors of Wilson quarks. To achieve even qualitative agreement with lattice data requires the introduction of two novel elements in the model: (i) explicit chiral symmetry breaking in the effective contact four-fermion interaction, referred to as the chiral twist, and (ii) renormalization of the Polyakov loop. The feedback of the dense medium to the gauge sector is modeled by a chemical-potential-dependent scale in the Polyakov-loop potential. In contrast to previously used analytical Ansätze, we determine its dependence on the chemical potential from lattice data for the expectation value of the Polyakov loop. Finally, we propose adding a two-derivative operator to our effective model. This term acts as an additional source of explicit chiral symmetry breaking, mimicking an analogous term in the lattice Wilson action.
Recent Advances in Neural Electrode-Tissue Interfaces.
Woeppel, Kevin; Yang, Qianru; Cui, Xinyan Tracy
2017-12-01
Neurotechnology is facing an exponential growth in the recent decades. Neural electrode-tissue interface research has been well recognized as an instrumental component of neurotechnology development. While satisfactory long-term performance was demonstrated in some applications, such as cochlear implants and deep brain stimulators, more advanced neural electrode devices requiring higher resolution for single unit recording or microstimulation still face significant challenges in reliability and longevity. In this article, we review the most recent findings that contribute to our current understanding of the sources of poor reliability and longevity in neural recording or stimulation, including the material failure, biological tissue response and the interplay between the two. The newly developed characterization tools are introduced from electrophysiology models, molecular and biochemical analysis, material characterization to live imaging. The effective strategies that have been applied to improve the interface are also highlighted. Finally, we discuss the challenges and opportunities in improving the interface and achieving seamless integration between the implanted electrodes and neural tissue both anatomically and functionally.
Managing knowledge business intelligence: A cognitive analytic approach
NASA Astrophysics Data System (ADS)
Surbakti, Herison; Ta'a, Azman
2017-10-01
The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.
Statistical methods and computing for big data.
Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun
2016-01-01
Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.
Using ethnography to investigate life scientists' information needs.
Forsythe, D E
1998-01-01
Designing information resources that actually meet the information needs of individuals requires detailed knowledge of these needs. This poses a challenge for developers. Because the meaning of particular terms can vary by field, professional knowledge differs to some extent in different disciplines, and the questions that people ask assume a certain amount of unarticulated background knowledge, understanding the information needs of life scientists is not a trivial undertaking. One source of help in meeting this challenge is ethnography, a set of research methods and an associated conceptual stance developed and used by anthropologists for investigating uncontrolled real-world settings. Drawing on the author's experience in using ethnographic techniques to study clinicians' information needs, this paper describes why such research is necessary, why it requires particular research methods, what an ethnographic perspective has added to the study of information needs, and what this broader approach has revealed about the types of information sought by clinicians in the course of their daily practice. PMID:9681177
Asymptotic-preserving Lagrangian approach for modeling anisotropic transport in magnetized plasmas
NASA Astrophysics Data System (ADS)
Chacon, Luis; Del-Castillo-Negrete, Diego
2012-03-01
Modeling electron transport in magnetized plasmas is extremely challenging due to the extreme anisotropy between parallel (to the magnetic field) and perpendicular directions (the transport-coefficient ratio χ/χ˜10^10 in fusion plasmas). Recently, a novel Lagrangian Green's function method has been proposedfootnotetextD. del-Castillo-Negrete, L. Chac'on, PRL, 106, 195004 (2011); D. del-Castillo-Negrete, L. Chac'on, Phys. Plasmas, submitted (2011) to solve the local and non-local purely parallel transport equation in general 3D magnetic fields. The approach avoids numerical pollution, is inherently positivity-preserving, and is scalable algorithmically (i.e., work per degree-of-freedom is grid-independent). In this poster, we discuss the extension of the Lagrangian Green's function approach to include perpendicular transport terms and sources. We present an asymptotic-preserving numerical formulation, which ensures a consistent numerical discretization temporally and spatially for arbitrary χ/χ ratios. We will demonstrate the potential of the approach with various challenging configurations, including the case of transport across a magnetic island in cylindrical geometry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhlmann, Andreas V.; Houel, Julien; Warburton, Richard J.
Optically active quantum dots, for instance self-assembled InGaAs quantum dots, are potentially excellent single photon sources. The fidelity of the single photons is much improved using resonant rather than non-resonant excitation. With resonant excitation, the challenge is to distinguish between resonance fluorescence and scattered laser light. We have met this challenge by creating a polarization-based dark-field microscope to measure the resonance fluorescence from a single quantum dot at low temperature. We achieve a suppression of the scattered laser exceeding a factor of 10{sup 7} and background-free detection of resonance fluorescence. The same optical setup operates over the entire quantum dotmore » emission range (920–980 nm) and also in high magnetic fields. The major development is the outstanding long-term stability: once the dark-field point has been established, the microscope operates for days without alignment. The mechanical and optical designs of the microscope are presented, as well as exemplary resonance fluorescence spectroscopy results on individual quantum dots to underline the microscope's excellent performance.« less
Multimodal inspection in power engineering and building industries: new challenges and solutions
NASA Astrophysics Data System (ADS)
Kujawińska, Małgorzata; Malesa, Marcin; Malowany, Krzysztof
2013-09-01
Recently the demand and number of applications of full-field, optical measurement methods based on noncoherent light sources increased significantly. They include traditional image processing, thermovision, digital image correlation (DIC) and structured light methods. However, there are still numerous challenges connected with implementation of these methods to in-situ, long-term monitoring in industrial, civil engineering and cultural heritage applications, multimodal measurements of a variety of object features or simply adopting instruments to work in hard environmental conditions. In this paper we focus on 3D DIC method and present its enhancements concerning software modifications (new visualization methods and a method for automatic merging of data distributed in time) and hardware improvements. The modified 3D DIC system combined with infrared camera system is applied in many interesting cases: measurements of boiler drum during annealing and of pipelines in heat power stations and monitoring of different building steel struts at construction site and validation of numerical models of large building structures constructed of graded metal plate arches.
Statistical methods and computing for big data
Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing
2016-01-01
Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593
Benefits and Challenges of the Passport Broadcast Intervention in Long-Term Care
ERIC Educational Resources Information Center
Wittenberg-Lyles, Elaine; Oliver, Debra Parker; Demiris, George; Shaunfield, Sara
2012-01-01
Creative activities are a challenge for long-term care facilities. The Passport intervention uses web-based video technology to provide long-term care residents with a virtual travel experience. Passport broadcasts were conducted and staff and residents were interviewed about the experience. A thematic analysis of interviews was used to discern…
NASA Astrophysics Data System (ADS)
Pilone, D.; Quinn, P.; Mitchell, A. E.; Baynes, K.; Shum, D.
2014-12-01
This talk introduces the audience to some of the very real challenges associated with visualizing data from disparate data sources as encountered during the development of real world applications. In addition to the fundamental challenges of dealing with the data and imagery, this talk discusses usability problems encountered while trying to provide interactive and user-friendly visualization tools. At the end of this talk the audience will be aware of some of the pitfalls of data visualization along with tools and techniques to help mitigate them. There are many sources of variable resolution visualizations of science data available to application developers including NASA's Global Imagery Browse Services (GIBS), however integrating and leveraging visualizations in modern applications faces a number of challenges, including: - Varying visualized Earth "tile sizes" resulting in challenges merging disparate sources - Multiple visualization frameworks and toolkits with varying strengths and weaknesses - Global composite imagery vs. imagery matching EOSDIS granule distribution - Challenges visualizing geographically overlapping data with different temporal bounds - User interaction with overlapping or collocated data - Complex data boundaries and shapes combined with multi-orbit data and polar projections - Discovering the availability of visualizations and the specific parameters, color palettes, and configurations used to produce them In addition to discussing the challenges and approaches involved in visualizing disparate data, we will discuss solutions and components we'll be making available as open source to encourage reuse and accelerate application development.
Radiological analysis of plutonium glass batches with natural/enriched boron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rainisch, R.
2000-06-22
The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less
Seasonally-Dynamic SPARROW Modeling of Nitrogen Flux Using Earth Observation Data
NASA Astrophysics Data System (ADS)
Smith, R. A.; Schwarz, G. E.; Brakebill, J. W.; Hoos, A. B.; Moore, R. B.; Shih, J.; Nolin, A. W.; Macauley, M.; Alexander, R. B.
2013-12-01
SPARROW models are widely used to identify and quantify the sources of contaminants in watersheds and to predict their flux and concentration at specified locations downstream. Conventional SPARROW models describe the average relationship between sources and stream conditions based on long-term water quality monitoring data and spatially-referenced explanatory information. But many watershed management issues stem from intra- and inter-annual changes in contaminant sources, hydrologic forcing, or other environmental conditions which cause a temporary imbalance between inputs and stream water quality. Dynamic behavior of the system relating to changes in watershed storage and processing then becomes important. In this study, we describe dynamically calibrated SPARROW models of total nitrogen flux in three sub-regional watersheds: the Potomac River Basin, Long Island Sound drainage, and coastal South Carolina drainage. The models are based on seasonal water quality and watershed input data for a total 170 monitoring stations for the period 2001 to 2008. Frequently-reported, spatially-detailed input data on the phenology of agricultural production, terrestrial vegetation growth, and snow melt are often challenging requirements of seasonal modeling of reactive nitrogen. In this NASA-funded research, we use Enhanced Vegetation Index (EVI), gross primary production and snow/ice cover data from MODIS to parameterize seasonal uptake and release of nitrogen from vegetation and snowpack. The spatial reference frames of the models are 1:100,000-scale stream networks, and the computational time steps are 0.25-year seasons. Precipitation and temperature data are from PRISM. The model formulation accounts for storage of nitrogen from nonpoint sources including fertilized cropland, pasture, urban land, and atmospheric deposition. Model calibration is by non-linear regression. Once calibrated, model source terms based on previous season export allow for recursive dynamic simulation of stream flux: gradual increases or decreases in export occur as source supply rates and hydrologic forcing change. Based on an assumption that removal of nitrogen from watershed storage to stream channels and to 'permanent' sinks (e.g. the atmosphere and deep groundwater) occur as parallel first-order processes, the models can be used to estimate the approximate residence times of nonpoint source nitrogen in the watersheds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Litao Wang; Jiming Hao; Kebin He
In the last 10 yr, Beijing has made a great effort to improve its air quality. However, it is still suffering from regional coarse particulate matter (PM10) pollution that could be a challenge to the promise of clean air during the 2008 Olympics. To provide scientific guidance on regional air pollution control, the Mesoscale Modeling System Generation 5 (MM5) and the Models-3/Community Multiscale Air Quality Model (CMAQ) air quality modeling system was used to investigate the contributions of emission sources outside the Beijing area to pollution levels in Beijing. The contributions to the PM10 concentrations in Beijing were assessed formore » the following sources: power plants, industry, domestic sources, transportation, agriculture, and biomass open burning. In January, it is estimated that on average 22% of the PM10 concentrations can be attributed to outside sources, of which domestic and industrial sources contributed 37 and 31%, respectively. In August, as much as 40% of the PM10 concentrations came from regional sources, of which approximately 41% came from industry and 31% from power plants. However, the synchronous analysis of the hourly concentrations, regional contributions, and wind vectors indicates that in the heaviest pollution periods the local emission sources play a more important role. The implications are that long-term control strategies should be based on regional-scale collaborations, and that emission abatement of local sources may be more effective in lowering the PM10 concentration levels on the heavy pollution days. Better air quality can be attained during the Olympics by placing effective emission controls on the local sources in Beijing and by controlling emissions from industry and power plants in the surrounding regions. 44 refs., 6 figs., 3 tabs.« less
NASA Astrophysics Data System (ADS)
Perez, Pedro B.; Hamawi, John N.
2017-09-01
Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.
NASA Astrophysics Data System (ADS)
Renger, Bernhard; Rummeny, Ernst J.; Noël, Peter B.
2013-03-01
During the last decades, the reduction of radiation exposure especially in diagnostic computed tomography is one of the most explored topics. In the same time, it seems challenging to quantify the long-term clinical dose reduction with regard to new hardware as well as software solutions. To overcome this challenge, we developed a Dose Monitoring System (DMS), which collects information from PACS, RIS, MPPS and structured reports. The integration of all sources overcomes the weaknesses of single systems. To gather all possible information, we integrated an optical character recognition system to extract, for example, information from the CT-dose-report. All collected data are transferred to a database for further evaluation, e.g., for calculations of effective as well as organ doses. The DMS provides a single database for tracking all essential study and patient specific information across different modality as well as different vendors. As an initial study, we longitudinally investigated the dose reduction in CT examination when employing a noise-suppressing reconstruction algorithm. For this examination type a significant long-term reduction in radiation exposure is reported, when comparing to a CT-system with standard reconstruction. In summary our DMS tool not only enables us to track radiation exposure on daily bases but further enables to analyses the long term effect of new dose saving strategies. In the future the statistical analyses of all retrospective data, which are available in a modern imaging department, will provide a unique overview of advances in reduction of radiation exposure.
Harnessing pluralism for better health in Bangladesh.
Ahmed, Syed Masud; Evans, Timothy G; Standing, Hilary; Mahmud, Simeen
2013-11-23
How do we explain the paradox that Bangladesh has made remarkable progress in health and human development, yet its achievements have taken place within a health system that is frequently characterised as weak, in terms of inadequate physical and human infrastructure and logistics, and low performing? We argue that the development of a highly pluralistic health system environment, defined by the participation of a multiplicity of different stakeholders and agents and by ad hoc, diffused forms of management has contributed to these outcomes by creating conditions for rapid change. We use a combination of data from official sources, research studies, case studies of specific innovations, and in-depth knowledge from our own long-term engagement with health sector issues in Bangladesh to lay out a conceptual framework for understanding pluralism and its outcomes. Although we argue that pluralism has had positive effects in terms of stimulating change and innovation, we also note its association with poor health systems governance and regulation, resulting in endemic problems such as overuse and misuse of drugs. Pluralism therefore requires active management that acknowledges and works with its polycentric nature. We identify four key areas where this management is needed: participatory governance, accountability and regulation, information systems, and capacity development. This approach challenges some mainstream frameworks for managing health systems, such as the building blocks approach of the WHO Health Systems Framework. However, as pluralism increasingly defines the nature and the challenge of 21st century health systems, the experience of Bangladesh is relevant to many countries across the world. Copyright © 2013 Elsevier Ltd. All rights reserved.
The ATLAS conditions database architecture for the Muon spectrometer
NASA Astrophysics Data System (ADS)
Verducci, Monica; ATLAS Muon Collaboration
2010-04-01
The Muon System, facing the challenge requirement of the conditions data storage, has extensively started to use the conditions database project 'COOL' as the basis for all its conditions data storage both at CERN and throughout the worldwide collaboration as decided by the ATLAS Collaboration. The management of the Muon COOL conditions database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored. The Muon conditions database is responsible for almost all of the 'non event' data and detector quality flags storage needed for debugging of the detector operations and for performing reconstruction and analysis. The COOL database allows database applications to be written independently of the underlying database technology and ensures long term compatibility with the entire ATLAS Software. COOL implements an interval of validity database, i.e. objects stored or referenced in COOL have an associated start and end time between which they are valid, the data is stored in folders, which are themselves arranged in a hierarchical structure of folder sets. The structure is simple and mainly optimized to store and retrieve object(s) associated with a particular time. In this work, an overview of the entire Muon conditions database architecture is given, including the different sources of the data and the storage model used. In addiction the software interfaces used to access to the conditions data are described, more emphasis is given to the Offline Reconstruction framework ATHENA and the services developed to provide the conditions data to the reconstruction.
NASA Astrophysics Data System (ADS)
Rivard, Maxime; Villeneuve, Alain; Lamouche, Guy
2017-02-01
For bioimaging applications, commercial swept-sources currently provide enough power (tens of milliwatts) insuring good imaging condition without damaging the tissues. For industrial applications, more power is needed since the amount of light collected can be very low due to challenging measurement conditions or due to poor sample reflectivity. To address this challenge, we explore three different setups to externally amplify the output of a commercial swept-source: a booster semiconductor optical amplifier (BOA), an erbium-doped fiber amplifier (EDFA) and a combination of both. These external amplification setups allow the exploration of emerging OCT applications without the need to develop new hardware.
Mirus, Benjamin B.; Nimmo, J.R.
2013-01-01
The impact of preferential flow on recharge and contaminant transport poses a considerable challenge to water-resources management. Typical hydrologic models require extensive site characterization, but can underestimate fluxes when preferential flow is significant. A recently developed source-responsive model incorporates film-flow theory with conservation of mass to estimate unsaturated-zone preferential fluxes with readily available data. The term source-responsive describes the sensitivity of preferential flow in response to water availability at the source of input. We present the first rigorous tests of a parsimonious formulation for simulating water table fluctuations using two case studies, both in arid regions with thick unsaturated zones of fractured volcanic rock. Diffuse flow theory cannot adequately capture the observed water table responses at both sites; the source-responsive model is a viable alternative. We treat the active area fraction of preferential flow paths as a scaled function of water inputs at the land surface then calibrate the macropore density to fit observed water table rises. Unlike previous applications, we allow the characteristic film-flow velocity to vary, reflecting the lag time between source and deep water table responses. Analysis of model performance and parameter sensitivity for the two case studies underscores the importance of identifying thresholds for initiation of film flow in unsaturated rocks, and suggests that this parsimonious approach is potentially of great practical value.
Effect of Citric Acid Surface Modification on Solubility of Hydroxyapatite Nanoparticles.
Samavini, Ranuri; Sandaruwan, Chanaka; De Silva, Madhavi; Priyadarshana, Gayan; Kottegoda, Nilwala; Karunaratne, Veranja
2018-04-04
Worldwide, there is an amplified interest in nanotechnology-based approaches to develop efficient nitrogen, phosphorus, and potassium fertilizers to address major challenges pertaining to food security. However, there are significant challenges associated with fertilizer manufacture and supply as well as cost in both economic and environmental terms. The main issues relating to nitrogen fertilizer surround the use of fossil fuels in its production and the emission of greenhouse gases resulting from its use in agriculture; phosphorus being a mineral source makes it nonrenewable and casts a shadow on its sustainable use in agriculture. This study focuses on development of an efficient P nutrient system that could overcome the inherent problems arising from current P fertilizers. Attempts are made to synthesize citric acid surface-modified hydroxyapatite nanoparticles using wet chemical precipitation. The resulting nanohybrids were characterized using powder X-ray diffraction to extract the crystallographic data, while functional group analysis was done by Fourier transform infrared spectroscopy. Morphology and particle size were studied using scanning electron microscopy along with elemental analysis using energy-dispersive X-ray diffraction spectroscopy. Its effectiveness as a source of P was investigated using water release studies and bioavailability studies using Zea mays as the model crop. Both tests demonstrated the increased availability of P from nanohybrids in the presence of an organic acid compared with pure hydroxyapatite nanoparticles and rock phosphate.
Welvaert, Marijke; Caley, Peter
2016-01-01
Citizen science and crowdsourcing have been emerging as methods to collect data for surveillance and/or monitoring activities. They could be gathered under the overarching term citizen surveillance . The discipline, however, still struggles to be widely accepted in the scientific community, mainly because these activities are not embedded in a quantitative framework. This results in an ongoing discussion on how to analyze and make useful inference from these data. When considering the data collection process, we illustrate how citizen surveillance can be classified according to the nature of the underlying observation process measured in two dimensions-the degree of observer reporting intention and the control in observer detection effort. By classifying the observation process in these dimensions we distinguish between crowdsourcing, unstructured citizen science and structured citizen science. This classification helps the determine data processing and statistical treatment of these data for making inference. Using our framework, it is apparent that published studies are overwhelmingly associated with structured citizen science, and there are well developed statistical methods for the resulting data. In contrast, methods for making useful inference from purely crowd-sourced data remain under development, with the challenges of accounting for the unknown observation process considerable. Our quantitative framework for citizen surveillance calls for an integration of citizen science and crowdsourcing and provides a way forward to solve the statistical challenges inherent to citizen-sourced data.
Walker, Judith-Ann
2016-09-01
As global impact investors gear up to support roll out of the Sustainable Development Goals in the developing world, African CSOs are urged to ensure that governments shift health funding sources away from aid and loans to innovative domestic funding sources which prioritize health. To do so, African CSOs require support to build their capacity for policy and budget advocacy. Governments and development partners have failed to invest in long term capacity building projects for indigenous NGOs and instead support INGOs to push the health advocacy agenda forward. In Nigeria, the Gates foundation has risen to the challenge of building capacity of indigenous NGOs for social accountability in child and family health. The 3 year pilot project - Partnership for Advocacy in Child and Family Health Project (PACFaH) mainstreams capacity building as an effective implementation strategy for 8 indigenous NGOs to deliver on - policy; budgetary; legislative; and administrative advocacy in four issue areas: 1) family planning; 2) nutrition; 3) routine immunization; and 4) reduction of under-5 deaths from diarrhea and pneumonia. This paper documents the achievements of the eight advocacy NGOs in PACFaH, at midterm and notes that while there have been challenges, working through capacity building as an implementation strategy has enabled the local groups in the delivery of evidence based advocacy.
Functionalization of mesoporous materials for lanthanide and actinide extraction.
Florek, Justyna; Giret, Simon; Juère, Estelle; Larivière, Dominic; Kleitz, Freddy
2016-10-14
Among the energy sources currently available that could address our insatiable appetite for energy and minimize our CO2 emission, solar, wind, and nuclear energy currently occupy an increasing portion of our energy portfolio. The energy associated with these sources can however only be harnessed after mineral resources containing valuable constituents such as actinides (Ac) and rare earth elements (REEs) are extracted, purified and transformed into components necessary for the conversion of energy into electricity. Unfortunately, the environmental impacts resulting from their manufacture including the generation of undesirable and, sometimes, radioactive wastes and the non-renewable nature of the mineral resources, to name a few, have emerged as challenges that should be addressed by the scientific community. In this perspective, the recent development of functionalized solid materials dedicated to selective elemental separation/pre-concentration could provide answers to several of the above-mentioned challenges. This review focuses on recent advances in the field of mesoporous solid-phase (SP) sorbents designed for REEs and Ac liquid-solid extraction. Particular attention will be devoted to silica and carbon sorbents functionalized with commonly known ligands, such as phosphorus or amide-containing functionalities. The extraction performances of these new systems are discussed in terms of sorption capacity and selectivity. In order to support potential industrial applications of the silica and carbon-based sorbents, their main drawbacks and advantages are highlighted and discussed.
4th Generation ECR Ion Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyneis, Claude M.; Leitner, D.; Todd, D.S.
2008-12-01
The concepts and technical challenges related to developing a 4th generation ECR ion source with an RF frequency greater than 40 GHz and magnetic confinement fields greater than twice Becr will be explored in this paper. Based on the semi-empirical frequency scaling of ECR plasma density with the square of operating frequency, there should be significant gains in performance over current 3rd generation ECR ion sources, which operate at RF frequencies between 20 and 30 GHz. While the 3rd generation ECR ion sources use NbTi superconducting solenoid and sextupole coils, the new sources will need to use different superconducting materialsmore » such as Nb3Sn to reach the required magnetic confinement, which scales linearly with RF frequency. Additional technical challenges include increased bremsstrahlung production, which may increase faster than the plasma density, bremsstrahlung heating of the cold mass and the availability of high power continuous wave microwave sources at these frequencies. With each generation of ECR ion sources, there are new challenges to be mastered, but the potential for higher performance and reduced cost of the associated accelerator continue to make this a promising avenue for development.« less
Samrat, Nahidul Hoque; Bin Ahmad, Norhafizan; Choudhury, Imtiaz Ahmed; Bin Taha, Zahari
2014-01-01
Today, the whole world faces a great challenge to overcome the environmental problems related to global energy production. Most of the islands throughout the world depend on fossil fuel importation with respect to energy production. Recent development and research on green energy sources can assure sustainable power supply for the islands. But unpredictable nature and high dependency on weather conditions are the main limitations of renewable energy sources. To overcome this drawback, different renewable sources and converters need to be integrated with each other. This paper proposes a standalone hybrid photovoltaic- (PV-) wave energy conversion system with energy storage. In the proposed hybrid system, control of the bidirectional buck-boost DC-DC converter (BBDC) is used to maintain the constant dc-link voltage. It also accumulates the excess hybrid power in the battery bank and supplies this power to the system load during the shortage of hybrid power. A three-phase complex vector control scheme voltage source inverter (VSI) is used to control the load side voltage in terms of the frequency and voltage amplitude. Based on the simulation results obtained from Matlab/Simulink, it has been found that the overall hybrid framework is capable of working under the variable weather and load conditions.
Samrat, Nahidul Hoque; Ahmad, Norhafizan Bin; Choudhury, Imtiaz Ahmed; Taha, Zahari Bin
2014-01-01
Today, the whole world faces a great challenge to overcome the environmental problems related to global energy production. Most of the islands throughout the world depend on fossil fuel importation with respect to energy production. Recent development and research on green energy sources can assure sustainable power supply for the islands. But unpredictable nature and high dependency on weather conditions are the main limitations of renewable energy sources. To overcome this drawback, different renewable sources and converters need to be integrated with each other. This paper proposes a standalone hybrid photovoltaic- (PV-) wave energy conversion system with energy storage. In the proposed hybrid system, control of the bidirectional buck-boost DC-DC converter (BBDC) is used to maintain the constant dc-link voltage. It also accumulates the excess hybrid power in the battery bank and supplies this power to the system load during the shortage of hybrid power. A three-phase complex vector control scheme voltage source inverter (VSI) is used to control the load side voltage in terms of the frequency and voltage amplitude. Based on the simulation results obtained from Matlab/Simulink, it has been found that the overall hybrid framework is capable of working under the variable weather and load conditions. PMID:24892049
Multisource geological data mining and its utilization of uranium resources exploration
NASA Astrophysics Data System (ADS)
Zhang, Jie-lin
2009-10-01
Nuclear energy as one of clear energy sources takes important role in economic development in CHINA, and according to the national long term development strategy, many more nuclear powers will be built in next few years, so it is a great challenge for uranium resources exploration. Research and practice on mineral exploration demonstrates that utilizing the modern Earth Observe System (EOS) technology and developing new multi-source geological data mining methods are effective approaches to uranium deposits prospecting. Based on data mining and knowledge discovery technology, this paper uses multi-source geological data to character electromagnetic spectral, geophysical and spatial information of uranium mineralization factors, and provides the technical support for uranium prospecting integrating with field remote sensing geological survey. Multi-source geological data used in this paper include satellite hyperspectral image (Hyperion), high spatial resolution remote sensing data, uranium geological information, airborne radiometric data, aeromagnetic and gravity data, and related data mining methods have been developed, such as data fusion of optical data and Radarsat image, information integration of remote sensing and geophysical data, and so on. Based on above approaches, the multi-geoscience information of uranium mineralization factors including complex polystage rock mass, mineralization controlling faults and hydrothermal alterations have been identified, the metallogenic potential of uranium has been evaluated, and some predicting areas have been located.
Emo, love and god: making sense of Urban Dictionary, a crowd-sourced online dictionary.
Nguyen, Dong; McGillivray, Barbara; Yasseri, Taha
2018-05-01
The Internet facilitates large-scale collaborative projects and the emergence of Web 2.0 platforms, where producers and consumers of content unify, has drastically changed the information market. On the one hand, the promise of the 'wisdom of the crowd' has inspired successful projects such as Wikipedia, which has become the primary source of crowd-based information in many languages. On the other hand, the decentralized and often unmonitored environment of such projects may make them susceptible to low-quality content. In this work, we focus on Urban Dictionary, a crowd-sourced online dictionary. We combine computational methods with qualitative annotation and shed light on the overall features of Urban Dictionary in terms of growth, coverage and types of content. We measure a high presence of opinion-focused entries, as opposed to the meaning-focused entries that we expect from traditional dictionaries. Furthermore, Urban Dictionary covers many informal, unfamiliar words as well as proper nouns. Urban Dictionary also contains offensive content, but highly offensive content tends to receive lower scores through the dictionary's voting system. The low threshold to include new material in Urban Dictionary enables quick recording of new words and new meanings, but the resulting heterogeneous content can pose challenges in using Urban Dictionary as a source to study language innovation.
Emo, love and god: making sense of Urban Dictionary, a crowd-sourced online dictionary
McGillivray, Barbara
2018-01-01
The Internet facilitates large-scale collaborative projects and the emergence of Web 2.0 platforms, where producers and consumers of content unify, has drastically changed the information market. On the one hand, the promise of the ‘wisdom of the crowd’ has inspired successful projects such as Wikipedia, which has become the primary source of crowd-based information in many languages. On the other hand, the decentralized and often unmonitored environment of such projects may make them susceptible to low-quality content. In this work, we focus on Urban Dictionary, a crowd-sourced online dictionary. We combine computational methods with qualitative annotation and shed light on the overall features of Urban Dictionary in terms of growth, coverage and types of content. We measure a high presence of opinion-focused entries, as opposed to the meaning-focused entries that we expect from traditional dictionaries. Furthermore, Urban Dictionary covers many informal, unfamiliar words as well as proper nouns. Urban Dictionary also contains offensive content, but highly offensive content tends to receive lower scores through the dictionary’s voting system. The low threshold to include new material in Urban Dictionary enables quick recording of new words and new meanings, but the resulting heterogeneous content can pose challenges in using Urban Dictionary as a source to study language innovation. PMID:29892417
ERIC Educational Resources Information Center
Cain, Jim
This paper provides information sources and ideas for challenge and adventure activities. Main information sources are listed: libraries, ERIC, and several publishers and programs. Some useful publications are described that provide activities and ideas related to outdoor education, environmental issues, games, special populations, educational…
Automated Source-Code-Based Testing of Object-Oriented Software
NASA Astrophysics Data System (ADS)
Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten
2014-08-01
With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
Kitinoja, Lisa; Saran, Sunil; Roy, Susanta K; Kader, Adel A
2011-03-15
This article discusses the needs and challenges of developing good, science-based, simple methods for postharvest handling that can be made available in developing countries. Some of the traditional challenges have been successfully met (i.e. identifying causes and sources of losses for key crops, identifying many potential postharvest technologies of practical use for reducing losses), but many challenges remain. These include the characterization of indigenous crops in terms of their unique postharvest physiology (e.g. respiration rate, susceptibility to water loss, chilling sensitivity, ethylene sensitivity), ascertaining the differences between handling recommendations made for well-known varieties and the needs of local varieties of crops, and determining cost effectiveness of scale-appropriate postharvest technologies in each locale and for each crop. Key issues include building capacity at the local level in postharvest science, university teaching and extension, and continued adaptive research efforts to match emerging postharvest technologies to local needs as these continue to change over time. Development of appropriate postharvest technology relies upon many disciplines that are relevant to the overall success of horticulture, i.e. plant biology, engineering, agricultural economics, food processing, nutrition, food safety, and environmental conservation. The expanding pool of new information derived from postharvest research and outreach efforts in these areas can lead in many directions which are likely to have an impact on relieving poverty in developing countries. Copyright © 2011 Society of Chemical Industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, Kurt H.; McCurdy, C. William; Orlando, Thomas M.
2000-09-01
This report is based largely on presentations and discussions at two workshops and contributions from workshop participants. The workshop on Fundamental Challenges in Electron-Driven Chemistry was held in Berkeley, October 9-10, 1998, and addressed questions regarding theory, computation, and simulation. The workshop on Electron-Driven Processes: Scientific Challenges and Technological Opportunities was held at Stevens Institute of Technology, March 16-17, 2000, and focused largely on experiments. Electron-molecule and electron-atom collisions initiate and drive almost all the relevant chemical processes associated with radiation chemistry, environmental chemistry, stability of waste repositories, plasma-enhanced chemical vapor deposition, plasma processing of materials for microelectronic devices andmore » other applications, and novel light sources for research purposes (e.g. excimer lamps in the extreme ultraviolet) and in everyday lighting applications. The life sciences are a rapidly advancing field where the important role of electron-driven processes is only now beginning to be recognized. Many of the applications of electron-initiated chemical processes require results in the near term. A large-scale, multidisciplinary and collaborative effort should be mounted to solve these problems in a timely way so that their solution will have the needed impact on the urgent questions of understanding the physico-chemical processes initiated and driven by electron interactions.« less
Doyle, John T; Kindness, Larry; Realbird, James; Eggers, Margaret J; Camper, Anne K
2018-03-21
Disparities in access to safe public drinking water are increasingly being recognized as contributing to health disparities and environmental injustice for vulnerable communities in the United States. As the Co-Directors of the Apsaálooke Water and Wastewater Authority (AWWWA) for the Crow Tribe, with our academic partners, we present here the multiple and complex challenges we have addressed in improving and maintaining tribal water and wastewater infrastructure, including the identification of diverse funding sources for infrastructure construction, the need for many kinds of specialized expertise and long-term stability of project personnel, ratepayer difficulty in paying for services, an ongoing legacy of inadequate infrastructure planning, and lack of water quality research capacity. As a tribal entity, the AWWWA faces additional challenges, including the complex jurisdictional issues affecting all phases of our work, lack of authority to create water districts, and additional legal and regulatory gaps-especially with regards to environmental protection. Despite these obstacles, the AWWWA and Crow Tribe have successfully upgraded much of the local water and wastewater infrastructure. We find that ensuring safe public drinking water for tribal and other disadvantaged U.S. communities will require comprehensive, community-engaged approaches across a broad range of stakeholders to successfully address these complex legal, regulatory, policy, community capacity, and financial challenges.
Doyle, John T.; Kindness, Larry; Realbird, James; Camper, Anne K.
2018-01-01
Disparities in access to safe public drinking water are increasingly being recognized as contributing to health disparities and environmental injustice for vulnerable communities in the United States. As the Co-Directors of the Apsaálooke Water and Wastewater Authority (AWWWA) for the Crow Tribe, with our academic partners, we present here the multiple and complex challenges we have addressed in improving and maintaining tribal water and wastewater infrastructure, including the identification of diverse funding sources for infrastructure construction, the need for many kinds of specialized expertise and long-term stability of project personnel, ratepayer difficulty in paying for services, an ongoing legacy of inadequate infrastructure planning, and lack of water quality research capacity. As a tribal entity, the AWWWA faces additional challenges, including the complex jurisdictional issues affecting all phases of our work, lack of authority to create water districts, and additional legal and regulatory gaps—especially with regards to environmental protection. Despite these obstacles, the AWWWA and Crow Tribe have successfully upgraded much of the local water and wastewater infrastructure. We find that ensuring safe public drinking water for tribal and other disadvantaged U.S. communities will require comprehensive, community-engaged approaches across a broad range of stakeholders to successfully address these complex legal, regulatory, policy, community capacity, and financial challenges. PMID:29561815
Ferrie, Suzie
2006-04-01
Ethical dilemmas can be challenging for the nutrition support clinician who is accustomed to evidence-based practice. The emotional and personal nature of ethical decision making can present difficulties, and conflict can arise when people have different ethical perspectives. An understanding of ethical terms and ethical theories can be helpful in clarifying the source of this conflict. These may include prominent ethical theories such as moral relativism, utilitarianism, Kantian absolutism, Aristotle's virtue ethics and ethics of care, as well as the key ethical principles in healthcare (autonomy, beneficence, nonmaleficence, and justice). Adopting a step-by-step approach can simplify the process of resolving ethical problems.
An Insight Into Neurophysiology of Pulpal Pain: Facts and Hypotheses
Gupta, Abhishek; N., Meena
2013-01-01
Pain and pain control are important to the dental profession because the general perception of the public is that dental treatment and pain go hand in hand. Successful dental treatment requires that the source of pain be detected. If the origin of pain is not found, inappropriate dental care and, ultimately, extraction may result. Pain experienced before, during, or after endodontic therapy is a serious concern to both patients and endodontists, and the variability of discomfort presents a challenge in terms of diagnostic methods, endodontic therapy, and endodontic knowledge. This review will help clinicians understand the basic neurophysiology of pulpal pain and other painful conditions of the dental pulp that are not well understood. PMID:24156000
Supramolecular Approaches to Nanoscale Morphological Control in Organic Solar Cells
Haruk, Alexander M.; Mativetsky, Jeffrey M.
2015-01-01
Having recently surpassed 10% efficiency, solar cells based on organic molecules are poised to become a viable low-cost clean energy source with the added advantages of mechanical flexibility and light weight. The best-performing organic solar cells rely on a nanostructured active layer morphology consisting of a complex organization of electron donating and electron accepting molecules. Although much progress has been made in designing new donor and acceptor molecules, rational control over active layer morphology remains a central challenge. Long-term device stability is another important consideration that needs to be addressed. This review highlights supramolecular strategies for generating highly stable nanostructured organic photovoltaic active materials by design. PMID:26110382
VLTI + MIDI Study of the High Mass Protostellar Candidate NGC 3603 IRS 9A
NASA Astrophysics Data System (ADS)
Nürnberger, D. E. A.; Vehoff, S.; Hummel, C. A.; Duschl, W. J.
2010-02-01
The formation and early evolution of high mass stars is among the hottest topics in astrophysics. Interferometric studies of these young stars and their circumstellar environments (envelopes, disks and jets) at near and mid infrared wavelengths are still rare and in terms of data analysis/interpretation very challenging. We here report on observations of the high mass protostellar candidate NGC 3603 IRS 9A which we undertook with VLTI + MIDI in 2005, complemented by near and mid infrared imaging and spectroscopic data. We discuss our results obtained from dedicated modeling efforts, employing both DUSTY and MC3D radiative transfer codes for a selected number of source geometries and surface brightness distributions.
An insight into neurophysiology of pulpal pain: facts and hypotheses.
Jain, Niharika; Gupta, Abhishek; N, Meena
2013-10-01
Pain and pain control are important to the dental profession because the general perception of the public is that dental treatment and pain go hand in hand. Successful dental treatment requires that the source of pain be detected. If the origin of pain is not found, inappropriate dental care and, ultimately, extraction may result. Pain experienced before, during, or after endodontic therapy is a serious concern to both patients and endodontists, and the variability of discomfort presents a challenge in terms of diagnostic methods, endodontic therapy, and endodontic knowledge. This review will help clinicians understand the basic neurophysiology of pulpal pain and other painful conditions of the dental pulp that are not well understood.
Gerbersdorf, Sabine U; Cimatoribus, Carla; Class, Holger; Engesser, Karl-H; Helbich, Steffen; Hollert, Henner; Lange, Claudia; Kranert, Martin; Metzger, Jörg; Nowak, Wolfgang; Seiler, Thomas-Benjamin; Steger, Kristin; Steinmetz, Heidrun; Wieprecht, Silke
2015-06-01
Anthropogenic Trace Compounds (ATCs) that continuously grow in numbers and concentrations are an emerging issue for water quality in both natural and technical environments. The complex web of exposure pathways as well as the variety in the chemical structure and potency of ATCs represents immense challenges for future research and policy initiatives. This review summarizes current trends and identifies knowledge gaps in innovative, effective monitoring and management strategies while addressing the research questions concerning ATC occurrence, fate, detection and toxicity. We highlight the progressing sensitivity of chemical analytics and the challenges in harmonization of sampling protocols and methods, as well as the need for ATC indicator substances to enable cross-national valid monitoring routine. Secondly, the status quo in ecotoxicology is described to advocate for a better implementation of long-term tests, to address toxicity on community and environmental as well as on human-health levels, and to adapt various test levels and endpoints. Moreover, we discuss potential sources of ATCs and the current removal efficiency of wastewater treatment plants (WWTPs) to indicate the most effective places and elimination strategies. Knowledge gaps in transport and/or detainment of ATCs through their passage in surface waters and groundwaters are further emphasized in relation to their physico-chemical properties, abiotic conditions and biological interactions in order to highlight fundamental research needs. Finally, we demonstrate the importance and remaining challenges of an appropriate ATC risk assessment since this will greatly assist in identifying the most urgent calls for action, in selecting the most promising measures, and in evaluating the success of implemented management strategies. Copyright © 2015. Published by Elsevier Ltd.
Crowd Sourcing for Challenging Technical Problems and Business Model
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.; Richard, Elizabeth
2011-01-01
Crowd sourcing may be defined as the act of outsourcing tasks that are traditionally performed by an employee or contractor to an undefined, generally large group of people or community (a crowd) in the form of an open call. The open call may be issued by an organization wishing to find a solution to a particular problem or complete a task, or by an open innovation service provider on behalf of that organization. In 2008, the Space Life Sciences Directorate (SLSD), with the support of Wyle Integrated Science and Engineering, established and implemented pilot projects in open innovation (crowd sourcing) to determine if these new internet-based platforms could indeed find solutions to difficult technical challenges. These unsolved technical problems were converted to problem statements, also called "Challenges" or "Technical Needs" by the various open innovation service providers, and were then posted externally to seek solutions. In addition, an open call was issued internally to NASA employees Agency wide (10 Field Centers and NASA HQ) using an open innovation service provider crowd sourcing platform to post NASA challenges from each Center for the others to propose solutions). From 2008 to 2010, the SLSD issued 34 challenges, 14 externally and 20 internally. The 14 external problems or challenges were posted through three different vendors: InnoCentive, Yet2.com and TopCoder. The 20 internal challenges were conducted using the InnoCentive crowd sourcing platform designed for internal use by an organization. This platform was customized for NASA use and promoted as NASA@Work. The results were significant. Of the seven InnoCentive external challenges, two full and five partial awards were made in complex technical areas such as predicting solar flares and long-duration food packaging. Similarly, the TopCoder challenge yielded an optimization algorithm for designing a lunar medical kit. The Yet2.com challenges yielded many new industry and academic contacts in bone imaging, microbial detection and even the use of pharmaceuticals for radiation protection. The internal challenges through NASA@Work drew over 6000 participants across all NASA centers. Challenges conducted by each NASA center elicited ideas and solutions from several other NASA centers and demonstrated rapid and efficient participation from employees at multiple centers to contribute to problem solving. Finally, on January 19, 2011, the SLSD conducted a workshop on open collaboration and innovation strategies and best practices through the newly established NASA Human Health and Performance Center (NHHPC). Initial projects will be described leading to a new business model for SLSD.
Long-term care policy and financing as a public or private matter in the United States.
Yee, D L
2001-01-01
Effective approaches to assure adequate resources, infrastructure, and broad societal support to address chronic care needs are volatile and potentially unpopular issues that can result in many losers (those getting far less than they want) and few winners (those who gain access to scarce societal resources for care). In the United States, debates on long-term care involve a complex set of issues and services that link health, social services (welfare), and economic policies that often pit public and private sector interests and values against one another. Yet long-term care policies fill a necessary function in society to clarify roles, expectations, and functions of public, non-profit, for profit, individual, and family sectors of a society. By assessing and developing policy proposals that include all long-term care system dimensions, a society can arrive at systematic, fair, and rational decisions. Limiting decisions to system financing aspects alone is likely to result in unforeseen or unintended effects in a long-term care system that stopgap "fixes" cannot resolve. Three underlying policy challenges are presented: the need for policymakers to consider whether the public sector is the first or last source of payment for long-term care; whether government is seen primarily as a risk or cost manager; and the extent to which choice is afforded to elders and family caregivers with regard to the types, settings, and amount of long-term care desired to complement family care.
Bayesian estimation of a source term of radiation release with approximately known nuclide ratios
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek
2016-04-01
We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Salathé, Marcel
2016-01-01
The digital revolution has contributed to very large data sets (ie, big data) relevant for public health. The two major data sources are electronic health records from traditional health systems and patient-generated data. As the two data sources have complementary strengths—high veracity in the data from traditional sources and high velocity and variety in patient-generated data—they can be combined to build more-robust public health systems. However, they also have unique challenges. Patient-generated data in particular are often completely unstructured and highly context dependent, posing essentially a machine-learning challenge. Some recent examples from infectious disease surveillance and adverse drug event monitoring demonstrate that the technical challenges can be solved. Despite these advances, the problem of verification remains, and unless traditional and digital epidemiologic approaches are combined, these data sources will be constrained by their intrinsic limits. PMID:28830106
Jones, Dorothy I; McGee, Charles E; Sample, Christopher J; Sempowski, Gregory D; Pickup, David J; Staats, Herman F
2016-07-01
Modified vaccinia Ankara virus (MVA) is a smallpox vaccine candidate. This study was performed to determine if MVA vaccination provides long-term protection against rabbitpox virus (RPXV) challenge, an animal model of smallpox. Two doses of MVA provided 100% protection against a lethal intranasal RPXV challenge administered 9 months after vaccination. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Challenges for Professional Organizations: Lessons from the Past
ERIC Educational Resources Information Center
O'Neil, Sharon Lund; Willis, Cheryl L.
2005-01-01
Many challenges face professional organizations. This study focused on the contributions, challenges, and trends in business education professional organizations over the years. Data was gathered from formal and informal sources associated with 17 business education professional organizations. The study showed that primary challenges were…
Aligning vocabulary for interoperability of ISR assets using authoritative sources
NASA Astrophysics Data System (ADS)
Hookway, Steve; Patten, Terry; Gorman, Joe
2017-05-01
The growing arsenal of network-centric sensor platforms shows great potential to enhance situational awareness capabilities. Non-traditional sensors collect a diverse range of data that can provide a more accurate and comprehensive common operational picture when combined with conventional intelligence, surveillance, and reconnaissance (ISR) products. One of the integration challenges is mediating differences in terminology that different data providers use to describe the data they have extracted. A data consumer should be able to reference information using the vocabulary that they are familiar with and rely on the framework to handle the mediation; for example, it should be up to the framework to identify that two different terms are synonyms for the same concept. In this paper we present an approach for automatically performing this alignment using authoritative sources such as Wikipedia (a stand-in for the Intellipedia wiki), and present experimental results that demonstrate that this approach is able to align a large number of concepts between different terminologies.
Nuclear Power; Past, present and future
NASA Astrophysics Data System (ADS)
Elliott, David
2017-04-01
This book looks at the early history of nuclear power, at what happened next, and at its longer-term prospects. The main question is: can nuclear power overcome the problems that have emerged? It was once touted as the ultimate energy source, freeing mankind from reliance on dirty, expensive fossil energy. Sixty years on, nuclear only supplies around 11.5% of global energy and is being challenged by cheaper energy options. While the costs of renewable sources, like wind and solar, are falling rapidly, nuclear costs have remained stubbornly high. Its development has also been slowed by a range of other problems, including a spate of major accidents, security concerns and the as yet unresolved issue of what to do with the wastes that it produces. In response, a new generation of nuclear reactors is being developed, many of them actually revised versions of the ideas first looked at in the earlier phase. Will this new generation of reactors bring nuclear energy to the forefront of energy production in the future?
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2014 CFR
2014-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2012 CFR
2012-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2010 CFR
2010-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2013 CFR
2013-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2011 CFR
2011-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
Sherman, Martin F; Gershon, Robyn R; Riley, Halley E M; Zhi, Qi; Magda, Lori A; Peyrot, Mark
2017-06-01
We examined psychological outcomes in a sample of participants who evacuated from the World Trade Center towers on September 11, 2011. This study aimed to identify risk factors for psychological injury that might be amenable to change, thereby reducing adverse impacts associated with emergency high-rise evacuation. We used data from a cross-sectional survey conducted 2 years after the attacks to classify 789 evacuees into 3 self-reported psychological outcome categories: long-term psychological disorder diagnosed by a physician, short-term psychological disorder and/or memory problems, and no known psychological disorder. After nonmodifiable risk factors were controlled for, diagnosed psychological disorder was more likely for evacuees who reported lower "emergency preparedness safety climate" scores, more evacuation challenges (during exit from the towers), and evacuation-related physical injuries. Other variables associated with increased risk of psychological disorder outcome included gender (female), lower levels of education, preexisting physical disability, preexisting psychological disorder, greater distance to final exit, and more information sources during egress. Improving the "emergency preparedness safety climate" of high-rise business occupancies and reducing the number of egress challenges are potential strategies for reducing the risk of adverse psychological outcomes of high-rise evacuations. Focused safety training for individuals with physical disabilities is also warranted. (Disaster Med Public Health Preparedness. 2017;11:326-336).
Exploiting semantic linkages among multiple sources for semantic information retrieval
NASA Astrophysics Data System (ADS)
Li, JianQiang; Yang, Ji-Jiang; Liu, Chunchen; Zhao, Yu; Liu, Bo; Shi, Yuliang
2014-07-01
The vision of the Semantic Web is to build a global Web of machine-readable data to be consumed by intelligent applications. As the first step to make this vision come true, the initiative of linked open data has fostered many novel applications aimed at improving data accessibility in the public Web. Comparably, the enterprise environment is so different from the public Web that most potentially usable business information originates in an unstructured form (typically in free text), which poses a challenge for the adoption of semantic technologies in the enterprise environment. Considering that the business information in a company is highly specific and centred around a set of commonly used concepts, this paper describes a pilot study to migrate the concept of linked data into the development of a domain-specific application, i.e. the vehicle repair support system. The set of commonly used concepts, including the part name of a car and the phenomenon term on the car repairing, are employed to build the linkage between data and documents distributed among different sources, leading to the fusion of documents and data across source boundaries. Then, we describe the approaches of semantic information retrieval to consume these linkages for value creation for companies. The experiments on two real-world data sets show that the proposed approaches outperform the best baseline 6.3-10.8% and 6.4-11.1% in terms of top five and top 10 precisions, respectively. We believe that our pilot study can serve as an important reference for the development of similar semantic applications in an enterprise environment.
Challenges in sharing of geospatial data by data custodians in South Africa
NASA Astrophysics Data System (ADS)
Kay, Sissiel E.
2018-05-01
As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.
Grass, Juliane; Kirschbaum, Clemens; Miller, Robert; Gao, Wei; Steudte-Schmiedgen, Susann; Stalder, Tobias
2015-03-01
Hair cortisol concentrations (HCC) are assumed to provide a stable, integrative marker of long-term systemic cortisol secretion. However, contrary to this assumption, some recent observations have raised the possibility that HCC may be subject to acute influences, potentially related to cortisol incorporation from sweat. Here, we provide a first detailed in vivo investigation of this possibility comprising two independent experimental studies: study I (N=42) used a treadmill challenge to induce sweating together with systemic cortisol reactivity while in study II (N=52) a sauna bathing challenge induced sweating without systemic cortisol changes. In both studies, repeated assessments of HCC, salivary cortisol, cortisol in sweat and individuals' sweating rate (single assessment) were conducted on the experimental day and at a next-day follow-up. Results across the two studies consistently revealed that HCC were not altered by the acute interventions. Further, HCC were found to be unrelated to acute salivary cortisol reactivity, sweat cortisol levels, sweating rate or the time of examination. In line with previous data, cortisol levels in sweat were strongly related to total salivary cortisol output across the examined periods. The present results oppose recent case report data by showing that single sweat-inducing interventions do not result in acute changes in HCC. Our data also tentatively speak against the notion that cortisol in sweat may be a dominant source of HCC. Further, our findings also indicate that HCC are not subject to diurnal variation. This research provides further support for hair cortisol analysis as a marker of integrated long-term systemic cortisol secretion. Copyright © 2015 Elsevier Ltd. All rights reserved.
SPIRITUAL WELL-BEING IN LONG-TERM COLORECTAL CANCER SURVIVORS WITH OSTOMIES
Bulkley, Joanna; McMullen, Carmit K.; Hornbrook, Mark C.; Grant, Marcia; Altschuler, Andrea; Wendel, Christopher S.; Krouse, Robert S.
2014-01-01
Objective Spiritual well-being (SpWB) is integral to health-related quality of life (HRQOL). The challenges of colorectal cancer (CRC) and subsequent bodily changes can affect SpWB. We analyzed the SpWB of CRC survivors with ostomies. Methods Two-hundred-eight-three long-term (≥5 years) CRC survivors with permanent ostomies completed the modified City of Hope Quality of Life-Ostomy (mCOH-QOL-O) questionnaire. An open-ended question elicited respondents’ greatest challenge in living with an ostomy. We used content analysis to identify SpWB responses and develop themes. We analyzed responses on the 3-item SpWB sub-scale. Results Open-ended responses from 52% of participants contained SpWB content. Fifteen unique SpWB themes were identified. Sixty percent of individuals expressed positive themes such as “positive attitude”, “I am fortunate”, “appreciate life more”, and “strength through religious faith”. Negative themes, expressed by only 29% of respondents, included “struggling to cope”, “not feeling ‘normal’”, and “loss”. Fifty-five percent of respondents expressed ambivalent themes including “learning acceptance”, “an ostomy is the price for survival”, “reason to be around despite suffering”, and “continuing to cope despite challenges”. The majority (64%) had a high SpWB sub-scale score. Conclusions While CRC survivors with ostomies infrequently mentioned negative SpWB themes as a major challenge, ambivalent themes were common. SpWB themes often were mentioned as a source of resilience or part of the struggle to adapt to an altered body after cancer surgery. Interventions to improve the quality of life of cancer survivors should contain program elements designed to address SpWB that support personal meaning, inner peace, inter-connectedness, and belonging. PMID:23749460
Gibson, F; Hibbins, S; Grew, T; Morgan, S; Pearce, S; Stark, D; Fern, L A
2016-11-01
Young people with cancer exhibit unique needs. During a time of normal physical and psychological change, multiple disease and treatment-related symptoms cause short and long-term physical and psychosocial effects. Little is known about how young people cope with the impact of cancer and its treatment on daily routines and their strategies to manage the challenges of cancer and treatments. We aimed to determine how young people describe these challenges through a social media site. Using the principles of virtual ethnography and watching videos on a social media site we gathered data from young people describing their cancer experience. Qualitative content analysis was employed to analyse and interpret the narrative from longitudinal 'video diaries' by 18 young people equating to 156 films and 27 h and 49 min of recording. Themes were described then organized and clustered into typologies grouping commonalities across themes. Four typologies emerged reflective of the cancer trajectory: treatment and relenting side effects, rehabilitation and getting on with life, relapse, facing more treatment and coming to terms with dying. This study confirms the need for young people to strive towards normality and creating a new normal, even where uncertainty prevailed. Strategies young people used to gain mastery over their illness and the types of stories they choose to tell provide the focus of the main narrative. Social Media sites can be examined as a source of data, to supplement or instead of more traditional routes of data collection known to be practically challenging with this population. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
D Hydrodynamics Simulation of Amazonian Seasonally Flooded Wetlands
NASA Astrophysics Data System (ADS)
Pinel, S. S.; Bonnet, M. P.; Da Silva, J. S.; Cavalcanti, R., Sr.; Calmant, S.
2016-12-01
In the low Amazonian basin, interactions between floodplains and river channels are important in terms of water exchanges, sediments, or nutrients. These wetlands are considered as hotspot of biodiversity and are among the most productive in the world. However, they are threatened by climatic changes and anthropic activities. Hence, considering the implications for predicting inundation status of floodplain habitats, the strong interactions between water circulation, energy fluxes, biogeochemical and ecological processes, detailed analyses of flooding dynamics are useful and needed. Numerical inundation models offer means to study the interactions among different water sources. Modeling floods events in this area is challenging because flows respond to dynamic hydraulic controls coming from several water sources, complex geomorphology, and vegetation. In addition, due to the difficulty of access, there is a lack of existing hydrological data. In this context, the use of monitoring systems by remote sensing is a good option. In this study, we simulated filling and drainage processes of an Amazon floodplain (Janauacá Lake, AM, Brazil) over a 6 years period (2006-2012). Common approaches of flow modeling in the Amazon region consist of coupling a 1D simulation of the main channel flood wave to a 2D simulation of the inundation of the floodplain. Here, our approach differs as the floodplain is fully simulated. Model used is the 3D model IPH-ECO, which consists of a three-dimensional hydrodynamic module coupled with an ecosystem module. The IPH-ECO hydrodynamic module solves the Reynolds-Averaged Navier-Stokes equations using a semi-implicit discretization. After having calibrated the simulation against roughness coefficients, we validated the model in terms of vertical accuracy against water levels (daily in situ and altimetrics data), in terms of flood extent against inundation maps deduced from available remote-sensed product imagery (ALOS-1/PALSAR.), and in terms of velocity. We analyzed the inter-annual variability in hydrological fluxes and inundation dynamics of the floodplain unit. Dominant sources of inflow varied seasonally: among direct rain and local runoff (November to April), Amazon River (May to August) and seepage (September to October).
Communicating science in politicized environments.
Lupia, Arthur
2013-08-20
Many members of the scientific community attempt to convey information to policymakers and the public. Much of this information is ignored or misinterpreted. This article describes why these outcomes occur and how science communicators can achieve better outcomes. The article focuses on two challenges associated with communicating scientific information to such audiences. One challenge is that people have less capacity to pay attention to scientific presentations than many communicators anticipate. A second challenge is that people in politicized environments often make different choices about whom to believe than do people in other settings. Together, these challenges cause policymakers and the public to be less responsive to scientific information than many communicators desire. Research on attention and source credibility can help science communicators better adapt to these challenges. Attention research clarifies when, and to what type of stimuli, people do (and do not) pay attention. Source credibility research clarifies the conditions under which an audience will believe scientists' descriptions of phenomena rather than the descriptions of less-valid sources. Such research can help communicators stay true to their science while making their findings more memorable and more believable to more audiences.
Communicating science in politicized environments
Lupia, Arthur
2013-01-01
Many members of the scientific community attempt to convey information to policymakers and the public. Much of this information is ignored or misinterpreted. This article describes why these outcomes occur and how science communicators can achieve better outcomes. The article focuses on two challenges associated with communicating scientific information to such audiences. One challenge is that people have less capacity to pay attention to scientific presentations than many communicators anticipate. A second challenge is that people in politicized environments often make different choices about whom to believe than do people in other settings. Together, these challenges cause policymakers and the public to be less responsive to scientific information than many communicators desire. Research on attention and source credibility can help science communicators better adapt to these challenges. Attention research clarifies when, and to what type of stimuli, people do (and do not) pay attention. Source credibility research clarifies the conditions under which an audience will believe scientists’ descriptions of phenomena rather than the descriptions of less-valid sources. Such research can help communicators stay true to their science while making their findings more memorable and more believable to more audiences. PMID:23940336
Siewert, F.; Buchheim, J.; Zeschke, T.; Störmer, M.; Falkenberg, G.; Sankari, R.
2014-01-01
To fully exploit the ultimate source properties of the next-generation light sources, such as free-electron lasers (FELs) and diffraction-limited storage rings (DLSRs), the quality requirements for gratings and reflective synchrotron optics, especially mirrors, have significantly increased. These coherence-preserving optical components for high-brightness sources will feature nanoscopic shape accuracies over macroscopic length scales up to 1000 mm. To enable high efficiency in terms of photon flux, such optics will be coated with application-tailored single or multilayer coatings. Advanced thin-film fabrication of today enables the synthesis of layers on the sub-nanometre precision level over a deposition length of up to 1500 mm. Specifically dedicated metrology instrumentation of comparable accuracy has been developed to characterize such optical elements. Second-generation slope-measuring profilers like the nanometre optical component measuring machine (NOM) at the BESSY-II Optics laboratory allow the inspection of up to 1500 mm-long reflective optical components with an accuracy better than 50 nrad r.m.s. Besides measuring the shape on top of the coated mirror, it is of particular interest to characterize the internal material properties of the mirror coating, which is the domain of X-rays. Layer thickness, density and interface roughness of single and multilayer coatings are investigated by means of X-ray reflectometry. In this publication recent achievements in the field of slope measuring metrology are shown and the characterization of different types of mirror coating demonstrated. Furthermore, upcoming challenges to the inspection of ultra-precise optical components designed to be used in future FEL and DLSR beamlines are discussed. PMID:25177985
NASA Astrophysics Data System (ADS)
Schuetze, C.; Sauer, U.; Dietrich, P.
2015-12-01
Reliable detection and assessment of near-surface CO2 emissions from natural or anthropogenic sources require the application of various monitoring tools at different spatial scales. Especially, optical remote sensing tools for atmospheric monitoring have the potential to measure integrally CO2 emissions over larger scales (> 10.000m2). Within the framework of the MONACO project ("Monitoring approach for geological CO2 storage sites using a hierarchical observation concept"), an integrative hierarchical monitoring concept was developed and validated at different field sites with the aim to establish a modular observation strategy including investigations in the shallow subsurface, at ground surface level and the lower atmospheric boundary layer. The main aims of the atmospheric monitoring using optical remote sensing were the observation of the gas dispersion in to the near-surface atmosphere, the determination of maximum concentration values and identification of the main challenges associated with the monitoring of extended emission sources with the proposed methodological set up under typical environmental conditions. The presentation will give an overview about several case studies using the integrative approach of Open-Path Fourier Transform Infrared spectroscopy (OP FTIR) in combination with in situ measurements. As a main result, the method was validated as possible approach for continuous monitoring of the atmospheric composition, in terms of integral determination of GHG concentrations and to identify target areas which are needed to be investigated more in detail. Especially the data interpretation should closely consider the micrometeorological conditions. Technical aspects concerning robust equipment, experimental set up and fast data processing algorithms have to be taken into account for the enhanced automation of atmospheric monitoring.
Comparing types of local public health agencies in North Carolina.
Markiewicz, Milissa; Moore, Jill; Foster, Johanna H; Berner, Maureen; Matthews, Gene; Wall, Aimee
2013-01-01
Some states are considering restructuring local public health agencies (LPHAs) in hopes of achieving long-term efficiencies. North Carolina's experience operating different types of LPHAs, such as county health departments, district health departments, public health authorities, and consolidated human services agencies, can provide valuable information to policy makers in other states who are examining how best to organize their local public health system. To identify stakeholders' perceptions of the benefits and challenges associated with different types of LPHAs in North Carolina and to compare LPHA types on selected financial, workforce, and service delivery measures. Focus groups and key informant interviews were conducted to identify stakeholders' perceptions of different LPHA types. To compare LPHA types on finance, workforce, and service delivery measures, descriptive statistical analyses were performed on publicly available quantitative data. North Carolina. Current and former state and local public health practitioners, county commissioners, county managers, assistant managers, state legislators, and others. In addition to identifying stakeholders' perceptions of LPHA types, proportion of total expenditures by funding source, expenditures per capita by funding source, full-time equivalents per 1000 population, and percentage of 127 tracked services offered were calculated. Stakeholders reported benefits and challenges of all LPHA types. LPHA types differ with regard to source of funding, with county health departments and consolidated human services agencies receiving a greater percentage of their funding from county appropriations than districts and authorities, which receive a comparatively larger percentage from other revenues. Types of LPHAs are not entirely distinct from one another, and LPHAs of the same type can vary greatly from one another. However, stakeholders noted differences between LPHA types-particularly with regard to district health departments-that were corroborated by an examination of expenditures per capita and full-time equivalents per 1000 population.
Health care financing in Nigeria: Implications for achieving universal health coverage.
Uzochukwu, B S C; Ughasoro, M D; Etiaba, E; Okwuosa, C; Envuladu, E; Onwujekwe, O E
2015-01-01
The way a country finances its health care system is a critical determinant for reaching universal health coverage (UHC). This is so because it determines whether the health services that are available are affordable to those that need them. In Nigeria, the health sector is financed through different sources and mechanisms. The difference in the proportionate contribution from these stated sources determine the extent to which such health sector will go in achieving successful health care financing system. Unfortunately, in Nigeria, achieving the correct blend of these sources remains a challenge. This review draws on relevant literature to provide an overview and the state of health care financing in Nigeria, including policies in place to enhance healthcare financing. We searched PubMed, Medline, The Cochrane Library, Popline, Science Direct and WHO Library Database with search terms that included, but were not restricted to health care financing Nigeria, public health financing, financing health and financing policies. Further publications were identified from references cited in relevant articles and reports. We reviewed only papers published in English. No date restrictions were placed on searches. It notes that health care in Nigeria is financed through different sources including but not limited to tax revenue, out-of-pocket payments (OOPs), donor funding, and health insurance (social and community). In the face of achieving UHC, achieving successful health care financing system continues to be a challenge in Nigeria and concludes that to achieve universal coverage using health financing as the strategy, there is a dire need to review the system of financing health and ensure that resources are used more efficiently while at the same time removing financial barriers to access by shifting focus from OOPs to other hidden resources. There is also need to give presidential assent to the national health bill and its prompt implementation when signed into law.
Oil Based Drilling Fluid Waste: An Overview on Environmentally Persistent Pollutants
NASA Astrophysics Data System (ADS)
Siddique, Shohel; Kwoffie, Lorraine; Addae-Afoakwa, Kofi; Yates, Kyari; Njuguna, James
2017-05-01
Operational discharges of spent drilling fluid, produced water, and accumulated drill cuttings from oil and gas industry are a continuous point source of environmental pollution. To meet the strict environmental standard for waste disposal, oil and gas industry is facing a numerous challenges in technological development to ensure a clean and safe environment. Oil and gas industry generates a large amount of spent drilling fluid, produced water, and drill cuttings, which are very different in every drilling operation in terms of composition and characterisation. This review article highlights the knowledge gap in identifying the different sources of waste streams in combined drilling waste. This paper also emphasises how different chemicals turn into environmentally significant pollutants after serving great performance in oil and gas drilling operations. For instance, oil based drilling fluid performs excellent in deeper drilling and drilling in the harsh geological conditions, but ended with (produces) a significant amount of persistent toxic pollutants in the environment. This review paper provides an overview on the basic concepts of drilling fluids and their functions, sources and characterisation of drilling wastes, and highlights some environmentally significant elements including different minerals present in drilling waste stream.
Impacts of beach wrack removal via grooming on surf zone water quality.
Russell, Todd L; Sassoubre, Lauren M; Zhou, Christina; French-Owen, Darien; Hassaballah, Abdulrahman; Boehm, Alexandria B
2014-02-18
Fecal indicator bacteria (FIB) are used to assess the microbial water quality of recreational waters. Increasingly, nonfecal sources of FIB have been implicated as causes of poor microbial water quality in the coastal environment. These sources are challenging to quantify and difficult to remediate. The present study investigates one nonfecal FIB source, beach wrack (decaying aquatic plants), and its impacts on water quality along the Central California coast. The prevalence of FIB on wrack was studied using a multibeach survey, collecting wrack throughout Central California. The impacts of beach grooming, to remove wrack, were investigated at Cowell Beach in Santa Cruz, California using a long-term survey (two summers, one with and one without grooming) and a 48 h survey during the first ever intensive grooming event. FIB were prevalent on wrack but highly variable spatially and temporally along the nine beaches sampled in Central California. Beach grooming was generally associated with either no change or a slight increase in coastal FIB concentrations and increases in surf zone turbidity and silicate, phosphate, and dissolved inorganic nitrogen concentrations. The findings suggest that beach grooming for wrack removal is not justified as a microbial pollution remediation strategy.
NASA Astrophysics Data System (ADS)
Burrell, Derek J.; Middlebrook, Christopher T.
2017-08-01
Wireless communication systems that employ free-space optical links in place of radio/microwave technologies carry substantial benefits in terms of data throughput, network security and design efficiency. Along with these advantages comes the challenge of counteracting signal degradation caused by atmospheric turbulence in free-space environments. A fully coherent laser source experiences random phase delays along its traversing path in turbulent conditions forming a speckle pattern and lowering the received signal-to-noise ratio upon detection. Preliminary research has shown that receiver-side speckle contrast may be significantly reduced and signal-to-noise ratio increased accordingly through the use of a partially coherent light source. While dynamic diffusers and adaptive optics solutions have been proven effective, they also add expense and complexity to a system that relies on accessibility and robustness for successful implementation. A custom Hadamard diffractive matrix design is used to statically induce partial coherence in a transmitted beam to increase signal-to-noise ratio for experimental turbulence scenarios. Atmospheric phase screens are generated using an open-source software package and subsequently loaded into a spatial light modulator using nematic liquid crystals to modulate the phase.
Exploring consumer exposure pathways and patterns of use for chemicals in the environment.
Dionisio, Kathie L; Frame, Alicia M; Goldsmith, Michael-Rock; Wambaugh, John F; Liddell, Alan; Cathey, Tommy; Smith, Doris; Vail, James; Ernstoff, Alexi S; Fantke, Peter; Jolliet, Olivier; Judson, Richard S
2015-01-01
Humans are exposed to thousands of chemicals in the workplace, home, and via air, water, food, and soil. A major challenge in estimating chemical exposures is to understand which chemicals are present in these media and microenvironments. Here we describe the Chemical/Product Categories Database (CPCat), a new, publically available (http://actor.epa.gov/cpcat) database of information on chemicals mapped to "use categories" describing the usage or function of the chemical. CPCat was created by combining multiple and diverse sources of data on consumer- and industrial-process based chemical uses from regulatory agencies, manufacturers, and retailers in various countries. The database uses a controlled vocabulary of 833 terms and a novel nomenclature to capture and streamline descriptors of chemical use for 43,596 chemicals from the various sources. Examples of potential applications of CPCat are provided, including identifying chemicals to which children may be exposed and to support prioritization of chemicals for toxicity screening. CPCat is expected to be a valuable resource for regulators, risk assessors, and exposure scientists to identify potential sources of human exposures and exposure pathways, particularly for use in high-throughput chemical exposure assessment.
Hydrologic and geochemical data assimilation at the Hanford 300 Area
NASA Astrophysics Data System (ADS)
Chen, X.; Hammond, G. E.; Murray, C. J.; Zachara, J. M.
2012-12-01
In modeling the uranium migration within the Integrated Field Research Challenge (IFRC) site at the Hanford 300 Area, uncertainties arise from both hydrologic and geochemical sources. The hydrologic uncertainty includes the transient flow boundary conditions induced by dynamic variations in Columbia River stage and the underlying heterogeneous hydraulic conductivity field, while the geochemical uncertainty is a result of limited knowledge of the geochemical reaction processes and parameters, as well as heterogeneity in uranium source terms. In this work, multiple types of data, including the results from constant-injection tests, borehole flowmeter profiling, and conservative tracer tests, are sequentially assimilated across scales within a Bayesian framework to reduce the hydrologic uncertainty. The hydrologic data assimilation is then followed by geochemical data assimilation, where the goal is to infer the heterogeneous distribution of uranium sources using uranium breakthrough curves from a desorption test that took place at high spring water table. We demonstrate in our study that Ensemble-based data assimilation techniques (Ensemble Kalman filter and smoother) are efficient in integrating multiple types of data sequentially for uncertainty reduction. The computational demand is managed by using the multi-realization capability within the parallel PFLOTRAN simulator.
From 2D to 3D modelling in long term tectonics: Modelling challenges and HPC solutions (Invited)
NASA Astrophysics Data System (ADS)
Le Pourhiet, L.; May, D.
2013-12-01
Over the last decades, 3D thermo-mechanical codes have been made available to the long term tectonics community either as open source (Underworld, Gale) or more limited access (Fantom, Elvis3D, Douar, LaMem etc ...). However, to date, few published results using these methods have included the coupling between crustal and lithospheric dynamics at large strain. The fact that these computations are computational expensive is not the primary reason for the relatively slow development of 3D modeling in the long term tectonics community, as compare to the rapid development observed within the mantle dynamic community, or in the short-term tectonics field. Long term tectonics problems have specific issues not found in either of these two field, including; large strain (not an issue for short-term), the inclusion of free surface and the occurence of large viscosity contrasts. The first issue is typically eliminated using a combined marker-ALE method instead of fully lagrangian method, however, the marker-ALE approach can pose some algorithmic challenges in a massively parallel environment. The two last issues are more problematic because they affect the convergence of the linear/non-linear solver and the memory cost. Two options have been tested so far, using low order element and solving with a sparse direct solver, or using higher order stable elements together with a multi-grid solver. The first options, is simpler to code and to use but reaches its limit at around 80^3 low order elements. The second option requires more operations but allows using iterative solver on extremely large computers. In this presentation, I will describe the design philosophy and highlight results obtained using a code from the second-class method. The presentation will be oriented from an end-user point of view, using an application from 3D continental break up to illustrate key concepts. The description will proceed point by point from implementing physics into the code, to dealing with specific issues related to solving the discrete system of non linear equations.
NASA Astrophysics Data System (ADS)
Osterwalder, Stefan; Fritsche, Johannes; Nilsson, Mats B.; Alewell, Christine; Bishop, Kevin
2015-04-01
The fate of anthropogenic emissions to the atmosphere is influenced by the exchange of elemental mercury (Hg0) with the earth surface. However, it remains challenging to quantify these exchanges which hold the key to a better understanding of mercury cycling at different scales, from the entire earth to specific environments. To better test hypotheses about land-atmosphere Hg interactions, we applied dynamic flux chambers (DFCs) for short term measurements and developed a novel Relaxed Eddy Accumulation (REA) design for continuous flux monitoring. Accurate determination of Hg0 fluxes has proven difficult due to the technical challenges presented by the small concentration differences (< 1 ng m-3) between updrafts and downdrafts. To address this we present a dual-intake, single analyzer REA system including a calibration module for periodic quality-control measurements with reference gases. To demonstrate the system performance, we present results from two contrasting environments: In February 2012 REA monitored a heterogeneous urban surface in the center of Basel, Switzerland where an average flux of 14 ng m-2 h-1 was detected with a distinct diurnal pattern. In May 2012, the REA monitored a boreal mire in northern Sweden with different turbulence regimes and Hg0 sink/source characteristics. During the snowmelt period in May 2012 the Hg0 flux averaged at 2 ng m-2 h-1. In order to better quantify inputs and outputs of Hg from boreal landscapes, we subsequently monitored the land-atmosphere exchange of Hg0 during a course of a year and compared the fluxes occasionally with DFC measurements. The amount of Hg0 volatilized from boreal mires was at a similar level as the annual export of Hg in stream water, identifying the mire as net source of Hg to neighboring environments. We believe that this dual-inlet, single detector approach is a significant innovation which can help realize the potential of REA for continuous, long-term determination of land-atmosphere Hg0 exchange.
Piecewise synonyms for enhanced UMLS source terminology integration.
Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J
2007-10-11
The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.
Sex work and the claim for grassroots legislation.
Fassi, Marisa N
2015-01-01
The aim of this paper is to contribute to understanding of legal models that aim to control sex work, and the policy implications of these, by discussing the experience of developing a grassroots legislation bill proposal by organised sex workers in Córdoba, Argentina. The term 'grassroots legislation' here refers to a legal response that derives from the active involvement of local social movements and thus incorporates the experiential knowledge and claims of these particular social groupings in the proposal. The experience described in this paper excludes approaches that render sex workers as passive victims or as deviant perpetrators; instead, it conceives of sex workers in terms of their political subjectivity and of political subjectivity in its capacity to speak, to decide, to act and to propose. This means challenging current patterns of knowledge/power that give superiority to 'expert knowledge' above and beyond the claims, experiences, knowledge and needs of sex workers themselves as meaningful sources for law making.
Evaluating Sustainability Models for Interoperability through Brokering Software
NASA Astrophysics Data System (ADS)
Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew
2016-04-01
Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.
Fast gene ontology based clustering for microarray experiments.
Ovaska, Kristian; Laakso, Marko; Hautaniemi, Sampsa
2008-11-21
Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.
Recruitment of adolescents for a smoking study: use of traditional strategies and social media.
Rait, Michelle A; Prochaska, Judith J; Rubinstein, Mark L
2015-09-01
Engaging and retaining adolescents in research studies is challenging. Social media offers utility for expanding the sphere of research recruitment. This study examined and compared traditional and Facebook-based recruitment strategies on reach, enrollment, cost, and retention. Substance users aged 13-17 years were recruited through several methods, including social media, a study website, fliers, talks in schools, bus ads, and referrals. Study involvement included a one-time visit and semiannual follow-up surveys. 1265 individuals contacted study personnel; 629 were ineligible; 129 declined; and 200 participants enrolled. Facebook drew the greatest volume but had a high rate of ineligibles. Referrals were the most successful and cost-effective ($7 per enrolled participant); school talks were the least. Recruitment source was unrelated to retention success. Facebook may expand recruitment reach, but had greater financial costs and more ineligible contacts, resulting in fewer enrollees relative to traditional interpersonal recruitment methods. Referrals, though useful for study engagement, did not provide a differential benefit in terms of long-term retention.
Sabapathy, Vikram; Ravi, Saranya; Srivastava, Vivi; Srivastava, Alok; Kumar, Sanjay
2012-01-01
Mesenchymal stem cells (MSCs) are an alluring therapeutic resource because of their plasticity, immunoregulatory capacity and ease of availability. Human BM-derived MSCs have limited proliferative capability, consequently, it is challenging to use in tissue engineering and regenerative medicine applications. Hence, placental MSCs of maternal origin, which is one of richest sources of MSCs were chosen to establish long-term culture from the cotyledons of full-term human placenta. Flow analysis established bonafied MSCs phenotypic characteristics, staining positively for CD29, CD73, CD90, CD105 and negatively for CD14, CD34, CD45 markers. Pluripotency of the cultured MSCs was assessed by in vitro differentiation towards not only intralineage cells like adipocytes, osteocytes, chondrocytes, and myotubules cells but also translineage differentiated towards pancreatic progenitor cells, neural cells, and retinal cells displaying plasticity. These cells did not significantly alter cell cycle or apoptosis pattern while maintaining the normal karyotype; they also have limited expression of MHC-II antigens and are Naive for stimulatory factors CD80 and CD 86. Further soft agar assays revealed that placental MSCs do not have the ability to form invasive colonies. Taking together all these characteristics into consideration, it indicates that placental MSCs could serve as good candidates for development and progress of stem-cell based therapeutics. PMID:22550499
NASA Astrophysics Data System (ADS)
Angelakis, E.
2012-01-01
The F-GAMMA program aims at understanding the physics at work in AGN via a multi-frequency monitoring approach. A number of roughly 65 Fermi-GST detectable blazars are being monitored monthly since January 2007 at radio wavelengths. The core program relies on the 100-m Effelsberg telescope operating at 8 frequencies between 2.6 and 43 GHz, the 30-m IRAM telescope observing at 86, 145 and 240 GHz and the APEX 12-m telescope at 345 GHz. For the targeted sources the LAT instrument onboard Fermi-GST provides gamma-ray light curves sampled daily. Here we discuss two recent findings: A). On the basis of their variability pattern, the observed quasi-simultaneous broad-band spectra can be classified to merely 5 classes. The variability for the first 4 is clearly dominated by spectral-evolution. Sources of the last class vary self-similarly with almost no apparent shift of the peak frequency. The former classes can be attributed to a two-component principal system made of a quiescent optically thin spectrum and a super-imposed flaring event. The later class must be interpreted in terms of a completely different mechanism. The apparent differences among the classes are explained in terms of a redshift modulus and an intrinsic-source/flare parameters modulus. Numerical simulations have shown that a shock-in-jet model can very well describe the observed behavior. It is concluded therefore that only two mechanisms seem to be producing variability. None of the almost 90 sources used for this study show a switch of class indicating that the variability mechanism is either (a) a finger-print of the source, or (b) remains stable on timescales far longer than the monitoring period of almost 4 years. B). Recently it has been disclosed that Narrow Line Seyfert 1 galaxies show gamma-ray emission. Within the F-GAMMA program radio jet emission has been detected from 3 such sources challenging the belief that jets are associated with elliptical galaxies. The recent findings in this area will be discussed.
OSRP Source Repatriations-Case Studies: Brazil, Ecuador, Uruguay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenberg, Ray Jr.; Abeyta, Cristy; Matzke, Jim
2012-07-01
The Global Threat Reduction Initiative's (GTRI) Offsite Source Recovery Project (OSRP) began recovering excess and unwanted radioactive sealed sources (sources) in 1999. As of February 2012, the project had recovered over 30,000 sources totaling over 820,000 Ci. OSRP grew out of early efforts at Los Alamos National Laboratory (LANL) to recover disused excess Plutonium- 239 (Pu-239) sources that were distributed in the 1960's and 1970's under the Atoms for Peace Program. Source recovery was initially considered a waste management activity. However, after the 9/11 terrorist attacks, the interagency community began to recognize that excess and unwanted radioactive sealed sources posemore » a national security threat, particularly those that lack a disposition path. After OSRP's transfer to the U.S. National Nuclear Security Administration (NNSA) to be part of GTRI, its mission was expanded to include all disused sealed sources that might require national security consideration. Recognizing the transnational threat posed by porous borders and the ubiquitous nature of sources, GTRI/OSRP repatriates U.S. origin sources based on threat reduction prioritization criteria. For example, several recent challenging source repatriation missions have been conducted by GTRI/OSRP in South America. These include the repatriation of a significant amount of Cs-137 and other isotopes from Brazil; re-packaging of conditioned Ra-226 sources in Ecuador for future repatriation; and, multilateral cooperation in the consolidation and export of Canadian, US, and Indian Co-60/Cs-137 sources from Uruguay. In addition, cooperation with regulators and private source owners in other countries presents opportunities for GTRI/OSRP to exchange best practices for managing disused sources. These positive experiences often result in long-term cooperation and information sharing with key foreign counterparts. International source recovery operations are essential to the preservation of U.S. national security interests. They are also mutually beneficial for fostering positive relationships with other governments and private industry, and demonstrate that responsible end-of-life options are given to legacy U.S.-origin sources in other countries. GTRI/OSRP does not take back sources that have a viable path for commercial disposal. Most US origin sources were sold commercially and were not provided by the US government. Below is a synopsis of cooperative efforts with Brazil, Ecuador, and Uruguay. Bilateral and multilateral efforts have been successful in removing hundreds of U.S.origin sealed radioactive sources from Latin American countries to the U.S. As many disused sources remain in the region, and since repatriation is not always an option, GTRI will continue to work with those countries to ensure that these sources are stored securely for the long-term. Successful Latin America operations should serve as a model for other regional cooperation in the repatriation of sealed sources, encouraging other source exporting countries to implement similar programs. Securing and removing sources, both domestically and internationally, is crucial to strengthening the life-cycle management of radioactive sources worldwide. Such efforts not only prevent these materials from being used maliciously, but also address public health and safety concerns, and under-gird the IAEA Code of Conduct on the Safety and Security of Radioactive Sources. (authors)« less
NASA Astrophysics Data System (ADS)
Kotthaus, S.; Grimmond, S.
2013-12-01
Global urbanisation brings increasingly dense and complex urban structures. To manage cities sustainably and smartly, currently and into the future under changing climates, urban climate research needs to advance in areas such as Central Business Districts (CBD) where human interactions with the environment are particularly concentrated. Measurement and modelling approaches may be pushed to their limits in dense urban settings, but if urban climate research is to contribute to the challenges of real cities those limits have to be addressed. The climate of cities is strongly governed by surface-atmosphere exchanges of energy, moisture and momentum. Observations of the relevant fluxes provide important information for improvement and evaluation of modelling approaches. Due to the CBD's heterogeneity, a very careful analysis of observations is required to understand the relevant processes. Current approaches used to interpret observations and set them in a wider context may need to be adapted for use in these more complex areas. Here, we present long-term observations of the radiation balance components and turbulent fluxes of latent heat, sensible heat and momentum in the city centre of London. This is one of the first measurement studies in a CBD covering multiple years with analysis at temporal scales from days to seasons. Data gathered at two sites in close vicinity, but with different measurement heights, are analysed to investigate the influence of source area characteristics on long-term radiation and turbulent fluxes. Challenges of source area modelling and the critical aspect of siting in such a complex environment are considered. Outgoing long- and short-wave radiation are impacted by the anisotropic nature of the urban surface and the high reflectance materials increasingly being used as building materials. Results highlight the need to consider the source area of radiometers in terms of diffuse and direct irradiance. Sensible heat fluxes (QH) are positive all year round, even at night. QH systematically exceeds input from net all-wave radiation (Q*), probably sustained by a both storage and anthropogenic heat fluxes (QF). Model estimates suggest QF can exceed the Q* nearly all year round. The positive QH inhibits stable conditions, but the stability classification is determined predominantly by the pattern of friction velocity over the rough urban surface. Turbulent latent heat flux variations are controlled (beyond the available energy) by rainfall due to the small vegetation cover. The Bowen ratio is mostly larger than one. Analysis of the eddy covariance footprint surface controls for the different land cover types by flow patterns for measurements at the two heights suggests the spatial variations of the sensible heat flux observed are partly related to changes in surface roughness, even at the local scale. Where the source areas are most homogeneous, flow conditions are vertically consistent - even if initial morphometric parameters suggested the measurements may be below the blending height. Turbulence statistics and momentum flux patterns prove useful for the interpretation of turbulent heat exchanges observed.
Future Protein Supply and Demand: Strategies and Factors Influencing a Sustainable Equilibrium
Henchion, Maeve; Hayes, Maria; Mullen, Anne Maria; Fenelon, Mark; Tiwari, Brijesh
2017-01-01
A growing global population, combined with factors such as changing socio-demographics, will place increased pressure on the world’s resources to provide not only more but also different types of food. Increased demand for animal-based protein in particular is expected to have a negative environmental impact, generating greenhouse gas emissions, requiring more water and more land. Addressing this “perfect storm” will necessitate more sustainable production of existing sources of protein as well as alternative sources for direct human consumption. This paper outlines some potential demand scenarios and provides an overview of selected existing and novel protein sources in terms of their potential to sustainably deliver protein for the future, considering drivers and challenges relating to nutritional, environmental, and technological and market/consumer domains. It concludes that different factors influence the potential of existing and novel sources. Existing protein sources are primarily hindered by their negative environmental impacts with some concerns around health. However, they offer social and economic benefits, and have a high level of consumer acceptance. Furthermore, recent research emphasizes the role of livestock as part of the solution to greenhouse gas emissions, and indicates that animal-based protein has an important role as part of a sustainable diet and as a contributor to food security. Novel proteins require the development of new value chains, and attention to issues such as production costs, food safety, scalability and consumer acceptance. Furthermore, positive environmental impacts cannot be assumed with novel protein sources and care must be taken to ensure that comparisons between novel and existing protein sources are valid. Greater alignment of political forces, and the involvement of wider stakeholders in a governance role, as well as development/commercialization role, is required to address both sources of protein and ensure food security. PMID:28726744
Future Protein Supply and Demand: Strategies and Factors Influencing a Sustainable Equilibrium.
Henchion, Maeve; Hayes, Maria; Mullen, Anne Maria; Fenelon, Mark; Tiwari, Brijesh
2017-07-20
A growing global population, combined with factors such as changing socio-demographics, will place increased pressure on the world's resources to provide not only more but also different types of food. Increased demand for animal-based protein in particular is expected to have a negative environmental impact, generating greenhouse gas emissions, requiring more water and more land. Addressing this "perfect storm" will necessitate more sustainable production of existing sources of protein as well as alternative sources for direct human consumption. This paper outlines some potential demand scenarios and provides an overview of selected existing and novel protein sources in terms of their potential to sustainably deliver protein for the future, considering drivers and challenges relating to nutritional, environmental, and technological and market/consumer domains. It concludes that different factors influence the potential of existing and novel sources. Existing protein sources are primarily hindered by their negative environmental impacts with some concerns around health. However, they offer social and economic benefits, and have a high level of consumer acceptance. Furthermore, recent research emphasizes the role of livestock as part of the solution to greenhouse gas emissions, and indicates that animal-based protein has an important role as part of a sustainable diet and as a contributor to food security. Novel proteins require the development of new value chains, and attention to issues such as production costs, food safety, scalability and consumer acceptance. Furthermore, positive environmental impacts cannot be assumed with novel protein sources and care must be taken to ensure that comparisons between novel and existing protein sources are valid. Greater alignment of political forces, and the involvement of wider stakeholders in a governance role, as well as development/commercialization role, is required to address both sources of protein and ensure food security.
Low birth weight and air pollution in California: Which sources and components drive the risk?
Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun
2016-01-01
Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bayha, Keith M.; Ortell, Natalie; Ryan, Caitlin N.; Griffitt, Kimberly J.; Krasnec, Michelle; Sena, Johnny; Ramaraj, Thiruvarangan; Takeshita, Ryan; Mayer, Gregory D.; Schilkey, Faye; Griffitt, Robert J.
2017-01-01
Exposure to crude oil or its individual constituents can have detrimental impacts on fish species, including impairment of the immune response. Increased observations of skin lesions in northern Gulf of Mexico fish during the 2010 Deepwater Horizon oil spill indicated the possibility of oil-induced immunocompromisation resulting in bacterial or viral infection. This study used a full factorial design of oil exposure and bacterial challenge to examine how oil exposure impairs southern flounder (Paralichthys lethostigma) immune function and increases susceptibility to the bacteria Vibrio anguillarum, a causative agent of vibriosis. Fish exposed to oil prior to bacterial challenge exhibited 94.4% mortality within 48 hours of bacterial exposure. Flounder challenged with V. anguillarum without prior oil exposure had <10% mortality. Exposure resulted in taxonomically distinct gill and intestine bacterial communities. Mortality strongly correlated with V. anguillarum levels, where it comprised a significantly higher percentage of the microbiome in Oil/Pathogen challenged fish and was nearly non-existent in the No Oil/Pathogen challenged fish bacterial community. Elevated V. anguillarum levels were a direct result of oil exposure-induced immunosuppression. Oil-exposure reduced expression of immunoglobulin M, the major systemic fish antibody, and resulted in an overall downregulation in transcriptome response, particularly in genes related to immune function, response to stimulus and hemostasis. Ultimately, sediment-borne oil exposure impairs immune function, leading to increased incidences of bacterial infections. This type of sediment-borne exposure may result in long-term marine ecosystem effects, as oil-bound sediment in the northern Gulf of Mexico will likely remain a contamination source for years to come. PMID:28464028
USDA-ARS?s Scientific Manuscript database
Objectives were to evaluate how dietary energy intake and source affect immune competence and response to an infectious bovine rhinotracheitis virus (IBRV) challenge in cattle. Forty-eight crossbred beef steers were stratified by body weight within 2 periods and randomized to 1 of 3 dietary treatmen...
USDA-ARS?s Scientific Manuscript database
Objectives were to evaluate how dietary energy level and source affect immune competence and response to a viral challenge in cattle. Forty-eight crossbred beef steers were stratified by BW within 2 periods and randomized to 1 of 3 dietary treatments (8 steers/treatment within period). Treatments we...
Sources and transport pathways of micropollutants into surface waters - an overview
NASA Astrophysics Data System (ADS)
Stamm, Christian
2017-04-01
Micropollutants reach water bodies from a large range of sources through different transport pathways. They consist of hundreds or thousands of compounds rendering exposure assessment an analytical challenge. Prominent examples of micropollutants are wastewater-born pharmaceuticals and hormones or plant protection products originating from diffuse agricultural sources. This presentation reviews the possible origin of micropollutants and their transport pathways. It demonstrates that considering municipal wastewater and agriculture may fall short of comprising all relevant source-pathway combination in a given watershed by providing examples from industry, animal production, or leaching to groundwater. The diversity of source-pathway leads on the one hand to a large number of possible chemicals to be considered including parent compounds of end products, their transformation products, legacy compounds but also intermediates used during industrial synthesis processes. On the other hand, it leads to a wide range of temporal dynamics by which these compounds reach streams and rivers. This combination makes a comprehensive exposure assessment for micropollutants a real scientific challenge. An outlook into new development in sampling and analytics will suggest possible solution for this challenge.
DOT National Transportation Integrated Search
2010-01-14
Due to the volatility of current highway construction commodity prices, owners, contractors, and designers are facing serious challenges in both short-term estimating and long-term planning. Among these challenges is significant uncertainty about the...
High-performance semiconductor quantum-dot single-photon sources
NASA Astrophysics Data System (ADS)
Senellart, Pascale; Solomon, Glenn; White, Andrew
2017-11-01
Single photons are a fundamental element of most quantum optical technologies. The ideal single-photon source is an on-demand, deterministic, single-photon source delivering light pulses in a well-defined polarization and spatiotemporal mode, and containing exactly one photon. In addition, for many applications, there is a quantum advantage if the single photons are indistinguishable in all their degrees of freedom. Single-photon sources based on parametric down-conversion are currently used, and while excellent in many ways, scaling to large quantum optical systems remains challenging. In 2000, semiconductor quantum dots were shown to emit single photons, opening a path towards integrated single-photon sources. Here, we review the progress achieved in the past few years, and discuss remaining challenges. The latest quantum dot-based single-photon sources are edging closer to the ideal single-photon source, and have opened new possibilities for quantum technologies.
Developing open-source codes for electromagnetic geophysics using industry support
NASA Astrophysics Data System (ADS)
Key, K.
2017-12-01
Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.
High-order scheme for the source-sink term in a one-dimensional water temperature model
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005
High-order scheme for the source-sink term in a one-dimensional water temperature model.
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.
The global Filipino nurse: An integrative review of Filipino nurses' work experiences.
Montayre, Jed; Montayre, Jasmine; Holroyd, Eleanor
2018-05-01
To understand the work-related experiences of Philippine-trained nurses working globally. The Philippines is a major source country of foreign-trained nurses located globally. However, there is paucity of research on professional factors and career related issues affecting foreign-trained nurses' work experiences. An integrative review through a comprehensive search of literature was undertaken from November 2015 and was repeated in August 2016. Seven articles satisfied the selection criteria. Filipino nurses experienced differences in the practice of nursing in terms of work process, roles and autonomy. Moreover, they encountered challenges such as work-related discrimination and technical difficulties within the organisation. A clear understanding of Filipino nurses' work experiences and the challenges they have encountered suggests identification of important constructs influencing effective translation of nursing practice across cultures and health systems, which then form the basis for support strategies. It is critical to recognize foreign-trained nurses' experience of work-related differences and challenges as these foster favorable conditions for the management team to plan and continually evaluate policies around recruitment, retention and support offered to these nurses. Furthermore, findings suggest internationalization of nursing framework and standards integrating a transcultural paradigm among staff members within a work organisation. © 2017 John Wiley & Sons Ltd.
The challenges of transitioning from linear to high-order overlay control in advanced lithography
NASA Astrophysics Data System (ADS)
Adel, M.; Izikson, P.; Tien, D.; Huang, C. K.; Robinson, J. C.; Eichelberger, B.
2008-03-01
In the lithography section of the ITRS 2006 update, at the top of the list of difficult challenges appears the text "overlay of multiple exposures including mask image placement". This is a reflection of the fact that today overlay is becoming a major yield risk factor in semiconductor manufacturing. Historically, lithographers have achieved sufficient alignment accuracy and hence layer to layer overlay control by relying on models which define overlay as a linear function of the field and wafer coordinates. These linear terms were easily translated to correctibles in the available exposure tool degrees of freedom on the wafer and reticle stages. However, as the 45 nm half pitch node reaches production, exposure tool vendors have begun to make available, and lithographers have begun to utilize so called high order wafer and field control, in which either look up table or high order polynomial models are modified on a product by product basis. In this paper, the major challenges of this transition will be described. It will include characterization of the sources of variation which need to be controlled by these new models and the overlay and alignment sampling optimization problem which needs to be addressed, while maintaining the ever tightening demands on productivity and cost of ownership.
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2012-01-01
This presentation is a technical summary of and outlook for NASA-internal and NASA-sponsored external research on core noise funded by the Fundamental Aeronautics Program Subsonic Fixed Wing (SFW) Project. Sections of the presentation cover: the SFW system-level noise metrics for the 2015 (N+1), 2020 (N+2), and 2025 (N+3) timeframes; SFW strategic thrusts and technical challenges; SFW advanced subsystems that are broadly applicable to N+3 vehicle concepts, with an indication where further noise research is needed; the components of core noise (compressor, combustor and turbine noise) and a rationale for NASA's current emphasis on the combustor-noise component; the increase in the relative importance of core noise due to turbofan design trends; the need to understand and mitigate core-noise sources for high-efficiency small gas generators; and the current research activities in the core-noise area, with additional details given about forthcoming updates to NASA's Aircraft Noise Prediction Program (ANOPP) core-noise prediction capabilities, two NRA efforts (Honeywell International, Phoenix, AZ and University of Illinois at Urbana-Champaign, respectively) to improve the understanding of core-noise sources and noise propagation through the engine core, and an effort to develop oxide/oxide ceramic-matrix-composite (CMC) liners for broadband noise attenuation suitable for turbofan-core application. Core noise must be addressed to ensure that the N+3 noise goals are met. Focused, but long-term, core-noise research is carried out to enable the advanced high-efficiency small gas-generator subsystem, common to several N+3 conceptual designs, needed to meet NASA's technical challenges. Intermediate updates to prediction tools are implemented as the understanding of the source structure and engine-internal propagation effects is improved. The NASA Fundamental Aeronautics Program has the principal objective of overcoming today's national challenges in air transportation. The SFW Quiet-Aircraft Subproject aims to develop concepts and technologies to reduce perceived community noise attributable to aircraft with minimal impact on weight and performance. This reduction of aircraft noise is critical to enabling the anticipated large increase in future air traffic.
NASA Technical Reports Server (NTRS)
Leifer, Ira; Tratt, David; Quattrochi, Dale; Bovensmann, Heinrich; Gerilowski, Konstantin; Buchwitz, Michael; Burrows, John
2013-01-01
Methane's (CH4) large global warming potential (Shindell et al., 2012) and likely increasing future emissions due to global warming feedbacks emphasize its importance to anthropogenic greenhouse warming (IPCC, 2007). Furthermore, CH4 regulation has far greater near-term climate change mitigation potential versus carbon dioxide CO2, the other major anthropogenic Greenhouse Gas (GHG) (Shindell et al., 2009). Uncertainties in CH4 budgets arise from the poor state of knowledge of CH4 sources - in part from a lack of sufficiently accurate assessments of the temporal and spatial emissions and controlling factors of highly variable anthropogenic and natural CH4 surface fluxes (IPCC, 2007) and the lack of global-scale (satellite) data at sufficiently high spatial resolution to resolve sources. Many important methane (and other trace gases) sources arise from urban and mega-urban landscapes where anthropogenic activities are centered - most of humanity lives in urban areas. Studying these complex landscape tapestries is challenged by a wide and varied range of activities at small spatial scale, and difficulty in obtaining up-to-date landuse data in the developed world - a key desire of policy makers towards development of effective regulations. In the developing world, challenges are multiplied with additional political access challenges. As high spatial resolution satellite and airborne data has become available, activity mapping applications have blossomed - i.e., Google maps; however, tap a minute fraction of remote sensing capabilities due to limited (three band) spectral information. Next generation approaches that incorporate high spatial resolution hyperspectral and ultraspectral data will allow detangling of the highly heterogeneous usage megacity patterns by providing diagnostic identification of chemical composition from solids (refs) to gases (refs). To properly enable these next generation technologies for megacity include atmospheric radiative transfer modeling the complex and often aerosol laden, humid, urban microclimates, atmospheric transport and profile monitoring, spatial resolution, temporal cycles (diurnal and seasonal which involve interactions with the surrounding environment diurnal and seasonal cycles) and representative measurement approaches given traffic realities. Promising approaches incorporate contemporaneous airborne remote sensing and in situ measurements, nocturnal surface surveys, with ground station measurement
Scaling Agile Infrastructure to People
NASA Astrophysics Data System (ADS)
Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.
2015-12-01
When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.
Peek, N; Holmes, J H; Sun, J
2014-08-15
To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.
Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2004-01-01
A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Baisden, W Troy; Keller, Elizabeth D; Van Hale, Robert; Frew, Russell D; Wassenaar, Leonard I
2016-01-01
Predictive understanding of precipitation δ(2)H and δ(18)O in New Zealand faces unique challenges, including high spatial variability in precipitation amounts, alternation between subtropical and sub-Antarctic precipitation sources, and a compressed latitudinal range of 34 to 47 °S. To map the precipitation isotope ratios across New Zealand, three years of integrated monthly precipitation samples were acquired from >50 stations. Conventional mean-annual precipitation δ(2)H and δ(18)O maps were produced by regressions using geographic and annual climate variables. Incomplete data and short-term variation in climate and precipitation sources limited the utility of this approach. We overcome these difficulties by calculating precipitation-weighted monthly climate parameters using national 5-km-gridded daily climate data. This data plus geographic variables were regressed to predict δ(2)H, δ(18)O, and d-excess at all sites. The procedure yields statistically-valid predictions of the isotope composition of precipitation (long-term average root mean square error (RMSE) for δ(18)O = 0.6 ‰; δ(2)H = 5.5 ‰); and monthly RMSE δ(18)O = 1.9 ‰, δ(2)H = 16 ‰. This approach has substantial benefits for studies that require the isotope composition of precipitation during specific time intervals, and may be further improved by comparison to daily and event-based precipitation samples as well as the use of back-trajectory calculations.
A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms
2014-01-01
Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581
NASA Technical Reports Server (NTRS)
Ohring, G.; Wielicki, B.; Spencer, R.; Emery, B.; Datla, R.
2004-01-01
Measuring the small changes associated with long-term global climate change from space is a daunting task. To address these problems and recommend directions for improvements in satellite instrument calibration some 75 scientists, including researchers who develop and analyze long-term data sets from satellites, experts in the field of satellite instrument calibration, and physicists working on state of the art calibration sources and standards met November 12 - 14, 2002 and discussed the issues. The workshop defined the absolute accuracies and long-term stabilities of global climate data sets that are needed to detect expected trends, translated these data set accuracies and stabilities to required satellite instrument accuracies and stabilities, and evaluated the ability of current observing systems to meet these requirements. The workshop's recommendations include a set of basic axioms or overarching principles that must guide high quality climate observations in general, and a roadmap for improving satellite instrument characterization, calibration, inter-calibration, and associated activities to meet the challenge of measuring global climate change. It is also recommended that a follow-up workshop be conducted to discuss implementation of the roadmap developed at this workshop.
Park, Eun Sug; Hopke, Philip K; Oh, Man-Suk; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford H
2014-07-01
There has been increasing interest in assessing health effects associated with multiple air pollutants emitted by specific sources. A major difficulty with achieving this goal is that the pollution source profiles are unknown and source-specific exposures cannot be measured directly; rather, they need to be estimated by decomposing ambient measurements of multiple air pollutants. This estimation process, called multivariate receptor modeling, is challenging because of the unknown number of sources and unknown identifiability conditions (model uncertainty). The uncertainty in source-specific exposures (source contributions) as well as uncertainty in the number of major pollution sources and identifiability conditions have been largely ignored in previous studies. A multipollutant approach that can deal with model uncertainty in multivariate receptor models while simultaneously accounting for parameter uncertainty in estimated source-specific exposures in assessment of source-specific health effects is presented in this paper. The methods are applied to daily ambient air measurements of the chemical composition of fine particulate matter ([Formula: see text]), weather data, and counts of cardiovascular deaths from 1995 to 1997 for Phoenix, AZ, USA. Our approach for evaluating source-specific health effects yields not only estimates of source contributions along with their uncertainties and associated health effects estimates but also estimates of model uncertainty (posterior model probabilities) that have been ignored in previous studies. The results from our methods agreed in general with those from the previously conducted workshop/studies on the source apportionment of PM health effects in terms of number of major contributing sources, estimated source profiles, and contributions. However, some of the adverse source-specific health effects identified in the previous studies were not statistically significant in our analysis, which probably resulted because we incorporated parameter uncertainty in estimated source contributions that has been ignored in the previous studies into the estimation of health effects parameters. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Sources and methods to reconstruct past masting patterns in European oak species.
Szabó, Péter
2012-01-01
The irregular occurrence of good seed years in forest trees is known in many parts of the world. Mast year frequency in the past few decades can be examined through field observational studies; however, masting patterns in the more distant past are equally important in gaining a better understanding of long-term forest ecology. Past masting patterns can be studied through the examination of historical written sources. These pose considerable challenges, because data in them were usually not recorded with the aim of providing information about masting. Several studies examined masting in the deeper past, however, authors hardly ever considered the methodological implications of using and combining various source types. This paper provides a critical overview of the types of archival written that are available for the reconstruction of past masting patterns for European oak species and proposes a method to unify and evaluate different types of data. Available sources cover approximately eight centuries and can be put into two basic categories: direct observations on the amount of acorns and references to sums of money received in exchange for access to acorns. Because archival sources are highly different in origin and quality, the optimal solution for creating databases for past masting data is a three-point scale: zero mast, moderate mast, good mast. When larger amounts of data are available in a unified three-point-scale database, they can be used to test hypotheses about past masting frequencies, the driving forces of masting or regional masting patterns.
Bisphosphonate Treatment for Children With Disabling Conditions
Boyce, Alison M.; Tosi, Laura L.; Paul, Scott M.
2014-01-01
Fractures are a frequent source of morbidity in children with disabling conditions. The assessment of bone density in this population is challenging, because densitometry is influenced by dynamic forces affecting the growing skeleton and may be further confounded by positioning difficulties and surgical hardware. First-line treatment for pediatric osteoporosis involves conservative measures, including optimizing the management of underlying conditions, maintaining appropriate calcium and vitamin D intake, encouraging weight-bearing physical activity, and monitoring measurements of bone mineral density. Bisphosphonates are a class of medications that increase bone mineral density by inhibiting bone resorption. Although bisphosphonates are commonly prescribed for treatment of adult osteoporosis, their use in pediatric patients is controversial because of the lack of long-term safety and efficacy data. PMID:24368091
A long view of global plutonium management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, R.L. Jr.
1995-10-01
Dealing with the large and growing world inventories of fissile materials from all sources is a major part of the long term challenge of limiting the danger from nuclear weapons. Providing clean, safe nuclear power may also be needed to prevent conditions from arising which could lead to large scale nuclear weapon (re)armament. ADTT technologies might reconcile the seeming dilemma of providing nuclear power while maintaining a very low world inventory of nuclear materials which can be used in weapons. This vision for ADTT should be tested in a variety of ways, including comparisons with competing approaches and with othermore » objectives. Such testing is one part of constructing a path for a decades-long, worldwide implementation campaign for ADTT.« less
Automatically Detecting Failures in Natural Language Processing Tools for Online Community Text.
Park, Albert; Hartzler, Andrea L; Huh, Jina; McDonald, David W; Pratt, Wanda
2015-08-31
The prevalence and value of patient-generated health text are increasing, but processing such text remains problematic. Although existing biomedical natural language processing (NLP) tools are appealing, most were developed to process clinician- or researcher-generated text, such as clinical notes or journal articles. In addition to being constructed for different types of text, other challenges of using existing NLP include constantly changing technologies, source vocabularies, and characteristics of text. These continuously evolving challenges warrant the need for applying low-cost systematic assessment. However, the primarily accepted evaluation method in NLP, manual annotation, requires tremendous effort and time. The primary objective of this study is to explore an alternative approach-using low-cost, automated methods to detect failures (eg, incorrect boundaries, missed terms, mismapped concepts) when processing patient-generated text with existing biomedical NLP tools. We first characterize common failures that NLP tools can make in processing online community text. We then demonstrate the feasibility of our automated approach in detecting these common failures using one of the most popular biomedical NLP tools, MetaMap. Using 9657 posts from an online cancer community, we explored our automated failure detection approach in two steps: (1) to characterize the failure types, we first manually reviewed MetaMap's commonly occurring failures, grouped the inaccurate mappings into failure types, and then identified causes of the failures through iterative rounds of manual review using open coding, and (2) to automatically detect these failure types, we then explored combinations of existing NLP techniques and dictionary-based matching for each failure cause. Finally, we manually evaluated the automatically detected failures. From our manual review, we characterized three types of failure: (1) boundary failures, (2) missed term failures, and (3) word ambiguity failures. Within these three failure types, we discovered 12 causes of inaccurate mappings of concepts. We used automated methods to detect almost half of 383,572 MetaMap's mappings as problematic. Word sense ambiguity failure was the most widely occurring, comprising 82.22% of failures. Boundary failure was the second most frequent, amounting to 15.90% of failures, while missed term failures were the least common, making up 1.88% of failures. The automated failure detection achieved precision, recall, accuracy, and F1 score of 83.00%, 92.57%, 88.17%, and 87.52%, respectively. We illustrate the challenges of processing patient-generated online health community text and characterize failures of NLP tools on this patient-generated health text, demonstrating the feasibility of our low-cost approach to automatically detect those failures. Our approach shows the potential for scalable and effective solutions to automatically assess the constantly evolving NLP tools and source vocabularies to process patient-generated text.
Salathé, Marcel
2016-12-01
The digital revolution has contributed to very large data sets (ie, big data) relevant for public health. The two major data sources are electronic health records from traditional health systems and patient-generated data. As the two data sources have complementary strengths-high veracity in the data from traditional sources and high velocity and variety in patient-generated data-they can be combined to build more-robust public health systems. However, they also have unique challenges. Patient-generated data in particular are often completely unstructured and highly context dependent, posing essentially a machine-learning challenge. Some recent examples from infectious disease surveillance and adverse drug event monitoring demonstrate that the technical challenges can be solved. Despite these advances, the problem of verification remains, and unless traditional and digital epidemiologic approaches are combined, these data sources will be constrained by their intrinsic limits. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America.
Miller, Dane
2006-01-01
Rob Burns talks with Dane Miller, former CEO of Biomet, about challenges posed by new technology in the orthopedic devices area. One key challenge is the rising cost and use of orthopedic devices at a time when providers are facing decreased profitability and reimbursement for orthopedic services. Another challenge is the long-term time horizon needed to gauge product success that contrasts with payers' and providers' short-term horizon. A third challenge is heightened governmental scrutiny of device makers' relationships with orthopedic surgeons. This interview was conducted before Miller left Biomet in March 2006.
26 CFR 1.737-1 - Recognition of precontribution gain.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
A Systematic Review of Chronic Fatigue Syndrome: Don't Assume It's Depression
Griffith, James P.; Zarrouf, Fahd A.
2008-01-01
Objective: Chronic fatigue syndrome (CFS) is characterized by profound, debilitating fatigue and a combination of several other symptoms resulting in substantial reduction in occupational, personal, social, and educational status. CFS is often misdiagnosed as depression. The objective of this study was to evaluate and discuss different etiologies, approaches, and management strategies of CFS and to present ways to differentiate it from the fatigue symptom of depression. Data Sources: A MEDLINE search was conducted to identify existing information about CFS and depression using the headings chronic fatigue syndrome AND depression. The alternative terms major depressive disorder and mood disorder were also searched in conjunction with the term chronic fatigue syndrome. Additionally, MEDLINE was searched using the term chronic fatigue. All searches were limited to articles published within the last 10 years, in English. A total of 302 articles were identified by these searches. Also, the term chronic fatigue syndrome was searched by itself. This search was limited to articles published within the last 5 years, in English, and resulted in an additional 460 articles. Additional publications were identified by manually searching the reference lists of the articles from both searches. Study Selection and Data Extraction: CFS definitions, etiologies, differential diagnoses (especially depression) and management strategies were extracted, reviewed, and summarized to meet the objectives of this article. Data Synthesis: CFS is underdiagnosed in more than 80% of the people who have it; at the same time, it is often misdiagnosed as depression. Genetic, immunologic, infectious, metabolic, and neurologic etiologies were suggested to explain CFS. A biopsychosocial model was suggested for evaluating, managing, and differentiating CFS from depression. Conclusions: Evaluating and managing chronic fatigue is a challenging situation for physicians, as it is a challenging and difficult condition for patients. A biopsychosocial approach in the evaluation and management is recommended. More studies about CFS manifestations, evaluation, and management are needed. PMID:18458765
NASA Astrophysics Data System (ADS)
Suleiman, Lina
2017-04-01
Planners and policymakers' concern escalates over conventional systems dealing with rains in cities based on domination and control of nature rather than harmony and design with nature. A new spatial planning paradigm is needed to put in place systems which mimic natural water systems and promise multiple values instead of systems consider rain as a source of problem. However, such approach embodies significant planning challenges. Urban rain harvesting systems (URHs) are inherently viewed as 'sociotechnical' systems. As such, planning processes should consider the interdependence of 'social' and 'technical' aspects as essential elements if a transition towards sustainable urban water systems is to be realised. Drawing on a common understanding for what urban rain harvesting systems should deliver in terms of 'functions' and 'added values', a generic planning framework is developed to inform practitioners on how the 'socio' and 'technical' elements should be assimilated in a long-term and integrated planning processes of URHs. Using the developed framework, the paper examines the planning and maintenance processes of urban rain harvesting systems in Årstafältet and Hammarby Sjöstad respectively. Results show that planners lack a common operational understanding on how these systems should be designed holistically in a long term and integrated planning processes creating working gabs or positional conflicts. In practice, urban planners and water engineers look at these systems as either urban design component or water drainage system to deal with technical functions hindering a smooth transition path towards urban rain harvesting systems. The paper concludes on the urgency for reordering roles and relations within a new set-up organisation to incubate these systems in long-term planning and maintenance processes. Key words: 'Sociotechnical' system, Water, Planning, Urban Rain Harvesting systems (URHs), Hammarby Sjostad and Årstafältet
ERIC Educational Resources Information Center
Phillips, David
2017-01-01
This paper describes the challenges involved in work in progress on the history of British policy in education in occupied Germany, 1945-1949. The problems centre on the range of archival sources, the structural balance of themes and chronology, and the use of appropriate illustrations. It argues that conclusions about the nature of educational…
Open Source GIS based integrated watershed management
NASA Astrophysics Data System (ADS)
Byrne, J. M.; Lindsay, J.; Berg, A. A.
2013-12-01
Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users
Observation-based source terms in the third-generation wave model WAVEWATCH
NASA Astrophysics Data System (ADS)
Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.
2015-12-01
Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.
WebGIVI: a web-based gene enrichment analysis and visualization tool.
Sun, Liang; Zhu, Yongnan; Mahmood, A S M Ashique; Tudor, Catalina O; Ren, Jia; Vijay-Shanker, K; Chen, Jian; Schmidt, Carl J
2017-05-04
A major challenge of high throughput transcriptome studies is presenting the data to researchers in an interpretable format. In many cases, the outputs of such studies are gene lists which are then examined for enriched biological concepts. One approach to help the researcher interpret large gene datasets is to associate genes and informative terms (iTerm) that are obtained from the biomedical literature using the eGIFT text-mining system. However, examining large lists of iTerm and gene pairs is a daunting task. We have developed WebGIVI, an interactive web-based visualization tool ( http://raven.anr.udel.edu/webgivi/ ) to explore gene:iTerm pairs. WebGIVI was built via Cytoscape and Data Driven Document JavaScript libraries and can be used to relate genes to iTerms and then visualize gene and iTerm pairs. WebGIVI can accept a gene list that is used to retrieve the gene symbols and corresponding iTerm list. This list can be submitted to visualize the gene iTerm pairs using two distinct methods: a Concept Map or a Cytoscape Network Map. In addition, WebGIVI also supports uploading and visualization of any two-column tab separated data. WebGIVI provides an interactive and integrated network graph of gene and iTerms that allows filtering, sorting, and grouping, which can aid biologists in developing hypothesis based on the input gene lists. In addition, WebGIVI can visualize hundreds of nodes and generate a high-resolution image that is important for most of research publications. The source code can be freely downloaded at https://github.com/sunliang3361/WebGIVI . The WebGIVI tutorial is available at http://raven.anr.udel.edu/webgivi/tutorial.php .
Cultural and Cognitive Development in Short-Term Study Abroad: Illuminating the 360 Experience
ERIC Educational Resources Information Center
Burton, Susan
2012-01-01
The three articles in this dissertation investigate leading others through developmental opportunities by facilitating their engagement in intercultural challenges. Specifically the research explores the meaning followers make of developmental challenges during short-term study abroad experiences and encounters with diversity. Data in the form of…
Bayesian source term determination with unknown covariance of measurements
NASA Astrophysics Data System (ADS)
Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav
2017-04-01
Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Insights and Challenges to Integrating Data from Diverse Ecological Networks
NASA Astrophysics Data System (ADS)
Peters, D. P. C.
2014-12-01
Many of the most dramatic and surprising effects of global change occur across large spatial extents, from regions to continents, that impact multiple ecosystem types across a range of interacting spatial and temporal scales. The ability of ecologists and inter-disciplinary scientists to understand and predict these dynamics depend, in large part, on existing site-based research infrastructures that developed in response to historic events. Integrating these diverse sources of data is critical to addressing these broad-scale questions. A conceptual approach is presented to synthesize and integrate diverse sources and types of data from different networks of research sites. This approach focuses on developing derived data products through spatial and temporal aggregation that allow datasets collected with different methods to be compared. The approach is illustrated through the integration, analysis, and comparison of hundreds of long-term datasets from 50 ecological sites in the US that represent ecosystem types commonly found globally. New insights were found by comparing multiple sites using common derived data. In addition to "bringing to light" many dark data in a standardized, open access, easy-to-use format, a suite of lessons were learned that can be applied to up and coming research networks in the US and internationally. These lessons will be described along with the challenges, including cyber-infrastructure, cultural, and behavioral constraints associated with the use of big and little data, that may keep ecologists and inter-disciplinary scientists from taking full advantage of the vast amounts of existing and yet-to-be exposed data.
Findyartini, Ardi; Sudarsono, Nani Cahyani
2018-05-02
Fostering personal identity formation and professional development among undergraduate medical students is challenging. Based on situated learning, experiential learning and role-modelling frameworks, a six-week course was developed to remediate lapses in professionalism among undergraduate medical students. This study aims to explore the students' perceptions of their personal identity formation and professional development following completion of the course. This qualitative study, adopting a phenomenological design, uses the participants' reflective diaries as primary data sources. In the pilot course, field work, role-model shadowing and discussions with resource personnel were conducted. A total of 14 students were asked to provide written self-reflections. Consistent, multi-source feedback was provided throughout the course. A thematic analysis was conducted to identify the key processes of personal and professional development among the students during remediation. Three main themes were revealed. First, students highlighted the strength of small group activities in helping them 'internalise the essential concepts'. Second, the role-model shadowing supported their understanding of 'what kind of medical doctors they would become'. Third, the field work allowed them to identify 'what the "noble values" are and how to implement them in daily practice'. By implementing multimodal activities, the course has high potential in supporting personal identity formation and professional development among undergraduate pre-clinical medical students, as well as remediating their lapses in professionalism. However, there are challenges in implementing the model among a larger student population and in documenting the long-term impact of the course.
Gao, Jinghong; Woodward, Alistair; Vardoulakis, Sotiris; Kovats, Sari; Wilkinson, Paul; Li, Liping; Xu, Lei; Li, Jing; Yang, Jun; Li, Jing; Cao, Lina; Liu, Xiaobo; Wu, Haixia; Liu, Qiyong
2017-02-01
With rapid economic development, China has been plagued by choking air pollution in recent years, and the frequent occurrence of haze episodes has caused widespread public concern. The purpose of this study is to describe the sources and formation of haze, summarize the mitigation measures in force, review the relationship between haze pollution and public health, and to discuss the challenges, potential research directions and policy options. Haze pollution has both natural and man-made causes, though it is anthropogenic sources that are the major contributors. Accumulation of air pollutants, secondary formation of aerosols, stagnant meteorological conditions, and trans-boundary transportation of pollutants are the principal causes driving the formation and evolution of haze. In China, haze includes gaseous pollutants and fine particles, of which PM 2.5 is the dominant component. Short and long-term exposure to haze pollution are associated with a range of negative health outcomes, including respiratory diseases, cardiovascular and cerebrovascular diseases, mental health problems, lung cancer and premature death. China has paid increasing attention to the improvement of air quality, and has introduced action plans and policies to tackle pollution, but many interventions have only temporary effects. There may be fierce resistance from industry groups and some government agencies, and often it is challenging to enforce relevant control measures and laws. We discuss the potential policy options for prevention, the need for wider public dialogue and the implications for scientific research. Copyright © 2016 Elsevier B.V. All rights reserved.
Strategic positioning. Part 2: Positioning challenges in an evolving health care marketplace.
Kauer, R T; Berkowitz, E
1997-01-01
Why is strategic positioning so important to health care organizations struggling in a managed care environment and what are the sources of value? In Part 1 of this article, entitled "The Sources of Value under Managed Care," the authors presented four sources of value relative to the evolution of the market from fee-for-service to managed care. These value sources are: (1) assets, (2) price/performance, (3) distribution, and, ultimately, (4) capabilities and brand equity. In this article, the authors further elaborate on the sources of value as the market moves beyond the historical fee-for-service position to a managed care marketplace. Part 2 presents the marketing and financial challenges to organizational positioning and performance across the four stages of managed care.
ERIC Educational Resources Information Center
Barzilai, Sarit; Tzadok, Eynav; Eshet-Alkalai, Yoram
2015-01-01
Sourcing is vital for knowledge construction from online information sources, yet learners may find it difficult to engage in effective sourcing. Sourcing can be particularly challenging when lay readers encounter conflicting expert accounts of controversial topics, a situation which is increasingly common when learning online. The aim of this…
Challenges/issues of NIS used in particle accelerator facilities
NASA Astrophysics Data System (ADS)
Faircloth, Dan
2013-09-01
High current, high duty cycle negative ion sources are an essential component of many high power particle accelerators. This talk gives an overview of the state-of-the-art sources used around the world. Volume, surface and charge exchange negative ion production processes are detailed. Cesiated magnetron and Penning surface plasma sources are discussed along with surface converter sources. Multicusp volume sources with filament and LaB6 cathodes are described before moving onto RF inductively coupled volume sources with internal and external antennas. The major challenges facing accelerator facilities are detailed. Beam current, source lifetime and reliability are the most pressing. The pros and cons of each source technology is discussed along with their development programs. The uncertainties and unknowns common to these sources are discussed. The dynamics of cesium surface coverage and the causes of source variability are still unknown. Minimizing beam emittance is essential to maximizing the transport of high current beams; space charge effects are very important. The basic physics of negative ion production is still not well understood, theoretical and experimental programs continue to improve this, but there are still many mysteries to be solved.
Identifying Attributes of CO2 Leakage Zones in Shallow Aquifers Using a Parametric Level Set Method
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Islam, A.; Wheeler, M.
2016-12-01
Leakage through abandoned wells and geologic faults poses the greatest risk to CO2 storage permanence. For shallow aquifers, secondary CO2 plumes emanating from the leak zones may go undetected for a sustained period of time and has the greatest potential to cause large-scale and long-term environmental impacts. Identification of the attributes of leak zones, including their shape, location, and strength, is required for proper environmental risk assessment. This study applies a parametric level set (PaLS) method to characterize the leakage zone. Level set methods are appealing for tracking topological changes and recovering unknown shapes of objects. However, level set evolution using the conventional level set methods is challenging. In PaLS, the level set function is approximated using a weighted sum of basis functions and the level set evolution problem is replaced by an optimization problem. The efficacy of PaLS is demonstrated through recovering the source zone created by CO2 leakage into a carbonate aquifer. Our results show that PaLS is a robust source identification method that can recover the approximate source locations in the presence of measurement errors, model parameter uncertainty, and inaccurate initial guesses of source flux strengths. The PaLS inversion framework introduced in this work is generic and can be adapted for any reactive transport model by switching the pre- and post-processing routines.
Moment tensor analysis of very shallow sources
Chiang, Andrea; Dreger, Douglas S.; Ford, Sean R.; ...
2016-10-11
An issue for moment tensor (MT) inversion of shallow seismic sources is that some components of the Green’s functions have vanishing amplitudes at the free surface, which can result in bias in the MT solution. The effects of the free surface on the stability of the MT method become important as we continue to investigate and improve the capabilities of regional full MT inversion for source–type identification and discrimination. It is important to understand free–surface effects on discriminating shallow explosive sources for nuclear monitoring purposes. It may also be important in natural systems that have very shallow seismicity, such asmore » volcanic and geothermal systems. We examine the effects of the free surface on the MT via synthetic testing and apply the MT–based discrimination method to three quarry blasts from the HUMMING ALBATROSS experiment. These shallow chemical explosions at ~10 m depth and recorded up to several kilometers distance represent rather severe source–station geometry in terms of free–surface effects. We show that the method is capable of recovering a predominantly explosive source mechanism, and the combined waveform and first–motion method enables the unique discrimination of these events. Furthermore, recovering the design yield using seismic moment estimates from MT inversion remains challenging, but we can begin to put error bounds on our moment estimates using the network sensitivity solution technique.« less
Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2005-01-01
A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Challenges and opportunities for early-career Teaching-Focussed academics in the biosciences.
Hubbard, Katharine; Gretton, Sarah; Jones, Katherine; Tallents, Lucy
2015-01-01
Twenty-seven percent of academics in UK Higher Education (HE) are in Teaching-Focussed positions, making major contributions to undergraduate programmes in an era of high student expectations when it comes to teaching quality. However, institutional support for Teaching-Focussed academics is often limited, both in terms of peer networking and opportunities for career development. As four early-career stage Teaching-Focussed academics working in a variety of institutions, we explore what motivated our choices to make teaching our primary academic activity, and the challenges that we have faced in doing so. In addition to highlighting the need for universities to fully recognise the achievements of teaching staff, we discuss the role that the various biosciences learned societies have in supporting Teaching-Focussed academics. We identify that there is a need for the learned societies to come together and pool their expertise in this area. The fragmented nature of the Teaching-Focussed academic community means that clear sources of national support are needed in order to best enable the next generation of bioscience educators to reach their full potential.
Yang, Yong
2017-11-01
Most health studies focus on one health outcome and examine the influence of one or multiple risk factors. However, in reality, various pathways, interactions, and associations exist not only between risk factors and health outcomes but also among the risk factors and among health outcomes. The advance of system science methods, Big Data, and accumulated knowledge allows us to examine how multiple risk factors influence multiple health outcomes at multiple levels (termed a 3M study). Using the study of neighborhood environment and health as an example, I elaborate on the significance of 3M studies. 3M studies may lead to a significantly deeper understanding of the dynamic interactions among risk factors and outcomes and could help us design better interventions that may be of particular relevance for upstream interventions. Agent-based modeling (ABM) is a promising method in the 3M study, although its potentials are far from being fully explored. Future challenges include the gap of epidemiologic knowledge and evidence, lack of empirical data sources, and the technical challenges of ABM. © 2017 New York Academy of Sciences.
High performance incandescent lighting using a selective emitter and nanophotonic filters
NASA Astrophysics Data System (ADS)
Leroy, Arny; Bhatia, Bikram; Wilke, Kyle; Ilic, Ognjen; Soljačić, Marin; Wang, Evelyn N.
2017-09-01
Previous approaches for improving the efficiency of incandescent light bulbs (ILBs) have relied on tailoring the emitted spectrum using cold-side interference filters that reflect the infrared energy back to the emitter while transmitting the visible light. While this approach has, in theory, potential to surpass light-emitting diodes (LEDs) in terms of luminous efficiency while conserving the excellent color rendering index (CRI) inherent to ILBs, challenges such as low view factor between the emitter and filter, high emitter (>2800 K) and filter temperatures and emitter evaporation have significantly limited the maximum efficiency. In this work, we first analyze the effect of non-idealities in the cold-side filter, the emitter and the view factor on the luminous efficiency. Second, we theoretically and experimentally demonstrate that the loss in efficiency associated with low view factors can be minimized by using a selective emitter (e.g., high emissivity in the visible and low emissivity in the infrared) with a filter. Finally, we discuss the challenges in achieving a high performance and long-lasting incandescent light source including the emitter and filter thermal stability as well as emitter evaporation.
Covalent Organic Frameworks: From Materials Design to Biomedical Application
Zhao, Fuli; Liu, Huiming; Mathe, Salva D. R.; Dong, Anjie
2017-01-01
Covalent organic frameworks (COFs) are newly emerged crystalline porous polymers with well-defined skeletons and nanopores mainly consisted of light-weight elements (H, B, C, N and O) linked by dynamic covalent bonds. Compared with conventional materials, COFs possess some unique and attractive features, such as large surface area, pre-designable pore geometry, excellent crystallinity, inherent adaptability and high flexibility in structural and functional design, thus exhibiting great potential for various applications. Especially, their large surface area and tunable porosity and π conjugation with unique photoelectric properties will enable COFs to serve as a promising platform for drug delivery, bioimaging, biosensing and theranostic applications. In this review, we trace the evolution of COFs in terms of linkages and highlight the important issues on synthetic method, structural design, morphological control and functionalization. And then we summarize the recent advances of COFs in the biomedical and pharmaceutical sectors and conclude with a discussion of the challenges and opportunities of COFs for biomedical purposes. Although currently still at its infancy stage, COFs as an innovative source have paved a new way to meet future challenges in human healthcare and disease theranostic. PMID:29283423
Compilation of climate data from heterogeneous networks across the Hawaiian Islands
Longman, Ryan J.; Giambelluca, Thomas W.; Nullet, Michael A.; Frazier, Abby G.; Kodama, Kevin; Crausbay, Shelley D.; Krushelnycky, Paul D.; Cordell, Susan; Clark, Martyn P.; Newman, Andy J.; Arnold, Jeffrey R.
2018-01-01
Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai‘i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National Centers for Environmental Information (NCEI). Researchers attempting to make use of available data are faced with a series of challenges that include: (1) identifying potential data sources; (2) acquiring data; (3) establishing data quality assurance and quality control (QA/QC) protocols; and (4) implementing robust gap filling techniques. This paper addresses these challenges by providing: (1) a summary of the available climate data in Hawai‘i including a detailed description of the various meteorological observation networks and data accessibility, and (2) a quality controlled meteorological dataset across the Hawaiian Islands for the 25-year period 1990-2014. The dataset draws on observations from 471 climate stations and includes rainfall, maximum and minimum surface air temperature, relative humidity, wind speed, downward shortwave and longwave radiation data. PMID:29437162
Barriers to Women's Participation: Experiences of Volunteers and Community Healthcare Authorities.
Rezakhani Moghaddam, Hamed; Allahverdipour, Hamid; Matlabi, Hossein
2018-01-01
Along with health development in general terms, women's involvement in health programs can be effective in raising their self-confidence and their health promotion. This study was carried out to unveil the barriers to and challenges of the health volunteers and to present the solutions to its promotion using active women participants' experiences and the authorities of the program. The study was carried out using qualitative method along with content analysis in city of Tabriz East-Azerbaijan province, Iran. Data collection was conducted utilizing semistructured individual interviews and focus group discussions with the participation of 29 health volunteers and responsible authorities. The participants were selected using purposive sampling with maximum variation. Data analysis implemented conventional content analysis using MAXODA. Barriers to and challenges of health volunteers were generally categorized into four main themes including volunteers and trainers' inadequate capabilities, inadequate acceptance of the volunteers, restrictive social norms, and organizational problems. It seems that interaction among health system, people, and health volunteers should be improved. Holding training programs about the activities of health volunteers at the society level leads into the better utilization of society sources in health programs.
Compilation of climate data from heterogeneous networks across the Hawaiian Islands
NASA Astrophysics Data System (ADS)
Longman, Ryan J.; Giambelluca, Thomas W.; Nullet, Michael A.; Frazier, Abby G.; Kodama, Kevin; Crausbay, Shelley D.; Krushelnycky, Paul D.; Cordell, Susan; Clark, Martyn P.; Newman, Andy J.; Arnold, Jeffrey R.
2018-02-01
Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai'i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National Centers for Environmental Information (NCEI). Researchers attempting to make use of available data are faced with a series of challenges that include: (1) identifying potential data sources; (2) acquiring data; (3) establishing data quality assurance and quality control (QA/QC) protocols; and (4) implementing robust gap filling techniques. This paper addresses these challenges by providing: (1) a summary of the available climate data in Hawai'i including a detailed description of the various meteorological observation networks and data accessibility, and (2) a quality controlled meteorological dataset across the Hawaiian Islands for the 25-year period 1990-2014. The dataset draws on observations from 471 climate stations and includes rainfall, maximum and minimum surface air temperature, relative humidity, wind speed, downward shortwave and longwave radiation data.
NASA Astrophysics Data System (ADS)
Kuhlmann, Andreas V.; Houel, Julien; Brunner, Daniel; Ludwig, Arne; Reuter, Dirk; Wieck, Andreas D.; Warburton, Richard J.
2013-07-01
Optically active quantum dots, for instance self-assembled InGaAs quantum dots, are potentially excellent single photon sources. The fidelity of the single photons is much improved using resonant rather than non-resonant excitation. With resonant excitation, the challenge is to distinguish between resonance fluorescence and scattered laser light. We have met this challenge by creating a polarization-based dark-field microscope to measure the resonance fluorescence from a single quantum dot at low temperature. We achieve a suppression of the scattered laser exceeding a factor of 107 and background-free detection of resonance fluorescence. The same optical setup operates over the entire quantum dot emission range (920-980 nm) and also in high magnetic fields. The major development is the outstanding long-term stability: once the dark-field point has been established, the microscope operates for days without alignment. The mechanical and optical designs of the microscope are presented, as well as exemplary resonance fluorescence spectroscopy results on individual quantum dots to underline the microscope's excellent performance.
Challenges and opportunities for early-career Teaching-Focussed academics in the biosciences
Hubbard, Katharine; Gretton, Sarah; Jones, Katherine; Tallents, Lucy
2015-01-01
Twenty-seven percent of academics in UK Higher Education (HE) are in Teaching-Focussed positions, making major contributions to undergraduate programmes in an era of high student expectations when it comes to teaching quality. However, institutional support for Teaching-Focussed academics is often limited, both in terms of peer networking and opportunities for career development. As four early-career stage Teaching-Focussed academics working in a variety of institutions, we explore what motivated our choices to make teaching our primary academic activity, and the challenges that we have faced in doing so. In addition to highlighting the need for universities to fully recognise the achievements of teaching staff, we discuss the role that the various biosciences learned societies have in supporting Teaching-Focussed academics. We identify that there is a need for the learned societies to come together and pool their expertise in this area. The fragmented nature of the Teaching-Focussed academic community means that clear sources of national support are needed in order to best enable the next generation of bioscience educators to reach their full potential. PMID:25977754
Improving certified nurse aide retention. A long-term care management challenge.
Mesirow, K M; Klopp, A; Olson, L L
1998-03-01
In the long-term care industry, the turnover rate among nurse aides is extremely high. This adversely affects resident satisfaction, resident care, morale, and finances. It presents a challenge to long-term care administration. Refusing to accept high turnover as an impossible situation allows changes to be made. The authors describe how the staff at one intermediate care facility identified its problems, assessed the causes, and implemented corrective action.
McQuire, Cheryl; Hassiotis, Angela; Harrison, Bronwyn; Pilling, Stephen
2015-11-26
Psychotropic medications are frequently used to treat challenging behaviour in children with intellectual disabilities, despite a lack of evidence for their efficacy. This systematic review and meta-analysis aimed to determine the safety and efficacy of pharmacological interventions for challenging behaviour among children with intellectual disabilities. Electronic databases were searched and supplemented with a hand search of reference lists and trial registries. Randomised controlled trials of pharmacological interventions for challenging behaviour among children with intellectual disabilities were included. Data were analysed using meta-analysis or described narratively if meta-analysis was not possible. For quality assessment, the Cochrane Risk of Bias tool and the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach were used. Fourteen studies including 912 participants met inclusion criteria. Antipsychotic medication reduced challenging behaviour among children with intellectual disabilities in the short-term (SMD = -1.09, p < 0.001 for risperidone; SMD = -0.64, p <0.001 for aripiprazole). However, there were significant side-effects including elevated prolactin levels (SMD = 3.22, p < 0.001) and weight gain (SMD = 0.82, p < 0.001). Evidence was inconclusive regarding the effectiveness of anticonvulsants and antioxidants for reducing challenging behaviour. The quality of all evidence was low and there were no long term follow up studies. Antipsychotic medications appear to be effective for reducing challenging behaviour in the short-term among children with intellectual disabilities, but they carry a risk of significant side effects. Findings from this review must be interpreted with caution as studies were typically of low quality and most outcomes were based on a small number of studies. Further long-term, high-quality research is needed to determine the effectiveness and safety of psychotropic medication for reducing challenging behaviour.
NASA Astrophysics Data System (ADS)
Gao, M.; Huang, S. T.; Wang, P.; Zhao, Y. A.; Wang, H. B.
2016-11-01
The geological disposal of high-level radioactive waste (hereinafter referred to "geological disposal") is a long-term, complex, and systematic scientific project, whose data and information resources in the research and development ((hereinafter referred to ”R&D”) process provide the significant support for R&D of geological disposal system, and lay a foundation for the long-term stability and safety assessment of repository site. However, the data related to the research and engineering in the sitting of the geological disposal repositories is more complicated (including multi-source, multi-dimension and changeable), the requirements for the data accuracy and comprehensive application has become much higher than before, which lead to the fact that the data model design of geo-information database for the disposal repository are facing more serious challenges. In the essay, data resources of the pre-selected areas of the repository has been comprehensive controlled and systematic analyzed. According to deeply understanding of the application requirements, the research work has made a solution for the key technical problems including reasonable classification system of multi-source data entity, complex logic relations and effective physical storage structures. The new solution has broken through data classification and conventional spatial data the organization model applied in the traditional industry, realized the data organization and integration with the unit of data entities and spatial relationship, which were independent, holonomic and with application significant features in HLW geological disposal. The reasonable, feasible and flexible data conceptual models, logical models and physical models have been established so as to ensure the effective integration and facilitate application development of multi-source data in pre-selected areas for geological disposal.
Tracers of the Extraterrestrial Component in Sediments and Inferences for Earth's Accretion History
NASA Technical Reports Server (NTRS)
Kyte, Frank T.
2003-01-01
The study of extraterrestrial matter in sediments began with the discovery of cosmic spherules during the HMS Challenger Expedition (1873-1876), but has evolved into a multidisciplinary study of the chemical, physical, and isotopic study of sediments. Extraterrestrial matter in sediments comes mainly from dust and large impactors from the asteroid belt and comets. What we know of the nature of these source materials comes from the study of stratospheric dust particles, cosmic spherules, micrometeorites, meteorites, and astronomical observations. The most common chemical tracers of extraterrestrial matter in sediments are the siderophile elements, most commonly iridium and other platinum group elements. Physical tracers include cosmic and impact spherules, Ni-rich spinels, meteorites, fossil meteorites, and ocean-impact melt debris. Three types of isotopic systems have been used to trace extraterrestrial matter. Osmium isotopes cannot distinguish chondritic from mantle sources, but provide a useful tool in modeling long-term accretion rates. Helium isotopes can be used to trace the long-term flux of the fine fraction of the interplanetary dust complex. Chromium isotopes can provide unequivocal evidence of an extraterrestrial source for sediments with high concentrations of meteoritic Cr. The terrestrial history of impacts, as recorded in sediments, is still poorly understood. Helium isotopes, multiple Ir anomalies, spherule beds, and craters all indicate a comet shower in the late Eocene. The Cretaceous-Tertiary boundary impact event appears to have been caused by a single carbonaceous chondrite projectile, most likely of asteroid origin. Little is known of the impact record in sediments from the rest of the Phanerozoic. Several impact deposits are known in the Precambrian, including several possible mega-impacts in the Early Archean.
NASA Astrophysics Data System (ADS)
Crochet, M. W.; Gonthier, K. A.
2013-12-01
Systems of hyperbolic partial differential equations are frequently used to model the flow of multiphase mixtures. These equations often contain sources, referred to as nozzling terms, that cannot be posed in divergence form, and have proven to be particularly challenging in the development of finite-volume methods. Upwind schemes have recently shown promise in properly resolving the steady wave solution of the associated multiphase Riemann problem. However, these methods require a full characteristic decomposition of the system eigenstructure, which may be either unavailable or computationally expensive. Central schemes, such as the Kurganov-Tadmor (KT) family of methods, require minimal characteristic information, which makes them easily applicable to systems with an arbitrary number of phases. However, the proper implementation of nozzling terms in these schemes has been mathematically ambiguous. The primary objectives of this work are twofold: first, an extension of the KT family of schemes is proposed that formally accounts for the nonconservative nozzling sources. This modification results in a semidiscrete form that retains the simplicity of its predecessor and introduces little additional computational expense. Second, this modified method is applied to multiple, but equivalent, forms of the multiphase equations to perform a numerical study by solving several one-dimensional test problems. Both ideal and Mie-Grüneisen equations of state are used, with the results compared to an analytical solution. This study demonstrates that the magnitudes of the resulting numerical errors are sensitive to the form of the equations considered, and suggests an optimal form to minimize these errors. Finally, a separate modification of the wave propagation speeds used in the KT family is also suggested that can reduce the extent of numerical diffusion in multiphase flows.
NASA Astrophysics Data System (ADS)
Bonhoff, H. A.; Petersson, B. A. T.
2010-08-01
For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.
The Denver Aerosol Sources and Health (DASH) Study: Overview and Early Findings
Vedal, S.; Hannigan, M.P.; Dutton, S.J.; Miller, S. L.; Milford, J.B.; Rabinovitch, N.; Kim, S.-Y.; Sheppard, L.
2012-01-01
Improved understanding of the sources of air pollution that are most harmful could aid in developing more effective measures for protecting human health. The Denver Aerosol Sources and Health (DASH) study was designed to identify the sources of ambient fine particulate matter (PM2.5) that are most responsible for the adverse health effects of short-term exposure to PM 2.5. Daily 24-hour PM2.5 sampling began in July 2002 at a residential monitoring site in Denver, Colorado, using both Teflon and quartz filter samplers. Sampling is planned to continue through 2008. Chemical speciation is being carried out for mass, inorganic ionic compounds (sulfate, nitrate and ammonium), and carbonaceous components, including elemental carbon, organic carbon, temperature-resolved organic carbon fractions and a large array of organic compounds. In addition, water soluble metals were measured daily for 12 months in 2003. A receptor-based source apportionment approach utilizing positive matrix factorization (PMF) will be used to identify PM 2.5 source contributions for each 24-hour period. Based on a preliminary assessment using synthetic data, the proposed source apportionment should be able to identify many important sources on a daily basis, including secondary ammonium nitrate and ammonium sulfate, diesel vehicle exhaust, road dust, wood combustion and vegetative debris. Meat cooking, gasoline vehicle exhaust and natural gas combustion were more challenging for PMF to accurately identify due to high detection limits for certain organic molecular marker compounds. Measurements of these compounds are being improved and supplemented with additional organic molecular marker compounds. The health study will investigate associations between daily source contributions and an array of health endpoints, including daily mortality and hospitalizations and measures of asthma control in asthmatic children. Findings from the DASH study, in addition to being of interest to policymakers, by identifying harmful PM2.5 sources may provide insights into mechanisms of PM effect. PMID:22723735
The Denver Aerosol Sources and Health (DASH) study: Overview and early findings
NASA Astrophysics Data System (ADS)
Vedal, S.; Hannigan, M. P.; Dutton, S. J.; Miller, S. L.; Milford, J. B.; Rabinovitch, N.; Kim, S.-Y.; Sheppard, L.
Improved understanding of the sources of air pollution that are most harmful could aid in developing more effective measures for protecting human health. The Denver Aerosol Sources and Health (DASH) study was designed to identify the sources of ambient fine particulate matter (PM 2.5) that are most responsible for the adverse health effects of short-term exposure to PM 2.5. Daily 24-h PM 2.5 sampling began in July 2002 at a residential monitoring site in Denver, Colorado, using both Teflon and quartz filter samplers. Sampling is planned to continue through 2008. Chemical speciation is being carried out for mass, inorganic ionic compounds (sulfate, nitrate and ammonium), and carbonaceous components, including elemental carbon, organic carbon, temperature-resolved organic carbon fractions and a large array of organic compounds. In addition, water-soluble metals were measured daily for 12 months in 2003. A receptor-based source apportionment approach utilizing positive matrix factorization (PMF) will be used to identify PM 2.5 source contributions for each 24-h period. Based on a preliminary assessment using synthetic data, the proposed source apportionment should be able to identify many important sources on a daily basis, including secondary ammonium nitrate and ammonium sulfate, diesel vehicle exhaust, road dust, wood combustion and vegetative debris. Meat cooking, gasoline vehicle exhaust and natural gas combustion were more challenging for PMF to accurately identify due to high detection limits for certain organic molecular marker compounds. Measurements of these compounds are being improved and supplemented with additional organic molecular marker compounds. The health study will investigate associations between daily source contributions and an array of health endpoints, including daily mortality and hospitalizations and measures of asthma control in asthmatic children. Findings from the DASH study, in addition to being of interest to policymakers, by identifying harmful PM 2.5 sources may provide insights into mechanisms of PM effect.
Source apportionment of airborne particulates through receptor modeling: Indian scenario
NASA Astrophysics Data System (ADS)
Banerjee, Tirthankar; Murari, Vishnu; Kumar, Manish; Raju, M. P.
2015-10-01
Airborne particulate chemistry mostly governed by associated sources and apportionment of specific sources is extremely essential to delineate explicit control strategies. The present submission initially deals with the publications (1980s-2010s) of Indian origin which report regional heterogeneities of particulate concentrations with reference to associated species. Such meta-analyses clearly indicate the presence of reservoir of both primary and secondary aerosols in different geographical regions. Further, identification of specific signatory molecules for individual source category was also evaluated in terms of their scientific merit and repeatability. Source signatures mostly resemble international profile while, in selected cases lack appropriateness. In India, source apportionment (SA) of airborne particulates was initiated way back in 1985 through factor analysis, however, principal component analysis (PCA) shares a major proportion of applications (34%) followed by enrichment factor (EF, 27%), chemical mass balance (CMB, 15%) and positive matrix factorization (PMF, 9%). Mainstream SA analyses identify earth crust and road dust resuspensions (traced by Al, Ca, Fe, Na and Mg) as a principal source (6-73%) followed by vehicular emissions (traced by Fe, Cu, Pb, Cr, Ni, Mn, Ba and Zn; 5-65%), industrial emissions (traced by Co, Cr, Zn, V, Ni, Mn, Cd; 0-60%), fuel combustion (traced by K, NH4+, SO4-, As, Te, S, Mn; 4-42%), marine aerosols (traced by Na, Mg, K; 0-15%) and biomass/refuse burning (traced by Cd, V, K, Cr, As, TC, Na, K, NH4+, NO3-, OC; 1-42%). In most of the cases, temporal variations of individual source contribution for a specific geographic region exhibit radical heterogeneity possibly due to unscientific orientation of individual tracers for specific source and well exaggerated by methodological weakness, inappropriate sample size, implications of secondary aerosols and inadequate emission inventories. Conclusively, a number of challenging issues and specific recommendations have been included which need to be considered for a scientific apportionment of particulate sources in different geographical regions of India.
Sulaiman-Hill, Cheryl M R; Thompson, Sandra C
2012-04-01
To examine the resettlement experiences and provide data of well being and psychological distress for Afghan and Kurdish refugees settled between eight and 20 years in New Zealand and Australia. Participants completed the Kessler-10 Psychological Distress Scale (K10) and Personal Well Being Index (PWI) for subjective well being. A mixed methods approach was used, with participants also discussing during interview resettlement difficulties, quality of life (QOL) and sources of stress. Data from 81 Muslim participants is reported; all spoke English, were generally well educated with 88% having secondary or tertiary level education, and the majority of those resettled before 2001 lived in Perth. Although psychological distress levels were mostly within the low-moderate risk range, significant differences were observed by gender and employment status. Participants identified a range of ongoing stressors with unemployment of particular concern. Social isolation and a sense that they would never really 'fit in' was also reported by some. Participants particularly valued the safety and improved quality of life in their host communities. Despite their appreciation of the overall resettlement experience, too much time to introspect, separation from family, status dissonance and still occasionally feeling overwhelmed by resettlement challenges is a long-term ongoing reality for some former refugees. Former refugees continue to struggle with unemployment, possible discrimination and loss of status long-term. © 2012 The Authors. ANZJPH © 2012 Public Health Association of Australia.
Beam Stability R&D for the APS MBA Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sereno, Nicholas S.; Arnold, Ned D.; Bui, Hanh D.
2015-01-01
Beam diagnostics required for the APS Multi-bend acromat (MBA) are driven by ambitious beam stability requirements. The major AC stability challenge is to correct rms beam motion to 10% the rms beam size at the insertion device source points from0.01 to 1000 Hz. The vertical plane represents the biggest challenge forAC stability, which is required to be 400 nm rms for a 4-micron vertical beam size. In addition to AC stability, long-term drift over a period of seven days is required to be 1 micron or less. Major diagnostics R&D components include improved rf beam position processing using commercially availablemore » FPGA-based BPM processors, new X-ray beam position monitors based on hard X-ray fluorescence from copper and Compton scattering off diamond, mechanical motion sensing to detect and correct long-term vacuum chamber drift, a new feedback system featuring a tenfold increase in sampling rate, and a several-fold increase in the number of fast correctors and BPMs in the feedback algorithm. Feedback system development represents a major effort, and we are pursuing development of a novel algorithm that integrates orbit correction for both slow and fast correctors down to DC simultaneously. Finally, a new data acquisition system (DAQ) is being developed to simultaneously acquire streaming data from all diagnostics as well as the feedback processors for commissioning and fault diagnosis. Results of studies and the design effort are reported.« less
Plasma Physics Challenges of MM-to-THz and High Power Microwave Generation
NASA Astrophysics Data System (ADS)
Booske, John
2007-11-01
Homeland security and military defense technology considerations have stimulated intense interest in mobile, high power sources of millimeter-wave to terahertz regime electromagnetic radiation, from 0.1 to 10 THz. While sources at the low frequency end, i.e., the gyrotron, have been deployed or are being tested for diverse applications such as WARLOC radar and active denial systems, the challenges for higher frequency sources have yet to be completely met for applications including noninvasive sensing of concealed weapons and dangerous agents, high-data-rate communications, and high resolution spectroscopy and atmospheric sensing. The compact size requirements for many of these high frequency sources requires miniscule, micro-fabricated slow wave circuits with high rf ohmic losses. This necessitates electron beams with not only very small transverse dimensions but also very high current density for adequate gain. Thus, the emerging family of mm-to-THz e-beam-driven vacuum electronics devices share many of the same plasma physics challenges that currently confront ``classic'' high power microwave (HPM) generators [1] including bright electron sources, intense beam transport, energetic electron interaction with surfaces and rf air breakdown at output windows. Multidimensional theoretical and computational models are especially important for understanding and addressing these challenges. The contemporary plasma physics issues, recent achievements, as well as the opportunities and outlook on THz and HPM will be addressed. [1] R.J. Barker, J.H. Booske, N.C. Luhmann, and G.S. Nusinovich, Modern Microwave and Millimeter-Wave Power Electronics (IEEE/Wiley, 2005).
Multi-field query expansion is effective for biomedical dataset retrieval.
Bouadjenek, Mohamed Reda; Verspoor, Karin
2017-01-01
In the context of the bioCADDIE challenge addressing information retrieval of biomedical datasets, we propose a method for retrieval of biomedical data sets with heterogenous schemas through query reformulation. In particular, the method proposed transforms the initial query into a multi-field query that is then enriched with terms that are likely to occur in the relevant datasets. We compare and evaluate two query expansion strategies, one based on the Rocchio method and another based on a biomedical lexicon. We then perform a comprehensive comparative evaluation of our method on the bioCADDIE dataset collection for biomedical retrieval. We demonstrate the effectiveness of our multi-field query method compared to two baselines, with MAP improved from 0.2171 and 0.2669 to 0.2996. We also show the benefits of query expansion, where the Rocchio expanstion method improves the MAP for our two baselines from 0.2171 and 0.2669 to 0.335. We show that the Rocchio query expansion method slightly outperforms the one based on the biomedical lexicon as a source of terms, with an improvement of roughly 3% for MAP. However, the query expansion method based on the biomedical lexicon is much less resource intensive since it does not require computation of any relevance feedback set or any initial execution of the query. Hence, in term of trade-off between efficiency, execution time and retrieval accuracy, we argue that the query expansion method based on the biomedical lexicon offers the best performance for a prototype biomedical data search engine intended to be used at a large scale. In the official bioCADDIE challenge results, although our approach is ranked seventh in terms of the infNDCG evaluation metric, it ranks second in term of P@10 and NDCG. Hence, the method proposed here provides overall good retrieval performance in relation to the approaches of other competitors. Consequently, the observations made in this paper should benefit the development of a Data Discovery Index prototype or the improvement of the existing one. © The Author(s) 2017. Published by Oxford University Press.
Multi-field query expansion is effective for biomedical dataset retrieval
2017-01-01
Abstract In the context of the bioCADDIE challenge addressing information retrieval of biomedical datasets, we propose a method for retrieval of biomedical data sets with heterogenous schemas through query reformulation. In particular, the method proposed transforms the initial query into a multi-field query that is then enriched with terms that are likely to occur in the relevant datasets. We compare and evaluate two query expansion strategies, one based on the Rocchio method and another based on a biomedical lexicon. We then perform a comprehensive comparative evaluation of our method on the bioCADDIE dataset collection for biomedical retrieval. We demonstrate the effectiveness of our multi-field query method compared to two baselines, with MAP improved from 0.2171 and 0.2669 to 0.2996. We also show the benefits of query expansion, where the Rocchio expanstion method improves the MAP for our two baselines from 0.2171 and 0.2669 to 0.335. We show that the Rocchio query expansion method slightly outperforms the one based on the biomedical lexicon as a source of terms, with an improvement of roughly 3% for MAP. However, the query expansion method based on the biomedical lexicon is much less resource intensive since it does not require computation of any relevance feedback set or any initial execution of the query. Hence, in term of trade-off between efficiency, execution time and retrieval accuracy, we argue that the query expansion method based on the biomedical lexicon offers the best performance for a prototype biomedical data search engine intended to be used at a large scale. In the official bioCADDIE challenge results, although our approach is ranked seventh in terms of the infNDCG evaluation metric, it ranks second in term of P@10 and NDCG. Hence, the method proposed here provides overall good retrieval performance in relation to the approaches of other competitors. Consequently, the observations made in this paper should benefit the development of a Data Discovery Index prototype or the improvement of the existing one. PMID:29220457
Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E
1994-01-01
OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788
A Semi-implicit Treatment of Porous Media in Steady-State CFD.
Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund
There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.
Sharma, Ashok K; Cook, Stephen; Tjandraatmadja, Grace; Gregory, Alan
2012-01-01
Water sensitive urban developments are designed with integrated urban water management concepts and water sensitive urban design measures. The initiatives that may be included are the substitution of imported drinking water with alternative sources using a fit-for-purpose approach and structural and non-structural measures for the source control of stormwater. A water sensitive approach to urban development can help in achieving sustainability objectives by minimising disturbance to ecological and hydrological processes, and also relieve stress on conventional water systems. Water sensitive urban developments remain novel in comparison with conventional approaches, so the understanding and knowledge of the systems in regards to their planning; design; implementation; operation and maintenance; health impacts and environmental impacts is still developing and thus the mainstream uptake of these approaches faces many challenges. A study has been conducted to understand these challenges through a detailed literature review, investigating a large number of local greenfield and infill developments, and conducting extensive consultation with water professionals. This research has identified the social, economic, political, institutional and technological challenges faced in implementing water sensitive urban design in greenfield and infill developments. The research found in particular that there is the need for long-term monitoring studies of water sensitive urban developments. This monitoring is important to validate the performance of novel approaches implemented and improve associated guidelines, standards, and regulatory and governance frameworks, which can lead to mainstream acceptance of water sensitive urban development approaches. The dissemination of this research will help generate awareness among water professionals, water utilities, developers, planners and regulators of the research challenges to be addressed in order to achieve more mainstream acceptance of water sensitive approaches to urban development. This study is based on existing water sensitive urban developments in Australia, however, the methodology adopted in investigating impediments to the uptake of these developments can be applied globally. It is hoped that insights from this study will benefit water professionals in other countries where there is also a move towards water sensitive urban development.
ERIC Educational Resources Information Center
Roman, Harry T.
2011-01-01
Street intersections are a source of accidents--for both automobiles and pedestrians. This article presents an intersection challenge that allows students to explore some possible ways to change the traditional intersection. In this challenge, teachers open up the boundaries and allow students to redesign their world. The first step is to help…
NASA Astrophysics Data System (ADS)
Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne
2014-01-01
Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.
Material challenges for transducer designers in the 21st century
NASA Astrophysics Data System (ADS)
Lindberg, Jan F.
2002-07-01
The modern U.S. Navy is rapidly evolving to meet the challenges of operating in the littorals. This focus changes the rules, especially to the designers of sonar systems that now need to aggressively engage quiet diesel electric submarine threats and neutralize sophisticated underwater mines. These new responsibilities dictate that new concepts be developed. To meet these new demands on the sonar system, transducer designers are being tasked to design transducers and to utilize new materials to address performance requirements that were never even imagined a decade ago. Sensor needs are no longer limited to pressure types but now have to sense velocity or acceleration. Sources are challenged to both frequency extent and power levels. The need to physically move sources off of submarines and surface combatants and onto vehicles with limited energy capabilities prompt the challenge of efficient bandwidth and high coupling. These are the needs of the 'next Navy'; the needs of the 'Navy after next' will present an even more demanding scenario. The future will demand revolutionary technology at the micro level with devices utilizing new power sources and new materials.
The challenges of studying visual expertise in medical image diagnosis.
Gegenfurtner, Andreas; Kok, Ellen; van Geel, Koos; de Bruin, Anique; Jarodzka, Halszka; Szulewski, Adam; van Merriënboer, Jeroen Jg
2017-01-01
Visual expertise is the superior visual skill shown when executing domain-specific visual tasks. Understanding visual expertise is important in order to understand how the interpretation of medical images may be best learned and taught. In the context of this article, we focus on the visual skill of medical image diagnosis and, more specifically, on the methodological set-ups routinely used in visual expertise research. We offer a critique of commonly used methods and propose three challenges for future research to open up new avenues for studying characteristics of visual expertise in medical image diagnosis. The first challenge addresses theory development. Novel prospects in modelling visual expertise can emerge when we reflect on cognitive and socio-cultural epistemologies in visual expertise research, when we engage in statistical validations of existing theoretical assumptions and when we include social and socio-cultural processes in expertise development. The second challenge addresses the recording and analysis of longitudinal data. If we assume that the development of expertise is a long-term phenomenon, then it follows that future research can engage in advanced statistical modelling of longitudinal expertise data that extends the routine use of cross-sectional material through, for example, animations and dynamic visualisations of developmental data. The third challenge addresses the combination of methods. Alternatives to current practices can integrate qualitative and quantitative approaches in mixed-method designs, embrace relevant yet underused data sources and understand the need for multidisciplinary research teams. Embracing alternative epistemological and methodological approaches for studying visual expertise can lead to a more balanced and robust future for understanding superior visual skills in medical image diagnosis as well as other medical fields. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Patel, Vainav; Jalah, Rashmi; Kulkarni, Viraj; Valentin, Antonio; Rosati, Margherita; Alicea, Candido; von Gegerfelt, Agneta; Huang, Wensheng; Guan, Yongjun; Keele, Brandon F; Bess, Julian W; Piatak, Michael; Lifson, Jeffrey D; Williams, William T; Shen, Xiaoying; Tomaras, Georgia D; Amara, Rama R; Robinson, Harriet L; Johnson, Welkin; Broderick, Kate E; Sardesai, Niranjan Y; Venzon, David J; Hirsch, Vanessa M; Felber, Barbara K; Pavlakis, George N
2013-02-19
We have previously shown that macaques vaccinated with DNA vectors expressing SIVmac239 antigens developed potent immune responses able to reduce viremia upon high-dose SIVmac251 challenge. To further improve vaccine-induced immunity and protection, we combined the SIVmac239 DNA vaccine with protein immunization using inactivated SIVmac239 viral particles as protein source. Twenty-six weeks after the last vaccination, the animals were challenged intrarectally at weekly intervals with a titrated dose of the heterologous SIVsmE660. Two of DNA-protein coimmunized macaques did not become infected after 14 challenges, but all controls were infected by 11 challenges. Vaccinated macaques showed modest protection from SIVsmE660 acquisition compared with naïve controls (P = 0.050; stratified for TRIM5α genotype). Vaccinees had significantly lower peak (1.6 log, P = 0.0048) and chronic phase viremia (P = 0.044), with 73% of the vaccinees suppressing viral replication to levels below assay detection during the 40-wk follow-up. Vaccine-induced immune responses associated significantly with virus control: binding antibody titers and the presence of rectal IgG to SIVsmE660 Env correlated with delayed SIVsmE660 acquisition; SIV-specific cytotoxic T cells, prechallenge CD4(+) effector memory, and postchallenge CD8(+) transitional memory cells correlated with control of viremia. Thus, SIVmac239 DNA and protein-based vaccine protocols were able to achieve high, persistent, broad, and effective cellular and humoral immune responses able to delay heterologous SIVsmE660 infection and to provide long-term control of viremia. These studies support a role of DNA and protein-based vaccines for development of an efficacious HIV/AIDS vaccine.
The Future of Water Security in Metropolitan Region of Sao Paulo Through Different Climate Scenarios
NASA Astrophysics Data System (ADS)
Gesualdo, G. C.; Oliveira, P. T. S.; Rodrigues, D. B. B.
2017-12-01
Achieving a balance between water availability and demand is one of the most pressing environmental challenges in the twenty-first century. This challenge is exacerbated by, climate change, which has already affected the water balance of landscapes globally by intensifying runoff, reducing snowpacks, and shifting precipitation regimes. Understanding these changes is crucial to identifying future water availability and developing sustainable management plans, especially in developing countries. Here, we address the developing country water balance challenge by assessing the influence of climate change on the water availability in the Jaguari basin, Southeastern Brazil. The Jaguari basin is one of the main sources of freshwater for 9 million people in the Metropolitan Region of São Paulo. This region represents about 7% of the Brazil's Gross Domestic Product. The critical importance of the water balance challenge in this area has been highlighted recently when a major drought in southeastern Brazil revealed the vulnerability of current water management systems. Still today, the per capita water availability in the region remains severely limited. To help address this water balance challenge, we use a modeling approach to predict future water vulnerabilities of this region under different climate scenarios. Here, we calibrated and validated a lumped conceptual model using HYMOD to evaluate future scenarios using downscaled climate models resulting from HadGEM2-ES and MIROC5 GCMs forced by RCP4.5 and RCP8.5 scenarios. We also present future directions which include bias correction from long-term weather station data and an empirical uncertainty assessment. Our results provide an important overview of climate change impacts on streamflow and future water availability in the Jaguari basin, which can be used to guide the basin`s water security plans and strategies.
Zhao, Lixiang; Gao, Song; Huan, Haixia; Xu, Xiaojing; Zhu, Xiaoping; Yang, Weixia; Gao, Qingqing; Liu, Xiufan
2009-05-01
Avian pathogenic Escherichia coli (APEC) and uropathogenic E. coli (UPEC) establish infections in extraintestinal habitats of different hosts. As the diversity, epidemiological sources and evolutionary origins of extraintestinal pathogenic E. coli (ExPEC) are so far only partially defined, in the present study,100 APEC isolates and 202 UPEC isolates were compared by their content of virulence genes and phylogenetic groups. The two groups showed substantial overlap in terms of their serogroups, phylogenetic groups and virulence genotypes, including their possession of certain genes associated with large transmissible plasmids of APEC. In a chicken challenge model, both UPEC U17 and APEC E058 had similar LD(50), demonstrating that UPEC U17 had the potential to cause significant disease in poultry. To gain further information about the similarities between UPEC and APEC, the in vivo expression of 152 specific genes of UPEC U17 and APEC E058 in both a murine urinary tract infection (UTI) model and a chicken challenge model was compared with that of these strains grown statically to exponential phase in rich medium. It was found that in the same model (murine UTI or chicken challenge), various genes of UPEC U17 and APEC E058 showed a similar tendency of expression. Several iron-related genes were upregulated in the UTI model and/or chicken challenge model, indicating that iron acquisition is important for E. coli to survive in blood or the urinary tract. Based on these results, the potential for APEC to act as human UPEC or as a reservoir of virulence genes for UPEC should be considered. Further, this study compared the transcriptional profile of virulence genes among APEC and UPEC in vivo.
McMullen, Carmit; Altschuler, Andrea; Bulkley, Joanna; Grant, Marcia; Hornbrook, Mark; Krouse, Robert
2012-01-01
Background Patients surgically treated for rectal cancer receive either an intestinal ostomy (externalization of the bowel to the abdominal wall) or, more frequently, an anastomosis (reconnection) of the rectum. While the challenges of intestinal ostomies have been previously described by this research team, much less is known about the long-term challenges of living with an anastomosis. Understanding the challenges of long-term rectal cancer survivors with both types of surgeries is important for informing and improving current practice. Methods We mailed our survey to 1000 long-term (at least 5 years post-diagnosis) rectal cancer survivors in KP Northern California and KP Northwest during 2010–2011. Our overall response rate was 57.7% (577/1000). The survey contained an open-ended question that asked respondents to write about the greatest challenge they experienced after their cancer surgery. Seventy-three percent of respondents provided a response to this “greatest challenge” question. Responses were analyzed qualitatively to compare the challenges reported by patients with anastomosis vs. ostomy. Results Challenges related to managing bowel function and output were found in both groups. Ostomy patients reported challenges to managing ostomy equipment that were unique to their condition—ostomy appliance failures, skin breakdown around the ostomy, and finding suitable places to empty, clean, and reconnect their appliance. Other notable differences in the greatest challenges among ostomy and anastomosis patients included: patients with an ostomy reported a range of psychosocial challenges relating to depression, shame, stigma, and post-operative psychological trauma about having an ostomy and such psychosocial impacts were notably absent among anastomosis patients; patients with ostomies reported regret about having an ostomy, but patients with anastomosis did not report regret about the surgery they received; and, anastomosis patients mentioned more challenges from radiation after effects, including pain, fistulae, and strictures. Discussion Our findings about rectal cancer survivors with ostomies mirror previously published reports. Even in the face of impaired bowel function, rectal cancer survivors with anastomoses express little psychological distress or regret about treatment choice. The lasting effects of radiation therapy, however, are of special concern to this group.
Financial anatomy of biomedical research.
Moses, Hamilton; Dorsey, E Ray; Matheson, David H M; Thier, Samuel O
2005-09-21
Public and private financial support of biomedical research have increased over the past decade. Few comprehensive analyses of the sources and uses of funds are available. This results in inadequate information on which to base investment decisions because not all sources allow equal latitude to explore hypotheses having scientific or clinical importance and creates a barrier to judging the value of research to society. To quantify funding trends from 1994 to 2004 of basic, translational, and clinical biomedical research by principal sponsors based in the United States. Publicly available data were compiled for the federal, state, and local governments; foundations; charities; universities; and industry. Proprietary (by subscription but openly available) databases were used to supplement public sources. Total actual research spending, growth rates, and type of research with inflation adjustment. Biomedical research funding increased from 37.1 billion dollars in 1994 to 94.3 billion dollars in 2003 and doubled when adjusted for inflation. Principal research sponsors in 2003 were industry (57%) and the National Institutes of Health (28%). Relative proportions from all public and private sources did not change. Industry sponsorship of clinical trials increased from 4.0 dollars to 14.2 billion dollars (in real terms) while federal proportions devoted to basic and applied research were unchanged. The United States spent an estimated 5.6% of its total health expenditures on biomedical research, more than any other country, but less than 0.1% for health services research. From an economic perspective, biotechnology and medical device companies were most productive, as measured by new diagnostic and therapeutic devices per dollar of research and development cost. Productivity declined for new pharmaceuticals. Enhancing research productivity and evaluation of benefit are pressing challenges, requiring (1) more effective translation of basic scientific knowledge to clinical application; (2) critical appraisal of rapidly moving scientific areas to guide investment where clinical need is greatest, not only where commercial opportunity is currently perceived; and (3) more specific information about sources and uses of research funds than is generally available to allow informed investment decisions. Responsibility falls on industry, government, and foundations to bring these changes about with a longer-term view of research value.
Multisource inverse-geometry CT. Part I. System concept and development
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Man, Bruno, E-mail: deman@ge.com; Harrison, Dan
Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: Themore » authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals.« less
Implementation of MAR within the Rio Grande Basin of Central New Mexico, USA
NASA Astrophysics Data System (ADS)
Marley, Robert; Blandford, T. Neil; Ewing, Amy; Webb, Larry; Yuhas, Katherine
2014-05-01
The U.S. Bureau of Reclamation has identified the Rio Grande basin within Central New Mexico as one of several regions where water supplies are over-allocated and future conflicts over the inadequate resource are highly likely. Local water providers have consistently identified managed aquifer recharge (MAR) as an important tool to provide conjunctive management of surface-water, groundwater, and reclaimed water sources in order to extend the useful life of existing water sources. However, MAR projects have been slow to take root partly due to rigorous demonstration requirements, groundwater quality protection concerns, and ongoing water right uncertainties. At first glance the several thousand meters of unconsolidated basin-fill sediments hosting the regional aquifer appear to provide an ideal environment for the subsurface storage of surplus water. However, the basin has a complex structural and depositional history that impacts the siting and overall effectiveness of MAR systems. Several recharge projects are now in various stages of implementation and are overcoming site specific challenges including source water and ambient groundwater compatibility, low-permeability sediments and compartmentalization of the aquifer by extensive faulting, well clogging, and overall water quality management. This presentation will highlight ongoing efforts of these water providers to develop full-scale recharge facilities. The performance of natural in-channel infiltration, engineered infiltration galleries, and direct injection systems designed to introduce from 500 to 5,000 mega-liters per annum to target intervals present from 150 to 600 meters below ground surface will be described. Source waters for recharge operations include inter-basin transferred surface water and highly treated reclaimed water sources requiring from minor to extensive treatment pre-recharge and post-recovery. Operational complexities have raised concerns related to long-term operation and maintenance and overall economic sustainability of these projects. Further, potential reduction in surface water return flows as a result of recharge operations and impacts to other water users during recovery of the stored water must be considered. Proposed rules for long-term storage, estimating water losses, and eventual water recovery as they relate to water rights administration within stream-connected aquifer systems will also be outlined during the presentation.
Multisource inverse-geometry CT. Part I. System concept and development
De Man, Bruno; Uribe, Jorge; Baek, Jongduk; Harrison, Dan; Yin, Zhye; Longtin, Randy; Roy, Jaydeep; Waters, Bill; Wilson, Colin; Short, Jonathan; Inzinna, Lou; Reynolds, Joseph; Neculaes, V. Bogdan; Frutschy, Kristopher; Senzig, Bob; Pelc, Norbert
2016-01-01
Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: The authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals. PMID:27487877
Beamed Energy Propulsion: Research Status And Needs--Part 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birkan, Mitat
One promising solution to the operationally responsive space is the application of remote electromagnetic energy to propel a launch vehicle into orbit. With beamed energy propulsion, one can leave the power source stationary on the ground or space, and direct heat propellant on the spacecraft with a beam from a fixed station. This permits the spacecraft to leave its power source at home, saving significant amounts of mass, greatly improving performance. This concept, which removes the mass penalty of carrying the propulsion energy source on board the vehicle, was first proposed by Arthur Kantrowitz in 1972; he invoked an extremelymore » powerful ground based laser. The same year Michael Minovich suggested a conceptually similar 'in-space' laser rocket system utilizing a remote laser power station. In the late 1980's, Air Force Office of Scientific Research (AFOSR) funded continuous, double pulse laser and microwave propulsion while Strategic Defense Initiative Office (SDIO) funded ablative laser rocket propulsion. Currently AFOSR has been funding the concept initiated by Leik Myrabo, repetitively pulsed laser propulsion, which has been universally perceived, arguably, to be the closest for mid-term applications. This 2-part paper examines the investment strategies in beamed energy propulsion and technical challenges to be covers Part 2 covers the present research status and needs.« less
NASA Astrophysics Data System (ADS)
Fanos, Ali Mutar; Pradhan, Biswajeet
2018-04-01
Rockfall poses risk to people, their properties and to transportation ways in mountainous and hilly regions. This catastrophe shows various characteristics such as vast distribution, sudden occurrence, variable magnitude, strong fatalness and randomicity. Therefore, prediction of rockfall phenomenon both spatially and temporally is a challenging task. Digital Terrain model (DTM) is one of the most significant elements in rockfall source identification and risk assessment. Light detection and ranging (LiDAR) is the most advanced effective technique to derive high-resolution and accurate DTM. This paper presents a critical overview of rockfall phenomenon (definition, triggering factors, motion modes and modeling) and LiDAR technique in terms of data pre-processing, DTM generation and the factors that can be obtained from this technique for rockfall source identification and risk assessment. It also reviews the existing methods that are utilized for the evaluation of the rockfall trajectories and their characteristics (frequency, velocity, bouncing height and kinetic energy), probability, susceptibility, hazard and risk. Detail consideration is given on quantitative methodologies in addition to the qualitative ones. Various methods are demonstrated with respect to their application scales (local and regional). Additionally, attention is given to the latest improvement, particularly including the consideration of the intensity of the phenomena and the magnitude of the events at chosen sites.
Lambert, Michelle; Chivers, Paola; Farringdon, Fiona
2018-06-11
University students generally make independent decisions regarding food choices. Current research about knowledge of Australian Dietary Guidelines (ADG), sources of nutrition information and influences on food choices for this group is scarce. Qualitative data was collected from gender separated focus groups comprising four female (n=31) and four male (n=18) to identify: knowledge of ADG, sources of nutrition information; factors that influence food choices; perceived relevant nutrition messages and how best to deliver them. Gaps in knowledge were identified particularly regarding number of serves and serving size for food groups. Social media was the most commonly reported source of knowledge. Social media was also a major influence on food choice due to its impact on body ideals. Current health promotion nutrition messages were perceived irrelevant given the focus on long-term health risks. Health and adhering to the ADG were not identified as important. The desire to look a particular way was the major influence on food choices. SO WHAT?: While there is an awareness of ADG, our participants made a deliberate decision not to follow them. This provides a challenge for developing relevant preventive health messages for this target audience. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Shoot, shovel and shut up: cryptic poaching slows restoration of a large carnivore in Europe.
Liberg, Olof; Chapron, Guillaume; Wabakken, Petter; Pedersen, Hans Christian; Hobbs, N Thompson; Sand, Håkan
2012-03-07
Poaching is a widespread and well-appreciated problem for the conservation of many threatened species. Because poaching is illegal, there is strong incentive for poachers to conceal their activities, and consequently, little data on the effects of poaching on population dynamics are available. Quantifying poaching mortality should be a required knowledge when developing conservation plans for endangered species but is hampered by methodological challenges. We show that rigorous estimates of the effects of poaching relative to other sources of mortality can be obtained with a hierarchical state-space model combined with multiple sources of data. Using the Scandinavian wolf (Canis lupus) population as an illustrative example, we show that poaching accounted for approximately half of total mortality and more than two-thirds of total poaching remained undetected by conventional methods, a source of mortality we term as 'cryptic poaching'. Our simulations suggest that without poaching during the past decade, the population would have been almost four times as large in 2009. Such a severe impact of poaching on population recovery may be widespread among large carnivores. We believe that conservation strategies for large carnivores considering only observed data may not be adequate and should be revised by including and quantifying cryptic poaching.
Shoot, shovel and shut up: cryptic poaching slows restoration of a large carnivore in Europe
Liberg, Olof; Chapron, Guillaume; Wabakken, Petter; Pedersen, Hans Christian; Hobbs, N. Thompson; Sand, Håkan
2012-01-01
Poaching is a widespread and well-appreciated problem for the conservation of many threatened species. Because poaching is illegal, there is strong incentive for poachers to conceal their activities, and consequently, little data on the effects of poaching on population dynamics are available. Quantifying poaching mortality should be a required knowledge when developing conservation plans for endangered species but is hampered by methodological challenges. We show that rigorous estimates of the effects of poaching relative to other sources of mortality can be obtained with a hierarchical state–space model combined with multiple sources of data. Using the Scandinavian wolf (Canis lupus) population as an illustrative example, we show that poaching accounted for approximately half of total mortality and more than two-thirds of total poaching remained undetected by conventional methods, a source of mortality we term as ‘cryptic poaching’. Our simulations suggest that without poaching during the past decade, the population would have been almost four times as large in 2009. Such a severe impact of poaching on population recovery may be widespread among large carnivores. We believe that conservation strategies for large carnivores considering only observed data may not be adequate and should be revised by including and quantifying cryptic poaching. PMID:21849323
Governing GMOs in the USA: science, law and public health.
Yang, Y Tony; Chen, Brian
2016-04-01
Controversy surrounds the production and consumption of genetically modified organisms (GMOs). Proponents argue that GMO food sources represent the only viable solution to food shortages in an ever-growing global population. Science reports no harm from GMO use and consumption so far. Opponents fear the potentially negative impact that GMO development and use could have on the environment and consumers, and are concerned about the lack of data on the long-term effects of GMO use. We discuss the development of GMO food sources, the history of legislation and policy for the labeling requirements of GMO food products, and the health, environmental, and legal rationale for and against GMO food labeling. The Food and Drug Administration regulates food with GMOs within a coordinated framework of federal agencies. Despite mounting scientific evidence that GMO foods are substantially equivalent to traditionally bred food sources, debate remains over the appropriateness of GMO food labeling. In fact, food manufacturers have mounted a First Amendment challenge against Vermont's passage of a law that requires GMO labeling. Mandatory GMO labeling is not supported by science. Compulsory GMO labels may not only hinder the development of agricultural biotechnology, but may also exacerbate the misconception that GMOs endanger people's health. © 2015 Society of Chemical Industry.
The Swift Supergiant Fast X-ray Transient Project
NASA Astrophysics Data System (ADS)
Romano, P.; Barthelmy, S.; Bozzo, E.; Burrows, D.; Ducci, L.; Esposito, P.; Evans, P.; Kennea, J.; Krimm, H.; Vercellone, S.
2017-10-01
We present the Swift Supergiant Fast X-ray Transients project, a systematic study of SFXTs and classical supergiant X-ray binaries (SGXBs) through efficient long-term monitoring of 17 sources including SFXTs and classical SGXBs across more than 4 orders of magnitude in X-ray luminosity on timescales from hundred seconds to years. We derived dynamic ranges, duty cycles, and luminosity distributions to highlight systematic differences that help discriminate between different theoretical models proposed to explain the differences between the wind accretion processes in SFXTs and classical SGXBs. Our follow-ups of the SFXT outbursts provide a steady advancement in the comprehension of the mechanisms triggering the high X-ray level emission of these sources. In particular, the observations of the outburst of the SFXT prototype IGR J17544-2619, when the source reached a peak X-ray luminosity of 3×10^{38} erg s^{-1}, challenged for the first time the maximum theoretical luminosity achievable by a wind-fed neutron star high mass X-ray binary. We propose that this giant outburst was due to the formation of a transient accretion disc around the compact object. We also created a catalogue of over 1000 BAT flares which we use to predict the observability and perspectives with future missions.
Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo
The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.
2011-11-30
OH: South- Western Cengage Learning. Mankiw , N. G. (2006). Principles of economics (4th ed.). Mason, OH: Thompson South- Western. Private...When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic times, it is now more...decision uncertainties. When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic
Uncertainty, variability, and earthquake physics in ground‐motion prediction equations
Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.
2017-01-01
Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20 km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maingi, Rajesh; Zinkle, Steven J.; Foster, Mark S.
2015-05-01
The realization of controlled thermonuclear fusion as an energy source would transform society, providing a nearly limitless energy source with renewable fuel. Under the auspices of the U.S. Department of Energy, the Fusion Energy Sciences (FES) program management recently launched a series of technical workshops to “seek community engagement and input for future program planning activities” in the targeted areas of (1) Integrated Simulation for Magnetic Fusion Energy Sciences, (2) Control of Transients, (3) Plasma Science Frontiers, and (4) Plasma-Materials Interactions aka Plasma-Materials Interface (PMI). Over the past decade, a number of strategic planning activities1-6 have highlighted PMI and plasmamore » facing components as a major knowledge gap, which should be a priority for fusion research towards ITER and future demonstration fusion energy systems. There is a strong international consensus that new PMI solutions are required in order for fusion to advance beyond ITER. The goal of the 2015 PMI community workshop was to review recent innovations and improvements in understanding the challenging PMI issues, identify high-priority scientific challenges in PMI, and to discuss potential options to address those challenges. The community response to the PMI research assessment was enthusiastic, with over 80 participants involved in the open workshop held at Princeton Plasma Physics Laboratory on May 4-7, 2015. The workshop provided a useful forum for the scientific community to review progress in scientific understanding achieved during the past decade, and to openly discuss high-priority unresolved research questions. One of the key outcomes of the workshop was a focused set of community-initiated Priority Research Directions (PRDs) for PMI. Five PRDs were identified, labeled A-E, which represent community consensus on the most urgent near-term PMI scientific issues. For each PRD, an assessment was made of the scientific challenges, as well as a set of actions to address those challenges. No prioritization was attempted amongst these five PRDs. We note that ITER, an international collaborative project to substantially extend fusion science and technology, is implicitly a driver and beneficiary of the research described in these PRDs; specific ITER issues are discussed in the background and PRD chapters. For succinctness, we describe these PRDs directly below; a brief introduction to magnetic fusion and the workshop process/timeline is given in Chapter I, and panelists are listed in the Appendix.« less
Assessment of vaccine testing at three laboratories using the guinea pig model of tuberculosis.
Grover, Ajay; Troudt, Jolynn; Arnett, Kimberly; Izzo, Linda; Lucas, Megan; Strain, Katie; McFarland, Christine; Hall, Yper; McMurray, David; Williams, Ann; Dobos, Karen; Izzo, Angelo
2012-01-01
The guinea pig model of tuberculosis is used extensively in different locations to assess the efficacy of novel tuberculosis vaccines during pre-clinical development. Two key assays are used to measure protection against virulent challenge: a 30 day post-infection assessment of mycobacterial burden and long-term post-infection survival and pathology analysis. To determine the consistency and robustness of the guinea pig model for testing vaccines, a comparative assessment between three sites that are currently involved in testing tuberculosis vaccines from external providers was performed. Each site was asked to test two "subunit" type vaccines in their routine animal model as if testing vaccines from a provider. All sites performed a 30 day study, and one site also performed a long-term survival/pathology study. Despite some differences in experimental approach between the sites, such as the origin of the Mycobacterium tuberculosis strain and the type of aerosol exposure device used to infect the animals and the source of the guinea pigs, the data obtained between sites were consistent in regard to the ability of each "vaccine" tested to reduce the mycobacterial burden. The observations also showed that there was good concurrence between the results of short-term and long-term studies. This validation exercise means that efficacy data can be compared between sites. Copyright © 2011 Elsevier Ltd. All rights reserved.
Teaching Energy to a General Audience
NASA Astrophysics Data System (ADS)
Baski, Alison; Hunnicutt, Sally
2010-02-01
A new, interdisciplinary course entitled ``Energy!'' has been developed by faculty in the physics and chemistry departments to meet the university's science and technology general education requirement. This course now enrolls over 400 students each semester in a single lecture where faculty from both departments co-teach throughout the term. Topics include the fundamentals of energy, fossil fuels, global climate change, nuclear energy, and renewable energy sources. The students represent an impressive range of majors (science, engineering, business, humanities, etc.) and comprise freshmen to seniors. To effectively teach this diverse audience and increase classroom engagement, in-class ``clickers'' are used with guided questions to teach concepts, which are then explicitly reinforced with online LON-CAPAfootnotetextFree open-source distributed learning content management and assessment system (www.lon-capa.org) homework. This online system enables immediate feedback in a structured manner, where students can practice randomized versions of problems for homework, quizzes, and exams. The course is already in high demand after only two semesters, in part because it is particularly relevant to students given the challenging energy and climate issues facing the nation and world. )
The impact of circulation control on rotary aircraft controls systems
NASA Technical Reports Server (NTRS)
Kingloff, R. F.; Cooper, D. E.
1987-01-01
Application of circulation to rotary wing systems is a new development. Efforts to determine the near and far field flow patterns and to analytically predict those flow patterns have been underway for some years. Rotary wing applications present a new set of challenges in circulation control technology. Rotary wing sections must accommodate substantial Mach number, free stream dynamic pressure and section angle of attack variation at each flight condition within the design envelope. They must also be capable of short term circulation blowing modulation to produce control moments and vibration alleviation in addition to a lift augmentation function. Control system design must provide this primary control moment, vibration alleviation and lift augmentation function. To accomplish this, one must simultaneously control the compressed air source and its distribution. The control law algorithm must therefore address the compressor as the air source, the plenum as the air pressure storage and the pneumatic flow gates or valves that distribute and meter the stored pressure to the rotating blades. Also, mechanical collective blade pitch, rotor shaft angle of attack and engine power control must be maintained.
Imagery for Disaster Response and Recovery
NASA Astrophysics Data System (ADS)
Bethel, G. R.
2011-12-01
Exposing the remotely sensed imagery for disaster response and recovery can provide the basis for an unbiased understanding of current conditions. Having created consolidated remotely sensed and geospatial data sources documents for US and Foreign disasters over the past six years, availability and usability are continuing to evolve. By documenting all existing sources of imagery and value added products, the disaster response and recovery community can develop actionable information. The past two years have provided unique situations to use imagery including a major humanitarian disaster and response effort in Haiti, a major environmental disaster in the Gulf of Mexico, a killer tornado in Joplin Missouri and long-term flooding in the Midwest. Each disaster presents different challenges and requires different spatial resolutions, spectral properties and/or multi-temporal collections. The community of data providers continues to expand with organized actives such as the International Charter for Space and Major Disasters and acquisitions by the private sector for the public good rather than for profit. However, data licensing, the lack of cross-calibration and inconsistent georeferencing hinder optimal use. Recent pre-event imagery is a critial component to any disaster response.
MyDiabetesMyWay: An Evolving National Data Driven Diabetes Self-Management Platform.
Wake, Deborah J; He, Jinzhang; Czesak, Anna Maria; Mughal, Fezan; Cunningham, Scott G
2016-09-01
MyDiabetesMyWay (MDMW) is an award-wining national electronic personal health record and self-management platform for diabetes patients in Scotland. This platform links multiple national institutional and patient-recorded data sources to provide a unique resource for patient care and self-management. This review considers the current evidence for online interventions in diabetes and discusses these in the context of current and ongoing developments for MDMW. Evaluation of MDMW through patient reported outcomes demonstrates a positive impact on self-management. User feedback has highlighted barriers to uptake and has guided platform evolution from an education resource website to an electronic personal health record now encompassing remote monitoring, communication tools and personalized education links. Challenges in delivering digital interventions for long-term conditions include integration of data between institutional and personal recorded sources to perform big data analytics and facilitating technology use in those with disabilities, low digital literacy, low socioeconomic status and in minority groups. The potential for technology supported health improvement is great, but awareness and adoption by health workers and patients remains a significant barrier. © 2016 Diabetes Technology Society.
NASA Astrophysics Data System (ADS)
Bateman, Robert; Harris, Adam; Lee, Linda; Howle, Christopher R.; Ackermann, Sarah L. G.
2016-05-01
The paper will review the feasibility of adapting the Modified Transient Plane Source (MTPS) method as a screening tool for early-detection of explosives and hazardous materials. Materials can be distinguished from others based on their inherent thermal properties (e.g. thermal effusivity) in testing through different types of barrier materials. A complimentary advantage to this technique relative to other traditional detection technologies is that it can penetrate reflective barrier materials, such as aluminum, easily. A strong proof-of-principle is presented on application of the MTPS transient thermal property measuring in the early-screening of liquid explosives. The work demonstrates a significant sensitivity to distinguishing a wide range of fluids based on their thermal properties through a barrier material. The work covers various complicating factors to the longer-term adoption of such a method including the impact of carbonization and viscosity. While some technical challenges remain, the technique offers significant advantages in complimenting existing detection methods in being able to penetrate reflective metal containers (e.g. aluminum soft drinkscans) with ease.
Wake, Deborah J.; He, Jinzhang; Czesak, Anna Maria; Mughal, Fezan; Cunningham, Scott G.
2016-01-01
MyDiabetesMyWay (MDMW) is an award-wining national electronic personal health record and self-management platform for diabetes patients in Scotland. This platform links multiple national institutional and patient-recorded data sources to provide a unique resource for patient care and self-management. This review considers the current evidence for online interventions in diabetes and discusses these in the context of current and ongoing developments for MDMW. Evaluation of MDMW through patient reported outcomes demonstrates a positive impact on self-management. User feedback has highlighted barriers to uptake and has guided platform evolution from an education resource website to an electronic personal health record now encompassing remote monitoring, communication tools and personalized education links. Challenges in delivering digital interventions for long-term conditions include integration of data between institutional and personal recorded sources to perform big data analytics and facilitating technology use in those with disabilities, low digital literacy, low socioeconomic status and in minority groups. The potential for technology supported health improvement is great, but awareness and adoption by health workers and patients remains a significant barrier. PMID:27162192
Technology, conflict early warning systems, public health, and human rights.
Pham, Phuong N; Vinck, Patrick
2012-12-15
Public health and conflict early warning are evolving rapidly in response to technology changes for the gathering, management, analysis and communication of data. It is expected that these changes will provide an unprecedented ability to monitor, detect, and respond to crises. One of the potentially most profound and lasting expected change affects the roles of the various actors in providing and sharing information and in responding to early warning. Communities and civil society actors have the opportunity to be empowered as a source of information, analysis, and response, while the role of traditional actors shifts toward supporting those communities and building resilience. However, by creating new roles, relationships, and responsibilities, technology changes raise major concerns and ethical challenges for practitioners, pressing the need for practical guidelines and actionable recommendations in line with existing ethical principles. Copyright © 2012 Pham and Vinck. This is an open access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original author and source are credited.
CrowdMag - Crowdsourcing magnetic data
NASA Astrophysics Data System (ADS)
Nair, M. C.; Boneh, N.; Chulliat, A.
2014-12-01
In the CrowdMag project, we explore whether digital magnetometers built in modern mobile phones can be used as scientific instruments to measure Earth's magnetic field. Most modern mobile phones have digital magnetometers to orient themselves. A phone's magnetometer measures three components of the local magnetic field with a typical sensitivity of about 150 to 600 nanotesla (nT). By combining data from vector magnetometers and accelerometers, phone's orientation is determined. Using phone's Internet connection, magnetic data and location are sent to a central server. At the server, we check quality of the magnetic data from all users and make the data available to the public as aggregate maps. We have two long-term goals. 1) Develop near-real-time models of Earth's time changing magnetic field by reducing man-made noise from crowdsourced data and combining it with geomagnetic data from other sources. 2) Improving accuracy of magnetic navigation by mapping magnetic noise sources (for e.g. power transformer and iron pipes). Key challenges to this endeavor are the low sensitivity of the phone's magnetometer and the noisy environment within and surrounding the phone. URL : http://www.ngdc.noaa.gov/geomag/crowdmag.shtml
Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry
NASA Astrophysics Data System (ADS)
Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter
2014-06-01
Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.
Families, Schools, and Major Demographic Trends in the United States
ERIC Educational Resources Information Center
Crosnoe, Robert; Benner, Aprile D.
2012-01-01
Although long-term social change is challenging to study, it is theoretically important because its effects are not immediate and it is not usually powered by major social institutions. Another challenge is that individuals are often only vaguely aware of long-term social changes and do not fully consider how they might be affected personally. As…
Using Multiple-Variable Matching to Identify Cultural Sources of Differential Item Functioning
ERIC Educational Resources Information Center
Wu, Amery D.; Ercikan, Kadriye
2006-01-01
Identifying the sources of differential item functioning (DIF) in international assessments is very challenging, because such sources are often nebulous and intertwined. Even though researchers frequently focus on test translation and content area, few actually go beyond these factors to investigate other cultural sources of DIF. This article…
The Chandra Source Catalog : Automated Source Correlation
NASA Astrophysics Data System (ADS)
Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-01-01
Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).
Radio Sources in the NCP Region Observed with the 21 Centimeter Array
NASA Astrophysics Data System (ADS)
Zheng, Qian; Wu, Xiang-Ping; Johnston-Hollitt, Melanie; Gu, Jun-hua; Xu, Haiguang
2016-12-01
We present a catalog of 624 radio sources detected around the North Celestial Pole (NCP) with the 21 Centimeter Array (21CMA), a radio interferometer dedicated to the statistical measurement of the epoch of reionization (EoR). The data are taken from a 12 hr observation made on 2013 April 13, with a frequency coverage from 75 to 175 MHz and an angular resolution of ˜4‧. The catalog includes flux densities at eight sub-bands across the 21CMA bandwidth and provides the in-band spectral indices for the detected sources. To reduce the complexity of interferometric imaging from the so-called “w” term and ionospheric effects, the present analysis is restricted to the east-west baselines within 1500 m only. The 624 radio sources are found within 5° around the NCP down to ˜0.1 Jy. Our source counts are compared, and also exhibit a good agreement, with deep low-frequency observations made recently with the GMRT and MWA. In particular, for fainter radio sources below ˜1 Jy, we find a flattening trend of source counts toward lower frequencies. While the thermal noise (˜0.4 mJy) is well controlled to below the confusion limit, the dynamical range (˜104) and sensitivity of current 21CMA imaging are largely limited by calibration and deconvolution errors, especially the grating lobes of very bright sources, such as 3C061.1, in the NCP field, which result from the regular spacings of the 21CMA. We note that particular attention should be paid to the extended sources, and their modeling and removal may constitute a large technical challenge for current EoR experiments. Our analysis may serve as a useful guide to the design of next generation low-frequency interferometers like the Square Kilometre Array.
NASA Technical Reports Server (NTRS)
Yee, H. C.; Shinn, J. L.
1986-01-01
Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.
Common Calibration Source for Monitoring Long-term Ozone Trends
NASA Technical Reports Server (NTRS)
Kowalewski, Matthew
2004-01-01
Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.
Watershed nitrogen and phosphorus balance: The upper Potomac River basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaworski, N.A.; Groffman, P.M.; Keller, A.A.
1992-01-01
Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less
Oxidation of Ammonia in Source Water Using Biological Filtration (slides)
Drinking water utilities are challenged with a variety of contamination issues from both the source water and the distribution system. Source water issues include biological contaminants such as bacteria and viruses as well as inorganic contaminants such as arsenic, barium, and ...
Preamplifiers for non-contact capacitive biopotential measurements.
Peng, GuoChen; Ignjatovic, Zeljko; Bocko, Mark F
2013-01-01
Non-contact biopotential sensing is an attractive measurement strategy for a number of health monitoring applications, primarily the ECG and the EEG. In all such applications a key technical challenge is the design of a low-noise trans-impedance preamplifier for the typically low-capacitance, high source impedance sensing electrodes. In this paper, we compare voltage and charge amplifier designs in terms of their common mode rejection ratio, noise performance, and frequency response. Both amplifier types employ the same operational-transconductance amplifier (OTA), which was fabricated in a 0.35 um CMOS process. The results show that a charge amplifier configuration has advantages for small electrode-to-subject coupling capacitance values (less than 10 pF--typical of noncontact electrodes) and that the voltage amplifier configuration has advantages for electrode capacitances above 10 pF.
Use of Mobile Devices to Access Resources Among Health Professions Students: A Systematic Review.
Mi, Misa; Wu, Wendy; Qiu, Maylene; Zhang, Yingting; Wu, Lin; Li, Jie
2016-01-01
This systematic review examines types of mobile devices used by health professions students, kinds of resources and tools accessed via mobile devices, and reasons for using the devices to access the resources and tools. The review included 20 studies selected from articles published in English between January 2010 and April 2015, retrieved from PubMed and other sources. Data extracted included participants, study designs, mobile devices used, mobile resources/apps accessed, outcome measures, and advantages of and barriers to using mobile devices. The review indicates significant variability across the studies in terms of research methods, types of mobile programs implemented, resources accessed, and outcomes. There were beneficial effects of using mobile devices to access resources as well as conspicuous challenges or barriers in using mobile devices.
Interstellar Flight, Imagination and Myth Creation as an Effective Means for Enduring Inspiration
NASA Astrophysics Data System (ADS)
Padowitz, G. H.
Interstellar travel to faraway star systems is humanity's most crucial mission, but we habitually focus on technological and funding challenges instead of deeply exploring the rare essence of creativity that is the source that enables us to ultimately solve all problems. Certainly, if Interstellar space flight is to succeed, inspiring and maintaining global and multigenerational support is primary to long-term development. To attract and sustain such extraordinary support the creative power of the imagination must be harnessed through independent artists. By first attracting and encouraging visionaries it's possible that we can awaken in the public a new, invigorating sense of adventure with lasting power. Going beyond our solar system to a nearby star is in reality a mythic quest and should be treated as such.
Sex-Linked Behavior: Evolution, Stability, and Variability.
Fine, Cordelia; Dupré, John; Joel, Daphna
2017-07-29
Common understanding of human sex-linked behaviors is that proximal mechanisms of genetic and hormonal sex, ultimately shaped by the differential reproductive challenges of ancestral males and females, act on the brain to transfer sex-linked predispositions across generations. Here, we extend the debate on the role of nature and nurture in the development of traits in the lifetime of an individual, to their role in the cross-generation transfer of traits. Advances in evolutionary theory that posit the environment as a source of trans-generational stability, and new understanding of sex effects on the brain, suggest that the cross-generation stability of sex-linked patterns of behavior are sometimes better explained in terms of inherited socioenvironmental conditions, with biological sex fostering intrageneration variability. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sex-Linked Behavior: Evolution, Stability, and Variability.
Fine, Cordelia; Dupré, John; Joel, Daphna
2017-09-01
Common understanding of human sex-linked behaviors is that proximal mechanisms of genetic and hormonal sex, ultimately shaped by the differential reproductive challenges of ancestral males and females, act on the brain to transfer sex-linked predispositions across generations. Here, we extend the debate on the role of nature and nurture in the development of traits in the lifetime of an individual, to their role in the cross-generation transfer of traits. Advances in evolutionary theory that posit the environment as a source of trans-generational stability, and new understanding of sex effects on the brain, suggest that the cross-generation stability of sex-linked patterns of behavior are sometimes better explained in terms of inherited socioenvironmental conditions, with biological sex fostering intrageneration variability. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pique, Jean-Paul; Moldovan, Ioana Cristina; Fesquet, Vincent
2006-11-01
One challenge for polychromatic laser guide stars is to create a sufficiently intense source in the UV. The flux required for the measurement of differential tip-tilt is the main issue that we address. We describe a model that has been validated using on-sky data. We present a method that excites the 4P3/2 sodium level using a one-photon excitation at 330 nm. It is more efficient than the two-photon excitation previously suggested since its power slope flux is 3 x 10(4) photons/s/m2/W instead of 1.3 x 10(3) photons/s/m2/W. This method is very promising both in terms of flux and system simplicity.
NASA Astrophysics Data System (ADS)
Shen, Ji; Gerard, Libby; Bowyer, Jane
2010-04-01
In this study we investigate how federal and state policy makers, and school principals are working to improve science teacher quality. Interviews, focused discussions, and policy documents serve as the primary data source. Findings suggest that both policy makers and principals prioritize increasing incentives for teachers entering the science teaching profession, providing professional development for new teachers, and using students’ data to evaluate and improve instruction. Differences between the two leadership groups emerged in terms of the grain size and practicality of their concerns. Our findings indicate that the complexity of educational challenges to improve science teacher quality call for the co-construction of policy by multiple constituent groups including school principals, federal and state policy makers, and science education researchers.
NASA Astrophysics Data System (ADS)
Ramos, Eunice; Sridharan, Vignesh; Howells, Mark
2017-04-01
The distribution of resources in Nicaragua is not even, as it is the case in many countries in the world. However, in the particular case of water resources, commonly used by different sectors and essential to basic human activities, their availability differs along the main drainage basins and is often mismatched with sectoral demands. For example, the population is distributed unevenly, with 80% being located in water scarce areas of the Pacific and Central region of Nicaragua. Agricultural activities also take place in regions where water resources are vulnerable. The spatial distribution of water and energy resources, population and land use in Nicaragua allowed for the identification of three target regions for the analysis: the Pacific coast, the Dry Corridor zone, and the Atlantic region. Each of these zones has different challenges on which the CLEWs assessment focused on. Water sources in the Pacific coast are mostly groundwater, and uncertainty exists related to the long-term availability of such source. This is also the region where most of the sugarcane, an important source of revenue for Nicaragua, is produced. As sugarcane needs to be irrigated, this increases the pressure on water resources. The Dry Corridor is an arid stretch in Central America cyclically affected by droughts that have a severe impact on the households whose economy and subsistence depends on agriculture of grains and coffee beans. It is expected that climate change will exacerbate further the food security problem. When water is lacking, also population experiences limited access to water for drinking and cooking. In addition, two major hydropower plants are located in this zone. Water resources are available both from surface and groundwater sources, however, due to their intensive use and vulnerability to climate, their availability can affect severely different sectors, presenting risks to food, water and energy security. Hydropower potential is foreseen to be exploited in the Matagalpa and Escondido River Basins draining to the Atlantic Ocean. Although competition for water resources in not as acute as in other regions due to abundant surface water and lower population density, climate change and the use of land for grazing could present risks to the exploitation of the renewable energy potential. This could have an impact on medium and long-term energy planning and the ambition of decreasing fuel imports for electricity generation and increase electricity access. To assess the potential implications of the previous challenges and provide insights on solutions where conflicts are more stringent, in line with sustainable development priorities, the CLEWs framework was used to perform the integration of resource systems models. WEAP was used for the representation of the water and land use systems, and then soft-linked with the energy systems model for Nicaragua, developed using the long-term energy planning tool OSeMOSYS. Hydropower expansion, the development of the electricity system, water availability for crop production, water allocation across sectors, sugarcane cultivation and bi-products use in electricity generation, and potential impacts of climate change, are amongst the issues investigated with the region-specific scenarios defined for the study.
NASA Astrophysics Data System (ADS)
Morris, C. E.; Sands, D. C.; Bardin, M.; Jaenicke, R.; Vogel, B.; Leyronas, C.; Ariya, P. A.; Psenner, R.
2011-01-01
For the past 200 years, the field of aerobiology has explored the abundance, diversity, survival and transport of micro-organisms in the atmosphere. Micro-organisms have been explored as passive and severely stressed riders of atmospheric transport systems. Recently, an interest in the active roles of these micro-organisms has emerged along with proposals that the atmosphere is a global biome for microbial metabolic activity and perhaps even multiplication. As part of a series of papers on the sources, distribution and roles in atmospheric processes of biological particles in the atmosphere, here we describe the pertinence of questions relating to the potential roles that air-borne micro-organisms might play in meteorological phenomena. For the upcoming era of research on the role of air-borne micro-organisms in meteorological phenomena, one important challenge is to go beyond descriptions of abundance of micro-organisms in the atmosphere toward an understanding of their dynamics in terms of both biological and physico-chemical properties and of the relevant transport processes at different scales. Another challenge is to develop this understanding under contexts pertinent to their potential role in processes related to atmospheric chemistry, the formation of clouds, precipitation and radiative forcing. This will require truly interdisciplinary approaches involving collaborators from the biological and physical sciences, from disciplines as disparate as agronomy, microbial genetics and atmosphere physics, for example.
Kokkonen, Kaija; Rissanen, Sari; Hujala, Anneli
2012-11-08
Elderly care practice and its management together with policy and research play a crucial role in responding to increasing challenges in institutional care for elderly people. Successful dialogue between these is necessary. The purpose of this systematic literature review is to compare how institutional elderly care management research meets the care challenges currently emphasized in international long-term care policy documents. This paper was based on a systematic literature review. After screening 1971 abstracts using inclusion/exclusion criteria, 58 refereed articles published between 2000 and 2010 remained for analysis. The articles were analyzed using theory-based content analysis by comparing the results to the framework based on analysis of international long-term care management policy documents. The current challenges of long-term care management identified from policy documents were Integrated Care Management, Productivity Management, Quality Management, Workforce Management and ICT Management. The research on institutional elderly care management responded somewhat to the challenges mentioned in policy documents. However, some of the challenges were studied broadly and some were paid only minor attention. Further, only few studies focused on the core items of challenges addressed in policy documents. Institutional care management research needs to focus more on challenges in integrated care, productivity, ICT and division of labor. Managers, researchers and policy-makers should assume more active collaborative roles in processes of research, policymaking and policy implementation. In addition managers' and policymakers' scientific literacy needs to be enhanced.
2012-01-01
Background Elderly care practice and its management together with policy and research play a crucial role in responding to increasing challenges in institutional care for elderly people. Successful dialogue between these is necessary. The purpose of this systematic literature review is to compare how institutional elderly care management research meets the care challenges currently emphasized in international long-term care policy documents. Methods This paper was based on a systematic literature review. After screening 1971 abstracts using inclusion/exclusion criteria, 58 refereed articles published between 2000 and 2010 remained for analysis. The articles were analyzed using theory-based content analysis by comparing the results to the framework based on analysis of international long-term care management policy documents. Results The current challenges of long-term care management identified from policy documents were Integrated Care Management, Productivity Management, Quality Management, Workforce Management and ICT Management. The research on institutional elderly care management responded somewhat to the challenges mentioned in policy documents. However, some of the challenges were studied broadly and some were paid only minor attention. Further, only few studies focused on the core items of challenges addressed in policy documents. Conclusions Institutional care management research needs to focus more on challenges in integrated care, productivity, ICT and division of labor. Managers, researchers and policy-makers should assume more active collaborative roles in processes of research, policymaking and policy implementation. In addition managers’ and policymakers’ scientific literacy needs to be enhanced. PMID:23137416
NASA Astrophysics Data System (ADS)
Kolodny, Michael A.
2017-05-01
Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.
The big data-big model (BDBM) challenges in ecological research
NASA Astrophysics Data System (ADS)
Luo, Y.
2015-12-01
The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple, heterogeneous data sets; intractability of structural complexity of big models; equifinality of model structure selection and parameter estimation; and computational demand of global optimization with Big Models.
Knowledge Evolution in Distributed Geoscience Datasets and the Role of Semantic Technologies
NASA Astrophysics Data System (ADS)
Ma, X.
2014-12-01
Knowledge evolves in geoscience, and the evolution is reflected in datasets. In a context with distributed data sources, the evolution of knowledge may cause considerable challenges to data management and re-use. For example, a short news published in 2009 (Mascarelli, 2009) revealed the geoscience community's concern that the International Commission on Stratigraphy's change to the definition of Quaternary may bring heavy reworking of geologic maps. Now we are in the era of the World Wide Web, and geoscience knowledge is increasingly modeled and encoded in the form of ontologies and vocabularies by using semantic technologies. Accordingly, knowledge evolution leads to a consequence called ontology dynamics. Flouris et al. (2008) summarized 10 topics of general ontology changes/dynamics such as: ontology mapping, morphism, evolution, debugging and versioning, etc. Ontology dynamics makes impacts at several stages of a data life cycle and causes challenges, such as: the request for reworking of the extant data in a data center, semantic mismatch among data sources, differentiated understanding of a same piece of dataset between data providers and data users, as well as error propagation in cross-discipline data discovery and re-use (Ma et al., 2014). This presentation will analyze the best practices in the geoscience community so far and summarize a few recommendations to reduce the negative impacts of ontology dynamics in a data life cycle, including: communities of practice and collaboration on ontology and vocabulary building, link data records to standardized terms, and methods for (semi-)automatic reworking of datasets using semantic technologies. References: Flouris, G., Manakanatas, D., Kondylakis, H., Plexousakis, D., Antoniou, G., 2008. Ontology change: classification and survey. The Knowledge Engineering Review 23 (2), 117-152. Ma, X., Fox, P., Rozell, E., West, P., Zednik, S., 2014. Ontology dynamics in a data life cycle: Challenges and recommendations from a Geoscience Perspective. Journal of Earth Science 25 (2), 407-412. Mascarelli, A.L., 2009. Quaternary geologists win timescale vote. Nature 459, 624.
Jabbar, Ahmed Najah
2018-04-13
This letter suggests two new types of asymmetrical higher-order kernels (HOK) that are generated using the orthogonal polynomials Laguerre (positive or right skew) and Bessel (negative or left skew). These skewed HOK are implemented in the blind source separation/independent component analysis (BSS/ICA) algorithm. The tests for these proposed HOK are accomplished using three scenarios to simulate a real environment using actual sound sources, an environment of mixtures of multimodal fast-changing probability density function (pdf) sources that represent a challenge to the symmetrical HOK, and an environment of an adverse case (near gaussian). The separation is performed by minimizing the mutual information (MI) among the mixed sources. The performance of the skewed kernels is compared to the performance of the standard kernels such as Epanechnikov, bisquare, trisquare, and gaussian and the performance of the symmetrical HOK generated using the polynomials Chebyshev1, Chebyshev2, Gegenbauer, Jacobi, and Legendre to the tenth order. The gaussian HOK are generated using the Hermite polynomial and the Wand and Schucany procedure. The comparison among the 96 kernels is based on the average intersymbol interference ratio (AISIR) and the time needed to complete the separation. In terms of AISIR, the skewed kernels' performance is better than that of the standard kernels and rivals most of the symmetrical kernels' performance. The importance of these new skewed HOK is manifested in the environment of the multimodal pdf mixtures. In such an environment, the skewed HOK come in first place compared with the symmetrical HOK. These new families can substitute for symmetrical HOKs in such applications.
Design of 1 MHz Solid State High Frequency Power Supply
NASA Astrophysics Data System (ADS)
Parmar, Darshan; Singh, N. P.; Gajjar, Sandip; Thakar, Aruna; Patel, Amit; Raval, Bhavin; Dhola, Hitesh; Dave, Rasesh; Upadhay, Dishang; Gupta, Vikrant; Goswami, Niranjan; Mehta, Kush; Baruah, Ujjwal
2017-04-01
High Frequency Power supply (HFPS) is used for various applications like AM Transmitters, metallurgical applications, Wireless Power Transfer, RF Ion Sources etc. The Ion Source for a Neutral beam Injector at ITER-India uses inductively coupled power source at High Frequency (∼1 MHz). Switching converter based topology used to generate 1 MHz sinusoidal output is expected to have advantages on efficiency and reliability as compared to traditional RF Tetrode tubes based oscillators. In terms of Power Electronics, thermal and power coupling issues are major challenges at such a high frequency. A conceptual design for a 200 kW, 1 MHz power supply and a prototype design for a 600 W source been done. The prototype design is attempted with Class-E amplifier topology where a MOSFET is switched resonantly. The prototype uses two low power modules and a ferrite combiner to add the voltage and power at the output. Subsequently solution with Class-D H-Bridge configuration have been evaluated through simulation where module design is stable as switching device do not participate in resonance, further switching device voltage rating is substantially reduced. The rating of the modules is essentially driven by the maximum power handling capacity of the MOSFETs and ferrites in the combiner circuit. The output passive network including resonance tuned network and impedance matching network caters for soft switching and matches the load impedance to 50ohm respectively. This paper describes the conceptual design of a 200 kW high frequency power supply and experimental results of the prototype 600 W, 1 MHz source.
Wilson, Andrew M; Sims, Erika J; Orr, Linda C; Robb, Fiona; Lipworth, Brian J
2003-01-01
Aims To evaluate the role of AMP nasal challenge as a measure of short-term treatment response in patients receiving intranasal corticosteroids. Adenosine monophosphate (AMP) challenge has been shown to be a good inflammatory surrogate in the lower airways, but it has not been properly evaluated as a nasal challenge test. Methods Fourteen patients with perennial allergic rhinitis (PAR) were randomized to receive 2 weeks treatment with placebo (PL) or 200 µg intranasal mometasone furoate (MF) once daily in a randomized single-blind crossover study. AMP (25–800 mg ml−1) and histamine (0.25–8 mg ml−1) nasal challenge testing were performed after each treatment period with 30% decrease in minimal cross-sectional area (MCA). Domiciliary symptom data were collected. Results There was a significant (P < 0.05) improvement in PC30 MCA and nasal volume with AMP but not with histamine comparing MF vs PL. This amounted to a 2.8 (95% CI 1.5, 4.0) and 0.7 (95% CI −0.5, 1.9) doubling-dose change for AMP and histamine challenges, respectively. There were significant (P < 0.05) improvements in nasal symptoms and quality of life. Conclusions AMP nasal challenge using acoustic rhinometry may be a useful test to assess short-term treatment response in patient with PAR. PMID:12680883
Powering nanorobotic devices: challenges and future strategies
NASA Astrophysics Data System (ADS)
Sankar, Krishna Moorthi
2014-04-01
Nanotechnology, even after 55 years since its foundation (1959 Richard Feynman's speech - `There is lot of space in the bottom'), is still in its infancy. However, of late, there has been a large increase in the research being done in this field in many prominent Universities and Research institutions across the globe. Nanorobotics is the combination of Nanotechnology and the science of Robotics, to create robots that are only a few nanometres (10-9 metres) in size. Nanobots are yet to be made. But with the current pace of ongoing researches, scientists predict that nanobots will be made a reality by next ten years. The main proposed function of nanobots is to use them in the medical field to interact with cells or intra-cellular substances and prevent or reverse structural and genetical problems and diseases. One of the major challenges faced while creating a nanobot to travel through human body is to power it. Nanobots would require a very small yet highly potential source of energy. There are many hypothesised energy sources for nanobots which are either already available within the human body naturally or which are to be supplied externally. But, all of these energy sources pose a few challenges which need to be addressed if they are to be used to power nanobots. These challenges can be overcome using a number of strategies that can be used to make an economically, ecologically and medically viable energy source.
Searching for randomized controlled trials and systematic reviews on exercise. A descriptive study.
Grande, Antonio José; Hoffmann, Tammy; Glasziou, Paul
2015-01-01
The current paradigm of science is to accumulate as much research data as possible, with less thought given to navigation or synthesis of the resulting mass, which hampers locating and using the research. The aim here was to describe the number of randomized controlled trials (RCTs) and systematic reviews (SRs) focusing on exercise, and their journal sources, that have been indexed in PubMed over time. Descriptive study conducted at Bond University, Australia. To find RCTs, a search was conducted in PubMed Clinical Queries, using the category "Therapy" and the Medical Subject Headings (MeSH) term "Exercise". To find SRs, a search was conducted in PubMed Clinical Queries, using the category "Therapy", the MeSH term "Exercise" and various methodological filters. Up until 2011, 9,354 RCTs about exercise were published in 1,250 journals and 1,262 SRs in 513 journals. Journals in the area of Sports Science published the greatest number of RCTs and journals categorized as belonging to "Other health professions" area (for example nursing or psychology) published the greatest number of SRs. The Cochrane Database of Systematic Reviews was the principal source for SRs, with 9.8% of the total, while the Journal of Strength and Conditioning Research and Medicine & Science in Sports & Exercise published 4.4% and 5.0% of the RCTs, respectively. The rapid growth and resulting scatter of RCTs and SRs on exercise presents challenges for locating and using this research. Solutions for this issue need to be considered.
NASA Astrophysics Data System (ADS)
Bürmen, Miran; Pernuš, Franjo; Likar, Boštjan
2010-02-01
Near-infrared spectroscopy is a promising, rapidly developing, reliable and noninvasive technique, used extensively in the biomedicine and in pharmaceutical industry. With the introduction of acousto-optic tunable filters (AOTF) and highly sensitive InGaAs focal plane sensor arrays, real-time high resolution hyper-spectral imaging has become feasible for a number of new biomedical in vivo applications. However, due to the specificity of the AOTF technology and lack of spectral calibration standardization, maintaining long-term stability and compatibility of the acquired hyper-spectral images across different systems is still a challenging problem. Efficiently solving both is essential as the majority of methods for analysis of hyper-spectral images relay on a priori knowledge extracted from large spectral databases, serving as the basis for reliable qualitative or quantitative analysis of various biological samples. In this study, we propose and evaluate fast and reliable spectral calibration of hyper-spectral imaging systems in the short wavelength infrared spectral region. The proposed spectral calibration method is based on light sources or materials, exhibiting distinct spectral features, which enable robust non-rigid registration of the acquired spectra. The calibration accounts for all of the components of a typical hyper-spectral imaging system such as AOTF, light source, lens and optical fibers. The obtained results indicated that practical, fast and reliable spectral calibration of hyper-spectral imaging systems is possible, thereby assuring long-term stability and inter-system compatibility of the acquired hyper-spectral images.
Ambient ultrafine particle levels at residential and reference sites in urban and rural Switzerland.
Meier, Reto; Eeftens, Marloes; Aguilera, Inmaculada; Phuleria, Harish C; Ineichen, Alex; Davey, Mark; Ragettli, Martina S; Fierz, Martin; Schindler, Christian; Probst-Hensch, Nicole; Tsai, Ming-Yi; Künzli, Nino
2015-03-03
Although there is evidence that ultrafine particles (UFP) do affect human health there are currently no legal ambient standards. The main reasons are the absence of spatially resolved exposure data to investigate long-term health effects and the challenge of defining representative reference sites for monitoring given the high dependence of UFP on proximity to sources. The objectives of this study were to evaluate the spatial distribution of UFP in four areas of the Swiss Study on Air Pollution and Lung and Heart Diseases in Adults (SAPALDIA) and to investigate the representativeness of routine air monitoring stations for residential sites in these areas. Repeated UFP measurements during three seasons have been conducted at a total of 80 residential sites and four area specific reference sites over a median duration of 7 days. Arithmetic mean residential PNC scattered around the median of 10,800 particles/cm(3) (interquartile range [IQR] = 7800 particles/cm(3)). Spatial within area contrasts (90th/10th percentile ratios) were around two; increased contrasts were observed during weekday rush-hours. Temporal UFP patterns were comparable at reference and residential sites in all areas. Our data show that central monitoring sites can represent residential conditions when locations are well chosen with respect to the local sources--namely traffic. For epidemiological research, locally resolved spatial models are needed to estimate individuals' long-term exposures to UFP of outdoor origin at home, during commute and at work.
Harnessing extracellular vesicles to direct endochondral repair of large bone defects
Ferreira, E.
2018-01-01
Large bone defects remain a tremendous clinical challenge. There is growing evidence in support of treatment strategies that direct defect repair through an endochondral route, involving a cartilage intermediate. While culture-expanded stem/progenitor cells are being evaluated for this purpose, these cells would compete with endogenous repair cells for limited oxygen and nutrients within ischaemic defects. Alternatively, it may be possible to employ extracellular vesicles (EVs) secreted by culture-expanded cells for overcoming key bottlenecks to endochondral repair, such as defect vascularization, chondrogenesis, and osseous remodelling. While mesenchymal stromal/stem cells are a promising source of therapeutic EVs, other donor cells should also be considered. The efficacy of an EV-based therapeutic will likely depend on the design of companion scaffolds for controlled delivery to specific target cells. Ultimately, the knowledge gained from studies of EVs could one day inform the long-term development of synthetic, engineered nanovesicles. In the meantime, EVs harnessed from in vitro cell culture have near-term promise for use in bone regenerative medicine. This narrative review presents a rationale for using EVs to improve the repair of large bone defects, highlights promising cell sources and likely therapeutic targets for directing repair through an endochondral pathway, and discusses current barriers to clinical translation. Cite this article: E. Ferreira, R. M. Porter. Harnessing extracellular vesicles to direct endochondral repair of large bone defects. Bone Joint Res 2018;7:263–273. DOI: 10.1302/2046-3758.74.BJR-2018-0006. PMID:29922444
Scientific and technical challenges on the road towards fusion electricity
NASA Astrophysics Data System (ADS)
Donné, A. J. H.; Federici, G.; Litaudon, X.; McDonald, D. C.
2017-10-01
The goal of the European Fusion Roadmap is to deliver fusion electricity to the grid early in the second half of this century. It breaks the quest for fusion energy into eight missions, and for each of them it describes a research and development programme to address all the open technical gaps in physics and technology and estimates the required resources. It points out the needs to intensify industrial involvement and to seek all opportunities for collaboration outside Europe. The roadmap covers three periods: the short term, which runs parallel to the European Research Framework Programme Horizon 2020, the medium term and the long term. ITER is the key facility of the roadmap as it is expected to achieve most of the important milestones on the path to fusion power. Thus, the vast majority of present resources are dedicated to ITER and its accompanying experiments. The medium term is focussed on taking ITER into operation and bringing it to full power, as well as on preparing the construction of a demonstration power plant DEMO, which will for the first time demonstrate fusion electricity to the grid around the middle of this century. Building and operating DEMO is the subject of the last roadmap phase: the long term. Clearly, the Fusion Roadmap is tightly connected to the ITER schedule. Three key milestones are the first operation of ITER, the start of the DT operation in ITER and reaching the full performance at which the thermal fusion power is 10 times the power put in to the plasma. The Engineering Design Activity of DEMO needs to start a few years after the first ITER plasma, while the start of the construction phase will be a few years after ITER reaches full performance. In this way ITER can give viable input to the design and development of DEMO. Because the neutron fluence in DEMO will be much higher than in ITER, it is important to develop and validate materials that can handle these very high neutron loads. For the testing of the materials, a dedicated 14 MeV neutron source is needed. This DEMO Oriented Neutron Source (DONES) is therefore an important facility to support the fusion roadmap.
Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations
NASA Technical Reports Server (NTRS)
Alter, Stephen J.
2003-01-01
A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.
BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Lotz
1997-02-15
This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less
Management challenges faced by managers of New Zealand long-term care facilities.
Madas, E; North, N
2000-01-01
This article reports on a postal survey of 78 long-term care managers in one region of New Zealand, of whom 45 (58%) responded. Most long-term care managers (73.2%) were middle-aged females holding nursing but not management qualifications. Most long-term care facilities (69%) tended to be stand-alone facilities providing a single type of care (rest home or continuing care hospital). The most prominent issues facing managers were considered to be inadequate funding to match the growing costs of providing long-term care and occupancy levels. Managers believed that political/regulatory, economic and social factors influenced these issues. Despite a turbulent health care environment and the challenges facing managers, long-term care managers reported they were coping well and valued networking.
Thavamani, Palanisami; Megharaj, Mallavarapu; Naidu, Ravi
2012-11-01
Bioremediation of polyaromatic hydrocarbons (PAH) contaminated soils in the presence of heavy metals have proved to be difficult and often challenging due to the ability of toxic metals to inhibit PAH degradation by bacteria. In this study, a mixed bacterial culture designated as consortium-5 was isolated from a former manufactured gas plant (MGP) site. The ability of this consortium to utilise HMW PAHs such as pyrene and BaP as a sole carbon source in the presence of toxic metal Cd was demonstrated. Furthermore, this consortium has proven to be effective in degradation of HMW PAHs even from the real long term contaminated MGP soil. Thus, the results of this study demonstrate the great potential of this consortium for field scale bioremediation of PAHs in long term mix contaminated soils such as MGP sites. To our knowledge this is the first study to isolate and characterize metal tolerant HMW PAH degrading bacterial consortium which shows great potential in bioremediation of mixed contaminated soils such as MGP.
A Microarray Tool Provides Pathway and GO Term Analysis.
Koch, Martin; Royer, Hans-Dieter; Wiese, Michael
2011-12-01
Analysis of gene expression profiles is no longer exclusively a task for bioinformatic experts. However, gaining statistically significant results is challenging and requires both biological knowledge and computational know-how. Here we present a novel, user-friendly microarray reporting tool called maRt. The software provides access to bioinformatic resources, like gene ontology terms and biological pathways by use of the DAVID and the BioMart web-service. Results are summarized in structured HTML reports, each presenting a different layer of information. In these report, contents of diverse sources are integrated and interlinked. To speed up processing, maRt takes advantage of the multi-core technology of modern desktop computers by using parallel processing. Since the software is built upon a RCP infrastructure it might be an outset for developers aiming to integrate novel R based applications. Installer, documentation and various kinds of tutorials are available under LGPL license at the website of our institute http://www.pharma.uni-bonn.de/www/mart. This software is free for academic use. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
False alarms, real challenges--one university's communication response to the 2001 anthrax crisis.
Clarke, Christopher E; Chess, Caron
2006-01-01
Considerable research exists on how government agencies at the federal, state, and local levels communicated during the fall 2001 anthrax attacks. However, there is little research on how other institutions handled this crisis, in terms of their response to potential anthrax contamination (aka "white powder scares") and their approach to disseminating important health and safety information. In this article, we investigate a major university's communication response to the anthrax crisis. First, we describe its communication experiences relating to a large white powder scare that occurred in October 2001. Second, we describe the university's broader communication efforts in terms of several important elements of risk communication research, including influence of source attributes, key messages, preferred channels, responses to information requests, and organizational influences. This study underlines that an institution does not have to be directly affected by a crisis to find itself on the communication "front lines." Moreover, other institutions may find it useful to learn from the experiences of this university, so that they may communicate more effectively during future crises.
The intergenerational effects of war on the health of children.
Devakumar, Delan; Birch, Marion; Osrin, David; Sondorp, Egbert; Wells, Jonathan C K
2014-04-02
The short- and medium-term effects of conflict on population health are reasonably well documented. Less considered are its consequences across generations and potential harms to the health of children yet to be born. Looking first at the nature and effects of exposures during conflict, and then at the potential routes through which harm may propagate within families, we consider the intergenerational effects of four features of conflict: violence, challenges to mental health, infection and malnutrition. Conflict-driven harms are transmitted through a complex permissive environment that includes biological, cultural and economic factors, and feedback loops between sources of harm and weaknesses in individual and societal resilience to them. We discuss the multiplicative effects of ongoing conflict when hostilities are prolonged. We summarize many instances in which the effects of war can propagate across generations. We hope that the evidence laid out in the article will stimulate research and--more importantly--contribute to the discussion of the costs of war; particularly in the longer-term in post-conflict situations in which interventions need to be sustained and adapted over many years.
The intergenerational effects of war on the health of children
2014-01-01
Background The short- and medium-term effects of conflict on population health are reasonably well documented. Less considered are its consequences across generations and potential harms to the health of children yet to be born. Discussion Looking first at the nature and effects of exposures during conflict, and then at the potential routes through which harm may propagate within families, we consider the intergenerational effects of four features of conflict: violence, challenges to mental health, infection and malnutrition. Conflict-driven harms are transmitted through a complex permissive environment that includes biological, cultural and economic factors, and feedback loops between sources of harm and weaknesses in individual and societal resilience to them. We discuss the multiplicative effects of ongoing conflict when hostilities are prolonged. Summary We summarize many instances in which the effects of war can propagate across generations. We hope that the evidence laid out in the article will stimulate research and – more importantly – contribute to the discussion of the costs of war; particularly in the longer-term in post-conflict situations in which interventions need to be sustained and adapted over many years. PMID:24694212
The lightest organic radical cation for charge storage in redox flow batteries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Jinhua; Pan, Baofei; Duan, Wentao
2016-08-25
Electrochemically reversible fluids of high energy density are promising materials for capturing the electrical energy generated from intermittent sources like solar and wind. To meet this technological challenge there is a need to understand the fundamental limits and interplay of electrochemical potential, stability and solubility in “lean” derivatives of redox-active molecules. Here we describe the process of molecular pruning, illustrated for 2,5-di-tert-butyl-1,4-bis(2-methoxyethoxy)benzene, a molecule known to produce a persistently stable, high-potential radical cation. By systematically shedding molecular fragments considered important for radical cation steric stabilization, we discovered a minimalistic structure that retains long-term stability in its oxidized form. Interestingly, wemore » find the tert-butyl groups are unnecessary; high stability of the radical cation and high solubility are both realized in derivatives having appropriately positioned arene methyl groups. These stability trends are rationalized by mechanistic considerations of the postulated decomposition pathways. We suggest that the molecular pruning approach will uncover lean redox active derivatives for electrochemical energy storage leading to materials with long-term stability and high intrinsic capacity.« less
Towards Personal Exposures: How Technology Is Changing Air Pollution and Health Research.
Larkin, A; Hystad, P
2017-12-01
We present a review of emerging technologies and how these can transform personal air pollution exposure assessment and subsequent health research. Estimating personal air pollution exposures is currently split broadly into methods for modeling exposures for large populations versus measuring exposures for small populations. Air pollution sensors, smartphones, and air pollution models capitalizing on big/new data sources offer tremendous opportunity for unifying these approaches and improving long-term personal exposure prediction at scales needed for population-based research. A multi-disciplinary approach is needed to combine these technologies to not only estimate personal exposures for epidemiological research but also determine drivers of these exposures and new prevention opportunities. While available technologies can revolutionize air pollution exposure research, ethical, privacy, logistical, and data science challenges must be met before widespread implementations occur. Available technologies and related advances in data science can improve long-term personal air pollution exposure estimates at scales needed for population-based research. This will advance our ability to evaluate the impacts of air pollution on human health and develop effective prevention strategies.
Source Finding in the Era of the SKA (Precursors): Aegean 2.0
NASA Astrophysics Data System (ADS)
Hancock, Paul J.; Trott, Cathryn M.; Hurley-Walker, Natasha
2018-03-01
In the era of the SKA precursors, telescopes are producing deeper, larger images of the sky on increasingly small time-scales. The greater size and volume of images place an increased demand on the software that we use to create catalogues, and so our source finding algorithms need to evolve accordingly. In this paper, we discuss some of the logistical and technical challenges that result from the increased size and volume of images that are to be analysed, and demonstrate how the Aegean source finding package has evolved to address these challenges. In particular, we address the issues of source finding on spatially correlated data, and on images in which the background, noise, and point spread function vary across the sky. We also introduce the concept of forced or prioritised fitting.
The UAB Informatics Institute and 2016 CEGS N-GRID de-identification shared task challenge.
Bui, Duy Duc An; Wyatt, Mathew; Cimino, James J
2017-11-01
Clinical narratives (the text notes found in patients' medical records) are important information sources for secondary use in research. However, in order to protect patient privacy, they must be de-identified prior to use. Manual de-identification is considered to be the gold standard approach but is tedious, expensive, slow, and impractical for use with large-scale clinical data. Automated or semi-automated de-identification using computer algorithms is a potentially promising alternative. The Informatics Institute of the University of Alabama at Birmingham is applying de-identification to clinical data drawn from the UAB hospital's electronic medical records system before releasing them for research. We participated in a shared task challenge by the Centers of Excellence in Genomic Science (CEGS) Neuropsychiatric Genome-Scale and RDoC Individualized Domains (N-GRID) at the de-identification regular track to gain experience developing our own automatic de-identification tool. We focused on the popular and successful methods from previous challenges: rule-based, dictionary-matching, and machine-learning approaches. We also explored new techniques such as disambiguation rules, term ambiguity measurement, and used multi-pass sieve framework at a micro level. For the challenge's primary measure (strict entity), our submissions achieved competitive results (f-measures: 87.3%, 87.1%, and 86.7%). For our preferred measure (binary token HIPAA), our submissions achieved superior results (f-measures: 93.7%, 93.6%, and 93%). With those encouraging results, we gain the confidence to improve and use the tool for the real de-identification task at the UAB Informatics Institute. Copyright © 2017 Elsevier Inc. All rights reserved.
Air Force Research Laboratory’s Focused Long Term Challenges
2008-04-01
Air Force Research Laboratory ( AFRL ) mission is to provide support to the Air Force (AF) and the warfighters with... Air Force Research Laboratory’s Focused Long Term Challenges Leo J Rose Munitions Directorate, Air Force Research Laboratory , 101 W Eglin Blvd...This technology vision, which was born in our Air Force Research Laboratory , builds on the Air Force’s traditional kill
USDA-ARS?s Scientific Manuscript database
Vitamin E, a major natural antioxidant, has been previously shown to attenuate pro-inflammatory response to immune challenge in cattle. Our objective was to evaluate the effect of short-term treatment with alpha-tocopherol in newborn calves on selected elements of the pro-inflamatory response to LPS...
Gulati, Karan; Ivanovski, Sašo
2017-08-01
The transmucosal nature of dental implants presents a unique therapeutic challenge, requiring not only rapid establishment and subsequent maintenance of osseointegration, but also the formation of resilient soft tissue integration. Key challenges in achieving long-term success are sub-optimal bone integration in compromised bone conditions and impaired trans-mucosal tissue integration in the presence of a persistent oral microbial biofilm. These challenges can be targeted by employing a drug-releasing implant modification such as TiO 2 nanotubes (TNTs), engineered on titanium surfaces via electrochemical anodization. Areas covered: This review focuses on applications of TNT-based dental implants towards achieving optimal therapeutic efficacy. Firstly, the functions of TNT implants will be explored in terms of their influence on osseointegration, soft tissue integration and immunomodulation. Secondly, the developmental challenges associated with such implants are reviewed including sterilization, stability and toxicity. Expert opinion: The potential of TNTs is yet to be fully explored in the context of the complex oral environment, including appropriate modulation of alveolar bone healing, immune-inflammatory processes, and soft tissue responses. Besides long-term in vivo assessment under masticatory loading conditions, investigating drug-release profiles in vivo and addressing various technical challenges are required to bridge the gap between research and clinical dentistry.
Conceptual Questions and Challenge Problems
NASA Astrophysics Data System (ADS)
Nurrenbern, Susan C.; Robinson, William R.
1998-11-01
The JCE Internet Conceptual Question and Challenge Problem Web site is a source of questions and problems that can be used in teaching and assessing conceptual understanding and problem solving in chemistry. Here you can find a library of free-response and multiple-choice conceptual questions and challenge problems, tips for writing these questions and problems, and a discussion of types of conceptual questions. This site is intended to be a means of sharing conceptual questions and challenge problems among chemical educators. This is a living site that will grow as you share conceptual questions and challenge problems and as we find new sources of information. We would like to make this site as inclusive as possible. Please share your questions and problems with us and alert us to references or Web sites that could be included on the site. You can use email, fax, or regular mail. Email: nurrenbern@purdue.edu or wrrobin@purdue.edu Fax: 765/494-0239 Mailing address: Susan C. Nurrenbern or William R. Robinson; Department of Chemistry; Purdue University; 1393 Brown Building; West Lafayette, IN 47907-1393. The Conceptual Questions and Challenge Problems Web site can be found here.
NASA Astrophysics Data System (ADS)
Johanson, I. A.; Miklius, A.; Poland, M. P.
2016-12-01
A sequence of magmatic events in April-May 2015 at Kīlauea Volcano produced a complex deformation pattern that can be described by multiple deforming sources, active simultaneously. The 2015 intrusive sequence began with inflation in the volcano's summit caldera near Halema`uma`u (HMM) Crater, which continued over a few weeks, followed by rapid deflation of the HMM source and inflation of a source in the south caldera region during the next few days. In Kīlauea Volcano's summit area, multiple deformation centers are active at varying times, and all contribute to the overall pattern observed with GPS, tiltmeters, and InSAR. Isolating the contribution of different signals related to each source is a challenge and complicates the determination of optimal source geometry for the underlying magma bodies. We used principle component analysis of continuous GPS time series from the 2015 intrusion sequence to determine three basis vectors which together account for 83% of the variance in the data set. The three basis vectors are non-orthogonal and not strictly the principle components of the data set. In addition to separating deformation sources in the continuous GPS data, the basis vectors provide a means to scale the contribution of each source in a given interferogram. This provides an additional constraint in a joint model of GPS and InSAR data (COSMO-SkyMed and Sentinel-1A) to determine source geometry. The first basis vector corresponds with inflation in the south caldera region, an area long recognized as the location of a long-term storage reservoir. The second vector represents deformation of the HMM source, which is in the same location as a previously modeled shallow reservoir, however InSAR data suggest a more complicated source. Preliminary modeling of the deformation attributed to the third basis vector shows that it is consistent with inflation of a steeply dipping ellipsoid centered below Keanakāko`i crater, southeast of HMM. Keanakāko`i crater is the locus of a known, intermittently active deformation source, which was not previously recognized to have been active during the 2015 event.
Time-frequency approach to underdetermined blind source separation.
Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong
2012-02-01
This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.
Long-term ecosystem monitoring and assessment of the Detroit River and Western Lake Erie.
Hartig, J H; Zarull, M A; Ciborowski, J J H; Gannon, J E; Wilke, E; Norwood, G; Vincent, A N
2009-11-01
Over 35 years of US and Canadian pollution prevention and control efforts have led to substantial improvements in environmental quality of the Detroit River and western Lake Erie. However, the available information also shows that much remains to be done. Improvements in environmental quality have resulted in significant ecological recovery, including increasing populations of bald eagles (Haliaeetus leucocephalus), peregrine falcons (Falco columbarius), lake sturgeon (Acipenser fulvescens), lake whitefish (Coregonus clupeaformis), walleye (Sander vitreus), and burrowing mayflies (Hexagenia spp.). Although this recovery is remarkable, many challenges remain, including population growth, transportation expansion, and land use changes; nonpoint source pollution; toxic substances contamination; habitat loss and degradation; introduction of exotic species; and greenhouse gases and global warming. Research/monitoring must be sustained for effective management. Priority research and monitoring needs include: demonstrating and quantifying cause-effect relationships; establishing quantitative endpoints and desired future states; determining cumulative impacts and how indicators relate; improving modeling and prediction; prioritizing geographic areas for protection and restoration; and fostering long-term monitoring for adaptive management. Key management agencies, universities, and environmental and conservation organizations should pool resources and undertake comprehensive and integrative assessments of the health of the Detroit River and western Lake Erie at least every 5 years to practice adaptive management for long-term sustainability.
Wray, Jo; Tregay, Jenifer; Bull, Catherine; Knowles, Rachel L; Crowe, Sonya; Brown, Katherine
2018-03-05
To elicit the perceptions of helpline staff who talk to parents of children discharged after cardiac surgery in infancy about parents' key concerns. A qualitative study involving semistructured interviews with 10 staff at four heart charities. Interviews were recorded, transcribed and analysed using Framework analysis. Staff identified the knowledge, communication and support needs of parents which they described in terms of the impact of patient and family factors, sources of support and systems. Staff perceptions of helplines, in terms of the function of a helpline and the roles of its staff, together with staff's personal views based on their experience of multiple encounters with many families, influenced how they viewed families' needs and responded to their requests. Helpline staff provided important, previously uncaptured evidence about the challenges faced by parents of children discharged after cardiac surgery in infancy. Staff have an important role in supporting communication, in terms of speaking to families about how to talk to professionals and talking to professionals directly to get or give information when parents are unable to do so. Capturing the perspective of helpline staff about communication issues has highlighted the need for interventions with professionals as well as parents. ©2018 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
Freeform étendue-preserving optics for light and color mixing
NASA Astrophysics Data System (ADS)
Sorgato, Simone; Mohedano, Rubén.; Chaves, Julio; Cvetkovic, Aleksandra; Hernández, Maikel; Benítez, Pablo; Miñano, Juan C.; Thienpont, Hugo; Duerr, Fabian
2015-09-01
Today's SSL illumination market shows a clear trend towards high flux packages with higher efficiency and higher CRI, realized by means of multiple color chips and phosphors. Such light sources require the optics to provide both near- and far-field color mixing. This design problem is particularly challenging for collimated luminaries, since traditional diffusers cannot be employed without enlarging the exit aperture and reducing brightness (so increasing étendue). Furthermore, diffusers compromise the light output ratio (efficiency) of the lamps to which they are applied. A solution, based on Köhler integration, consisting of a spherical cap comprising spherical microlenses on both its interior and exterior sides was presented in 2012. When placed on top of an inhomogeneous multichip Lambertian LED, this so-called Shell-Mixer creates a homogeneous (both spatially and angularly) virtual source, also Lambertian, where the images of the chips merge. The virtual source is located at the same position with essentially the same size of the original source. The diameter of this optics was 3 times that of the chip-array footprint. In this work, we present a new version of the Shell-Mixer, based on the Edge Ray Principle, where neither the overall shape of the cap nor the surfaces of the lenses are constrained to spheres or rotational Cartesian ovals. This new Shell- Mixer is freeform, only twice as large as the original chip-array and equals the original model in terms of brightness, color uniformity and efficiency.
Bhatia, Vivek N.; Perlman, David H.; Costello, Catherine E.; McComb, Mark E.
2009-01-01
In order that biological meaning may be derived and testable hypotheses may be built from proteomics experiments, assignments of proteins identified by mass spectrometry or other techniques must be supplemented with additional notation, such as information on known protein functions, protein-protein interactions, or biological pathway associations. Collecting, organizing, and interpreting this data often requires the input of experts in the biological field of study, in addition to the time-consuming search for and compilation of information from online protein databases. Furthermore, visualizing this bulk of information can be challenging due to the limited availability of easy-to-use and freely available tools for this process. In response to these constraints, we have undertaken the design of software to automate annotation and visualization of proteomics data in order to accelerate the pace of research. Here we present the Software Tool for Researching Annotations of Proteins (STRAP) – a user-friendly, open-source C# application. STRAP automatically obtains gene ontology (GO) terms associated with proteins in a proteomics results ID list using the freely accessible UniProtKB and EBI GOA databases. Summarized in an easy-to-navigate tabular format, STRAP includes meta-information on the protein in addition to complimentary GO terminology. Additionally, this information can be edited by the user so that in-house expertise on particular proteins may be integrated into the larger dataset. STRAP provides a sortable tabular view for all terms, as well as graphical representations of GO-term association data in pie (biological process, cellular component and molecular function) and bar charts (cross comparison of sample sets) to aid in the interpretation of large datasets and differential analyses experiments. Furthermore, proteins of interest may be exported as a unique FASTA-formatted file to allow for customizable re-searching of mass spectrometry data, and gene names corresponding to the proteins in the lists may be encoded in the Gaggle microformat for further characterization, including pathway analysis. STRAP, a tutorial, and the C# source code are freely available from http://cpctools.sourceforge.net. PMID:19839595
Challenges to producing a long-term stratospheric aerosol climatology for chemistry and climate
NASA Astrophysics Data System (ADS)
Thomason, Larry; Vernier, Jean-Paul; Bourassa, Adam; Rieger, Landon; Luo, Beiping; Peter, Thomas; Arfeuille, Florian
2016-04-01
Stratospheric aerosol data sets are key inputs for climate models (GCMs, CCMs) particularly for understanding the role of volcanoes on climate and as a surrogate for understanding the potential of human-derived stratospheric aerosol as mitigation for global warming. In addition to supporting activities of individual climate models, the data sets also act as a historical input to the activities of SPARC's Chemistry-Climate Model Initiative (CCMI) and the World Climate Research Programme's Coupled Model Intercomparison Project (CMIP). One such data set was produced in 2004 as a part of the SPARC Assessment of Stratospheric Aerosol Properties (ASAP), extending from 1979 and 2004. It was primarily constructed from the Stratospheric Aerosol and Gas Experiment series of instruments but supplemented by data from other space-based sources and a number of ground-based and airborne instruments. Updates to this data set have expanded the timeframe to span from 1850 through 2014 through the inclusion of data from additional sources, such as photometer data and ice core analyses. Fundamentally, there are limitations to the reliability of the optical properties of aerosol inferred from even the most complete single instrument data sets. At the same time, the heterogeneous nature of the underlying data to this historical data set produces considerable challenges to the production of a climate data set which is both homogeneous and reliable throughout its timespan. In this presentation, we will discuss the impact of this heterogeneity showing specific examples such as the SAGE II to OSIRIS/CALIPSO transition in 2005. Potential solutions to these issues will also be discussed.
2015-01-01
Linguistic and cultural differences can impede comprehension among potential research participants during the informed consent process, but how researchers and IRBs respond to these challenges in practice is unclear. We conducted in-depth interviews with 15 researchers, research ethics committee (REC) chairs and members from 8 different countries with emerging economies, involved in HIV-related research sponsored by HIV Prevention Trials Network (HPTN), regarding the ethical and regulatory challenges they face in this regard. In the interviews, problems with translating study materials often arose as major concerns. Four sets of challenges were identified concerning linguistic and cultural translations of informed consent documents and other study materials, related to the: (1) context, (2) process, (3) content and (4) translation of these documents. Host country contextual issues included low literacy rates, education (e.g., documents may need to be written below 5th grade reading level), and experiences with research, and different views of written documentation. Certain terms and concepts may not exist in other languages, or have additional connotations that back translations do not always reveal. Challenges arise because of not only the content of word-for-word, literal translation, but the linguistic form of the language, such as tone (e.g., appropriate forms of politeness vs. legalese, seen as harsh), syntax, manner of questions posed, and the concept of the consent); and the contexts of use affect meaning. Problems also emerged in bilateral communications – US IRBs may misunderstand local practices, or communicate insufficiently the reasons for their decisions to foreign RECs. In sum, these data highlight several challenges that have received little, if any, attention in past literature on translation of informed consent and study materials, and have crucial implications for improving practice, education, research and policy, suggesting several strategies, including needs for broader open-source multilingual lexicons, and more awareness of the complexities involved. PMID:26225759