Sample records for sets global performance

  1. Impact of Uncertainties in Meteorological Forcing Data and Land Surface Parameters on Global Estimates of Terrestrial Water Balance Components

    NASA Astrophysics Data System (ADS)

    Nasonova, O. N.; Gusev, Ye. M.; Kovalev, Ye. E.

    2009-04-01

    Global estimates of the components of terrestrial water balance depend on a technique of estimation and on the global observational data sets used for this purpose. Land surface modelling is an up-to-date and powerful tool for such estimates. However, the results of modelling are affected by the quality of both a model and input information (including meteorological forcing data and model parameters). The latter is based on available global data sets containing meteorological data, land-use information, and soil and vegetation characteristics. Now there are a lot of global data sets, which differ in spatial and temporal resolution, as well as in accuracy and reliability. Evidently, uncertainties in global data sets will influence the results of model simulations, but to which extent? The present work is an attempt to investigate this issue. The work is based on the land surface model SWAP (Soil Water - Atmosphere - Plants) and global 1-degree data sets on meteorological forcing data and the land surface parameters, provided within the framework of the Second Global Soil Wetness Project (GSWP-2). The 3-hourly near-surface meteorological data (for the period from 1 July 1982 to 31 December 1995) are based on reanalyses and gridded observational data used in the International Satellite Land-Surface Climatology Project (ISLSCP) Initiative II. Following the GSWP-2 strategy, we used a number of alternative global forcing data sets to perform different sensitivity experiments (with six alternative versions of precipitation, four versions of radiation, two pure reanalysis products and two fully hybridized products of meteorological data). To reveal the influence of model parameters on simulations, in addition to GSWP-2 parameter data sets, we produced two alternative global data sets with soil parameters on the basis of their relationships with the content of clay and sand in a soil. After this the sensitivity experiments with three different sets of parameters were performed. As a result, 16 variants of global annual estimates of water balance components were obtained. Application of alternative data sets on radiation, precipitation, and soil parameters allowed us to reveal the influence of uncertainties in input data on global estimates of water balance components.

  2. Global-scale regionalization of hydrological model parameters using streamflow data from many small catchments

    NASA Astrophysics Data System (ADS)

    Beck, Hylke; de Roo, Ad; van Dijk, Albert; McVicar, Tim; Miralles, Diego; Schellekens, Jaap; Bruijnzeel, Sampurno; de Jeu, Richard

    2015-04-01

    Motivated by the lack of large-scale model parameter regionalization studies, a large set of 3328 small catchments (< 10000 km2) around the globe was used to set up and evaluate five model parameterization schemes at global scale. The HBV-light model was chosen because of its parsimony and flexibility to test the schemes. The catchments were calibrated against observed streamflow (Q) using an objective function incorporating both behavioral and goodness-of-fit measures, after which the catchment set was split into subsets of 1215 donor and 2113 evaluation catchments based on the calibration performance. The donor catchments were subsequently used to derive parameter sets that were transferred to similar grid cells based on a similarity measure incorporating climatic and physiographic characteristics, thereby producing parameter maps with global coverage. Overall, there was a lack of suitable donor catchments for mountainous and tropical environments. The schemes with spatially-uniform parameter sets (EXP2 and EXP3) achieved the worst Q estimation performance in the evaluation catchments, emphasizing the importance of parameter regionalization. The direct transfer of calibrated parameter sets from donor catchments to similar grid cells (scheme EXP1) performed best, although there was still a large performance gap between EXP1 and HBV-light calibrated against observed Q. The schemes with parameter sets obtained by simultaneously calibrating clusters of similar donor catchments (NC10 and NC58) performed worse than EXP1. The relatively poor Q estimation performance achieved by two (uncalibrated) macro-scale hydrological models suggests there is considerable merit in regionalizing the parameters of such models. The global HBV-light parameter maps and ancillary data are freely available via http://water.jrc.ec.europa.eu.

  3. Creating Online Training for Procedures in Global Health with PEARLS (Procedural Education for Adaptation to Resource-Limited Settings).

    PubMed

    Bensman, Rachel S; Slusher, Tina M; Butteris, Sabrina M; Pitt, Michael B; On Behalf Of The Sugar Pearls Investigators; Becker, Amanda; Desai, Brinda; George, Alisha; Hagen, Scott; Kiragu, Andrew; Johannsen, Ron; Miller, Kathleen; Rule, Amy; Webber, Sarah

    2017-11-01

    The authors describe a multiinstitutional collaborative project to address a gap in global health training by creating a free online platform to share a curriculum for performing procedures in resource-limited settings. This curriculum called PEARLS (Procedural Education for Adaptation to Resource-Limited Settings) consists of peer-reviewed instructional and demonstration videos describing modifications for performing common pediatric procedures in resource-limited settings. Adaptations range from the creation of a low-cost spacer for inhaled medications to a suction chamber for continued evacuation of a chest tube. By describing the collaborative process, we provide a model for educators in other fields to collate and disseminate procedural modifications adapted for their own specialty and location, ideally expanding this crowd-sourced curriculum to reach a wide audience of trainees and providers in global health.

  4. Static evaluation of a NAVSTAR Global Positioning System (GPS) (Magnavox Z-Set) receiver, May-September, 1979

    DOT National Transportation Integrated Search

    1980-05-01

    The report documents the results of the static testing of a NAVSTAR Global Positioning System (GPS) single channel sequential receiver (Magnavox Z-Set). These tests were performed at the Coast Guard District 11 office in Long Beach, CA from May to Se...

  5. Tests of a Semi-Analytical Case 1 and Gelbstoff Case 2 SeaWiFS Algorithm with a Global Data Set

    NASA Technical Reports Server (NTRS)

    Carder, Kendall L.; Hawes, Steve K.; Lee, Zhongping

    1997-01-01

    A semi-analytical algorithm was tested with a total of 733 points of either unpackaged or packaged-pigment data, with corresponding algorithm parameters for each data type. The 'unpackaged' type consisted of data sets that were generally consistent with the Case 1 CZCS algorithm and other well calibrated data sets. The 'packaged' type consisted of data sets apparently containing somewhat more packaged pigments, requiring modification of the absorption parameters of the model consistent with the CalCOFI study area. This resulted in two equally divided data sets. A more thorough scrutiny of these and other data sets using a semianalytical model requires improved knowledge of the phytoplankton and gelbstoff of the specific environment studied. Since the semi-analytical algorithm is dependent upon 4 spectral channels including the 412 nm channel, while most other algorithms are not, a means of testing data sets for consistency was sought. A numerical filter was developed to classify data sets into the above classes. The filter uses reflectance ratios, which can be determined from space. The sensitivity of such numerical filters to measurement resulting from atmospheric correction and sensor noise errors requires further study. The semi-analytical algorithm performed superbly on each of the data sets after classification, resulting in RMS1 errors of 0.107 and 0.121, respectively, for the unpackaged and packaged data-set classes, with little bias and slopes near 1.0. In combination, the RMS1 performance was 0.114. While these numbers appear rather sterling, one must bear in mind what mis-classification does to the results. Using an average or compromise parameterization on the modified global data set yielded an RMS1 error of 0.171, while using the unpackaged parameterization on the global evaluation data set yielded an RMS1 error of 0.284. So, without classification, the algorithm performs better globally using the average parameters than it does using the unpackaged parameters. Finally, the effects of even more extreme pigment packaging must be examined in order to improve algorithm performance at high latitudes. Note, however, that the North Sea and Mississippi River plume studies contributed data to the packaged and unpackaged classess, respectively, with little effect on algorithm performance. This suggests that gelbstoff-rich Case 2 waters do not seriously degrade performance of the semi-analytical algorithm.

  6. Does communication help people coordinate?

    PubMed Central

    2017-01-01

    Theoretical and experimental investigations have consistently demonstrated that collective performance in a variety of tasks can be significantly improved by allowing communication. We present the results of the first experiment systematically investigating the value of communication in networked consensus. The goal of all tasks in our experiments is for subjects to reach global consensus, even though nodes can only observe choices of their immediate neighbors. Unlike previous networked consensus tasks, our experiments allow subjects to communicate either with their immediate neighbors (locally) or with the entire network (globally). Moreover, we consider treatments in which essentially arbitrary messages can be sent, as well as those in which only one type of message is allowed, informing others about a node’s local state. We find that local communication adds minimal value: fraction of games solved is essentially identical to treatments with no communication. Ability to communicate globally, in contrast, offers a significant performance improvement. In addition, we find that constraining people to only exchange messages about local state is significantly better than unconstrained communication. We observe that individual behavior is qualitatively consistent across settings: people clearly react to messages they receive in all communication settings. However, we find that messages received in local communication treatments are relatively uninformative, whereas global communication offers substantial information advantage. Exploring mixed communication settings, in which only a subset of agents are global communicators, we find that a significant number of global communicators is needed for performance to approach success when everyone communicates globally. However, global communicators have a significant advantage: a small tightly connected minority of globally communicating nodes can successfully steer outcomes towards their preferences, although this can be significantly mitigated when all other nodes have the ability to communicate locally with their neighbors. PMID:28178295

  7. An Evaluation of Computerized Tests as Predictors of Job Performance: II. Differential Validity for Global and Job Element Criteria. Final Report.

    ERIC Educational Resources Information Center

    Cory, Charles H.

    This report presents data concerning the validity of a set of experimental computerized and paper-and-pencil tests for measures of on-job performance on global and job elements. It reports on the usefulness of 30 experimental and operational variables for predicting marks on 42 job elements and on a global criterion for Electrician's Mate,…

  8. Estimating the potential intensification of global grazing systems based on climate adjusted yield gap analysis

    NASA Astrophysics Data System (ADS)

    Sheehan, J. J.

    2016-12-01

    We report here a first-of-its-kind analysis of the potential for intensification of global grazing systems. Intensification is calculated using the statistical yield gap methodology developed previously by others (Mueller et al 2012 and Licker et al 2010) for global crop systems. Yield gaps are estimated by binning global pasture land area into 100 equal area sized bins of similar climate (defined by ranges of rainfall and growing degree days). Within each bin, grid cells of pastureland are ranked from lowest to highest productivity. The global intensification potential is defined as the sum of global production across all bins at a given percentile ranking (e.g. performance at the 90th percentile) divided by the total current global production. The previous yield gap studies focused on crop systems because productivity data on these systems is readily available. Nevertheless, global crop land represents only one-third of total global agricultural land, while pasture systems account for the remaining two-thirds. Thus, it is critical to conduct the same kind of analysis on what is the largest human use of land on the planet—pasture systems. In 2013, Herrero et al announced the completion of a geospatial data set that augmented the animal census data with data and modeling about production systems and overall food productivity (Herrero et al, PNAS 2013). With this data set, it is now possible to apply yield gap analysis to global pasture systems. We used the Herrero et al data set to evaluate yield gaps for meat and milk production from pasture based systems for cattle, sheep and goats. The figure included with this abstract shows the intensification potential for kcal per hectare per year of meat and milk from global cattle, sheep and goats as a function of increasing levels of performance. Performance is measured as the productivity achieved at a given ranked percentile within each bin.We find that if all pasture land were raised to their 90th percentile of performance, global output of meat and milk could increase 2.8 fold. This is much higher than that reported previously for major grain crops like corn and wheat. Our results suggest that efforts to address poor performance of pasture systems around the world could substantially improve the outlook for meeting future food demand.

  9. Global ablation techniques.

    PubMed

    Woods, Sarah; Taylor, Betsy

    2013-12-01

    Global endometrial ablation techniques are a relatively new surgical technology for the treatment of heavy menstrual bleeding that can now be used even in an outpatient clinic setting. A comparison of global ablation versus earlier ablation technologies notes no significant differences in success rates and some improvement in patient satisfaction. The advantages of the newer global endometrial ablation systems include less operative time, improved recovery time, and decreased anesthetic risk. Ablation procedures performed in an outpatient surgical or clinic setting provide advantages both of potential cost savings for patients and the health care system and improved patient convenience. Copyright © 2013. Published by Elsevier Inc.

  10. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spackman, Peter R.; Karton, Amir, E-mail: amir.karton@uwa.edu.au

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L{sup α} two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/ormore » second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol{sup –1}. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol{sup –1}.« less

  11. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    NASA Astrophysics Data System (ADS)

    Spackman, Peter R.; Karton, Amir

    2015-05-01

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/Lα two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol-1. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol-1.

  12. Being global in public health practice and research: complementary competencies are needed.

    PubMed

    Cole, Donald C; Davison, Colleen; Hanson, Lori; Jackson, Suzanne F; Page, Ashley; Lencuch, Raphael; Kakuma, Ritz

    2011-01-01

    Different sets of competencies in public health, global health and research have recently emerged, including the Core Competencies for Public Health in Canada (CCPHC). Within this context, we believe it is important to articulate competencies for globalhealth practitioners-educators and researchers that are in addition to those outlined in the CCPHC. In global health, we require knowledge and skills regarding: north-south power dynamics, linkages between local and global health problems, and the roles of international organizations. We must be able to work responsibly in low-resource settings, foster self-determination in a world rife with power differentials, and engage in dialogue with stakeholders globally. Skills in cross-cultural communication and the ability to critically self-reflect on one's own social location within the global context are essential. Those in global health must be committed to improving health equity through global systems changes and be willing to be mentored and to mentor others across borders. We call for dialogue on these competencies and for development of ways to assess both their demonstration in academic settings and their performance in global health practice and research.

  13. A Global Rapid Integrated Monitoring System for Water Cycle and Water Resource Assessment (Global-RIMS)

    NASA Technical Reports Server (NTRS)

    Roads, John; Voeroesmarty, Charles

    2005-01-01

    The main focus of our work was to solidify underlying data sets, the data processing tools and the modeling environment needed to perform a series of long-term global and regional hydrological simulations leading eventually to routine hydrometeorological predictions. A water and energy budget synthesis was developed for the Mississippi River Basin (Roads et al. 2003), in order to understand better what kinds of errors exist in current hydrometeorological data sets. This study is now being extended globally with a larger number of observations and model based data sets under the new NASA NEWS program. A global comparison of a number of precipitation data sets was subsequently carried out (Fekete et al. 2004) in which it was further shown that reanalysis precipitation has substantial problems, which subsequently led us to the development of a precipitation assimilation effort (Nunes and Roads 2005). We believe that with current levels of model skill in predicting precipitation that precipitation assimilation is necessary to get the appropriate land surface forcing.

  14. A global data set of soil particle size properties

    NASA Technical Reports Server (NTRS)

    Webb, Robert S.; Rosenzweig, Cynthia E.; Levine, Elissa R.

    1991-01-01

    A standardized global data set of soil horizon thicknesses and textures (particle size distributions) was compiled. This data set will be used by the improved ground hydrology parameterization designed for the Goddard Institute for Space Studies General Circulation Model (GISS GCM) Model 3. The data set specifies the top and bottom depths and the percent abundance of sand, silt, and clay of individual soil horizons in each of the 106 soil types cataloged for nine continental divisions. When combined with the World Soil Data File, the result is a global data set of variations in physical properties throughout the soil profile. These properties are important in the determination of water storage in individual soil horizons and exchange of water with the lower atmosphere. The incorporation of this data set into the GISS GCM should improve model performance by including more realistic variability in land-surface properties.

  15. A global × global test for testing associations between two large sets of variables.

    PubMed

    Chaturvedi, Nimisha; de Menezes, Renée X; Goeman, Jelle J

    2017-01-01

    In high-dimensional omics studies where multiple molecular profiles are obtained for each set of patients, there is often interest in identifying complex multivariate associations, for example, copy number regulated expression levels in a certain pathway or in a genomic region. To detect such associations, we present a novel approach to test for association between two sets of variables. Our approach generalizes the global test, which tests for association between a group of covariates and a single univariate response, to allow high-dimensional multivariate response. We apply the method to several simulated datasets as well as two publicly available datasets, where we compare the performance of multivariate global test (G2) with univariate global test. The method is implemented in R and will be available as a part of the globaltest package in R. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A survey of compiler optimization techniques

    NASA Technical Reports Server (NTRS)

    Schneck, P. B.

    1972-01-01

    Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.

  17. A high-resolution global flood hazard model

    NASA Astrophysics Data System (ADS)

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  18. A high‐resolution global flood hazard model†

    PubMed Central

    Smith, Andrew M.; Bates, Paul D.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-01-01

    Abstract Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data‐scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross‐disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ∼90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high‐resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ∼1 km, mean absolute error in flooded fraction falls to ∼5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2‐D only variant and an independently developed pan‐European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next‐generation global terrain data sets will offer the best prospect for a step‐change improvement in model performance. PMID:27594719

  19. A Memetic Algorithm for Global Optimization of Multimodal Nonseparable Problems.

    PubMed

    Zhang, Geng; Li, Yangmin

    2016-06-01

    It is a big challenging issue of avoiding falling into local optimum especially when facing high-dimensional nonseparable problems where the interdependencies among vector elements are unknown. In order to improve the performance of optimization algorithm, a novel memetic algorithm (MA) called cooperative particle swarm optimizer-modified harmony search (CPSO-MHS) is proposed in this paper, where the CPSO is used for local search and the MHS for global search. The CPSO, as a local search method, uses 1-D swarm to search each dimension separately and thus converges fast. Besides, it can obtain global optimum elements according to our experimental results and analyses. MHS implements the global search by recombining different vector elements and extracting global optimum elements. The interaction between local search and global search creates a set of local search zones, where global optimum elements reside within the search space. The CPSO-MHS algorithm is tested and compared with seven other optimization algorithms on a set of 28 standard benchmarks. Meanwhile, some MAs are also compared according to the results derived directly from their corresponding references. The experimental results demonstrate a good performance of the proposed CPSO-MHS algorithm in solving multimodal nonseparable problems.

  20. Global Seismic Imaging Based on Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Bozdag, E.; Lefebvre, M.; Lei, W.; Peter, D. B.; Smith, J. A.; Zhu, H.; Komatitsch, D.; Tromp, J.

    2013-12-01

    Our aim is to perform adjoint tomography at the scale of globe to image the entire planet. We have started elastic inversions with a global data set of 253 CMT earthquakes with moment magnitudes in the range 5.8 ≤ Mw ≤ 7 and used GSN stations as well as some local networks such as USArray, European stations, etc. Using an iterative pre-conditioned conjugate gradient scheme, we initially set the aim to obtain a global crustal and mantle model with confined transverse isotropy in the upper mantle. Global adjoint tomography has so far remained a challenge mainly due to computational limitations. Recent improvements in our 3D solvers (e.g., a GPU version) and access to high-performance computational centers (e.g., ORNL's Cray XK7 "Titan" system) now enable us to perform iterations with higher-resolution (T > 9 s) and longer-duration (200 min) simulations to accommodate high-frequency body waves and major-arc surface waves, respectively, which help improve data coverage. The remaining challenge is the heavy I/O traffic caused by the numerous files generated during the forward/adjoint simulations and the pre- and post-processing stages of our workflow. We improve the global adjoint tomography workflow by adopting the ADIOS file format for our seismic data as well as models, kernels, etc., to improve efficiency on high-performance clusters. Our ultimate aim is to use data from all available networks and earthquakes within the magnitude range of our interest (5.5 ≤ Mw ≤ 7) which requires a solid framework to manage big data in our global adjoint tomography workflow. We discuss the current status and future of global adjoint tomography based on our initial results as well as practical issues such as handling big data in inversions and on high-performance computing systems.

  1. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    DOEpatents

    Faraj, Ahmad

    2013-07-09

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer, each node including at least two processing cores, that include: establishing, for each node, a plurality of logical rings, each ring including a different set of at least one core on that node, each ring including the cores on at least two of the nodes; iteratively for each node: assigning each core of that node to one of the rings established for that node to which the core has not previously been assigned, and performing, for each ring for that node, a global allreduce operation using contribution data for the cores assigned to that ring or any global allreduce results from previous global allreduce operations, yielding current global allreduce results for each core; and performing, for each node, a local allreduce operation using the global allreduce results.

  2. Psychometric Quality of a Student Evaluation of Teaching Survey in Higher Education

    ERIC Educational Resources Information Center

    Oon, Pey-Tee; Spencer, Benson; Kam, Chester Chun Seng

    2017-01-01

    Student evaluations of teaching (SET) are used globally by higher education institutions for performance assessment of academic staff and evaluation of course quality. Higher education institutions commonly develop their own SETs to measure variables deemed relevant to them. However, "home-grown" SETs are rarely assessed…

  3. Modelling Movement Energetics Using Global Positioning System Devices in Contact Team Sports: Limitations and Solutions.

    PubMed

    Gray, Adrian J; Shorter, Kathleen; Cummins, Cloe; Murphy, Aron; Waldron, Mark

    2018-06-01

    Quantifying the training and competition loads of players in contact team sports can be performed in a variety of ways, including kinematic, perceptual, heart rate or biochemical monitoring methods. Whilst these approaches provide data relevant for team sports practitioners and athletes, their application to a contact team sport setting can sometimes be challenging or illogical. Furthermore, these methods can generate large fragmented datasets, do not provide a single global measure of training load and cannot adequately quantify all key elements of performance in contact team sports. A previous attempt to address these limitations via the estimation of metabolic energy demand (global energy measurement) has been criticised for its inability to fully quantify the energetic costs of team sports, particularly during collisions. This is despite the seemingly unintentional misapplication of the model's principles to settings outside of its intended use. There are other hindrances to the application of such models, which are discussed herein, such as the data-handling procedures of Global Position System manufacturers and the unrealistic expectations of end users. Nevertheless, we propose an alternative energetic approach, based on Global Positioning System-derived data, to improve the assessment of mechanical load in contact team sports. We present a framework for the estimation of mechanical work performed during locomotor and contact events with the capacity to globally quantify the work done during training and matches.

  4. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  5. Impairment in local and global processing and set-shifting in body dysmorphic disorder

    PubMed Central

    Kerwin, Lauren; Hovav, Sarit; Helleman, Gerhard; Feusner, Jamie D.

    2014-01-01

    Body dysmorphic disorder (BDD) is characterized by distressing and often debilitating preoccupations with misperceived defects in appearance. Research suggests that aberrant visual processing may contribute to these misperceptions. This study used two tasks to probe global and local visual processing as well as set shifting in individuals with BDD. Eighteen unmedicated individuals with BDD and 17 non-clinical controls completed two global-local tasks. The embedded figures task requires participants to determine which of three complex figures contained a simpler figure embedded within it. The Navon task utilizes incongruent stimuli comprised of a large letter (global level) made up of smaller letters (local level). The outcome measures were response time and accuracy rate. On the embedded figures task, BDD individuals were slower and less accurate than controls. On the Navon task, BDD individuals processed both global and local stimuli slower and less accurately than controls, and there was a further decrement in performance when shifting attention between the different levels of stimuli. Worse insight correlated with poorer performance on both tasks. Taken together, these results suggest abnormal global and local processing for non-appearance related stimuli among BDD individuals, in addition to evidence of poor set-shifting abilities. Moreover, these abnormalities appear to relate to the important clinical variable of poor insight. Further research is needed to explore these abnormalities and elucidate their possible role in the development and/or persistence of BDD symptoms. PMID:24972487

  6. Expression signature as a biomarker for prenatal diagnosis of trisomy 21.

    PubMed

    Volk, Marija; Maver, Aleš; Lovrečić, Luca; Juvan, Peter; Peterlin, Borut

    2013-01-01

    A universal biomarker panel with the potential to predict high-risk pregnancies or adverse pregnancy outcome does not exist. Transcriptome analysis is a powerful tool to capture differentially expressed genes (DEG), which can be used as biomarker-diagnostic-predictive tool for various conditions in prenatal setting. In search of biomarker set for predicting high-risk pregnancies, we performed global expression profiling to find DEG in Ts21. Subsequently, we performed targeted validation and diagnostic performance evaluation on a larger group of case and control samples. Initially, transcriptomic profiles of 10 cultivated amniocyte samples with Ts21 and 9 with normal euploid constitution were determined using expression microarrays. Datasets from Ts21 transcriptomic studies from GEO repository were incorporated. DEG were discovered using linear regression modelling and validated using RT-PCR quantification on an independent sample of 16 cases with Ts21 and 32 controls. The classification performance of Ts21 status based on expression profiling was performed using supervised machine learning algorithm and evaluated using a leave-one-out cross validation approach. Global gene expression profiling has revealed significant expression changes between normal and Ts21 samples, which in combination with data from previously performed Ts21 transcriptomic studies, were used to generate a multi-gene biomarker for Ts21, comprising of 9 gene expression profiles. In addition to biomarker's high performance in discriminating samples from global expression profiling, we were also able to show its discriminatory performance on a larger sample set 2, validated using RT-PCR experiment (AUC=0.97), while its performance on data from previously published studies reached discriminatory AUC values of 1.00. Our results show that transcriptomic changes might potentially be used to discriminate trisomy of chromosome 21 in the prenatal setting. As expressional alterations reflect both, causal and reactive cellular mechanisms, transcriptomic changes may thus have future potential in the diagnosis of a wide array of heterogeneous diseases that result from genetic disturbances.

  7. Aiming High: Setting Performance Standards for Student Success

    ERIC Educational Resources Information Center

    Phillips, Gary; Garcia, Alicia N.

    2015-01-01

    Content standards, not performance standards, have been almost the sole focus of state policies and recent conversations about academic standards. Without rigorous content and performance standards, we cannot adequately prepare students for the global marketplace. A recent AIR study shows that state performance standards are consistently low and…

  8. The Global Emergency Observation and Warning System

    NASA Technical Reports Server (NTRS)

    Bukley, Angelia P.; Mulqueen, John A.

    1994-01-01

    Based on an extensive characterization of natural hazards, and an evaluation of their impacts on humanity, a set of functional technical requirements for a global warning and relief system was developed. Since no technological breakthroughs are required to implement a global system capable of performing the functions required to provide sufficient information for prevention, preparedness, warning, and relief from natural disaster effects, a system is proposed which would combine the elements of remote sensing, data processing, information distribution, and communications support on a global scale for disaster mitigation.

  9. Evaluation of high fidelity patient simulator in assessment of performance of anaesthetists.

    PubMed

    Weller, J M; Bloch, M; Young, S; Maze, M; Oyesola, S; Wyner, J; Dob, D; Haire, K; Durbridge, J; Walker, T; Newble, D

    2003-01-01

    There is increasing emphasis on performance-based assessment of clinical competence. The High Fidelity Patient Simulator (HPS) may be useful for assessment of clinical practice in anaesthesia, but needs formal evaluation of validity, reliability, feasibility and effect on learning. We set out to assess the reliability of a global rating scale for scoring simulator performance in crisis management. Using a global rating scale, three judges independently rated videotapes of anaesthetists in simulated crises in the operating theatre. Five anaesthetists then independently rated subsets of these videotapes. There was good agreement between raters for medical management, behavioural attributes and overall performance. Agreement was high for both the initial judges and the five additional raters. Using a global scale to assess simulator performance, we found good inter-rater reliability for scoring performance in a crisis. We estimate that two judges should provide a reliable assessment. High fidelity simulation should be studied further for assessing clinical performance.

  10. God: Do I Have Your Attention?

    ERIC Educational Resources Information Center

    Colzato, Lorenza S.; van Beest, Ilja; van den Wildenberg, Wery P. M.; Scorolli, Claudia; Dorchin, Shirley; Meiran, Nachshon; Borghi, Anna M.; Hommel, Bernhard

    2010-01-01

    Religion is commonly defined as a set of rules, developed as part of a culture. Here we provide evidence that practice in following these rules systematically changes the way people attend to visual stimuli, as indicated by the individual sizes of the global precedence effect (better performance to global than to local features). We show that this…

  11. Effort in Multitasking: Local and Global Assessment of Effort.

    PubMed

    Kiesel, Andrea; Dignath, David

    2017-01-01

    When performing multiple tasks in succession, self-organization of task order might be superior compared to external-controlled task schedules, because self-organization allows optimizing processing modes and thus reduces switch costs, and it increases commitment to task goals. However, self-organization is an additional executive control process that is not required if task order is externally specified and as such it is considered as time-consuming and effortful. To compare self-organized and externally controlled task scheduling, we suggest assessing global subjective and objectives measures of effort in addition to local performance measures. In our new experimental approach, we combined characteristics of dual tasking settings and task switching settings and compared local and global measures of effort in a condition with free choice of task sequence and a condition with cued task sequence. In a multi-tasking environment, participants chose the task order while the task requirement of the not-yet-performed task remained the same. This task preview allowed participants to work on the previously non-chosen items in parallel and resulted in faster responses and fewer errors in task switch trials than in task repetition trials. The free-choice group profited more from this task preview than the cued group when considering local performance measures. Nevertheless, the free-choice group invested more effort than the cued group when considering global measures. Thus, self-organization in task scheduling seems to be effortful even in conditions in which it is beneficiary for task processing. In a second experiment, we reduced the possibility of task preview for the not-yet-performed tasks in order to hinder efficient self-organization. Here neither local nor global measures revealed substantial differences between the free-choice and a cued task sequence condition. Based on the results of both experiments, we suggest that global assessment of effort in addition to local performance measures might be a useful tool for multitasking research.

  12. Global Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnamoorthy, Sriram; Daily, Jeffrey A.; Vishnu, Abhinav

    2015-11-01

    Global Arrays (GA) is a distributed-memory programming model that allows for shared-memory-style programming combined with one-sided communication, to create a set of tools that combine high performance with ease-of-use. GA exposes a relatively straightforward programming abstraction, while supporting fully-distributed data structures, locality of reference, and high-performance communication. GA was originally formulated in the early 1990’s to provide a communication layer for the Northwest Chemistry (NWChem) suite of chemistry modeling codes that was being developed concurrently.

  13. Static Evaluation of a NAVSTAR GPS (Magnavox Z-Set) Receiver - May-September 1979

    DOT National Transportation Integrated Search

    1980-05-01

    The report documents the results of the static testing of a NAVSTAR Global Positioning System (GPS) single channel sequential receiver (Magnavox Z-Set). These tests were performed at the Coast Guard District 11 office in Long Beach, CA from May to Se...

  14. Forecasting Space Weather-Induced GPS Performance Degradation Using Random Forest

    NASA Astrophysics Data System (ADS)

    Filjar, R.; Filic, M.; Milinkovic, F.

    2017-12-01

    Space weather and ionospheric dynamics have a profound effect on positioning performance of the Global Satellite Navigation System (GNSS). However, the quantification of that effect is still the subject of scientific activities around the world. In the latest contribution to the understanding of the space weather and ionospheric effects on satellite-based positioning performance, we conducted a study of several candidates for forecasting method for space weather-induced GPS positioning performance deterioration. First, a 5-days set of experimentally collected data was established, encompassing the space weather and ionospheric activity indices (including: the readings of the Sudden Ionospheric Disturbance (SID) monitors, components of geomagnetic field strength, global Kp index, Dst index, GPS-derived Total Electron Content (TEC) samples, standard deviation of TEC samples, and sunspot number) and observations of GPS positioning error components (northing, easting, and height positioning error) derived from the Adriatic Sea IGS reference stations' RINEX raw pseudorange files in quiet space weather periods. This data set was split into the training and test sub-sets. Then, a selected set of supervised machine learning methods based on Random Forest was applied to the experimentally collected data set in order to establish the appropriate regional (the Adriatic Sea) forecasting models for space weather-induced GPS positioning performance deterioration. The forecasting models were developed in the R/rattle statistical programming environment. The forecasting quality of the regional forecasting models developed was assessed, and the conclusions drawn on the advantages and shortcomings of the regional forecasting models for space weather-caused GNSS positioning performance deterioration.

  15. A framework for combining multiple soil moisture retrievals based on maximizing temporal correlation

    NASA Astrophysics Data System (ADS)

    Kim, Seokhyeon; Parinussa, Robert M.; Liu, Yi. Y.; Johnson, Fiona M.; Sharma, Ashish

    2015-08-01

    A method for combining two microwave satellite soil moisture products by maximizing the temporal correlation with a reference data set has been developed. The method was applied to two global soil moisture data sets, Japan Aerospace Exploration Agency (JAXA) and Land Parameter Retrieval Model (LPRM), retrieved from the Advanced Microwave Scanning Radiometer 2 observations for the period 2012-2014. A global comparison revealed superior results of the combined product compared to the individual products against the reference data set of ERA-Interim volumetric water content. The global mean temporal correlation coefficient of the combined product with this reference was 0.52 which outperforms the individual JAXA (0.35) as well as the LPRM (0.45) product. Additionally, the performance was evaluated against in situ observations from the International Soil Moisture Network. The combined data set showed a significant improvement in temporal correlation coefficients in the validation compared to JAXA and minor improvements for the LPRM product.

  16. Evidence for a Global Sampling Process in Extraction of Summary Statistics of Item Sizes in a Set.

    PubMed

    Tokita, Midori; Ueda, Sachiyo; Ishiguchi, Akira

    2016-01-01

    Several studies have shown that our visual system may construct a "summary statistical representation" over groups of visual objects. Although there is a general understanding that human observers can accurately represent sets of a variety of features, many questions on how summary statistics, such as an average, are computed remain unanswered. This study investigated sampling properties of visual information used by human observers to extract two types of summary statistics of item sets, average and variance. We presented three models of ideal observers to extract the summary statistics: a global sampling model without sampling noise, global sampling model with sampling noise, and limited sampling model. We compared the performance of an ideal observer of each model with that of human observers using statistical efficiency analysis. Results suggest that summary statistics of items in a set may be computed without representing individual items, which makes it possible to discard the limited sampling account. Moreover, the extraction of summary statistics may not necessarily require the representation of individual objects with focused attention when the sets of items are larger than 4.

  17. Increasing signal processing sophistication in the calculation of the respiratory modulation of the photoplethysmogram (DPOP).

    PubMed

    Addison, Paul S; Wang, Rui; Uribe, Alberto A; Bergese, Sergio D

    2015-06-01

    DPOP (∆POP or Delta-POP) is a non-invasive parameter which measures the strength of respiratory modulations present in the pulse oximetry photoplethysmogram (pleth) waveform. It has been proposed as a non-invasive surrogate parameter for pulse pressure variation (PPV) used in the prediction of the response to volume expansion in hypovolemic patients. Many groups have reported on the DPOP parameter and its correlation with PPV using various semi-automated algorithmic implementations. The study reported here demonstrates the performance gains made by adding increasingly sophisticated signal processing components to a fully automated DPOP algorithm. A DPOP algorithm was coded and its performance systematically enhanced through a series of code module alterations and additions. Each algorithm iteration was tested on data from 20 mechanically ventilated OR patients. Correlation coefficients and ROC curve statistics were computed at each stage. For the purposes of the analysis we split the data into a manually selected 'stable' region subset of the data containing relatively noise free segments and a 'global' set incorporating the whole data record. Performance gains were measured in terms of correlation against PPV measurements in OR patients undergoing controlled mechanical ventilation. Through increasingly advanced pre-processing and post-processing enhancements to the algorithm, the correlation coefficient between DPOP and PPV improved from a baseline value of R = 0.347 to R = 0.852 for the stable data set, and, correspondingly, R = 0.225 to R = 0.728 for the more challenging global data set. Marked gains in algorithm performance are achievable for manually selected stable regions of the signals using relatively simple algorithm enhancements. Significant additional algorithm enhancements, including a correction for low perfusion values, were required before similar gains were realised for the more challenging global data set.

  18. APOE Genotypes Associate With Cognitive Performance but Not Cerebral Structure: Diabetes Heart Study MIND

    PubMed Central

    Raffield, Laura M.; Hardy, Joycelyn C.; Hsu, Fang-Chi; Divers, Jasmin; Xu, Jianzhao; Smith, S. Carrie; Hugenschmidt, Christina E.; Wagner, Benjamin C.; Whitlow, Christopher T.; Sink, Kaycee M.; Maldjian, Joseph A.; Williamson, Jeff D.; Bowden, Donald W.; Freedman, Barry I.

    2016-01-01

    OBJECTIVE Dementia is a debilitating illness with a disproportionate burden in patients with type 2 diabetes (T2D). Among the contributors, genetic variation at the apolipoprotein E locus (APOE) is posited to convey a strong effect. This study compared and contrasted the association of APOE with cognitive performance and cerebral structure in the setting of T2D. RESEARCH DESIGN AND METHODS European Americans from the Diabetes Heart Study (DHS) MIND (n = 754) and African Americans from the African American (AA)-DHS MIND (n = 517) were examined. The cognitive battery assessed executive function, memory, and global cognition, and brain MRI was performed. RESULTS In European Americans and African Americans, the APOE E4 risk haplotype group was associated with poorer performance on the modified Mini-Mental Status Examination (P < 0.017), a measure of global cognition. In contrast to the literature, the APOE E2 haplotype group, which was overrepresented in these participants with T2D, was associated with poorer Rey Auditory Verbal Learning Test performance (P < 0.032). Nominal associations between APOE haplotype groups and MRI-determined cerebral structure were observed. CONCLUSIONS Compared with APOE E3 carriers, E2 and E4 carriers performed worse in the cognitive domains of memory and global cognition. Identification of genetic contributors remains critical to understanding new pathways to prevent and treat dementia in the setting of T2D. PMID:27703028

  19. Robust Global Image Registration Based on a Hybrid Algorithm Combining Fourier and Spatial Domain Techniques

    DTIC Science & Technology

    2012-09-01

    Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain techniques Peter N. Crabtree, Collin Seanor...00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain...demonstrate performance of a hybrid algorithm . These results are from analysis of a set of images of an ISO 12233 [12] resolution chart captured in the

  20. A surgical skills laboratory improves residents' knowledge and performance of episiotomy repair.

    PubMed

    Banks, Erika; Pardanani, Setul; King, Mary; Chudnoff, Scott; Damus, Karla; Freda, Margaret Comerford

    2006-11-01

    This study was undertaken to assess whether a surgical skills laboratory improves residents' knowledge and performance of episiotomy repair. Twenty-four first- and second-year residents were randomly assigned to either a surgical skills laboratory on episiotomy repair or traditional teaching alone. Pre- and posttests assessed basic knowledge. Blinded attending physicians assessed performance, evaluating residents on second-degree laceration/episiotomy repairs in the clinical setting with 3 validated tools: a task-specific checklist, global rating scale, and a pass-fail grade. Postgraduate year 1 (PGY-1) residents participating in the laboratory scored significantly better on all 3 surgical assessment tools: the checklist, the global score, and the pass/fail analysis. All the residents who had the teaching laboratory demonstrated significant improvements on knowledge and the skills checklist. PGY-2 residents did not benefit as much as PGY-1 residents. A surgical skills laboratory improved residents' knowledge and performance in the clinical setting. Improvement was greatest for PGY-1 residents.

  1. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  2. European Responses to Global Competitiveness in Higher Education. Research & Occasional Paper Series: CSHE.7.09

    ERIC Educational Resources Information Center

    van der Wende, Marijk

    2009-01-01

    The growing global competition in which knowledge is a prime factor for economic growth is increasingly shaping policies and setting the agenda for the future of European higher education. With its aim to become the world's leading knowledge economy, the European Union is concerned about its performance in the knowledge sector, in particular in…

  3. Applying a global justice lens to health systems research ethics: an initial exploration.

    PubMed

    Pratt, Bridget; Hyder, Adnan A

    2015-03-01

    Recent scholarship has considered what, if anything, rich people owe to poor people to achieve justice in global health and the implications of this for international research. Yet this work has primarily focused on international clinical research. Health systems research is increasingly being performed in low and middle income countries and is essential to reducing global health disparities. This paper provides an initial description of the ethical issues related to priority setting, capacity-building, and the provision of post-study benefits that arise during the conduct of such research. It presents a selection of issues discussed in the health systems research literature and argues that they constitute ethical concerns based on their being inconsistent with a particular theory of global justice (the health capability paradigm). Issues identified include the fact that priority setting for health systems research at the global level is often not driven by national priorities and that capacity-building efforts frequently utilize one-size-fits-all approaches.

  4. Formulating Spatially Varying Performance in the Statistical Fusion Framework

    PubMed Central

    Landman, Bennett A.

    2012-01-01

    To date, label fusion methods have primarily relied either on global (e.g. STAPLE, globally weighted vote) or voxelwise (e.g. locally weighted vote) performance models. Optimality of the statistical fusion framework hinges upon the validity of the stochastic model of how a rater errs (i.e., the labeling process model). Hitherto, approaches have tended to focus on the extremes of potential models. Herein, we propose an extension to the STAPLE approach to seamlessly account for spatially varying performance by extending the performance level parameters to account for a smooth, voxelwise performance level field that is unique to each rater. This approach, Spatial STAPLE, provides significant improvements over state-of-the-art label fusion algorithms in both simulated and empirical data sets. PMID:22438513

  5. Continuation of the NVAP Global Water Vapor Data Sets for Pathfinder Science Analysis

    NASA Technical Reports Server (NTRS)

    VonderHaar, Thomas H.; Engelen, Richard J.; Forsythe, John M.; Randel, David L.; Ruston, Benjamin C.; Woo, Shannon; Dodge, James (Technical Monitor)

    2001-01-01

    This annual report covers August 2000 - August 2001 under NASA contract NASW-0032, entitled "Continuation of the NVAP (NASA's Water Vapor Project) Global Water Vapor Data Sets for Pathfinder Science Analysis". NASA has created a list of Earth Science Research Questions which are outlined by Asrar, et al. Particularly relevant to NVAP are the following questions: (a) How are global precipitation, evaporation, and the cycling of water changing? (b) What trends in atmospheric constituents and solar radiation are driving global climate? (c) How well can long-term climatic trends be assessed or predicted? Water vapor is a key greenhouse gas, and an understanding of its behavior is essential in global climate studies. Therefore, NVAP plays a key role in addressing the above climate questions by creating a long-term global water vapor dataset and by updating the dataset with recent advances in satellite instrumentation. The NVAP dataset produced from 1988-1998 has found wide use in the scientific community. Studies of interannual variability are particularly important. A recent paper by Simpson, et al. that examined the NVAP dataset in detail has shown that its relative accuracy is sufficient for the variability studies that contribute toward meeting NASA's goals. In the past year, we have made steady progress towards continuing production of this high-quality dataset as well as performing our own investigations of the data. This report summarizes the past year's work on production of the NVAP dataset and presents results of analyses we have performed in the past year.

  6. Assessing Low-Intensity Relationships in Complex Networks

    PubMed Central

    Spitz, Andreas; Gimmler, Anna; Stoeck, Thorsten; Zweig, Katharina Anna; Horvát, Emőke-Ágnes

    2016-01-01

    Many large network data sets are noisy and contain links representing low-intensity relationships that are difficult to differentiate from random interactions. This is especially relevant for high-throughput data from systems biology, large-scale ecological data, but also for Web 2.0 data on human interactions. In these networks with missing and spurious links, it is possible to refine the data based on the principle of structural similarity, which assesses the shared neighborhood of two nodes. By using similarity measures to globally rank all possible links and choosing the top-ranked pairs, true links can be validated, missing links inferred, and spurious observations removed. While many similarity measures have been proposed to this end, there is no general consensus on which one to use. In this article, we first contribute a set of benchmarks for complex networks from three different settings (e-commerce, systems biology, and social networks) and thus enable a quantitative performance analysis of classic node similarity measures. Based on this, we then propose a new methodology for link assessment called z* that assesses the statistical significance of the number of their common neighbors by comparison with the expected value in a suitably chosen random graph model and which is a consistently top-performing algorithm for all benchmarks. In addition to a global ranking of links, we also use this method to identify the most similar neighbors of each single node in a local ranking, thereby showing the versatility of the method in two distinct scenarios and augmenting its applicability. Finally, we perform an exploratory analysis on an oceanographic plankton data set and find that the distribution of microbes follows similar biogeographic rules as those of macroorganisms, a result that rejects the global dispersal hypothesis for microbes. PMID:27096435

  7. Assessing Low-Intensity Relationships in Complex Networks.

    PubMed

    Spitz, Andreas; Gimmler, Anna; Stoeck, Thorsten; Zweig, Katharina Anna; Horvát, Emőke-Ágnes

    2016-01-01

    Many large network data sets are noisy and contain links representing low-intensity relationships that are difficult to differentiate from random interactions. This is especially relevant for high-throughput data from systems biology, large-scale ecological data, but also for Web 2.0 data on human interactions. In these networks with missing and spurious links, it is possible to refine the data based on the principle of structural similarity, which assesses the shared neighborhood of two nodes. By using similarity measures to globally rank all possible links and choosing the top-ranked pairs, true links can be validated, missing links inferred, and spurious observations removed. While many similarity measures have been proposed to this end, there is no general consensus on which one to use. In this article, we first contribute a set of benchmarks for complex networks from three different settings (e-commerce, systems biology, and social networks) and thus enable a quantitative performance analysis of classic node similarity measures. Based on this, we then propose a new methodology for link assessment called z* that assesses the statistical significance of the number of their common neighbors by comparison with the expected value in a suitably chosen random graph model and which is a consistently top-performing algorithm for all benchmarks. In addition to a global ranking of links, we also use this method to identify the most similar neighbors of each single node in a local ranking, thereby showing the versatility of the method in two distinct scenarios and augmenting its applicability. Finally, we perform an exploratory analysis on an oceanographic plankton data set and find that the distribution of microbes follows similar biogeographic rules as those of macroorganisms, a result that rejects the global dispersal hypothesis for microbes.

  8. Data Sparsity Considerations in Climate Impact Analysis for the Water Sector (Invited)

    NASA Astrophysics Data System (ADS)

    Asante, K. O.; Khimsara, P.; Chan, A.

    2013-12-01

    Scientists and planners are helping governments and communities around the world to prepare for climate change by performing local impact studies and developing adaptation plans. Most studies begin by analyzing global climate models outputs to estimate the magnitude of projected change, assessing vulnerabilities and proposing adaptation measures. In these studies, climate projections from the Intergovernmental Panel on Climate Change (IPCC) Data Distribution Centre (DDC) are either used directly or downscaled using regional models. Since climate projections cover the entire global, climate change analysis can be performed for any location. However, selection of climate projections for use in historically data sparse regions presents special challenges. Key questions arise about the impact of historical data sparsity on quality of climate projections, spatial consistency of results and suitability for applications such as water resource planning. In this paper, a water-sector climate study conducted in a data-rich setting in California is compared to a similar study conducted a data-sparse setting in Mozambique. The challenges of selecting projections, performing analysis and interpreting the results for climate adaption planning are compared to illustrate the decision process and challenges encountered in these two very different settings.

  9. Improvement of training set structure in fusion data cleaning using Time-Domain Global Similarity method

    NASA Astrophysics Data System (ADS)

    Liu, J.; Lan, T.; Qin, H.

    2017-10-01

    Traditional data cleaning identifies dirty data by classifying original data sequences, which is a class-imbalanced problem since the proportion of incorrect data is much less than the proportion of correct ones for most diagnostic systems in Magnetic Confinement Fusion (MCF) devices. When using machine learning algorithms to classify diagnostic data based on class-imbalanced training set, most classifiers are biased towards the major class and show very poor classification rates on the minor class. By transforming the direct classification problem about original data sequences into a classification problem about the physical similarity between data sequences, the class-balanced effect of Time-Domain Global Similarity (TDGS) method on training set structure is investigated in this paper. Meanwhile, the impact of improved training set structure on data cleaning performance of TDGS method is demonstrated with an application example in EAST POlarimetry-INTerferometry (POINT) system.

  10. Global pediatric environmental health.

    PubMed

    Guidotti, Tee L; Gitterman, Benjamin A

    2007-04-01

    Children are uniquely vulnerable to environmental health problems. Developed countries report as the most common problems ambient (outdoor) air pollution and lead. Developing countries have a wider range of common problems, including childhood injuries, indoor air pollution, infectious disease, and poor sanitation with unsafe water. Globally, the agencies of the United Nations act to protect children and perform essential reporting and standards-setting functions. Conditions vary greatly among countries and are not always better in developing countries. Protecting the health of children requires strengthening the public health and medical systems in every country, rather than a single global agenda.

  11. Exploring the limit of accuracy for density functionals based on the generalized gradient approximation: Local, global hybrid, and range-separated hybrid functionals with and without dispersion corrections

    DOE PAGES

    Mardirossian, Narbe; Head-Gordon, Martin

    2014-03-25

    The limit of accuracy for semi-empirical generalized gradient approximation (GGA) density functionals is explored in this paper by parameterizing a variety of local, global hybrid, and range-separated hybrid functionals. The training methodology employed differs from conventional approaches in 2 main ways: (1) Instead of uniformly truncating the exchange, same-spin correlation, and opposite-spin correlation functional inhomogeneity correction factors, all possible fits up to fourth order are considered, and (2) Instead of selecting the optimal functionals based solely on their training set performance, the fits are validated on an independent test set and ranked based on their overall performance on the trainingmore » and test sets. The 3 different methods of accounting for exchange are trained both with and without dispersion corrections (DFT-D2 and VV10), resulting in a total of 491 508 candidate functionals. For each of the 9 functional classes considered, the results illustrate the trade-off between improved training set performance and diminished transferability. Since all 491 508 functionals are uniformly trained and tested, this methodology allows the relative strengths of each type of functional to be consistently compared and contrasted. Finally, the range-separated hybrid GGA functional paired with the VV10 nonlocal correlation functional emerges as the most accurate form for the present training and test sets, which span thermochemical energy differences, reaction barriers, and intermolecular interactions involving lighter main group elements.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Lopez, A.; Sengupta, M.

    Typical Meteorological Year (TMY) data sets provide industry standard resource information for building designers and are commonly used by the solar industry to estimate photovoltaic and concentrating solar power system performance. Historically, TMY data sets were only available for certain station locations, but current TMY data sets are available on the same grid as the National Solar Radiation Database data and are referred to as the gridded TMY. In this report, a comparison of TMY, typical direct (normal irradiance) year (TDY), and typical global (horizontal irradiance) year (TGY) data sets were performed to better understand the impact of ancillary weathermore » variables upon them. These analyses identified geographical areas of high and low temporal and spatial variability, thereby providing insight into the representativeness of a particular TMY data set for use in renewable energy as well as other applications.« less

  13. Learning Parsimonious Classification Rules from Gene Expression Data Using Bayesian Networks with Local Structure.

    PubMed

    Lustgarten, Jonathan Lyle; Balasubramanian, Jeya Balaji; Visweswaran, Shyam; Gopalakrishnan, Vanathi

    2017-03-01

    The comprehensibility of good predictive models learned from high-dimensional gene expression data is attractive because it can lead to biomarker discovery. Several good classifiers provide comparable predictive performance but differ in their abilities to summarize the observed data. We extend a Bayesian Rule Learning (BRL-GSS) algorithm, previously shown to be a significantly better predictor than other classical approaches in this domain. It searches a space of Bayesian networks using a decision tree representation of its parameters with global constraints, and infers a set of IF-THEN rules. The number of parameters and therefore the number of rules are combinatorial to the number of predictor variables in the model. We relax these global constraints to a more generalizable local structure (BRL-LSS). BRL-LSS entails more parsimonious set of rules because it does not have to generate all combinatorial rules. The search space of local structures is much richer than the space of global structures. We design the BRL-LSS with the same worst-case time-complexity as BRL-GSS while exploring a richer and more complex model space. We measure predictive performance using Area Under the ROC curve (AUC) and Accuracy. We measure model parsimony performance by noting the average number of rules and variables needed to describe the observed data. We evaluate the predictive and parsimony performance of BRL-GSS, BRL-LSS and the state-of-the-art C4.5 decision tree algorithm, across 10-fold cross-validation using ten microarray gene-expression diagnostic datasets. In these experiments, we observe that BRL-LSS is similar to BRL-GSS in terms of predictive performance, while generating a much more parsimonious set of rules to explain the same observed data. BRL-LSS also needs fewer variables than C4.5 to explain the data with similar predictive performance. We also conduct a feasibility study to demonstrate the general applicability of our BRL methods on the newer RNA sequencing gene-expression data.

  14. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    PubMed

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  15. Efficient globally optimal segmentation of cells in fluorescence microscopy images using level sets and convex energy functionals.

    PubMed

    Bergeest, Jan-Philip; Rohr, Karl

    2012-10-01

    In high-throughput applications, accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression and the understanding of cell function. We propose an approach for segmenting cell nuclei which is based on active contours using level sets and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We consider three different well-known energy functionals for active contour-based segmentation and introduce convex formulations of these functionals. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images from different experiments comprising different cell types. We have also performed a quantitative comparison with previous segmentation approaches. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. A path following algorithm for the graph matching problem.

    PubMed

    Zaslavskiy, Mikhail; Bach, Francis; Vert, Jean-Philippe

    2009-12-01

    We propose a convex-concave programming approach for the labeled weighted graph matching problem. The convex-concave programming formulation is obtained by rewriting the weighted graph matching problem as a least-square problem on the set of permutation matrices and relaxing it to two different optimization problems: a quadratic convex and a quadratic concave optimization problem on the set of doubly stochastic matrices. The concave relaxation has the same global minimum as the initial graph matching problem, but the search for its global minimum is also a hard combinatorial problem. We, therefore, construct an approximation of the concave problem solution by following a solution path of a convex-concave problem obtained by linear interpolation of the convex and concave formulations, starting from the convex relaxation. This method allows to easily integrate the information on graph label similarities into the optimization problem, and therefore, perform labeled weighted graph matching. The algorithm is compared with some of the best performing graph matching methods on four data sets: simulated graphs, QAPLib, retina vessel images, and handwritten Chinese characters. In all cases, the results are competitive with the state of the art.

  17. First Results of the Performance of the Global Forest/Non-Forest Map derived from TanDEM-X Interferometric Data

    NASA Astrophysics Data System (ADS)

    Gonzalez, Carolina; Rizzoli, Paola; Martone, Michele; Wecklich, Christopher; Bueso Bello, Jose Luis; Krieger, Gerhard; Zink, Manfred

    2017-04-01

    The globally acquired interferometric synthetic aperture radar (SAR) data set, used for the recently completed primary goal of the TanDEM-X mission, enables a big opportunity for scientific geo-applications. Of great importance for land characterization, classification, and monitoring is that the data set is globally acquired without gaps and includes multiple acquisitions of every region, with comparable parameters. One of the most valuable maps that can be derived from interferometric SAR data for land classification describes the presence/absence of vegetation. In particular, here we report about the deployment of the Global Forest/Non-Forest Map, derived from TanDEM-X interferometric SAR quick-look data, at a ground resolution of 50 m by 50 m. Presence of structures and in particular vegetation produces multiple scattering known as volume decorrelation. Its contribution can be directly estimated from the assessment of coherence loss in the interferometric bistatic pair, by compensating for all other decorrelation sources, such as poor signal-to-noise ratio or quantization noise. Three different forest types have been characterized based on the estimated volume decorrelation: tropical, temperate, and boreal forest. This characterization was then used in a fuzzy clustering approach for the discrimination of vegetated areas on a global scale. Water and cities are filtered out from the generated maps in order to distinguish volume decorrelation from other decorrelation sources. The validation and performance comparison of the delivered product is also presented, and represents a fundamental tool for optimizing the whole algorithm at all different stages. Furtheremore, as the time interval of the acquisitions is almost 4 years, change detection can be performed as well and examples of deforestation are also going to be included in the final paper.

  18. Reference set for performance testing of pediatric vaccine safety signal detection methods and systems.

    PubMed

    Brauchli Pernus, Yolanda; Nan, Cassandra; Verstraeten, Thomas; Pedenko, Mariia; Osokogu, Osemeke U; Weibel, Daniel; Sturkenboom, Miriam; Bonhoeffer, Jan

    2016-12-12

    Safety signal detection in spontaneous reporting system databases and electronic healthcare records is key to detection of previously unknown adverse events following immunization. Various statistical methods for signal detection in these different datasources have been developed, however none are geared to the pediatric population and none specifically to vaccines. A reference set comprising pediatric vaccine-adverse event pairs is required for reliable performance testing of statistical methods within and across data sources. The study was conducted within the context of the Global Research in Paediatrics (GRiP) project, as part of the seventh framework programme (FP7) of the European Commission. Criteria for the selection of vaccines considered in the reference set were routine and global use in the pediatric population. Adverse events were primarily selected based on importance. Outcome based systematic literature searches were performed for all identified vaccine-adverse event pairs and complemented by expert committee reports, evidence based decision support systems (e.g. Micromedex), and summaries of product characteristics. Classification into positive (PC) and negative control (NC) pairs was performed by two independent reviewers according to a pre-defined algorithm and discussed for consensus in case of disagreement. We selected 13 vaccines and 14 adverse events to be included in the reference set. From a total of 182 vaccine-adverse event pairs, we classified 18 as PC, 113 as NC and 51 as unclassifiable. Most classifications (91) were based on literature review, 45 were based on expert committee reports, and for 46 vaccine-adverse event pairs, an underlying pathomechanism was not plausible classifying the association as NC. A reference set of vaccine-adverse event pairs was developed. We propose its use for comparing signal detection methods and systems in the pediatric population. Published by Elsevier Ltd.

  19. Microarray missing data imputation based on a set theoretic framework and biological knowledge.

    PubMed

    Gan, Xiangchao; Liew, Alan Wee-Chung; Yan, Hong

    2006-01-01

    Gene expressions measured using microarrays usually suffer from the missing value problem. However, in many data analysis methods, a complete data matrix is required. Although existing missing value imputation algorithms have shown good performance to deal with missing values, they also have their limitations. For example, some algorithms have good performance only when strong local correlation exists in data while some provide the best estimate when data is dominated by global structure. In addition, these algorithms do not take into account any biological constraint in their imputation. In this paper, we propose a set theoretic framework based on projection onto convex sets (POCS) for missing data imputation. POCS allows us to incorporate different types of a priori knowledge about missing values into the estimation process. The main idea of POCS is to formulate every piece of prior knowledge into a corresponding convex set and then use a convergence-guaranteed iterative procedure to obtain a solution in the intersection of all these sets. In this work, we design several convex sets, taking into consideration the biological characteristic of the data: the first set mainly exploit the local correlation structure among genes in microarray data, while the second set captures the global correlation structure among arrays. The third set (actually a series of sets) exploits the biological phenomenon of synchronization loss in microarray experiments. In cyclic systems, synchronization loss is a common phenomenon and we construct a series of sets based on this phenomenon for our POCS imputation algorithm. Experiments show that our algorithm can achieve a significant reduction of error compared to the KNNimpute, SVDimpute and LSimpute methods.

  20. Developing Diverse Teams to Improve Performance in the Organizational Setting

    ERIC Educational Resources Information Center

    Yeager, Katherine L.; Nafukho, Fredrick M.

    2012-01-01

    Purpose: The use of teams in organizations given the current trend toward globalization, population changes, and an aging workforce, especially in high-income countries, makes the issue of diverse team building critical. The purpose of this paper is to explore the issue of team diversity and team performance through the examination of theory and…

  1. Evaluation of regional climate simulations over the Great Lakes region driven by three global data sets

    Treesearch

    Shiyuan Zhong; Xiuping Li; Xindi Bian; Warren E. Heilman; L. Ruby Leung; William I. Jr. Gustafson

    2012-01-01

    The performance of regional climate simulations is evaluated for the Great Lakes region. Three 10-year (1990-1999) current-climate simulations are performed using the MM5 regional climate model (RCM) with 36-km horizontal resolution. The simulations employed identical configuration and physical parameterizations, but different lateral boundary conditions and sea-...

  2. Global surface temperature/heat transfer measurements using infrared imaging

    NASA Technical Reports Server (NTRS)

    Daryabeigi, Kamran

    1992-01-01

    A series of studies were conducted to evaluate the use of scanning radiometric infrared imaging systems for providing global surface temperature/heat transfer measurements in support of hypersonic wind tunnel testing. The in situ precision of the technique with narrow temperature span setting over the temperature range of 20 to 200 C was investigated. The precision of the technique over wider temperature span settings was also determined. The accuracy of technique for providing aerodynamic heating rates was investigated by performing measurements on a 10.2-centimeter hemisphere model in the Langley 31-inch Mach 10 tunnel, and comparing the results with theoretical predictions. Data from tests conducted on a generic orbiter model in this tunnel are also presented.

  3. Evaluating Gene Set Enrichment Analysis Via a Hybrid Data Model

    PubMed Central

    Hua, Jianping; Bittner, Michael L.; Dougherty, Edward R.

    2014-01-01

    Gene set enrichment analysis (GSA) methods have been widely adopted by biological labs to analyze data and generate hypotheses for validation. Most of the existing comparison studies focus on whether the existing GSA methods can produce accurate P-values; however, practitioners are often more concerned with the correct gene-set ranking generated by the methods. The ranking performance is closely related to two critical goals associated with GSA methods: the ability to reveal biological themes and ensuring reproducibility, especially for small-sample studies. We have conducted a comprehensive simulation study focusing on the ranking performance of seven representative GSA methods. We overcome the limitation on the availability of real data sets by creating hybrid data models from existing large data sets. To build the data model, we pick a master gene from the data set to form the ground truth and artificially generate the phenotype labels. Multiple hybrid data models can be constructed from one data set and multiple data sets of smaller sizes can be generated by resampling the original data set. This approach enables us to generate a large batch of data sets to check the ranking performance of GSA methods. Our simulation study reveals that for the proposed data model, the Q2 type GSA methods have in general better performance than other GSA methods and the global test has the most robust results. The properties of a data set play a critical role in the performance. For the data sets with highly connected genes, all GSA methods suffer significantly in performance. PMID:24558298

  4. Cognitive deficits as an endophenotype for anorexia nervosa: an accepted fact or a need for re-examination?

    PubMed

    Talbot, Amy; Hay, Phillipa; Buckett, Geoffrey; Touyz, Stephen

    2015-01-01

    To investigate whether impaired set shifting and weak central coherence represent state or trait characteristics and, therefore, candidate endophenotypes of anorexia nervosa (AN). Forty-nine individuals with lifetime AN (24 acutely unwell, 10 weight recovered, and 15 fully recovered) and 43 healthy controls completed the Wisconsin Card Sorting Test (WCST), the Matching Familiar Figures Test, and the Rey Complex Figure Task measuring cognitive flexibility, local processing, and global processing, respectively. Participants also completed questionnaires assessing eating disorder, anxiety and depressive symptoms, obsessional traits, interpersonal functioning, and quality of life. Body mass index was calculated from height and weight measurements. Participants with lifetime AN demonstrated poorer set shifting ability than healthy controls as evidenced by a greater number of perseverative errors on the WCST. When participants were grouped according to illness status, only those in the two recovered groups demonstrated poorer set shifting ability than healthy controls while patients with acute AN performed comparably to all other groups. There were no significant differences between groups on measures of local and global processing. No relationship was found between specific clinical features of AN and cognitive performance. The results of this study are consistent with a global trend toward set shifting difficulties in patients with AN but do not support weak central coherence as a candidate endophenotype for AN. These findings have clinical implications in terms of treatment selection and planning, particularly in relation to the use of cognitive remediation therapy with patients with AN. © 2014 Wiley Periodicals, Inc.

  5. Global parameter estimation for thermodynamic models of transcriptional regulation.

    PubMed

    Suleimenov, Yerzhan; Ay, Ahmet; Samee, Md Abul Hassan; Dresch, Jacqueline M; Sinha, Saurabh; Arnosti, David N

    2013-07-15

    Deciphering the mechanisms involved in gene regulation holds the key to understanding the control of central biological processes, including human disease, population variation, and the evolution of morphological innovations. New experimental techniques including whole genome sequencing and transcriptome analysis have enabled comprehensive modeling approaches to study gene regulation. In many cases, it is useful to be able to assign biological significance to the inferred model parameters, but such interpretation should take into account features that affect these parameters, including model construction and sensitivity, the type of fitness calculation, and the effectiveness of parameter estimation. This last point is often neglected, as estimation methods are often selected for historical reasons or for computational ease. Here, we compare the performance of two parameter estimation techniques broadly representative of local and global approaches, namely, a quasi-Newton/Nelder-Mead simplex (QN/NMS) method and a covariance matrix adaptation-evolutionary strategy (CMA-ES) method. The estimation methods were applied to a set of thermodynamic models of gene transcription applied to regulatory elements active in the Drosophila embryo. Measuring overall fit, the global CMA-ES method performed significantly better than the local QN/NMS method on high quality data sets, but this difference was negligible on lower quality data sets with increased noise or on data sets simplified by stringent thresholding. Our results suggest that the choice of parameter estimation technique for evaluation of gene expression models depends both on quality of data, the nature of the models [again, remains to be established] and the aims of the modeling effort. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Spherical Harmonics Analysis of the ECMWF Global Wind Fields at the 10-Meter Height Level During 1985: A Collection of Figures Illustrating Results

    NASA Technical Reports Server (NTRS)

    Sanchez, Braulio V.; Nishihama, Masahiro

    1997-01-01

    Half-daily global wind speeds in the east-west (u) and north-south (v) directions at the 10-meter height level were obtained from the European Centre for Medium Range Weather Forecasts (ECMWF) data set of global analyses. The data set covered the period 1985 January to 1995 January. A spherical harmonic expansion to degree and order 50 was used to perform harmonic analysis of the east-west (u) and north-south (v) velocity field components. The resulting wind field is displayed, as well as the residual of the fit, at a particular time. The contribution of particular coefficients is shown. The time variability of the coefficients up to degree and order 3 is presented. Corresponding power spectrum plots are given. Time series analyses were applied also to the power associated with degrees 0-10; the results are included.

  7. Global stability and tumor clearance conditions for a cancer chemotherapy system

    NASA Astrophysics Data System (ADS)

    Valle, Paul A.; Starkov, Konstantin E.; Coria, Luis N.

    2016-11-01

    In this paper we study the global dynamics of a cancer chemotherapy system presented by de Pillis et al. (2007). This mathematical model describes the interaction between tumor cells, effector-immune cells, circulating lymphocytes and chemotherapy treatment. By applying the localization method of compact invariant sets, we find lower and upper bounds for these three cells populations. Further, we define a bounded domain in R+,04 where all compact invariant sets of the system are located and provide conditions under which this domain is positively invariant. We apply LaSalle's invariance principle and one result concerning two-dimensional competitive systems in order to derive sufficient conditions for tumor clearance and global asymptotic stability of the tumor-free equilibrium point. These conditions are computed by using bounds of the localization domain and they are given in terms of the chemotherapy treatment. Finally, we perform numerical simulations in order to illustrate our results.

  8. Pseudochaotic dynamics near global periodicity

    NASA Astrophysics Data System (ADS)

    Fan, Rong; Zaslavsky, George M.

    2007-09-01

    In this paper, we study a piecewise linear version of kicked oscillator model: saw-tooth map. A special case of global periodicity, in which every phase point belongs to a periodic orbit, is presented. With few analytic results known for the corresponding map on torus, we numerically investigate transport properties and statistical behavior of Poincaré recurrence time in two cases of deviation from global periodicity. A non-KAM behavior of the system, as well as subdiffusion and superdiffusion, are observed through numerical simulations. Statistics of Poincaré recurrences shows Kac lemma is valid in the system and there is a relation between the transport exponent and the Poincaré recurrence exponent. We also perform careful numerical computation of capacity, information and correlation dimensions of the so-called exceptional set in both cases. Our results show that the fractal dimension of the exceptional set is strictly less than 2 and that the fractal structures are unifractal rather than multifractal.

  9. Performance of the Falling Snow Retrieval Algorithms for the Global Precipitation Measurement (GPM) Mission

    NASA Technical Reports Server (NTRS)

    Skofronick-Jackson, Gail; Munchak, Stephen J.; Ringerud, Sarah

    2016-01-01

    Retrievals of falling snow from space represent an important data set for understanding the Earth's atmospheric, hydrological, and energy cycles, especially during climate change. Estimates of falling snow must be captured to obtain the true global precipitation water cycle, snowfall accumulations are required for hydrological studies, and without knowledge of the frozen particles in clouds one cannot adequately understand the energy and radiation budgets. While satellite-based remote sensing provides global coverage of falling snow events, the science is relatively new and retrievals are still undergoing development with challenges remaining). This work reports on the development and testing of retrieval algorithms for the Global Precipitation Measurement (GPM) mission Core Satellite, launched February 2014.

  10. An Analysis of San Diego's Housing Market Using a Geographically Weighted Regression Approach

    NASA Astrophysics Data System (ADS)

    Grant, Christina P.

    San Diego County real estate transaction data was evaluated with a set of linear models calibrated by ordinary least squares and geographically weighted regression (GWR). The goal of the analysis was to determine whether the spatial effects assumed to be in the data are best studied globally with no spatial terms, globally with a fixed effects submarket variable, or locally with GWR. 18,050 single-family residential sales which closed in the six months between April 2014 and September 2014 were used in the analysis. Diagnostic statistics including AICc, R2, Global Moran's I, and visual inspection of diagnostic plots and maps indicate superior model performance by GWR as compared to both global regressions.

  11. Exploration of GPS to enhance the safe transport of hazardous materials

    DOT National Transportation Integrated Search

    1997-12-01

    The report (1) documents a set of requirements for the performance of location systems that utilize the Global Positioning System (GPS), (2) identifies potential uses of GPS in hazardous materials transport, (3) develops service descriptions for the ...

  12. Antimicrobial resistance monitoring in Neisseria gonorrhoeae and strategic use of funds from the Global Fund to set up a systematic Moroccan gonococcal antimicrobial surveillance programme.

    PubMed

    Hançali, Amina; Ndowa, Francis; Bellaji, Bahija; Bennani, Aziza; Kettani, Amina; Charof, Reda; El Aouad, Rajae

    2013-12-01

    The aims of this study were to assess antimicrobial resistance in Neisseria gonorrhoeae infections and update the treatment in the national guidelines for the syndromic management of sexually transmitted infections in Morocco. 171 men complaining of urethral discharge were recruited from basic health services during 2009. Urethral swab samples were collected and N gonorrhoeae identification was performed by culture. Antimicrobial susceptibility testing was performed using the Etest method and the antimicrobial agents tested were ciprofloxacin, penicillin, spectinomycin, tetracycline, ceftriaxone and cefixime. A total of 72 isolates were examined. Significant resistance to tetracycline (92.8%) and ciprofloxacin (86.8%), which was used as first-line treatment in gonococcal infections, was noted. No resistance to spectinomycin, ceftriaxone or cefixime was detected in all the isolates. Following these results the Ministry of Health of Morocco replaced ciprofloxacin and introduced ceftriaxone 250 mg as a single dose in the treatment of gonococcal infections. Using funds from the Global Fund to Fight AIDS, Tuberculosis and Malaria (the Global Fund), a surveillance programme was set up for antimicrobial resistance testing in N gonorrhoeae.

  13. Remote Sensing Information Science Research

    NASA Technical Reports Server (NTRS)

    Clarke, Keith C.; Scepan, Joseph; Hemphill, Jeffrey; Herold, Martin; Husak, Gregory; Kline, Karen; Knight, Kevin

    2002-01-01

    This document is the final report summarizing research conducted by the Remote Sensing Research Unit, Department of Geography, University of California, Santa Barbara under National Aeronautics and Space Administration Research Grant NAG5-10457. This document describes work performed during the period of 1 March 2001 thorough 30 September 2002. This report includes a survey of research proposed and performed within RSRU and the UCSB Geography Department during the past 25 years. A broad suite of RSRU research conducted under NAG5-10457 is also described under themes of Applied Research Activities and Information Science Research. This research includes: 1. NASA ESA Research Grant Performance Metrics Reporting. 2. Global Data Set Thematic Accuracy Analysis. 3. ISCGM/Global Map Project Support. 4. Cooperative International Activities. 5. User Model Study of Global Environmental Data Sets. 6. Global Spatial Data Infrastructure. 7. CIESIN Collaboration. 8. On the Value of Coordinating Landsat Operations. 10. The California Marine Protected Areas Database: Compilation and Accuracy Issues. 11. Assessing Landslide Hazard Over a 130-Year Period for La Conchita, California Remote Sensing and Spatial Metrics for Applied Urban Area Analysis, including: (1) IKONOS Data Processing for Urban Analysis. (2) Image Segmentation and Object Oriented Classification. (3) Spectral Properties of Urban Materials. (4) Spatial Scale in Urban Mapping. (5) Variable Scale Spatial and Temporal Urban Growth Signatures. (6) Interpretation and Verification of SLEUTH Modeling Results. (7) Spatial Land Cover Pattern Analysis for Representing Urban Land Use and Socioeconomic Structures. 12. Colorado River Flood Plain Remote Sensing Study Support. 13. African Rainfall Modeling and Assessment. 14. Remote Sensing and GIS Integration.

  14. Constraints on Smoke Injection Height, Source Strength, and Transports from MISR and MODIS

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph A.; Petrenko, Mariya; Val Martin, Maria; Chin, Mian

    2014-01-01

    The AeroCom BB (Biomass Burning) Experiment AOD (Aerosol Optical Depth) motivation: We have a substantial set of satellite wildfire plume AOD snapshots and injection heights to help calibrate model/inventory performance; We are 1) adding more fire source-strength cases 2) using MISR to improve the AOD constrains and 3) adding 2008 global injection heights; We selected GFED3-daily due to good overall source strength performance, but any inventory can be tested; Joint effort to test multiple, global models, to draw robust BB injection height and emission strength conclusions. We provide satellite-based injection height and smoke plume AOD climatologies.

  15. Thalamic Amnesia Mimicking Transient Global Amnesia.

    PubMed

    Giannantoni, Nadia M; Lacidogna, Giordano; Broccolini, Aldobrando; Pilato, Fabio; Profice, Paolo; Morosetti, Roberta; Caliandro, Pietro; Gambassi, Giovanni; Della Marca, Giacomo; Frisullo, Giovanni

    2015-06-01

    Transient global amnesia is a benign syndrome and one of the most frequent discharges from the emergency department that can hardly be distinguished from other mimicking diseases. No consensus in the evaluation of transient global amnesia has yet been found in the emergency setting. We describe a 69-year-old woman who presented to our emergency department with an abrupt onset of anterograde amnesia, preceded by a similar amnesic episode misinterpreted as transient global amnesia. Neuroradiologic, neuropsychological, and neurophysiological evaluations supported the diagnosis of vascular thalamic amnesia. We report a patient who clinically fulfilled transient global amnesia's criteria and in whom nevertheless was disclosed a thalamic ischemic lesion on neuroimaging.This case report highlights the importance of performing neuroradiologic screening in the emergency department even when clinical history and physical findings are highly suggestive for transient global amnesia.

  16. Improving the Quality of School Facilities through Building Performance Assessment: Educational Reform and School Building Quality in Sao Paulo, Brazil

    ERIC Educational Resources Information Center

    Ornstein, Sheila Walbe; Moreira, Nanci Saraiva; Ono, Rosaria; Limongi Franca, Ana J. G.; Nogueira, Roselene A. M. F.

    2009-01-01

    Purpose: The paper describes the purpose of and strategies for conducting post-occupancy evaluations (POEs) as a method for assessing school building performance. Set within the larger context of global efforts to develop and apply common indicators of school building quality, the authors describe research conducted within the newest generation of…

  17. Setting up a hydrological model based on global data for the Ayeyarwady basin in Myanmar

    NASA Astrophysics Data System (ADS)

    ten Velden, Corine; Sloff, Kees; Nauta, Tjitte

    2017-04-01

    The use of global datasets in local hydrological modelling can be of great value. It opens up the possibility to include data for areas where local data is not or only sparsely available. In hydrological modelling the existence of both static physical data such as elevation and land use, and dynamic meteorological data such as precipitation and temperature, is essential for setting up a hydrological model, but often such data is difficult to obtain at the local level. For the Ayeyarwady catchment in Myanmar a distributed hydrological model (Wflow: https://github.com/openstreams/wflow) was set up with only global datasets, as part of a water resources study. Myanmar is an emerging economy, which has only recently become more receptive to foreign influences. It has a very limited hydrometeorological measurement network, with large spatial and temporal gaps, and data that are of uncertain quality and difficult to obtain. The hydrological model was thus set up based on resampled versions of the SRTM digital elevation model, the GlobCover land cover dataset and the HWSD soil dataset. Three global meteorological datasets were assessed and compared for use in the hydrological model: TRMM, WFDEI and MSWEP. The meteorological datasets were assessed based on their conformity with several precipitation station measurements, and the overall model performance was assessed by calculating the NSE and RVE based on discharge measurements of several gauging stations. The model was run for the period 1979-2012 on a daily time step, and the results show an acceptable applicability of the used global datasets in the hydrological model. The WFDEI forcing dataset gave the best results, with a NSE of 0.55 at the outlet of the model and a RVE of 8.5%, calculated over the calibration period 2006-2012. As a general trend the modelled discharge at the upstream stations tends to be underestimated, and at the downstream stations slightly overestimated. The quality of the discharge measurements that form the basis for the performance calculations is uncertain; data analysis suggests that rating curves are not frequently updated. The modelling results are not perfect and there is ample room for improvement, but the results are reasonable given the notion that setting up a hydrological model for this area would not have been possible without the use of global datasets due to the lack of available local data. The resulting hydrological model then enabled the set-up of the RIBASIM water allocation model for the Ayeyarwady basin in order to assess its water resources. The study discussed here is a first step; ideally this is followed up by a more thorough calibration and validation with the limited local measurements available, e.g. a precipitation correction based on the available rainfall measurements, to ensure the integration of global and local data.

  18. Global Data Toolset (GDT)

    USGS Publications Warehouse

    Cress, Jill J.; Riegle, Jodi L.

    2007-01-01

    According to the United Nations Environment Programme World Conservation Monitoring Centre (UNEP-WCMC) approximately 60 percent of the data contained in the World Database on Protected Areas (WDPA) has missing or incomplete boundary information. As a result, global analyses based on the WDPA can be inaccurate, and professionals responsible for natural resource planning and priority setting must rely on incomplete geospatial data sets. To begin to address this problem the World Data Center for Biodiversity and Ecology, in cooperation with the U. S. Geological Survey (USGS) Rocky Mountain Geographic Science Center (RMGSC), the National Biological Information Infrastructure (NBII), the Global Earth Observation System, and the Inter-American Biodiversity Information Network (IABIN) sponsored a Protected Area (PA) workshop in Asuncion, Paraguay, in November 2007. The primary goal of this workshop was to train representatives from eight South American countries on the use of the Global Data Toolset (GDT) for reviewing and editing PA data. Use of the GDT will allow PA experts to compare their national data to other data sets, including non-governmental organization (NGO) and WCMC data, in order to highlight inaccuracies or gaps in the data, and then to apply any needed edits, especially in the delineation of the PA boundaries. In addition, familiarizing the participants with the web-enabled GDT will allow them to maintain and improve their data after the workshop. Once data edits have been completed the GDT will also allow the country authorities to perform any required review and validation processing. Once validated, the data can be used to update the global WDPA and IABIN databases, which will enhance analysis on global and regional levels.

  19. GEO Supersites Data Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Lengert, W.; Popp, H.-J.; Gleyzes, J.-P.

    2012-04-01

    In the framework of the GEO Geohazard Supersite initiative, an international partnership of organizations and scientists involved in the monitoring and assessment of geohazards has been established. The mission is to advance the scientific understanding of geohazards by improving geohazard monitoring through the combination of in-situ and space-based data, and by facilitating the access to data relevant for geohazard research. The stakeholders are: (1) governmental organizations or research institutions responsible for the ground-based monitoring of earthquake and volcanic areas, (2) space agencies and satellite operators providing satellite data, (3) the global geohazard scientific community. The 10.000's of ESA's SAR products are accessible, since beginning 2008, using ESA's "Virtual Archive", a Cloud Computing assets, allowing the global community an utmost downloading performance of these high volume data sets for mass-market costs. In the GEO collaborative context, the management of ESA's "Virtual Archive" and the ordering of these large data sets is being performed by UNAVCO, who is also coordinating the data demand for the several hundreds of co-PIs. ESA is envisaging to provide scientists and developers access to a highly elastic operational e-infrastructure, providing interdisciplinary data on a large scale as well as tools ensuring innovation and a permanent evolution of the products. Consequently, this science environment will help in defining and testing new applications and technologies fostering innovation and new science findings. In Europe, the collaboration between EPOS, "European Plate Observatory System" lead by INGV, and ESA with support of DLR, ASI, and CNES are the main institutional stakeholders for the GEO Supersites contributing also to a unifying e-infrastructure. The overarching objective of the Geohazard Supersites is: "To implement a sustainable Global Earthquake Observation System and a Global Volcano Observation System as part of the Global Earth Observation System of Systems (GEOSS)." This presentation will outline the overall concept, objectives, and examples of the e-infrastructure, which is currently being set up for the GEO Supersite initiative helping to advance science.

  20. Computer simulations of space-borne meteorological systems on the CYBER 205

    NASA Technical Reports Server (NTRS)

    Halem, M.

    1984-01-01

    Because of the extreme expense involved in developing and flight testing meteorological instruments, an extensive series of numerical modeling experiments to simulate the performance of meteorological observing systems were performed on CYBER 205. The studies compare the relative importance of different global measurements of individual and composite systems of the meteorological variables needed to determine the state of the atmosphere. The assessments are made in terms of the systems ability to improve 12 hour global forecasts. Each experiment involves the daily assimilation of simulated data that is obtained from a data set called nature. This data is obtained from two sources: first, a long two-month general circulation integration with the GLAS 4th Order Forecast Model and second, global analysis prepared by the National Meteorological Center, NOAA, from the current observing systems twice daily.

  1. Global modeling of land water and energy balances. Part II: Land-characteristic contributions to spatial variability

    USGS Publications Warehouse

    Milly, P.C.D.; Shmakin, A.B.

    2002-01-01

    Land water and energy balances vary around the globe because of variations in amount and temporal distribution of water and energy supplies and because of variations in land characteristics. The former control (water and energy supplies) explains much more variance in water and energy balances than the latter (land characteristics). A largely untested hypothesis underlying most global models of land water and energy balance is the assumption that parameter values based on estimated geographic distributions of soil and vegetation characteristics improve the performance of the models relative to the use of globally constant land parameters. This hypothesis is tested here through an evaluation of the improvement in performance of one land model associated with the introduction of geographic information on land characteristics. The capability of the model to reproduce annual runoff ratios of large river basins, with and without information on the global distribution of albedo, rooting depth, and stomatal resistance, is assessed. To allow a fair comparison, the model is calibrated in both cases by adjusting globally constant scale factors for snow-free albedo, non-water-stressed bulk stomatal resistance, and critical root density (which is used to determine effective root-zone depth). The test is made in stand-alone mode, that is, using prescribed radiative and atmospheric forcing. Model performance is evaluated by comparing modeled runoff ratios with observed runoff ratios for a set of basins where precipitation biases have been shown to be minimal. The withholding of information on global variations in these parameters leads to a significant degradation of the capability of the model to simulate the annual runoff ratio. An additional set of optimization experiments, in which the parameters are examined individually, reveals that the stomatal resistance is, by far, the parameter among these three whose spatial variations add the most predictive power to the model in stand-alone mode. Further single-parameter experiments with surface roughness length, available water capacity, thermal conductivity, and thermal diffusivity show very little sensitivity to estimated global variations in these parameters. Finally, it is found that even the constant-parameter model performance exceeds that of the Budyko and generalized Turc-Pike water-balance equations, suggesting that the model benefits also from information on the geographic variability of the temporal structure of forcing.

  2. Spectral characteristics of mid-latitude continental convection from a global variable-resolution Voronoi-mesh atmospheric model

    NASA Astrophysics Data System (ADS)

    Wong, M.; Skamarock, W. C.

    2015-12-01

    Global numerical weather forecast tests were performed using the global nonhydrostatic atmospheric model, Model for Prediction Across Scales (MPAS), for the NOAA Storm Prediction Center 2015 Spring Forecast Experiment (May 2015) and the Plains Elevated Convection at Night (PECAN) field campaign (June to mid-July 2015). These two sets of forecasts were performed on 50-to-3 km and 15-to-3 km smoothly-varying horizontal meshes, respectively. Both variable-resolution meshes have nominal convection-permitting 3-km grid spacing over the entire continental US. Here we evaluate the limited-area (vs. global) spectra from these NWP simulations. We will show the simulated spectral characteristics of total kinetic energy, vertical velocity variance, and precipitation during these spring and summer periods when diurnal continental convection is most active over central US. Spectral characteristics of a high-resolution global 3-km simulation (essentially no nesting) from the 20 May 2013 Moore, OK tornado case are also shown. These characteristics include spectral scaling, shape, and anisotropy, as well as the effective resolution of continental convection representation in MPAS.

  3. Strengthening and sustainability of national immunization technical advisory groups (NITAGs) globally: Lessons and recommendations from the founding meeting of the global NITAG network.

    PubMed

    Adjagba, Alex; MacDonald, Noni E; Ortega-Pérez, Inmaculada; Duclos, Philippe

    2017-05-25

    National Immunization Technical Advisory Groups (NITAGs) provide independent, evidence-informed advice to assist their governments in immunization policy formation. However, many NITAGs face challenges in fulfilling their roles. Hence the many requests for formation of a network linking NITAGs together so they can learn from each other. To address this request, the Health Policy and Institutional Development (HPID) Center (a WHO Collaborating Center at the Agence de Médecine Préventive - AMP), in collaboration with WHO, organized a meeting in Veyrier-du-Lac, France, on 11 and 12 May 2016, to establish a Global NITAG Network (GNN). The meeting focused on two areas: the requirements for (a) the establishment of a global NITAG collaborative network; and (b) the global assessment/evaluation of the performance of NITAGs. 35 participants from 26 countries reviewed the proposed GNN framework documents and NITAG performance evaluation. Participants recommended that a GNN should be established, agreed on its governance, function, scope and a proposed work plan as well as setting a framework for NITAG evaluation. Copyright © 2017.

  4. Position space analysis of the AdS (in)stability problem

    NASA Astrophysics Data System (ADS)

    Dimitrakopoulos, Fotios V.; Freivogel, Ben; Lippert, Matthew; Yang, I.-Sheng

    2015-08-01

    We investigate whether arbitrarily small perturbations in global AdS space are generically unstable and collapse into black holes on the time scale set by gravitational interactions. We argue that current evidence, combined with our analysis, strongly suggests that a set of nonzero measure in the space of initial conditions does not collapse on this time scale. We perform an analysis in position space to study this puzzle, and our formalism allows us to directly study the vanishing-amplitude limit. We show that gravitational self-interaction leads to tidal deformations which are equally likely to focus or defocus energy, and we sketch the phase diagram accordingly. We also clarify the connection between gravitational evolution in global AdS and holographic thermalization.

  5. Production of a long-term global water vapor and liquid water data set using ultra-fast methods to assimilate multi-satellite and radiosonde observations

    NASA Technical Reports Server (NTRS)

    Vonderhaar, Thomas H.; Randel, David L.; Reinke, Donald L.; Stephens, Graeme L.; Ringerud, Mark A.; Combs, Cynthia L.; Greenwald, Thomas J.; Wittmeyer, Ian L.

    1995-01-01

    There is a well-documented requirement for a comprehensive and accurate global moisture data set to assist many important studies in atmospheric science. Currently, atmospheric water vapor measurements are made from a variety of sources including radiosondes, aircraft and surface observations, and in recent years, by various satellite instruments. Creating a global data set from a single measuring system produces results that are useful and accurate only in specific situations and/or areas. Therefore, an accurate global moisture data set has been derived from a combination of these measurement systems. Under a NASA peer-reviewed contract, STC-METSAT produced two 5-yr (1988-1992) global data sets. One is the total column (integrated) water vapor data set and the other, a global layered water vapor data set using a combination of radiosonde observations, Television and Infrared Observation Satellite (TIROS) Operational Satellite (TOVS), and Special Sensor Microwave/Imager (SSM/I) data sets. STC-METSAT also produced a companion, global, integrated liquid water data set. The complete data set (all three products) has been named NVAP, an anachronym for NASA Water Vapor Project. STC-METSAT developed methods to process the data at a daily time scale and 1 x 1 deg spatial resolution.

  6. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-08-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  7. Innovative Techniques for the Production of Low Cost 2D Laser Diode Arrays. Supplies or Services and Prices/Costs

    DTIC Science & Technology

    1991-12-31

    continue on facet coatings, PL correlation to device performance, and CVD diamond. All global issues mentioned in Section 2.0 will be addresses and...The CVD diamond submounts will be hermetically sealed, electrically isolated and liquid cooled. (Deliverables: 5 5-bar arrays.) The following global ... issues not mentioned above will be investigated continuously throughout all four phases of this program: (1) design and development of a mask set to

  8. Global Lidar Measurements of Clouds and Aerosols from Space Using the Geoscience Laser Altimeter System (GLAS)

    NASA Technical Reports Server (NTRS)

    Hlavka, Dennis L.; Palm, S. P.; Welton, E. J.; Hart, W. D.; Spinhirne, J. D.; McGill, M.; Mahesh, A.; Starr, David OC. (Technical Monitor)

    2001-01-01

    The Geoscience Laser Altimeter System (GLAS) is scheduled for launch on the ICESat satellite as part of the NASA EOS mission in 2002. GLAS will be used to perform high resolution surface altimetry and will also provide a continuously operating atmospheric lidar to profile clouds, aerosols, and the planetary boundary layer with horizontal and vertical resolution of 175 and 76.8 m, respectively. GLAS is the first active satellite atmospheric profiler to provide global coverage. Data products include direct measurements of the heights of aerosol and cloud layers, and the optical depth of transmissive layers. In this poster we provide an overview of the GLAS atmospheric data products, present a simulated GLAS data set, and show results from the simulated data set using the GLAS data processing algorithm. Optical results from the ER-2 Cloud Physics Lidar (CPL), which uses many of the same processing algorithms as GLAS, show algorithm performance with real atmospheric conditions during the Southern African Regional Science Initiative (SAFARI 2000).

  9. Global behavior analysis for stochastic system of 1,3-PD continuous fermentation

    NASA Astrophysics Data System (ADS)

    Zhu, Xi; Kliemann, Wolfgang; Li, Chunfa; Feng, Enmin; Xiu, Zhilong

    2017-12-01

    Global behavior for stochastic system of continuous fermentation in glycerol bio-dissimilation to 1,3-propanediol by Klebsiella pneumoniae is analyzed in this paper. This bioprocess cannot avoid the stochastic perturbation caused by internal and external disturbance which reflect on the growth rate. These negative factors can limit and degrade the achievable performance of controlled systems. Based on multiplicity phenomena, the equilibriums and bifurcations of the deterministic system are analyzed. Then, a stochastic model is presented by a bounded Markov diffusion process. In order to analyze the global behavior, we compute the control sets for the associated control system. The probability distributions of relative supports are also computed. The simulation results indicate that how the disturbed biosystem tend to stationary behavior globally.

  10. Next generation of global land cover characterization, mapping, and monitoring

    USGS Publications Warehouse

    Giri, Chandra; Pengra, Bruce; Long, J.; Loveland, Thomas R.

    2013-01-01

    Land cover change is increasingly affecting the biophysics, biogeochemistry, and biogeography of the Earth's surface and the atmosphere, with far-reaching consequences to human well-being. However, our scientific understanding of the distribution and dynamics of land cover and land cover change (LCLCC) is limited. Previous global land cover assessments performed using coarse spatial resolution (300 m–1 km) satellite data did not provide enough thematic detail or change information for global change studies and for resource management. High resolution (∼30 m) land cover characterization and monitoring is needed that permits detection of land change at the scale of most human activity and offers the increased flexibility of environmental model parameterization needed for global change studies. However, there are a number of challenges to overcome before producing such data sets including unavailability of consistent global coverage of satellite data, sheer volume of data, unavailability of timely and accurate training and validation data, difficulties in preparing image mosaics, and high performance computing requirements. Integration of remote sensing and information technology is needed for process automation and high-performance computing needs. Recent developments in these areas have created an opportunity for operational high resolution land cover mapping, and monitoring of the world. Here, we report and discuss these advancements and opportunities in producing the next generations of global land cover characterization, mapping, and monitoring at 30-m spatial resolution primarily in the context of United States, Group on Earth Observations Global 30 m land cover initiative (UGLC).

  11. EO-Performance relationships in Reverse Internationalization by Chinese Global Startup OEMs: Social Networks and Strategic Flexibility.

    PubMed

    Chin, Tachia; Tsai, Sang-Bing; Fang, Kai; Zhu, Wenzhong; Yang, Dongjin; Liu, Ren-Huai; Tsuei, Richard Ting Chang

    2016-01-01

    Due to the context-sensitive nature of entrepreneurial orientation (EO), it is imperative to in-depth explore the EO-performance mechanism in China at its critical, specific stage of economic reform. Under the context of "reverse internationalization" by Chinese global startup original equipment manufacturers (OEMs), this paper aims to manifest the unique links and complicated interrelationships between the individual EO dimensions and firm performance. Using structural equation modeling, we found that during reverse internationalization, proactiveness is positively related to performance; risk taking is not statistically associated with performance; innovativeness is negatively related to performance. The proactiveness-performance relationship is mediated by Strategic flexibility and moderated by social networking relationships. The dynamic and complex institutional setting, coupled with the issues of overcapacity and rising labor cost in China may explain why our distinctive results occur. This research advances the understanding of how contingent factors (social network relationships and strategic flexibility) facilitate entrepreneurial firms to break down institutional barriers and reap the most from EO. It brings new insights into how Chinese global startup OEMs draw on EO to undertake reverse internationalization, responding the calls for unraveling the heterogeneous characteristics of EO sub-dimensions and for more contextually-embedded treatment of EO-performance associations.

  12. EO-Performance relationships in Reverse Internationalization by Chinese Global Startup OEMs: Social Networks and Strategic Flexibility

    PubMed Central

    Chin, Tachia; Tsai, Sang-Bing; Fang, Kai; Zhu, Wenzhong; Yang, Dongjin; Liu, Ren-huai; Tsuei, Richard Ting Chang

    2016-01-01

    Due to the context-sensitive nature of entrepreneurial orientation (EO), it is imperative to in-depth explore the EO-performance mechanism in China at its critical, specific stage of economic reform. Under the context of “reverse internationalization” by Chinese global startup original equipment manufacturers (OEMs), this paper aims to manifest the unique links and complicated interrelationships between the individual EO dimensions and firm performance. Using structural equation modeling, we found that during reverse internationalization, proactiveness is positively related to performance; risk taking is not statistically associated with performance; innovativeness is negatively related to performance. The proactiveness-performance relationship is mediated by Strategic flexibility and moderated by social networking relationships. The dynamic and complex institutional setting, coupled with the issues of overcapacity and rising labor cost in China may explain why our distinctive results occur. This research advances the understanding of how contingent factors (social network relationships and strategic flexibility) facilitate entrepreneurial firms to break down institutional barriers and reap the most from EO. It brings new insights into how Chinese global startup OEMs draw on EO to undertake reverse internationalization, responding the calls for unraveling the heterogeneous characteristics of EO sub-dimensions and for more contextually-embedded treatment of EO-performance associations. PMID:27631368

  13. Defining the minimum temporal and spatial scales available from a new 72-month Nimbus-7 Earth Radiation Budget climate data set

    NASA Technical Reports Server (NTRS)

    Randel, D. L.; Campbell, G. G.; Vonder Haar, T. H.; Smith, L.

    1986-01-01

    Scale factors and assumptions which were applied in calculations of global radiation budget parameters based on ERB data are discussed. The study was performed to examine the relationship between the composite global ERB map that can be generated every six days using all available data and the actual average global ERB. The wide field of view ERB instrument functioned for the first 19 months of the Nimbus-7 life, and furnished sufficient data for calculating actual ERB averages. The composite was most accurate in regions with the least variation in radiation budget.

  14. On the use of tower-flux measurements to assess the performance of global ecosystem models

    NASA Astrophysics Data System (ADS)

    El Maayar, M.; Kucharik, C.

    2003-04-01

    Global ecosystem models are important tools for the study of biospheric processes and their responses to environmental changes. Such models typically translate knowledge, gained from local observations, into estimates of regional or even global outcomes of ecosystem processes. A typical test of ecosystem models consists of comparing their output against tower-flux measurements of land surface-atmosphere exchange of heat and mass. To perform such tests, models are typically run using detailed information on soil properties (texture, carbon content,...) and vegetation structure observed at the experimental site (e.g., vegetation height, vegetation phenology, leaf photosynthetic characteristics,...). In global simulations, however, earth's vegetation is typically represented by a limited number of plant functional types (PFT; group of plant species that have similar physiological and ecological characteristics). For each PFT (e.g., temperate broadleaf trees, boreal conifer evergreen trees,...), which can cover a very large area, a set of typical physiological and physical parameters are assigned. Thus, a legitimate question arises: How does the performance of a global ecosystem model run using detailed site-specific parameters compare with the performance of a less detailed global version where generic parameters are attributed to a group of vegetation species forming a PFT? To answer this question, we used a multiyear dataset, measured at two forest sites with contrasting environments, to compare seasonal and interannual variability of surface-atmosphere exchange of water and carbon predicted by the Integrated BIosphere Simulator-Dynamic Global Vegetation Model. Two types of simulations were, thus, performed: a) Detailed runs: observed vegetation characteristics (leaf area index, vegetation height,...) and soil carbon content, in addition to climate and soil type, are specified for model run; and b) Generic runs: when only observed climates and soil types at the measurement sites are used to run the model. The generic runs were performed for the number of years equal to the current age of the forests, initialized with no vegetation and a soil carbon density equal to zero.

  15. Objective structured assessment of nontechnical skills: Reliability of a global rating scale for the in-training assessment in the operating room.

    PubMed

    Dedy, Nicolas J; Szasz, Peter; Louridas, Marisa; Bonrath, Esther M; Husslein, Heinrich; Grantcharov, Teodor P

    2015-06-01

    Nontechnical skills are critical for patient safety in the operating room (OR). As a result, regulatory bodies for accreditation and certification have mandated the integration of these competencies into postgraduate education. A generally accepted approach to the in-training assessment of nontechnical skills, however, is lacking. The goal of the present study was to develop an evidence-based and reliable tool for the in-training assessment of residents' nontechnical performance in the OR. The Objective Structured Assessment of Nontechnical Skills tool was designed as a 5-point global rating scale with descriptive anchors for each item, based on existing evidence-based frameworks of nontechnical skills, as well as resident training requirements. The tool was piloted on scripted videos and refined in an iterative process. The final version was used to rate residents' performance in recorded OR crisis simulations and during live observations in the OR. A total of 37 simulations and 10 live procedures were rated. Interrater agreement was good for total mean scores, both in simulation and in the real OR, with intraclass correlation coefficients >0.90 in all settings for average and single measures. Internal consistency of the scale was high (Cronbach's alpha = 0.80). The Objective Structured Assessment of Nontechnical Skills global rating scale was developed as an evidence-based tool for the in-training assessment of residents' nontechnical performance in the OR. Unique descriptive anchors allow for a criterion-referenced assessment of performance. Good reliability was demonstrated in different settings, supporting applications in research and education. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Using fluorescent dyes as proxies to study herbicide removal by sorption in buffer zones.

    PubMed

    Dollinger, Jeanne; Dagès, Cécile; Voltz, Marc

    2017-04-01

    The performance of buffer zones for removing pesticides from runoff water varies greatly according to landscape settings, hydraulic regime, and system design. Evaluating the performance of buffers for a range of pesticides and environmental conditions can be very expensive. Recent studies suggested that the fluorescent dyes uranine and sulforhodamine B could be used as cost-effective surrogates of herbicides to evaluate buffer performance. However, while transformation mechanisms in buffers have been extensively documented, sorption processes of both dyes have rarely been investigated. In this study, we measured the adsorption, desorption, and kinetic sorption coefficients of uranine and sulforhodamine B for a diverse range of buffer zone materials (soils, litters, plants) and compared the adsorption coefficients (Kd) to those of selected herbicides. We also compared the global sorption capacity of 6 ditches, characterized by varying proportions of the aforementioned materials, between both dyes and a set of four herbicides using the sorption-induced pesticide retention indicator (SPRI). We found that both the individual Kd of uranine for the diverse buffer materials and the global sorption capacity of the ditches are equivalent to those of the herbicides diuron, isoproturon, and metolachlor. The Kd of sulforhodamine B on plants and soils are equivalent to those of glyphosate, and the global sorption capacities of the ditches are equivalent for both molecules. Hence, we demonstrate for the first time that uranine can be used as a proxy of moderately hydrophobic herbicides to evaluate the performance of buffer systems, whereas sulforhodamine B can serve as a proxy for more strongly sorbing herbicides.

  17. Complementarity and Area-Efficiency in the Prioritization of the Global Protected Area Network.

    PubMed

    Kullberg, Peter; Toivonen, Tuuli; Montesino Pouzols, Federico; Lehtomäki, Joona; Di Minin, Enrico; Moilanen, Atte

    2015-01-01

    Complementarity and cost-efficiency are widely used principles for protected area network design. Despite the wide use and robust theoretical underpinnings, their effects on the performance and patterns of priority areas are rarely studied in detail. Here we compare two approaches for identifying the management priority areas inside the global protected area network: 1) a scoring-based approach, used in recently published analysis and 2) a spatial prioritization method, which accounts for complementarity and area-efficiency. Using the same IUCN species distribution data the complementarity method found an equal-area set of priority areas with double the mean species ranges covered compared to the scoring-based approach. The complementarity set also had 72% more species with full ranges covered, and lacked any coverage only for half of the species compared to the scoring approach. Protected areas in our complementarity-based solution were on average smaller and geographically more scattered. The large difference between the two solutions highlights the need for critical thinking about the selected prioritization method. According to our analysis, accounting for complementarity and area-efficiency can lead to considerable improvements when setting management priorities for the global protected area network.

  18. Assessing Climate Change Risks Using a Multi-Model Approach

    NASA Astrophysics Data System (ADS)

    Knorr, W.; Scholze, M.; Prentice, C.

    2007-12-01

    We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from the IPCC AR4 data archive using 16 climate models and mapping the proportions of model runs showing exceedance of natural variability in wildfire frequency and freshwater supply or shifts in vegetation cover. Our analysis does not assign probabilities to scenarios. Instead, we consider the distribution of outcomes within three sets of model runs grouped according to the amount of global warming they simulate: < 2 degree C (including committed climate change simulations), 2-3 degree C, and >3 degree C. Here, we are contrasting two different methods for calculating the risks: first we use an equal weighting approach giving every model within one of the three sets the same weight, and second, we weight the models according to their ability to model ENSO. The differences are underpinning the need for the development of more robust performance metrics for global climate models.

  19. Support and performance improvement for primary health care workers in low- and middle-income countries: a scoping review of intervention design and methods

    PubMed Central

    Mabey, David C.; Chaudhri, Simran; Brown Epstein, Helen-Ann; Lawn, Stephen D.

    2017-01-01

    Abstract Primary health care workers (HCWs) in low- and middle-income settings (LMIC) often work in challenging conditions in remote, rural areas, in isolation from the rest of the health system and particularly specialist care. Much attention has been given to implementation of interventions to support quality and performance improvement for workers in such settings. However, little is known about the design of such initiatives and which approaches predominate, let alone those that are most effective. We aimed for a broad understanding of what distinguishes different approaches to primary HCW support and performance improvement and to clarify the existing evidence as well as gaps in evidence in order to inform decision-making and design of programs intended to support and improve the performance of health workers in these settings. We systematically searched the literature for articles addressing this topic, and undertook a comparative review to document the principal approaches to performance and quality improvement for primary HCWs in LMIC settings. We identified 40 eligible papers reporting on interventions that we categorized into five different approaches: (1) supervision and supportive supervision; (2) mentoring; (3) tools and aids; (4) quality improvement methods, and (5) coaching. The variety of study designs and quality/performance indicators precluded a formal quantitative data synthesis. The most extensive literature was on supervision, but there was little clarity on what defines the most effective approach to the supervision activities themselves, let alone the design and implementation of supervision programs. The mentoring literature was limited, and largely focused on clinical skills building and educational strategies. Further research on how best to incorporate mentorship into pre-service clinical training, while maintaining its function within the routine health system, is needed. There is insufficient evidence to draw conclusions about coaching in this setting, however a review of the corporate and the business school literature is warranted to identify transferrable approaches. A substantial literature exists on tools, but significant variation in approaches makes comparison challenging. We found examples of effective individual projects and designs in specific settings, but there was a lack of comparative research on tools across approaches or across settings, and no systematic analysis within specific approaches to provide evidence with clear generalizability. Future research should prioritize comparative intervention trials to establish clear global standards for performance and quality improvement initiatives. Such standards will be critical to creating and sustaining a well-functioning health workforce and for global initiatives such as universal health coverage. PMID:27993961

  20. Assessing Technical Performance and Determining the Learning Curve in Cleft Palate Surgery Using a High-Fidelity Cleft Palate Simulator.

    PubMed

    Podolsky, Dale J; Fisher, David M; Wong Riff, Karen W; Szasz, Peter; Looi, Thomas; Drake, James M; Forrest, Christopher R

    2018-06-01

    This study assessed technical performance in cleft palate repair using a newly developed assessment tool and high-fidelity cleft palate simulator through a longitudinal simulation training exercise. Three residents performed five and one resident performed nine consecutive endoscopically recorded cleft palate repairs using a cleft palate simulator. Two fellows in pediatric plastic surgery and two expert cleft surgeons also performed recorded simulated repairs. The Cleft Palate Objective Structured Assessment of Technical Skill (CLOSATS) and end-product scales were developed to assess performance. Two blinded cleft surgeons assessed the recordings and the final repairs using the CLOSATS, end-product scale, and a previously developed global rating scale. The average procedure-specific (CLOSATS), global rating, and end-product scores increased logarithmically after each successive simulation session for the residents. Reliability of the CLOSATS (average item intraclass correlation coefficient (ICC), 0.85 ± 0.093) and global ratings (average item ICC, 0.91 ± 0.02) among the raters was high. Reliability of the end-product assessments was lower (average item ICC, 0.66 ± 0.15). Standard setting linear regression using an overall cutoff score of 7 of 10 corresponded to a pass score for the CLOSATS and the global score of 44 (maximum, 60) and 23 (maximum, 30), respectively. Using logarithmic best-fit curves, 6.3 simulation sessions are required to reach the minimum standard. A high-fidelity cleft palate simulator has been developed that improves technical performance in cleft palate repair. The simulator and technical assessment scores can be used to determine performance before operating on patients.

  1. Means, Variability and Trends of Precipitation in the Global Climate as Determined by the 25-year GEWEWGPCP Data Set

    NASA Technical Reports Server (NTRS)

    Adler, R. F.; Gu, G.; Curtis, S.; Huffman, G. J.

    2004-01-01

    The Global Precipitation Climatology Project (GPCP) 25-year precipitation data set is used as a basis to evaluate the mean state, variability and trends (or inter-decadal changes) of global and regional scales of precipitation. The uncertainties of these characteristics of the data set are evaluated by examination of other, parallel data sets and examination of shorter periods with higher quality data (e.g., TRMM). The global and regional means are assessed for uncertainty by comparing with other satellite and gauge data sets, both globally and regionally. The GPCP global mean of 2.6 mdday is divided into values of ocean and land and major latitude bands (Tropics, mid-latitudes, etc.). Seasonal variations globally and by region are shown and uncertainties estimated. The variability of precipitation year-to-year is shown to be related to ENS0 variations and volcanoes and is evaluated in relation to the overall lack of a significant global trend. The GPCP data set necessarily has a heterogeneous time series of input data sources, so part of the assessment described above is to test the initial results for potential influence by major data boundaries in the record.

  2. Assessing water resources in Azerbaijan using a local distributed model forced and constrained with global data

    NASA Astrophysics Data System (ADS)

    Bouaziz, Laurène; Hegnauer, Mark; Schellekens, Jaap; Sperna Weiland, Frederiek; ten Velden, Corine

    2017-04-01

    In many countries, data is scarce, incomplete and often not easily shared. In these cases, global satellite and reanalysis data provide an alternative to assess water resources. To assess water resources in Azerbaijan, a completely distributed and physically based hydrological wflow-sbm model was set-up for the entire Kura basin. We used SRTM elevation data, a locally available river map and one from OpenStreetMap to derive the drainage direction network at the model resolution of approximately 1x1 km. OpenStreetMap data was also used to derive the fraction of paved area per cell to account for the reduced infiltration capacity (c.f. Schellekens et al. 2014). We used the results of a global study to derive root zone capacity based on climate data (Wang-Erlandsson et al., 2016). To account for the variation in vegetation cover over the year, monthly averages of Leaf Area Index, based on MODIS data, were used. For the soil-related parameters, we used global estimates as provided by Dai et al. (2013). This enabled the rapid derivation of a first estimate of parameter values for our hydrological model. Digitized local meteorological observations were scarce and available only for limited time period. Therefore several sources of global meteorological data were evaluated: (1) EU-WATCH global precipitation, temperature and derived potential evaporation for the period 1958-2001 (Harding et al., 2011), (2) WFDEI precipitation, temperature and derived potential evaporation for the period 1979-2014 (by Weedon et al., 2014), (3) MSWEP precipitation (Beck et al., 2016) and (4) local precipitation data from more than 200 stations in the Kura basin were available from the NOAA website for a period up to 1991. The latter, together with data archives from Azerbaijan, were used as a benchmark to evaluate the global precipitation datasets for the overlapping period 1958-1991. By comparing the datasets, we found that monthly mean precipitation of EU-WATCH and WFDEI coincided well with NOAA stations and that MSWEP slightly overestimated precipitation amounts. On a daily basis, there were discrepancies in the peak timing and magnitude between measured precipitation and the global products. A bias between EU-WATCH and WFDEI temperature and potential evaporation was observed and to model the water balance correctly, it was needed to correct EU-WATCH to WFDEI mean monthly values. Overall, the available sources enabled rapid set-up of a hydrological model including the forcing of the model with a relatively good performance to assess water resources in Azerbaijan with a limited calibration effort and allow for a similar set-up anywhere in the world. Timing and quantification of peak volume remains a weakness in global data, making it difficult to be used for some applications (flooding) and for detailed calibration. Selecting and comparing different sources of global meteorological data is important to have a reliable set which improves model performance. - Beck et al., 2016. MSWEP: 3-hourly 0.25° global gridded precipitation (1979-2014) by merging gauge, satellite, and reanalysis data. Hydrol. Earth Syst. Sci. Discuss. - Dai Y. et al. ,2013. Development of a China Dataset of Soil Hydraulic Parameters Using Pedotransfer Functions for Land Surface Modeling. Journal of Hydrometeorology - Harding, R. et al., 2011., WATCH: Current knowledge of the Terrestrial global water cycle, J. Hydrometeorol. - Schellekens, J. et al., 2014. Rapid setup of hydrological and hydraulic models using OpenStreetMap and the SRTM derived digital elevation model. Environmental Modelling&Software - Wang-Erlandsson L. et al., 2016. Global Root Zone Storage Capacity from Satellite-Based Evaporation. Hydrology and Earth System Sciences - Weedon, G. et al., 2014. The WFDEI meteorological forcing data set: WATCH Forcing Data methodology applied to ERA-Interim reanalysis data, Water Resources Research.

  3. Global embedding of fibre inflation models

    NASA Astrophysics Data System (ADS)

    Cicoli, Michele; Muia, Francesco; Shukla, Pramod

    2016-11-01

    We present concrete embeddings of fibre inflation models in globally consistent type IIB Calabi-Yau orientifolds with closed string moduli stabilisation. After performing a systematic search through the existing list of toric Calabi-Yau manifolds, we find several examples that reproduce the minimal setup to embed fibre inflation models. This involves Calabi-Yau manifolds with h 1,1 = 3 which are K3 fibrations over a ℙ1 base with an additional shrinkable rigid divisor. We then provide different consistent choices of the underlying brane set-up which generate a non-perturbative superpotential suitable for moduli stabilisation and string loop corrections with the correct form to drive inflation. For each Calabi-Yau orientifold setting, we also compute the effect of higher derivative contributions and study their influence on the inflationary dynamics.

  4. Tower-scale performance of four observation-based evapotranspiration algorithms within the WACMOS-ET project

    NASA Astrophysics Data System (ADS)

    Michel, Dominik; Miralles, Diego; Jimenez, Carlos; Ershadi, Ali; McCabe, Matthew F.; Hirschi, Martin; Seneviratne, Sonia I.; Jung, Martin; Wood, Eric F.; (Bob) Su, Z.; Timmermans, Joris; Chen, Xuelong; Fisher, Joshua B.; Mu, Quiaozen; Fernandez, Diego

    2015-04-01

    Research on climate variations and the development of predictive capabilities largely rely on globally available reference data series of the different components of the energy and water cycles. Several efforts have recently aimed at producing large-scale and long-term reference data sets of these components, e.g. based on in situ observations and remote sensing, in order to allow for diagnostic analyses of the drivers of temporal variations in the climate system. Evapotranspiration (ET) is an essential component of the energy and water cycle, which cannot be monitored directly on a global scale by remote sensing techniques. In recent years, several global multi-year ET data sets have been derived from remote sensing-based estimates, observation-driven land surface model simulations or atmospheric reanalyses. The LandFlux-EVAL initiative presented an ensemble-evaluation of these data sets over the time periods 1989-1995 and 1989-2005 (Mueller et al. 2013). The WACMOS-ET project (http://wacmoset.estellus.eu) started in the year 2012 and constitutes an ESA contribution to the GEWEX initiative LandFlux. It focuses on advancing the development of ET estimates at global, regional and tower scales. WACMOS-ET aims at developing a Reference Input Data Set exploiting European Earth Observations assets and deriving ET estimates produced by a set of four ET algorithms covering the period 2005-2007. The algorithms used are the SEBS (Su et al., 2002), Penman-Monteith from MODIS (Mu et al., 2011), the Priestley and Taylor JPL model (Fisher et al., 2008) and GLEAM (Miralles et al., 2011). The algorithms are run with Fluxnet tower observations, reanalysis data (ERA-Interim), and satellite forcings. They are cross-compared and validated against in-situ data. In this presentation the performance of the different ET algorithms with respect to different temporal resolutions, hydrological regimes, land cover types (including grassland, cropland, shrubland, vegetation mosaic, savanna, woody savanna, needleleaf forest, deciduous forest and mixed forest) are evaluated at the tower-scale in 24 pre-selected study regions on three continents (Europe, North America, and Australia). References: Fisher, J. B., Tu, K.P., and Baldocchi, D.D. Global estimates of the land-atmosphere water flux based on monthly AVHRR and ISLSCP-II data, validated at 16 FLUXNET sites, Remote Sens. Environ. 112, 901-919, 2008. Jiménez, C. et al. Global intercomparison of 12 land surface heat flux estimates. J. Geophys. Res. 116, D02102, 2011. 
 Miralles, D.G. et al. Global land-surface evaporation estimated from satellite-based observations. Hydrol. Earth Syst. Sci. 15, 453-469, 2011. 
 Mu, Q., Zhao, M. & Running, S.W. Improvements to a MODIS global terrestrial evapotranspiration algorithm. Remote Sens. Environ. 115, 1781-1800, 2011. 
 Mueller, B., Hirschi, M., Jimenez, C., Ciais, P., Dirmeyer, P. A., Dolman, A. J., Fisher, J. B., Jung, M., Ludwig, F., Maignan, F., Miralles, D. G., McCabe, M. F., Reichstein, M., Sheffield, J., Wang, K., Wood, E. F., Zhang, Y., and Seneviratne, S. I. (2013). Benchmark products for land evapotranspiration: LandFlux-EVAL multi-data set synthesis. Hydrology and Earth System Sciences, 17, 3707-3720. Mueller, B. et al. Benchmark products for land evapotranspiration: LandFlux-EVAL multi-dataset synthesis. Hydrol. Earth Syst. Sci. 17, 3707-3720, 2013. Su, Z. The Surface Energy Balance System (SEBS) for estimation of turbulent heat fluxes. Hydrol. Earth Syst. Sci. 6, 85-99, 2002.

  5. Dimensions of Academic Interest among Undergraduate Students: Passion, Confidence, Aspiration and Self-Expression

    ERIC Educational Resources Information Center

    Lee, Jihyun; Durksen, Tracy L.

    2018-01-01

    We investigated psychological dimensions of academic interest among undergraduate students (N = 325) using a global academic interest scale. The scale was administered together with measures of academic performance, educational aspiration, career planning, goal setting, life satisfaction, attitudes towards leisure, personality and value.…

  6. Get Set! E-Ready, ... E-Learn! The E-Readiness of Warehouse Workers

    ERIC Educational Resources Information Center

    Moolman, Hermanus B.; Blignaut, Seugnet

    2008-01-01

    Modern organizations use technology to expand across traditional business zones and boundaries to survive the global commercial village. While IT systems allow organizations to maintain a competitive edge, South African unskilled labour performing warehouse operations are frequently retrained to keep abreast with Information Technology.…

  7. Cultural Regulation and the Reshaping of the University

    ERIC Educational Resources Information Center

    O'Brien, Stephen

    2012-01-01

    This paper is set within the context of university change in the Republic of Ireland. Irish third-level institutions are increasingly situated, whilst situating themselves, in the global advance of the so-called "entrepreneurial" university model. This model promotes knowledge as utilitarian and performative that, in turn, informs new…

  8. Management Matters: A Leverage Point for Health Systems Strengthening in Global Health

    PubMed Central

    Bradley, Elizabeth H.; Taylor, Lauren A.; Cuellar, Carlos J.

    2015-01-01

    Despite a renewed focus in the field of global health on strengthening health systems, inadequate attention has been directed to a key ingredient of high-performing health systems: management. We aimed to develop the argument that management – defined here as the process of achieving predetermined objectives through human, financial, and technical resources – is a cross-cutting function necessary for success in all World Health Organization (WHO) building blocks of health systems strengthening. Management within health systems is particularly critical in low-income settings where the efficient use of scarce resources is paramount to attaining health goals. More generally, investments in management capacity may be viewed as a key leverage point in grand strategy, as strong management enables the achievement of large ends with limited means. We also sought to delineate a set of core competencies and identify key roles to be targeted for management capacity building efforts. Several effective examples of management interventions have been described in the research literature. Together, the existing evidence underscores the importance of country ownership of management capacity building efforts, which often challenge the status quo and thus need country leadership to sustain despite inevitable friction. The literature also recognizes that management capacity efforts, as a key ingredient of effective systems change, take time to embed, as new protocols and ways of working become habitual and integrated as standard operating procedures. Despite these challenges, the field of health management as part of global health system strengthening efforts holds promise as a fundamental leverage point for achieving health system performance goals with existing human, technical, and financial resources. The evidence base consistently supports the role of management in performance improvement but would benefit from additional research with improved methodological rigor and longer-time horizon investigations. Meanwhile, greater emphasis on management as a critical element of global health efforts may open new and sustainable avenues for advancing health systems performance. PMID:26188805

  9. Management Matters: A Leverage Point for Health Systems Strengthening in Global Health.

    PubMed

    Bradley, Elizabeth H; Taylor, Lauren A; Cuellar, Carlos J

    2015-05-20

    Despite a renewed focus in the field of global health on strengthening health systems, inadequate attention has been directed to a key ingredient of high-performing health systems: management. We aimed to develop the argument that management - defined here as the process of achieving predetermined objectives through human, financial, and technical resources - is a cross-cutting function necessary for success in all World Health Organization (WHO) building blocks of health systems strengthening. Management within health systems is particularly critical in low-income settings where the efficient use of scarce resources is paramount to attaining health goals. More generally, investments in management capacity may be viewed as a key leverage point in grand strategy, as strong management enables the achievement of large ends with limited means. We also sought to delineate a set of core competencies and identify key roles to be targeted for management capacity building efforts. Several effective examples of management interventions have been described in the research literature. Together, the existing evidence underscores the importance of country ownership of management capacity building efforts, which often challenge the status quo and thus need country leadership to sustain despite inevitable friction. The literature also recognizes that management capacity efforts, as a key ingredient of effective systems change, take time to embed, as new protocols and ways of working become habitual and integrated as standard operating procedures. Despite these challenges, the field of health management as part of global health system strengthening efforts holds promise as a fundamental leverage point for achieving health system performance goals with existing human, technical, and financial resources. The evidence base consistently supports the role of management in performance improvement but would benefit from additional research with improved methodological rigor and longer-time horizon investigations. Meanwhile, greater emphasis on management as a critical element of global health efforts may open new and sustainable avenues for advancing health systems performance. © 2015 by Kerman University of Medical Sciences.

  10. Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.

    PubMed

    Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven

    2016-02-06

    The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).

  11. Relationship between global structural parameters and Enzyme Commission hierarchy: implications for function prediction.

    PubMed

    Boareto, Marcelo; Yamagishi, Michel E B; Caticha, Nestor; Leite, Vitor B P

    2012-10-01

    In protein databases there is a substantial number of proteins structurally determined but without function annotation. Understanding the relationship between function and structure can be useful to predict function on a large scale. We have analyzed the similarities in global physicochemical parameters for a set of enzymes which were classified according to the four Enzyme Commission (EC) hierarchical levels. Using relevance theory we introduced a distance between proteins in the space of physicochemical characteristics. This was done by minimizing a cost function of the metric tensor built to reflect the EC classification system. Using an unsupervised clustering method on a set of 1025 enzymes, we obtained no relevant clustering formation compatible with EC classification. The distance distributions between enzymes from the same EC group and from different EC groups were compared by histograms. Such analysis was also performed using sequence alignment similarity as a distance. Our results suggest that global structure parameters are not sufficient to segregate enzymes according to EC hierarchy. This indicates that features essential for function are rather local than global. Consequently, methods for predicting function based on global attributes should not obtain high accuracy in main EC classes prediction without relying on similarities between enzymes from training and validation datasets. Furthermore, these results are consistent with a substantial number of studies suggesting that function evolves fundamentally by recruitment, i.e., a same protein motif or fold can be used to perform different enzymatic functions and a few specific amino acids (AAs) are actually responsible for enzyme activity. These essential amino acids should belong to active sites and an effective method for predicting function should be able to recognize them. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Protein Loop Structure Prediction Using Conformational Space Annealing.

    PubMed

    Heo, Seungryong; Lee, Juyong; Joo, Keehyoung; Shin, Hang-Cheol; Lee, Jooyoung

    2017-05-22

    We have developed a protein loop structure prediction method by combining a new energy function, which we call E PLM (energy for protein loop modeling), with the conformational space annealing (CSA) global optimization algorithm. The energy function includes stereochemistry, dynamic fragment assembly, distance-scaled finite ideal gas reference (DFIRE), and generalized orientation- and distance-dependent terms. For the conformational search of loop structures, we used the CSA algorithm, which has been quite successful in dealing with various hard global optimization problems. We assessed the performance of E PLM with two widely used loop-decoy sets, Jacobson and RAPPER, and compared the results against the DFIRE potential. The accuracy of model selection from a pool of loop decoys as well as de novo loop modeling starting from randomly generated structures was examined separately. For the selection of a nativelike structure from a decoy set, E PLM was more accurate than DFIRE in the case of the Jacobson set and had similar accuracy in the case of the RAPPER set. In terms of sampling more nativelike loop structures, E PLM outperformed E DFIRE for both decoy sets. This new approach equipped with E PLM and CSA can serve as the state-of-the-art de novo loop modeling method.

  13. Impact of bias-corrected reanalysis-derived lateral boundary conditions on WRF simulations

    NASA Astrophysics Data System (ADS)

    Moalafhi, Ditiro Benson; Sharma, Ashish; Evans, Jason Peter; Mehrotra, Rajeshwar; Rocheta, Eytan

    2017-08-01

    Lateral and lower boundary conditions derived from a suitable global reanalysis data set form the basis for deriving a dynamically consistent finer resolution downscaled product for climate and hydrological assessment studies. A problem with this, however, is that systematic biases have been noted to be present in the global reanalysis data sets that form these boundaries, biases which can be carried into the downscaled simulations thereby reducing their accuracy or efficacy. In this work, three Weather Research and Forecasting (WRF) model downscaling experiments are undertaken to investigate the impact of bias correcting European Centre for Medium range Weather Forecasting Reanalysis ERA-Interim (ERA-I) atmospheric temperature and relative humidity using Atmospheric Infrared Sounder (AIRS) satellite data. The downscaling is performed over a domain centered over southern Africa between the years 2003 and 2012. The sample mean and the mean as well as standard deviation at each grid cell for each variable are used for bias correction. The resultant WRF simulations of near-surface temperature and precipitation are evaluated seasonally and annually against global gridded observational data sets and compared with ERA-I reanalysis driving field. The study reveals inconsistencies between the impact of the bias correction prior to downscaling and the resultant model simulations after downscaling. Mean and standard deviation bias-corrected WRF simulations are, however, found to be marginally better than mean only bias-corrected WRF simulations and raw ERA-I reanalysis-driven WRF simulations. Performances, however, differ when assessing different attributes in the downscaled field. This raises questions about the efficacy of the correction procedures adopted.

  14. The WACMOS-ET project – Part 1: Tower-scale evaluation of four remote-sensing-based evapotranspiration algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michel, D.; Jimenez, C.; Miralles, D. G.

    The WAter Cycle Multi-mission Observation Strategy – EvapoTranspiration (WACMOS-ET) project has compiled a forcing data set covering the period 2005–2007 that aims to maximize the exploitation of European Earth Observations data sets for evapotranspiration (ET) estimation. The data set was used to run four established ET algorithms: the Priestley–Taylor Jet Propulsion Laboratory model (PT-JPL), the Penman–Monteith algorithm from the MODerate resolution Imaging Spectroradiometer (MODIS) evaporation product (PM-MOD), the Surface Energy Balance System (SEBS) and the Global Land Evaporation Amsterdam Model (GLEAM). In addition, in situ meteorological data from 24 FLUXNET towers were used to force the models, with results from both forcing sets compared tomore » tower-based flux observations. Model performance was assessed on several timescales using both sub-daily and daily forcings. The PT-JPL model and GLEAM provide the best performance for both satellite- and tower-based forcing as well as for the considered temporal resolutions. Simulations using the PM-MOD were mostly underestimated, while the SEBS performance was characterized by a systematic overestimation. In general, all four algorithms produce the best results in wet and moderately wet climate regimes. In dry regimes, the correlation and the absolute agreement with the reference tower ET observations were consistently lower. While ET derived with in situ forcing data agrees best with the tower measurements ( R 2 = 0.67), the agreement of the satellite-based ET estimates is only marginally lower ( R 2 = 0.58). Results also show similar model performance at daily and sub-daily (3-hourly) resolutions. Overall, our validation experiments against in situ measurements indicate that there is no single best-performing algorithm across all biome and forcing types. In conclusion, an extension of the evaluation to a larger selection of 85 towers (model inputs resampled to a common grid to facilitate global estimates) confirmed the original findings.« less

  15. The WACMOS-ET project – Part 1: Tower-scale evaluation of four remote-sensing-based evapotranspiration algorithms

    DOE PAGES

    Michel, D.; Jimenez, C.; Miralles, D. G.; ...

    2016-02-23

    The WAter Cycle Multi-mission Observation Strategy – EvapoTranspiration (WACMOS-ET) project has compiled a forcing data set covering the period 2005–2007 that aims to maximize the exploitation of European Earth Observations data sets for evapotranspiration (ET) estimation. The data set was used to run four established ET algorithms: the Priestley–Taylor Jet Propulsion Laboratory model (PT-JPL), the Penman–Monteith algorithm from the MODerate resolution Imaging Spectroradiometer (MODIS) evaporation product (PM-MOD), the Surface Energy Balance System (SEBS) and the Global Land Evaporation Amsterdam Model (GLEAM). In addition, in situ meteorological data from 24 FLUXNET towers were used to force the models, with results from both forcing sets compared tomore » tower-based flux observations. Model performance was assessed on several timescales using both sub-daily and daily forcings. The PT-JPL model and GLEAM provide the best performance for both satellite- and tower-based forcing as well as for the considered temporal resolutions. Simulations using the PM-MOD were mostly underestimated, while the SEBS performance was characterized by a systematic overestimation. In general, all four algorithms produce the best results in wet and moderately wet climate regimes. In dry regimes, the correlation and the absolute agreement with the reference tower ET observations were consistently lower. While ET derived with in situ forcing data agrees best with the tower measurements ( R 2 = 0.67), the agreement of the satellite-based ET estimates is only marginally lower ( R 2 = 0.58). Results also show similar model performance at daily and sub-daily (3-hourly) resolutions. Overall, our validation experiments against in situ measurements indicate that there is no single best-performing algorithm across all biome and forcing types. In conclusion, an extension of the evaluation to a larger selection of 85 towers (model inputs resampled to a common grid to facilitate global estimates) confirmed the original findings.« less

  16. Pneumothorax detection in chest radiographs using local and global texture signatures

    NASA Astrophysics Data System (ADS)

    Geva, Ofer; Zimmerman-Moreno, Gali; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit

    2015-03-01

    A novel framework for automatic detection of pneumothorax abnormality in chest radiographs is presented. The suggested method is based on a texture analysis approach combined with supervised learning techniques. The proposed framework consists of two main steps: at first, a texture analysis process is performed for detection of local abnormalities. Labeled image patches are extracted in the texture analysis procedure following which local analysis values are incorporated into a novel global image representation. The global representation is used for training and detection of the abnormality at the image level. The presented global representation is designed based on the distinctive shape of the lung, taking into account the characteristics of typical pneumothorax abnormalities. A supervised learning process was performed on both the local and global data, leading to trained detection system. The system was tested on a dataset of 108 upright chest radiographs. Several state of the art texture feature sets were experimented with (Local Binary Patterns, Maximum Response filters). The optimal configuration yielded sensitivity of 81% with specificity of 87%. The results of the evaluation are promising, establishing the current framework as a basis for additional improvements and extensions.

  17. Pheromone Static Routing Strategy for Complex Networks

    NASA Astrophysics Data System (ADS)

    Hu, Mao-Bin; Henry, Y. K. Lau; Ling, Xiang; Jiang, Rui

    2012-12-01

    We adopt the concept of using pheromones to generate a set of static paths that can reach the performance of global dynamic routing strategy [Phys. Rev. E 81 (2010) 016113]. The path generation method consists of two stages. In the first stage, a pheromone is dropped to the nodes by packets forwarded according to the global dynamic routing strategy. In the second stage, pheromone static paths are generated according to the pheromone density. The output paths can greatly improve traffic systems' overall capacity on different network structures, including scale-free networks, small-world networks and random graphs. Because the paths are static, the system needs much less computational resources than the global dynamic routing strategy.

  18. Team Formation in Partially Observable Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Agogino, Adrian K.; Tumer, Kagan

    2004-01-01

    Sets of multi-agent teams often need to maximize a global utility rating the performance of the entire system where a team cannot fully observe other teams agents. Such limited observability hinders team-members trying to pursue their team utilities to take actions that also help maximize the global utility. In this article, we show how team utilities can be used in partially observable systems. Furthermore, we show how team sizes can be manipulated to provide the best compromise between having easy to learn team utilities and having them aligned with the global utility, The results show that optimally sized teams in a partially observable environments outperform one team in a fully observable environment, by up to 30%.

  19. Component-Level Selection and Qualification for the Global Ecosystem Dynamics Investigation (GEDI) Laser Altimeter Transmitter

    NASA Technical Reports Server (NTRS)

    Frese, Erich A.; Chiragh, Furqan L.; Switzer, Robert; Vasilyev, Aleksey A.; Thomes, Joe; Coyle, D. Barry; Stysley, Paul R.

    2018-01-01

    Flight quality solid-state lasers require a unique and extensive set of testing and qualification processes, both at the system and component levels to insure the laser's promised performance. As important as the overall laser transmitter design is, the quality and performance of individual subassemblies, optics, and electro-optics dictate the final laser unit's quality. The Global Ecosystem Dynamics Investigation (GEDI) laser transmitters employ all the usual components typical for a diode-pumped, solid-state laser, yet must each go through their own individual process of specification, modeling, performance demonstration, inspection, and destructive testing. These qualification processes and results for the laser crystals, laser diode arrays, electro-optics, and optics, will be reviewed as well as the relevant critical issues encountered, prior to their installation in the GEDI flight laser units.

  20. The Role of a Multidimensional Concept of Trust in the Performance of Global Virtual Teams

    NASA Technical Reports Server (NTRS)

    Bodensteiner, Nan Muir; Stecklein, Jonette M.

    2002-01-01

    This paper focuses on the concept of trust as an important ingredient of effective global virtual team performance. Definitions of trust and virtual teams are presented. The concept of trust is developed from its unilateral application (trust, absence of trust) to a multidimensional concept including cognitive and affective components. The special challenges of a virtual team are then discussed with particular emphasis on how a multidimensional concept of trust impacts these challenges. Propositions suggesting the multidimensional concept of trust moderates the negative impacts of distance, cross cultural and organizational differences, the effects of electronically mediated communication, reluctance to share information and a lack of hi story/future on the performance of virtual teams are stated. The paper concludes with recommendations and a set of techniques to build both cognitive and affective trust in virtual teams.

  1. An analysis of IGBP global land-cover characterization process

    USGS Publications Warehouse

    Loveland, Thomas R.; Zhu, Zhiliang; Ohlen, Donald O.; Brown, Jesslyn F.; Reed, Bradley C.; Yang, Limin

    1999-01-01

    The international Geosphere Biosphere Programme (IGBP) has called for the development of improved global land-cover data for use in increasingly sophisticated global environmental models. To meet this need, the staff of the U.S. Geological Survey and the University of Nebraska-Lincoln developed and applied a global land-cover characterization methodology using 1992-1993 1-km resolution Advanced Very High Resolution Radiometer (AVHRR) and other spatial data. The methodology, based on unsupervised classification with extensive postclassification refinement, yielded a multi-layer database consisting of eight land-cover data sets, descriptive attributes, and source data. An independent IGBP accuracy assessment reports a global accuracy of 73.5 percent, and continental results vary from 63 percent to 83 percent. Although data quality, methodology, interpreter performance, and logistics affected the results, significant problems were associated with the relationship between AVHRR data and fine-scale, spectrally similar land-cover patterns in complex natural or disturbed landscapes.

  2. The Global Menace

    PubMed Central

    Hodges, Sarah

    2015-01-01

    Summary The history of medicine has gone ‘global.’ Why? Can the proliferation of the ‘global’ in our writing be explained away as a product of staying true to our historical subjects’ categories? Or has this historiography in fact delivered a new ‘global’ problematic or performed serious ‘global’ analytic work? The situation is far from clear, and it is the tension between the global as descriptor and an analytics of the global that concerns me here. I have three main concerns: (1) that there is an epistemic collusion between the discourses of universality that inform medical science and global-talk; (2) that the embrace of the ‘global’ authorises a turning away from analyses of power in history-writing in that (3) this turning away from analyses of power in history-writing leads to scholarship that reproduces rather than critiques globalisation as a set of institutions, discourses and practices. PMID:26345469

  3. Evaluation of an innovative sensor for measuring global and diffuse irradiance, and sunshine duration

    NASA Astrophysics Data System (ADS)

    Muneer, Tariq; Zhang, Xiaodong; Wood, John

    2002-03-01

    Delta-T Device Limited of Cambridge, UK have developed an integrated device which enables simultaneous measurement of horizontal global and diffuse irradiance as well as sunshine status at any given instance in time. To evaluate the performance of this new device, horizontal global and diffuse irradiance data were simultaneously collected from Delta-T device and Napier University's CIE First Class daylight monitoring station. To enable a cross check a Kipp & Zonen CM11 global irradiance sensor has also been installed in Currie, south-west Edinburgh. Sunshine duration data have been recorded at the Royal Botanical Garden, Edinburgh using their Campbell-Stokes recorder. Hourly data sets were analysed and plotted within the Microsoft Excel environment. Using the common statistical measures, Root Mean Square Difference (RMSD) and Mean Bias Difference (MBD) the accuracy of measurements of Delta-T sensor's horizontal global and diffuse irradiance, and sunshine duration were investigated. The results show a good performance on the part of Delta-T device for the measurement of global and diffuse irradiance. The sunshine measurements were found to have a lack of consistency and accuracy. It is argued herein that the distance between the respective sensors and the poor accuracy of Campbell-Stokes recorder may be contributing factors to this phenomenon.

  4. Learning Human Actions by Combining Global Dynamics and Local Appearance.

    PubMed

    Luo, Guan; Yang, Shuang; Tian, Guodong; Yuan, Chunfeng; Hu, Weiming; Maybank, Stephen J

    2014-12-01

    In this paper, we address the problem of human action recognition through combining global temporal dynamics and local visual spatio-temporal appearance features. For this purpose, in the global temporal dimension, we propose to model the motion dynamics with robust linear dynamical systems (LDSs) and use the model parameters as motion descriptors. Since LDSs live in a non-Euclidean space and the descriptors are in non-vector form, we propose a shift invariant subspace angles based distance to measure the similarity between LDSs. In the local visual dimension, we construct curved spatio-temporal cuboids along the trajectories of densely sampled feature points and describe them using histograms of oriented gradients (HOG). The distance between motion sequences is computed with the Chi-Squared histogram distance in the bag-of-words framework. Finally we perform classification using the maximum margin distance learning method by combining the global dynamic distances and the local visual distances. We evaluate our approach for action recognition on five short clips data sets, namely Weizmann, KTH, UCF sports, Hollywood2 and UCF50, as well as three long continuous data sets, namely VIRAT, ADL and CRIM13. We show competitive results as compared with current state-of-the-art methods.

  5. Aerosol and Cloud Observations and Data Products by the GLAS Polar Orbiting Lidar Instrument

    NASA Technical Reports Server (NTRS)

    Spinhirne, J. D.; Palm, S. P.; Hlavka, D. L.; Hart, W. D.; Mahesh, A.; Welton, E. J.

    2005-01-01

    The Geoscience Laser Altimeter System (GLAS) launched in 2003 is the first polar orbiting satellite lidar. The instrument was designed for high performance observations of the distribution and optical scattering cross sections of clouds and aerosol. The backscatter lidar operates at two wavelengths, 532 and 1064 nm. Both receiver channels meet and exceed their design goals, and beginning with a two month period through October and November 2003, an excellent global lidar data set now exists. The data products for atmospheric observations include the calibrated, attenuated backscatter cross section for cloud and aerosol; height detection for multiple cloud layers; planetary boundary layer height; cirrus and aerosol optical depth and the height distribution of aerosol and cloud scattering cross section profiles. The data sets are now in open release through the NASA data distribution system. The initial results on global statistics for cloud and aerosol distribution has been produced and in some cases compared to other satellite observations. The sensitivity of the cloud measurements is such that the 70% global cloud coverage result should be the most accurate to date. Results on the global distribution of aerosol are the first that produce the true height distribution for model inter-comparison.

  6. A Round Robin evaluation of AMSR-E soil moisture retrievals

    NASA Astrophysics Data System (ADS)

    Mittelbach, Heidi; Hirschi, Martin; Nicolai-Shaw, Nadine; Gruber, Alexander; Dorigo, Wouter; de Jeu, Richard; Parinussa, Robert; Jones, Lucas A.; Wagner, Wolfgang; Seneviratne, Sonia I.

    2014-05-01

    Large-scale and long-term soil moisture observations based on remote sensing are promising data sets to investigate and understand various processes of the climate system including the water and biochemical cycles. Currently, the ESA Climate Change Initiative for soil moisture develops and evaluates a consistent global long-term soil moisture data set, which is based on merging passive and active remotely sensed soil moisture. Within this project an inter-comparison of algorithms for AMSR-E and ASCAT Level 2 products was conducted separately to assess the performance of different retrieval algorithms. Here we present the inter-comparison of AMSR-E Level 2 soil moisture products. These include the public data sets from University of Montana (UMT), Japan Aerospace and Space Exploration Agency (JAXA), VU University of Amsterdam (VUA; two algorithms) and National Aeronautics and Space Administration (NASA). All participating algorithms are applied to the same AMSR-E Level 1 data set. Ascending and descending paths of scaled surface soil moisture are considered and evaluated separately in daily and monthly resolution over the 2007-2011 time period. Absolute values of soil moisture as well as their long-term anomalies (i.e. removing the mean seasonal cycle) and short-term anomalies (i.e. removing a five weeks moving average) are evaluated. The evaluation is based on conventional measures like correlation and unbiased root-mean-square differences as well as on the application of the triple collocation method. As reference data set, surface soil moisture of 75 quality controlled soil moisture sites from the International Soil Moisture Network (ISMN) are used, which cover a wide range of vegetation density and climate conditions. For the application of the triple collocation method, surface soil moisture estimates from the Global Land Data Assimilation System are used as third independent data set. We find that the participating algorithms generally display a better performance for the descending compared to the ascending paths. A first classification of the sites defined by geographical locations show that the algorithms have a very similar average performance. Further classifications of the sites by land cover types and climate regions will be conducted which might result in a more diverse performance of the algorithms.

  7. Development of responder criteria for multicomponent non-pharmacological treatment in fibromyalgia.

    PubMed

    Vervoort, Vera M; Vriezekolk, Johanna E; van den Ende, Cornelia H

    2017-01-01

    There is a need to identify individual treatment success in patients with fibromyalgia (FM) who received non-pharmacological treatment. The present study described responder criteria for multicomponent non-pharmacological treatment in FM, and estimated and compared their sensitivity and specificity. Candidate responder sets were 1) identified in literature; and 2) formulated by expert group consensus. All candidate responder sets were tested in a cohort of 129 patients with FM receiving multicomponent non-pharmacological treatment. We used two gold standards (both therapist's and patient's perspective), assessed at six months after the start of treatment. Seven responder sets were defined (three identified in literature and four formulated by expert group consensus), and comprised combinations of domains of 1) pain; 2) fatigue; 3) patient global assessment (PGA); 4) illness perceptions; 5) limitations in activities of daily living (ADL); and 6) sleep. The sensitivity and specificity of literature-based responder sets (n=3) ranged between 17%-99% and 15%-95% respectively, whereas the expert-based responder sets (n=4) performed slightly better with regard to sensitivity (range 41%-81%) and specificity (range 50%-96%). Of the literature-based responder sets the OMERACT-OARSI responder set with patient's gold standard performed best (sensitivity 63%, specificity 75% and ROC area = 0.69). Overall, the expert-based responder set comprising the domains illness perceptions and limitations in ADL with patient's gold standard performed best (sensitivity 47%, specificity 96% and ROC area = 0.71). We defined sets of responder criteria for multicomponent non-pharmacological treatment in fibromyalgia. Further research should focus on the validation of those sets with acceptable performance.

  8. Interdependencies and Causalities in Coupled Financial Networks.

    PubMed

    Vodenska, Irena; Aoyama, Hideaki; Fujiwara, Yoshi; Iyetomi, Hiroshi; Arai, Yuta

    2016-01-01

    We explore the foreign exchange and stock market networks for 48 countries from 1999 to 2012 and propose a model, based on complex Hilbert principal component analysis, for extracting significant lead-lag relationships between these markets. The global set of countries, including large and small countries in Europe, the Americas, Asia, and the Middle East, is contrasted with the limited scopes of targets, e.g., G5, G7 or the emerging Asian countries, adopted by previous works. We construct a coupled synchronization network, perform community analysis, and identify formation of four distinct network communities that are relatively stable over time. In addition to investigating the entire period, we divide the time period into into "mild crisis," (1999-2002), "calm," (2003-2006) and "severe crisis" (2007-2012) sub-periods and find that the severe crisis period behavior dominates the dynamics in the foreign exchange-equity synchronization network. We observe that in general the foreign exchange market has predictive power for the global stock market performances. In addition, the United States, German and Mexican markets have forecasting power for the performances of other global equity markets.

  9. Improvement of research quality in the fields of orthopaedics and trauma: a global perspective.

    PubMed

    Fayaz, Hangama C; Haas, Norbert; Kellam, James; Bavonratanavech, Suthorn; Parvizi, Javad; Dyer, George; Pohlemann, Tim; Jerosch, Jörg; Prommersberger, Karl-Josef; Pape, Hans Christoph; Smith, Malcolm; Vrahas, Marc; Perka, Carsten; Siebenrock, Klaus; Elhassan, Bassem; Moran, Christopher; Jupiter, Jesse B

    2013-07-01

    The international orthopaedic community aims to achieve the best possible outcome for patient care by constantly modifying surgical techniques and expanding the surgeon's knowledge. These efforts require proper reflection within a setting that necessitates a higher quality standard for global orthopaedic publication. Furthermore, these techniques demand that surgeons acquire information at a rapid rate while enforcing higher standards in research performance. An international consensus exists on how to perform research and what rules should be considered when publishing a scientific paper. Despite this global agreement, in today's "Cross Check Era", too many authors do not give attention to the current standards of systematic research. Thus, the purpose of this paper is to describe these performance standards, the available choices for orthopaedic surgeons and the current learning curve for seasoned teams of researchers and orthopaedic surgeons with more than three decades of experience. These lead to provide an accessible overview of all important aspects of the topics that will significantly influence the research development as we arrive at an important globalisation era in orthopaedics and trauma-related research.

  10. Quantum computation over the butterfly network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soeda, Akihito; Kinjo, Yoshiyuki; Turner, Peter S.

    2011-07-15

    In order to investigate distributed quantum computation under restricted network resources, we introduce a quantum computation task over the butterfly network where both quantum and classical communications are limited. We consider deterministically performing a two-qubit global unitary operation on two unknown inputs given at different nodes, with outputs at two distinct nodes. By using a particular resource setting introduced by M. Hayashi [Phys. Rev. A 76, 040301(R) (2007)], which is capable of performing a swap operation by adding two maximally entangled qubits (ebits) between the two input nodes, we show that unitary operations can be performed without adding any entanglementmore » resource, if and only if the unitary operations are locally unitary equivalent to controlled unitary operations. Our protocol is optimal in the sense that the unitary operations cannot be implemented if we relax the specifications of any of the channels. We also construct protocols for performing controlled traceless unitary operations with a 1-ebit resource and for performing global Clifford operations with a 2-ebit resource.« less

  11. Improved Hydrology over Peatlands in a Global Land Modeling System

    NASA Technical Reports Server (NTRS)

    Bechtold, M.; Delannoy, G.; Reichle, R.; Koster, R.; Mahanama, S.; Roose, Dirk

    2018-01-01

    Peatlands of the Northern Hemisphere represent an important carbon pool that mainly accumulated since the last ice age under permanently wet conditions in specific geological and climatic settings. The carbon balance of peatlands is closely coupled to water table dynamics. Consequently, the future carbon balance over peatlands is strongly dependent on how hydrology in peatlands will react to changing boundary conditions, e.g. due to climate change or regional water level drawdown of connected aquifers or streams. Global land surface modeling over organic-rich regions can provide valuable global-scale insights on where and how peatlands are in transition due to changing boundary conditions. However, the current global land surface models are not able to reproduce typical hydrological dynamics in peatlands well. We implemented specific structural and parametric changes to account for key hydrological characteristics of peatlands into NASA's GEOS-5 Catchment Land Surface Model (CLSM, Koster et al. 2000). The main modifications pertain to the modeling of partial inundation, and the definition of peatland-specific runoff and evapotranspiration schemes. We ran a set of simulations on a high performance cluster using different CLSM configurations and validated the results with a newly compiled global in-situ dataset of water table depths in peatlands. The results demonstrate that an update of soil hydraulic properties for peat soils alone does not improve the performance of CLSM over peatlands. However, structural model changes for peatlands are able to improve the skill metrics for water table depth. The validation results for the water table depth indicate a reduction of the bias from 2.5 to 0.2 m, and an improvement of the temporal correlation coefficient from 0.5 to 0.65, and from 0.4 to 0.55 for the anomalies. Our validation data set includes both bogs (rain-fed) and fens (ground and/or surface water influence) and reveals that the metrics improved less for fens. In addition, a comparison of evapotranspiration and soil moisture estimates over peatlands will be presented, albeit only with limited ground-based validation data. We will discuss strengths and weaknesses of the new model by focusing on time series of specific validation sites.

  12. Public health agenda setting in a global context: the International Labor Organization's decent work agenda.

    PubMed

    Di Ruggiero, Erica; Cohen, Joanna E; Cole, Donald C; Forman, Lisa

    2015-04-01

    We drew on two agenda-setting theories usually applied at the state or national level to assess their utility at the global level: Kingdon's multiple streams theory and Baumgartner and Jones's punctuated equilibrium theory. We illustrate our analysis with findings from a qualitative study of the International Labor Organization's Decent Work Agenda. We found that both theories help explain the agenda-setting mechanisms that operate in the global context, including how windows of opportunity open and what role institutions play as policy entrepreneurs. Future application of these theories could help characterize power struggles between global actors, whose voices are heard or silenced, and their impact on global policy agenda setting.

  13. Is inefficient cognitive processing in anorexia nervosa a familial trait? A neuropsychological pilot study of mothers of offspring with a diagnosis of anorexia nervosa.

    PubMed

    Lang, Katie; Treasure, Janet; Tchanturia, Kate

    2016-06-01

    Inefficient set shifting and poor global processing are thought to be possible traits in anorexia nervosa (AN). This study aimed to investigate the neuropsychological processing style of unaffected mothers of offspring with AN (unaffected AN mothers). The performance of 21 unaffected AN mothers were compared to 20 mothers of healthy control offspring on neuropsychological measures of set shifting (Wisconsin Card Sorting Test, WCST) and central coherence (Fragmented Pictures Task, FPT, and Rey Osterrieth Complex Figures Task, ROCFT). Associations between neuropsychological performance and clinical measures were examined in the unaffected AN mothers group. There were significant differences in perseverative errors on the WCST (P≤0.01), with the unaffected mothers displaying a more inflexible thinking style compared to the control group. There were also significant differences on the FPT (P ≤ 0.01) and the ROCFT (P ≤ 0.01), whereby unaffected AN mothers showed lower levels of global processing. The results of this study support the idea of the familial nature of cognitive styles in AN. The implications of these findings are discussed.

  14. The Global Oscillation Network Group site survey. 1: Data collection and analysis methods

    NASA Technical Reports Server (NTRS)

    Hill, Frank; Fischer, George; Grier, Jennifer; Leibacher, John W.; Jones, Harrison B.; Jones, Patricia P.; Kupke, Renate; Stebbins, Robin T.

    1994-01-01

    The Global Oscillation Network Group (GONG) Project is planning to place a set of instruments around the world to observe solar oscillations as continuously as possible for at least three years. The Project has now chosen the sites that will comprise the network. This paper describes the methods of data collection and analysis that were used to make this decision. Solar irradiance data were collected with a one-minute cadence at fifteen sites around the world and analyzed to produce statistics of cloud cover, atmospheric extinction, and transparency power spectra at the individual sites. Nearly 200 reasonable six-site networks were assembled from the individual stations, and a set of statistical measures of the performance of the networks was analyzed using a principal component analysis. An accompanying paper presents the results of the survey.

  15. Global 21 cm Signal Extraction from Foreground and Instrumental Effects. I. Pattern Recognition Framework for Separation Using Training Sets

    NASA Astrophysics Data System (ADS)

    Tauscher, Keith; Rapetti, David; Burns, Jack O.; Switzer, Eric

    2018-02-01

    The sky-averaged (global) highly redshifted 21 cm spectrum from neutral hydrogen is expected to appear in the VHF range of ∼20–200 MHz and its spectral shape and strength are determined by the heating properties of the first stars and black holes, by the nature and duration of reionization, and by the presence or absence of exotic physics. Measurements of the global signal would therefore provide us with a wealth of astrophysical and cosmological knowledge. However, the signal has not yet been detected because it must be seen through strong foregrounds weighted by a large beam, instrumental calibration errors, and ionospheric, ground, and radio-frequency-interference effects, which we collectively refer to as “systematics.” Here, we present a signal extraction method for global signal experiments which uses Singular Value Decomposition of “training sets” to produce systematics basis functions specifically suited to each observation. Instead of requiring precise absolute knowledge of the systematics, our method effectively requires precise knowledge of how the systematics can vary. After calculating eigenmodes for the signal and systematics, we perform a weighted least square fit of the corresponding coefficients and select the number of modes to include by minimizing an information criterion. We compare the performance of the signal extraction when minimizing various information criteria and find that minimizing the Deviance Information Criterion most consistently yields unbiased fits. The methods used here are built into our widely applicable, publicly available Python package, pylinex, which analytically calculates constraints on signals and systematics from given data, errors, and training sets.

  16. Methane emissions from global wetlands: An assessment of the uncertainty associated with various wetland extent data sets

    NASA Astrophysics Data System (ADS)

    Zhang, Bowen; Tian, Hanqin; Lu, Chaoqun; Chen, Guangsheng; Pan, Shufen; Anderson, Christopher; Poulter, Benjamin

    2017-09-01

    A wide range of estimates on global wetland methane (CH4) fluxes has been reported during the recent two decades. This gives rise to urgent needs to clarify and identify the uncertainty sources, and conclude a reconciled estimate for global CH4 fluxes from wetlands. Most estimates by using bottom-up approach rely on wetland data sets, but these data sets show largely inconsistent in terms of both wetland extent and spatiotemporal distribution. A quantitative assessment of uncertainties associated with these discrepancies among wetland data sets has not been well investigated yet. By comparing the five widely used global wetland data sets (GISS, GLWD, Kaplan, GIEMS and SWAMPS-GLWD), it this study, we found large differences in the wetland extent, ranging from 5.3 to 10.2 million km2, as well as their spatial and temporal distributions among the five data sets. These discrepancies in wetland data sets resulted in large bias in model-estimated global wetland CH4 emissions as simulated by using the Dynamic Land Ecosystem Model (DLEM). The model simulations indicated that the mean global wetland CH4 emissions during 2000-2007 were 177.2 ± 49.7 Tg CH4 yr-1, based on the five different data sets. The tropical regions contributed the largest portion of estimated CH4 emissions from global wetlands, but also had the largest discrepancy. Among six continents, the largest uncertainty was found in South America. Thus, the improved estimates of wetland extent and CH4 emissions in the tropical regions and South America would be a critical step toward an accurate estimate of global CH4 emissions. This uncertainty analysis also reveals an important need for our scientific community to generate a global scale wetland data set with higher spatial resolution and shorter time interval, by integrating multiple sources of field and satellite data with modeling approaches, for cross-scale extrapolation.

  17. An efficient global energy optimization approach for robust 3D plane segmentation of point clouds

    NASA Astrophysics Data System (ADS)

    Dong, Zhen; Yang, Bisheng; Hu, Pingbo; Scherer, Sebastian

    2018-03-01

    Automatic 3D plane segmentation is necessary for many applications including point cloud registration, building information model (BIM) reconstruction, simultaneous localization and mapping (SLAM), and point cloud compression. However, most of the existing 3D plane segmentation methods still suffer from low precision and recall, and inaccurate and incomplete boundaries, especially for low-quality point clouds collected by RGB-D sensors. To overcome these challenges, this paper formulates the plane segmentation problem as a global energy optimization because it is robust to high levels of noise and clutter. First, the proposed method divides the raw point cloud into multiscale supervoxels, and considers planar supervoxels and individual points corresponding to nonplanar supervoxels as basic units. Then, an efficient hybrid region growing algorithm is utilized to generate initial plane set by incrementally merging adjacent basic units with similar features. Next, the initial plane set is further enriched and refined in a mutually reinforcing manner under the framework of global energy optimization. Finally, the performances of the proposed method are evaluated with respect to six metrics (i.e., plane precision, plane recall, under-segmentation rate, over-segmentation rate, boundary precision, and boundary recall) on two benchmark datasets. Comprehensive experiments demonstrate that the proposed method obtained good performances both in high-quality TLS point clouds (i.e., http://SEMANTIC3D.NET)

  18. Low-cost photodynamic therapy devices for global health settings: Characterization of battery-powered LED performance and smartphone imaging in 3D tumor models

    PubMed Central

    Hempstead, Joshua; Jones, Dustin P.; Ziouche, Abdelali; Cramer, Gwendolyn M.; Rizvi, Imran; Arnason, Stephen; Hasan, Tayyaba; Celli, Jonathan P.

    2015-01-01

    A lack of access to effective cancer therapeutics in resource-limited settings is implicated in global cancer health disparities between developed and developing countries. Photodynamic therapy (PDT) is a light-based treatment modality that has exhibited safety and efficacy in the clinic using wavelengths and irradiances achievable with light-emitting diodes (LEDs) operated on battery power. Here we assess low-cost enabling technology to extend the clinical benefit of PDT to regions with little or no access to electricity or medical infrastructure. We demonstrate the efficacy of a device based on a 635 nm high-output LED powered by three AA disposable alkaline batteries, to achieve strong cytotoxic response in monolayer and 3D cultures of A431 squamous carcinoma cells following photosensitization by administering aminolevulinic acid (ALA) to induce the accumulation of protoporphyrin IX (PpIX). Here we characterize challenges of battery-operated device performance, including battery drain and voltage stability specifically over relevant PDT dose parameters. Further motivated by the well-established capacity of PDT photosensitizers to serve as tumour-selective fluorescence contrast agents, we demonstrate the capability of a consumer smartphone with low-cost add-ons to measure concentration-dependent PpIX fluorescence. This study lays the groundwork for the on-going development of image-guided ALA-PDT treatment technologies for global health applications. PMID:25965295

  19. Low-cost photodynamic therapy devices for global health settings: Characterization of battery-powered LED performance and smartphone imaging in 3D tumor models.

    PubMed

    Hempstead, Joshua; Jones, Dustin P; Ziouche, Abdelali; Cramer, Gwendolyn M; Rizvi, Imran; Arnason, Stephen; Hasan, Tayyaba; Celli, Jonathan P

    2015-05-12

    A lack of access to effective cancer therapeutics in resource-limited settings is implicated in global cancer health disparities between developed and developing countries. Photodynamic therapy (PDT) is a light-based treatment modality that has exhibited safety and efficacy in the clinic using wavelengths and irradiances achievable with light-emitting diodes (LEDs) operated on battery power. Here we assess low-cost enabling technology to extend the clinical benefit of PDT to regions with little or no access to electricity or medical infrastructure. We demonstrate the efficacy of a device based on a 635 nm high-output LED powered by three AA disposable alkaline batteries, to achieve strong cytotoxic response in monolayer and 3D cultures of A431 squamous carcinoma cells following photosensitization by administering aminolevulinic acid (ALA) to induce the accumulation of protoporphyrin IX (PpIX). Here we characterize challenges of battery-operated device performance, including battery drain and voltage stability specifically over relevant PDT dose parameters. Further motivated by the well-established capacity of PDT photosensitizers to serve as tumour-selective fluorescence contrast agents, we demonstrate the capability of a consumer smartphone with low-cost add-ons to measure concentration-dependent PpIX fluorescence. This study lays the groundwork for the on-going development of image-guided ALA-PDT treatment technologies for global health applications.

  20. Low-cost photodynamic therapy devices for global health settings: Characterization of battery-powered LED performance and smartphone imaging in 3D tumor models

    NASA Astrophysics Data System (ADS)

    Hempstead, Joshua; Jones, Dustin P.; Ziouche, Abdelali; Cramer, Gwendolyn M.; Rizvi, Imran; Arnason, Stephen; Hasan, Tayyaba; Celli, Jonathan P.

    2015-05-01

    A lack of access to effective cancer therapeutics in resource-limited settings is implicated in global cancer health disparities between developed and developing countries. Photodynamic therapy (PDT) is a light-based treatment modality that has exhibited safety and efficacy in the clinic using wavelengths and irradiances achievable with light-emitting diodes (LEDs) operated on battery power. Here we assess low-cost enabling technology to extend the clinical benefit of PDT to regions with little or no access to electricity or medical infrastructure. We demonstrate the efficacy of a device based on a 635 nm high-output LED powered by three AA disposable alkaline batteries, to achieve strong cytotoxic response in monolayer and 3D cultures of A431 squamous carcinoma cells following photosensitization by administering aminolevulinic acid (ALA) to induce the accumulation of protoporphyrin IX (PpIX). Here we characterize challenges of battery-operated device performance, including battery drain and voltage stability specifically over relevant PDT dose parameters. Further motivated by the well-established capacity of PDT photosensitizers to serve as tumour-selective fluorescence contrast agents, we demonstrate the capability of a consumer smartphone with low-cost add-ons to measure concentration-dependent PpIX fluorescence. This study lays the groundwork for the on-going development of image-guided ALA-PDT treatment technologies for global health applications.

  1. Spotting words in handwritten Arabic documents

    NASA Astrophysics Data System (ADS)

    Srihari, Sargur; Srinivasan, Harish; Babu, Pavithra; Bhole, Chetan

    2006-01-01

    The design and performance of a system for spotting handwritten Arabic words in scanned document images is presented. Three main components of the system are a word segmenter, a shape based matcher for words and a search interface. The user types in a query in English within a search window, the system finds the equivalent Arabic word, e.g., by dictionary look-up, locates word images in an indexed (segmented) set of documents. A two-step approach is employed in performing the search: (1) prototype selection: the query is used to obtain a set of handwritten samples of that word from a known set of writers (these are the prototypes), and (2) word matching: the prototypes are used to spot each occurrence of those words in the indexed document database. A ranking is performed on the entire set of test word images-- where the ranking criterion is a similarity score between each prototype word and the candidate words based on global word shape features. A database of 20,000 word images contained in 100 scanned handwritten Arabic documents written by 10 different writers was used to study retrieval performance. Using five writers for providing prototypes and the other five for testing, using manually segmented documents, 55% precision is obtained at 50% recall. Performance increases as more writers are used for training.

  2. Public Health Agenda Setting in a Global Context: The International Labor Organization’s Decent Work Agenda

    PubMed Central

    Cohen, Joanna E.; Cole, Donald C.; Forman, Lisa

    2015-01-01

    We drew on two agenda-setting theories usually applied at the state or national level to assess their utility at the global level: Kingdon’s multiple streams theory and Baumgartner and Jones’s punctuated equilibrium theory. We illustrate our analysis with findings from a qualitative study of the International Labor Organization’s Decent Work Agenda. We found that both theories help explain the agenda-setting mechanisms that operate in the global context, including how windows of opportunity open and what role institutions play as policy entrepreneurs. Future application of these theories could help characterize power struggles between global actors, whose voices are heard or silenced, and their impact on global policy agenda setting. PMID:25713966

  3. Financial Capacity Following Traumatic Brain Injury: A Six-Month Longitudinal Study

    PubMed Central

    Dreer, Laura E.; DeVivo, Michael J.; Novack, Thomas A.; Marson, Daniel C.

    2015-01-01

    Objective To longitudinally investigate financial capacity (FC) following traumatic brain injury (TBI). Design Longitudinal study comparing FC in cognitively healthy adults and persons with moderate to severe TBI at time of acute hospitalization (Time 1) and at six months post injury (Time 2). Setting Inpatient brain injury rehabilitation unit. Participants Twenty healthy adult controls and 24 adult persons with moderate to severe TBI. Main Outcome Measures Participants were administered the Financial Capacity Instrument (FCI-9), a standardized instrument that measures performance on eighteen financial tasks, nine domains, and two global scores. Between and within group differences were examined for each FCI-9 domain and global scores. Using control group referenced cut scores, participants with TBI were also assigned an impairment rating (intact, marginal, or impaired) on each domain and global score. Results At Time 1, participants with TBI performed significantly below controls on the majority of financial variables tested. At Time 2, participants with TBI demonstrated within group improvement on both simple and complex financial skills, but continued to perform below adult controls on complex financial skills and both global scores. Group by time interactions were significant for five domains and both global scores. At Time 1, high percentages of participants with TBI were assigned either ‘marginal’ or ‘impaired’ ratings on the domains and global scores, with significant percentage increases of ‘intact’ ratings at Time 2. Conclusions Immediately following acute injury, persons with moderate to severe TBI show global impairment of FC. Findings indicate improvement of both simple and complex financial skills over a six month period, but continued impairment on more complex financial skills. Future studies should examine loss and recovery of FC following TBI over longer time periods and a wider range of injury severity. PMID:22369113

  4. General, crystallized and fluid intelligence are not associated with functional global network efficiency: A replication study with the human connectome project 1200 data set.

    PubMed

    Kruschwitz, J D; Waller, L; Daedelow, L S; Walter, H; Veer, I M

    2018-05-01

    One hallmark example of a link between global topological network properties of complex functional brain connectivity and cognitive performance is the finding that general intelligence may depend on the efficiency of the brain's intrinsic functional network architecture. However, although this association has been featured prominently over the course of the last decade, the empirical basis for this broad association of general intelligence and global functional network efficiency is quite limited. In the current study, we set out to replicate the previously reported association between general intelligence and global functional network efficiency using the large sample size and high quality data of the Human Connectome Project, and extended the original study by testing for separate association of crystallized and fluid intelligence with global efficiency, characteristic path length, and global clustering coefficient. We were unable to provide evidence for the proposed association between general intelligence and functional brain network efficiency, as was demonstrated by van den Heuvel et al. (2009), or for any other association with the global network measures employed. More specifically, across multiple network definition schemes, ranging from voxel-level networks to networks of only 100 nodes, no robust associations and only very weak non-significant effects with a maximal R 2 of 0.01 could be observed. Notably, the strongest (non-significant) effects were observed in voxel-level networks. We discuss the possibility that the low power of previous studies and publication bias may have led to false positive results fostering the widely accepted notion of general intelligence being associated to functional global network efficiency. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Accuracy assessment of the global TanDEM-X Digital Elevation Model with GPS data

    NASA Astrophysics Data System (ADS)

    Wessel, Birgit; Huber, Martin; Wohlfart, Christian; Marschalk, Ursula; Kosmann, Detlev; Roth, Achim

    2018-05-01

    The primary goal of the German TanDEM-X mission is the generation of a highly accurate and global Digital Elevation Model (DEM) with global accuracies of at least 10 m absolute height error (linear 90% error). The global TanDEM-X DEM acquired with single-pass SAR interferometry was finished in September 2016. This paper provides a unique accuracy assessment of the final TanDEM-X global DEM using two different GPS point reference data sets, which are distributed across all continents, to fully characterize the absolute height error. Firstly, the absolute vertical accuracy is examined by about three million globally distributed kinematic GPS (KGPS) points derived from 19 KGPS tracks covering a total length of about 66,000 km. Secondly, a comparison is performed with more than 23,000 "GPS on Bench Marks" (GPS-on-BM) points provided by the US National Geodetic Survey (NGS) scattered across 14 different land cover types of the US National Land Cover Data base (NLCD). Both GPS comparisons prove an absolute vertical mean error of TanDEM-X DEM smaller than ±0.20 m, a Root Means Square Error (RMSE) smaller than 1.4 m and an excellent absolute 90% linear height error below 2 m. The RMSE values are sensitive to land cover types. For low vegetation the RMSE is ±1.1 m, whereas it is slightly higher for developed areas (±1.4 m) and for forests (±1.8 m). This validation confirms an outstanding absolute height error at 90% confidence level of the global TanDEM-X DEM outperforming the requirement by a factor of five. Due to its extensive and globally distributed reference data sets, this study is of considerable interests for scientific and commercial applications.

  6. Visiting Trainees in Global Settings: Host and Partner Perspectives on Desirable Competencies.

    PubMed

    Cherniak, William; Latham, Emily; Astle, Barbara; Anguyo, Geoffrey; Beaunoir, Tessa; Buenaventura, Joel; DeCamp, Matthew; Diaz, Karla; Eichbaum, Quentin; Hedimbi, Marius; Myser, Cat; Nwobu, Charles; Standish, Katherine; Evert, Jessica

    Current competencies in global health education largely reflect perspectives from high-income countries (HICs). Consequently, there has been underrepresentation of the voices and perspectives of partners in low- and middle-income countries (LMICs) who supervise and mentor trainees engaged in short-term experiences in global health (STEGH). The objective of this study was to better understand the competencies and learning objectives that are considered a priority from the perspective of partners in LMICs. A review of current interprofessional global health competencies was performed to design a web-based survey instrument in English and Spanish. Survey data were collected from a global convenience sample. Data underwent descriptive statistical analysis and logistic regression. The survey was completed by 170 individuals; 132 in English and 38 in Spanish. More than 85% of respondents rated cultural awareness and respectful conduct while on a STEGH as important. None of the respondents said trainees arrive as independent practitioners to fill health care gaps. Of 109 respondents, 65 (60%) reported that trainees gaining fluency in the local language was not important. This study found different levels of agreement between partners across economic regions of the world when compared with existing global health competencies. By gaining insight into host partners' perceptions of desired competencies, global health education programs in LMICs can be more collaboratively and ethically designed to meet the priorities, needs, and expectations of those stakeholders. This study begins to shift the paradigm of global health education program design by encouraging North-South/East-West shared agenda setting, mutual respect, empowerment, and true collaboration. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  7. Cost-effective priorities for global mammal conservation.

    PubMed

    Carwardine, Josie; Wilson, Kerrie A; Ceballos, Gerardo; Ehrlich, Paul R; Naidoo, Robin; Iwamura, Takuya; Hajkowicz, Stefan A; Possingham, Hugh P

    2008-08-12

    Global biodiversity priority setting underpins the strategic allocation of conservation funds. In identifying the first comprehensive set of global priority areas for mammals, Ceballos et al. [Ceballos G, Ehrlich PR, Soberón J, Salazar I, Fay JP (2005) Science 309:603-607] found much potential for conflict between conservation and agricultural human activity. This is not surprising because, like other global priority-setting approaches, they set priorities without socioeconomic objectives. Here we present a priority-setting framework that seeks to minimize the conflicts and opportunity costs of meeting conservation goals. We use it to derive a new set of priority areas for investment in mammal conservation based on (i) agricultural opportunity cost and biodiversity importance, (ii) current levels of international funding, and (iii) degree of threat. Our approach achieves the same biodiversity outcomes as Ceballos et al.'s while reducing the opportunity costs and conflicts with agricultural human activity by up to 50%. We uncover shortfalls in the allocation of conservation funds in many threatened priority areas, highlighting a global conservation challenge.

  8. A Framework For Analysis Of Coastal Infrastructure Vunerabilty To Global Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Obrien, P. S.; White, K. D.; Veatch, W.; Marzion, R.; Moritz, H.; Moritz, H. R.

    2017-12-01

    Recorded impacts of global sea rise on coastal water levels have been documented over the past 100 to 150 years. In the recent 40 years the assumption of hydrologic stationarity has been recognized as invalid. New coastal infrastructure designs must recognize the paradigm shift from hydrologic stationarity to non-stationarity in coastal hydrology. A framework for the evaluation of existing coastal infrastructure is proposed to effectively assess design vulnerability. Two data sets developed from existing structures are chosen to test a proposed framework for vunerabilty to global sea level rise, with the proposed name Climate Preparedness and Resilience Register (CPRR). The CPRR framework consists of four major elements; Datum Adjustment, Coastal Water Levels, Scenario Projections and Performance Thresholds.

  9. Variations in Global Precipitation: Climate-scale to Floods

    NASA Technical Reports Server (NTRS)

    Adler, Robert

    2006-01-01

    Variations in global precipitation from climate-scale to small scale are examined using satellite-based analyses of the Global Precipitation Climatology Project (GPCP) and information from the Tropical Rainfall Measuring Mission (TRMM). Global and large regional rainfall variations and possible long-term changes are examined using the 27- year (1979-2005) monthly dataset from the GPCP. In addition to global patterns associated with phenomena such as ENSO, the data set is explored for evidence of longterm change. Although the global change of precipitation in the data set is near zero, the data set does indicate a small upward trend in the Tropics (25S-25N), especially over ocean. Techniques are derived to isolate and eliminate variations due to ENS0 and major volcanic eruptions and the significance of the trend is examined. The status of TRMM estimates is examined in terms of evaluating and improving the long-term global data set. To look at rainfall variations on a much smaller scale TRMM data is used in combination with observations from other satellites to produce a 3-hr resolution, eight-year data set for examination of weather events and for practical applications such as detecting floods. Characteristics of the data set are presented and examples of recent flood events are examined.

  10. Assimilation of global versus local data sets into a regional model of the Gulf Stream system. 1. Data effectiveness

    NASA Astrophysics Data System (ADS)

    Malanotte-Rizzoli, Paola; Young, Roberta E.

    1995-12-01

    The primary objective of this paper is to assess the relative effectiveness of data sets with different space coverage and time resolution when they are assimilated into an ocean circulation model. We focus on obtaining realistic numerical simulations of the Gulf Stream system typically of the order of 3-month duration by constructing a "synthetic" ocean simultaneously consistent with the model dynamics and the observations. The model used is the Semispectral Primitive Equation Model. The data sets are the "global" Optimal Thermal Interpolation Scheme (OTIS) 3 of the Fleet Numerical Oceanography Center providing temperature and salinity fields with global coverage and with bi-weekly frequency, and the localized measurements, mostly of current velocities, from the central and eastern array moorings of the Synoptic Ocean Prediction (SYNOP) program, with daily frequency but with a very small spatial coverage. We use a suboptimal assimilation technique ("nudging"). Even though this technique has already been used in idealized data assimilation studies, to our knowledge this is the first study in which the effectiveness of nudging is tested by assimilating real observations of the interior temperature and salinity fields. This is also the first work in which a systematic assimilation is carried out of the localized, high-quality SYNOP data sets in numerical experiments longer than 1-2 weeks, that is, not aimed to forecasting. We assimilate (1) the global OTIS 3 alone, (2) the local SYNOP observations alone, and (3) both OTIS 3 and SYNOP observations. We assess the success of the assimilations with quantitative measures of performance, both on the global and local scale. The results can be summarized as follows. The intermittent assimilation of the global OTIS 3 is necessary to keep the model "on track" over 3-month simulations on the global scale. As OTIS 3 is assimilated at every model grid point, a "gentle" weight must be prescribed to it so as not to overconstrain the model. However, in these assimilations the predicted velocity fields over the SYNOP arrays are greatly in error. The continuous assimilation of the localized SYNOP data sets with a strong weight is necessary to obtain local realistic evolutions. Then assimilation of velocity measurements alone recovers the density structure over the array area. However, the spatial coverage of the SYNOP measurements is too small to constrain the model on the global scale. Thus the blending of both types of datasets is necessary in the assimilation as they constrain different time and space scales. Our choice of "gentle" nudging weight for the global OTIS 3 and "strong" weight for the local SYNOP data provides for realistic simulations of the Gulf Stream system, both globally and locally, on the 3- to 4-month-long timescale, the one governed by the Gulf Stream jet internal dynamics.

  11. On the accuracy of density-functional theory exchange-correlation functionals for H bonds in small water clusters: Benchmarks approaching the complete basis set limit

    NASA Astrophysics Data System (ADS)

    Santra, Biswajit; Michaelides, Angelos; Scheffler, Matthias

    2007-11-01

    The ability of several density-functional theory (DFT) exchange-correlation functionals to describe hydrogen bonds in small water clusters (dimer to pentamer) in their global minimum energy structures is evaluated with reference to second order Møller-Plesset perturbation theory (MP2). Errors from basis set incompleteness have been minimized in both the MP2 reference data and the DFT calculations, thus enabling a consistent systematic evaluation of the true performance of the tested functionals. Among all the functionals considered, the hybrid X3LYP and PBE0 functionals offer the best performance and among the nonhybrid generalized gradient approximation functionals, mPWLYP and PBE1W perform best. The popular BLYP and B3LYP functionals consistently underbind and PBE and PW91 display rather variable performance with cluster size.

  12. On the accuracy of density-functional theory exchange-correlation functionals for H bonds in small water clusters: benchmarks approaching the complete basis set limit.

    PubMed

    Santra, Biswajit; Michaelides, Angelos; Scheffler, Matthias

    2007-11-14

    The ability of several density-functional theory (DFT) exchange-correlation functionals to describe hydrogen bonds in small water clusters (dimer to pentamer) in their global minimum energy structures is evaluated with reference to second order Moller-Plesset perturbation theory (MP2). Errors from basis set incompleteness have been minimized in both the MP2 reference data and the DFT calculations, thus enabling a consistent systematic evaluation of the true performance of the tested functionals. Among all the functionals considered, the hybrid X3LYP and PBE0 functionals offer the best performance and among the nonhybrid generalized gradient approximation functionals, mPWLYP and PBE1W perform best. The popular BLYP and B3LYP functionals consistently underbind and PBE and PW91 display rather variable performance with cluster size.

  13. Global land cover mapping and characterization: present situation and future research priorities

    USGS Publications Warehouse

    Giri, Chandra

    2005-01-01

    The availability and accessibility of global land cover data sets plays an important role in many global change studies. The importance of such science‐based information is also reflected in a number of international, regional, and national projects and programs. Recent developments in earth observing satellite technology, information technology, computer hardware and software, and infrastructure development have helped developed better quality land cover data sets. As a result, such data sets are increasingly becoming available, the user‐base is ever widening, application areas have been expanding, and the potential of many other applications are enormous. Yet, we are far from producing high quality global land cover data sets. This paper examines the progress in the development of digital global land cover data, their availability, and current applications. Problems and opportunities are also explained. The overview sets the stage for identifying future research priorities needed for operational land cover assessment and monitoring.

  14. A Novel Hybrid Firefly Algorithm for Global Optimization.

    PubMed

    Zhang, Lina; Liu, Liqiang; Yang, Xin-She; Dai, Yuntao

    Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA), is proposed by combining the advantages of both the firefly algorithm (FA) and differential evolution (DE). FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA), differential evolution (DE) and particle swarm optimization (PSO) in the sense of avoiding local minima and increasing the convergence rate.

  15. A Novel Hybrid Firefly Algorithm for Global Optimization

    PubMed Central

    Zhang, Lina; Liu, Liqiang; Yang, Xin-She; Dai, Yuntao

    2016-01-01

    Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA), is proposed by combining the advantages of both the firefly algorithm (FA) and differential evolution (DE). FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA), differential evolution (DE) and particle swarm optimization (PSO) in the sense of avoiding local minima and increasing the convergence rate. PMID:27685869

  16. An Improved Quantum-Behaved Particle Swarm Optimization Algorithm with Elitist Breeding for Unconstrained Optimization.

    PubMed

    Yang, Zhen-Lun; Wu, Angus; Min, Hua-Qing

    2015-01-01

    An improved quantum-behaved particle swarm optimization with elitist breeding (EB-QPSO) for unconstrained optimization is presented and empirically studied in this paper. In EB-QPSO, the novel elitist breeding strategy acts on the elitists of the swarm to escape from the likely local optima and guide the swarm to perform more efficient search. During the iterative optimization process of EB-QPSO, when criteria met, the personal best of each particle and the global best of the swarm are used to generate new diverse individuals through the transposon operators. The new generated individuals with better fitness are selected to be the new personal best particles and global best particle to guide the swarm for further solution exploration. A comprehensive simulation study is conducted on a set of twelve benchmark functions. Compared with five state-of-the-art quantum-behaved particle swarm optimization algorithms, the proposed EB-QPSO performs more competitively in all of the benchmark functions in terms of better global search capability and faster convergence rate.

  17. Bifurcation analysis of eight coupled degenerate optical parametric oscillators

    NASA Astrophysics Data System (ADS)

    Ito, Daisuke; Ueta, Tetsushi; Aihara, Kazuyuki

    2018-06-01

    A degenerate optical parametric oscillator (DOPO) network realized as a coherent Ising machine can be used to solve combinatorial optimization problems. Both theoretical and experimental investigations into the performance of DOPO networks have been presented previously. However a problem remains, namely that the dynamics of the DOPO network itself can lower the search success rates of globally optimal solutions for Ising problems. This paper shows that the problem is caused by pitchfork bifurcations due to the symmetry structure of coupled DOPOs. Some two-parameter bifurcation diagrams of equilibrium points express the performance deterioration. It is shown that the emergence of non-ground states regarding local minima hampers the system from reaching the ground states corresponding to the global minimum. We then describe a parametric strategy for leading a system to the ground state by actively utilizing the bifurcation phenomena. By adjusting the parameters to break particular symmetry, we find appropriate parameter sets that allow the coherent Ising machine to obtain the globally optimal solution alone.

  18. Support and performance improvement for primary health care workers in low- and middle-income countries: a scoping review of intervention design and methods.

    PubMed

    Vasan, Ashwin; Mabey, David C; Chaudhri, Simran; Brown Epstein, Helen-Ann; Lawn, Stephen D

    2017-04-01

    Primary health care workers (HCWs) in low- and middle-income settings (LMIC) often work in challenging conditions in remote, rural areas, in isolation from the rest of the health system and particularly specialist care. Much attention has been given to implementation of interventions to support quality and performance improvement for workers in such settings. However, little is known about the design of such initiatives and which approaches predominate, let alone those that are most effective. We aimed for a broad understanding of what distinguishes different approaches to primary HCW support and performance improvement and to clarify the existing evidence as well as gaps in evidence in order to inform decision-making and design of programs intended to support and improve the performance of health workers in these settings. We systematically searched the literature for articles addressing this topic, and undertook a comparative review to document the principal approaches to performance and quality improvement for primary HCWs in LMIC settings. We identified 40 eligible papers reporting on interventions that we categorized into five different approaches: (1) supervision and supportive supervision; (2) mentoring; (3) tools and aids; (4) quality improvement methods, and (5) coaching. The variety of study designs and quality/performance indicators precluded a formal quantitative data synthesis. The most extensive literature was on supervision, but there was little clarity on what defines the most effective approach to the supervision activities themselves, let alone the design and implementation of supervision programs. The mentoring literature was limited, and largely focused on clinical skills building and educational strategies. Further research on how best to incorporate mentorship into pre-service clinical training, while maintaining its function within the routine health system, is needed. There is insufficient evidence to draw conclusions about coaching in this setting, however a review of the corporate and the business school literature is warranted to identify transferrable approaches. A substantial literature exists on tools, but significant variation in approaches makes comparison challenging. We found examples of effective individual projects and designs in specific settings, but there was a lack of comparative research on tools across approaches or across settings, and no systematic analysis within specific approaches to provide evidence with clear generalizability. Future research should prioritize comparative intervention trials to establish clear global standards for performance and quality improvement initiatives. Such standards will be critical to creating and sustaining a well-functioning health workforce and for global initiatives such as universal health coverage. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  19. Big Data and High-Performance Computing in Global Seismology

    NASA Astrophysics Data System (ADS)

    Bozdag, Ebru; Lefebvre, Matthieu; Lei, Wenjie; Peter, Daniel; Smith, James; Komatitsch, Dimitri; Tromp, Jeroen

    2014-05-01

    Much of our knowledge of Earth's interior is based on seismic observations and measurements. Adjoint methods provide an efficient way of incorporating 3D full wave propagation in iterative seismic inversions to enhance tomographic images and thus our understanding of processes taking place inside the Earth. Our aim is to take adjoint tomography, which has been successfully applied to regional and continental scale problems, further to image the entire planet. This is one of the extreme imaging challenges in seismology, mainly due to the intense computational requirements and vast amount of high-quality seismic data that can potentially be assimilated. We have started low-resolution inversions (T > 30 s and T > 60 s for body and surface waves, respectively) with a limited data set (253 carefully selected earthquakes and seismic data from permanent and temporary networks) on Oak Ridge National Laboratory's Cray XK7 "Titan" system. Recent improvements in our 3D global wave propagation solvers, such as a GPU version of the SPECFEM3D_GLOBE package, will enable us perform higher-resolution (T > 9 s) and longer duration (~180 m) simulations to take the advantage of high-frequency body waves and major-arc surface waves, thereby improving imbalanced ray coverage as a result of the uneven global distribution of sources and receivers. Our ultimate goal is to use all earthquakes in the global CMT catalogue within the magnitude range of our interest and data from all available seismic networks. To take the full advantage of computational resources, we need a solid framework to manage big data sets during numerical simulations, pre-processing (i.e., data requests and quality checks, processing data, window selection, etc.) and post-processing (i.e., pre-conditioning and smoothing kernels, etc.). We address the bottlenecks in our global seismic workflow, which are mainly coming from heavy I/O traffic during simulations and the pre- and post-processing stages, by defining new data formats for seismograms and outputs of our 3D solvers (i.e., meshes, kernels, seismic models, etc.) based on ORNL's ADIOS libraries. We will discuss our global adjoint tomography workflow on HPC systems as well as the current status of our global inversions.

  20. Assessing deep and shallow learning methods for quantitative prediction of acute chemical toxicity.

    PubMed

    Liu, Ruifeng; Madore, Michael; Glover, Kyle P; Feasel, Michael G; Wallqvist, Anders

    2018-05-02

    Animal-based methods for assessing chemical toxicity are struggling to meet testing demands. In silico approaches, including machine-learning methods, are promising alternatives. Recently, deep neural networks (DNNs) were evaluated and reported to outperform other machine-learning methods for quantitative structure-activity relationship modeling of molecular properties. However, most of the reported performance evaluations relied on global performance metrics, such as the root mean squared error (RMSE) between the predicted and experimental values of all samples, without considering the impact of sample distribution across the activity spectrum. Here, we carried out an in-depth analysis of DNN performance for quantitative prediction of acute chemical toxicity using several datasets. We found that the overall performance of DNN models on datasets of up to 30,000 compounds was similar to that of random forest (RF) models, as measured by the RMSE and correlation coefficients between the predicted and experimental results. However, our detailed analyses demonstrated that global performance metrics are inappropriate for datasets with a highly uneven sample distribution, because they show a strong bias for the most populous compounds along the toxicity spectrum. For highly toxic compounds, DNN and RF models trained on all samples performed much worse than the global performance metrics indicated. Surprisingly, our variable nearest neighbor method, which utilizes only structurally similar compounds to make predictions, performed reasonably well, suggesting that information of close near neighbors in the training sets is a key determinant of acute toxicity predictions.

  1. Reculturing Schools in England: How "Cult" Values in Education Policy Discourse Influence the Construction of Practitioner Identities and Work Orientations

    ERIC Educational Resources Information Center

    Bates, Agnieszka

    2016-01-01

    The imperative of continuous improvement has now become normative in education policy discourse, typically framed as setting "aspirational" targets for pupil performance as a prerequisite for gaining competitive advantage in the global economy. In this context, teachers, leaders, teacher assistants and other practitioners working in…

  2. Global Maritime Partnerships Game

    DTIC Science & Technology

    2010-10-08

    develop concepts of operations, doctrine and TTPs to perform various maritime security mission sets. Close collaboration with the USCG is vital to...issues of common concern in order to develop compatible doctrine . c. Objectives UNCLASSIFIED 37 UNCLASSIFIED i. Examine the interaction between...Senkaku Islands (PRC vs. Japan), Indonesia archipelagic sea lanes passage, Northern Territories Dispute (Japan vs. Russia) UNCLASSIFIED 143

  3. A Living Library: New Model for Global Electronic Interactivity and Networking in the Garden.

    ERIC Educational Resources Information Center

    Sherk, Bonnie

    1995-01-01

    Describes the Living Library, an idea to create a network of international cultural parks in different cities of the world using new communications technologies on-line in a garden setting, bringing the humanities, sciences, and social sciences to life through plants, visual and performed artworks, lectures, and computer and on-line satellite…

  4. Heritability of growth traits and correlation with hepatic gene expression among hybrid striped bass exhibiting extremes in performance

    USDA-ARS?s Scientific Manuscript database

    We set out to better understand the genetic basis behind growth variation in hybrid striped bass (HSB) by determining whether gene expression changes could be detected between the largest and smallest HSB in a population using a global gene expression approach by RNA sequencing of liver. Fingerling...

  5. Standardized Individuality: Cosmopolitanism and Educational Decision-Making in an Atlantic Canadian Rural Community

    ERIC Educational Resources Information Center

    Corbett, Michael J.

    2010-01-01

    With the rise of network society, consumerism, individualization, globalization and contemporary change forces, students are pressured to both perform well in standardized academic assessments while at the same time constructing a non-standard, unique project of the self. I argue that this generates a particular set of place-based tensions for…

  6. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to the phenology, solar-view geometry, and atmospheric condition etc. factors but not actual landcover difference. Finally, we will compare the classification results from screened and unscreened training samples to assess the improvement achieved by cleaning up the training samples. Keywords:

  7. Context factors in general practitioner-patient encounters and their impact on assessing communication skills--an exploratory study.

    PubMed

    Essers, Geurt; Kramer, Anneke; Andriesse, Boukje; van Weel, Chris; van der Vleuten, Cees; van Dulmen, Sandra

    2013-05-22

    Assessment of medical communication performance usually focuses on rating generically applicable, well-defined communication skills. However, in daily practice, communication is determined by (specific) context factors, such as acquaintance with the patient, or the presented problem. Merely valuing the presence of generic skills may not do justice to the doctor's proficiency.Our aim was to perform an exploratory study on how assessment of general practitioner (GP) communication performance changes if context factors are explicitly taken into account. We used a mixed method design to explore how ratings would change. A random sample of 40 everyday GP consultations was used to see if previously identified context factors could be observed again. The sample was rated twice using a widely used assessment instrument (the MAAS-Global), first in the standard way and secondly after context factors were explicitly taken into account, by using a context-specific rating protocol to assess communication performance in the workplace. In between first and second rating, the presence of context factors was established. Item score differences were calculated using paired sample t-tests. In 38 out of 40 consultations, context factors prompted application of the context-specific rating protocol. Mean overall score on the 7-point MAAS-Global scale increased from 2.98 in standard to 3.66 in the context-specific rating (p<0.00); the effect size for the total mean score was 0.84. In earlier research the minimum standard score for adequate communication was set at 3.17. Applying the protocol, the mean overall score rose above the level set in an earlier study for the MAAS-Global scores to represent 'adequate GP communication behaviour'. Our findings indicate that incorporating context factors in communication assessment thus makes a meaningful difference and shows that context factors should be considered as 'signal' instead of 'noise' in GP communication assessment. Explicating context factors leads to a more deliberate and transparent rating of GP communication performance.

  8. Health-promoting schools: an opportunity for oral health promotion.

    PubMed Central

    Kwan, Stella Y. L.; Petersen, Poul Erik; Pine, Cynthia M.; Borutta, Annerose

    2005-01-01

    Schools provide an important setting for promoting health, as they reach over 1 billion children worldwide and, through them, the school staff, families and the community as a whole. Health promotion messages can be reinforced throughout the most influential stages of children's lives, enabling them to develop lifelong sustainable attitudes and skills. Poor oral health can have a detrimental effect on children's quality of life, their performance at school and their success in later life. This paper examines the global need for promoting oral health through schools. The WHO Global School Health Initiative and the potential for setting up oral health programmes in schools using the health-promoting school framework are discussed. The challenges faced in promoting oral health in schools in both developed and developing countries are highlighted. The importance of using a validated framework and appropriate methodologies for the evaluation of school oral health projects is emphasized. PMID:16211159

  9. Pareto-Optimal Estimates of California Precipitation Change

    NASA Astrophysics Data System (ADS)

    Langenbrunner, Baird; Neelin, J. David

    2017-12-01

    In seeking constraints on global climate model projections under global warming, one commonly finds that different subsets of models perform well under different objective functions, and these trade-offs are difficult to weigh. Here a multiobjective approach is applied to a large set of subensembles generated from the Climate Model Intercomparison Project phase 5 ensemble. We use observations and reanalyses to constrain tropical Pacific sea surface temperatures, upper level zonal winds in the midlatitude Pacific, and California precipitation. An evolutionary algorithm identifies the set of Pareto-optimal subensembles across these three measures, and these subensembles are used to constrain end-of-century California wet season precipitation change. This methodology narrows the range of projections throughout California, increasing confidence in estimates of positive mean precipitation change. Finally, we show how this technique complements and generalizes emergent constraint approaches for restricting uncertainty in end-of-century projections within multimodel ensembles using multiple criteria for observational constraints.

  10. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less

  11. Global Collaborative STEM Education

    NASA Astrophysics Data System (ADS)

    Meabh Kelly, Susan; Smith, Walter

    2016-04-01

    Global Collaborative STEM Education, as the name suggests, simultaneously supports two sets of knowledge and skills. The first set is STEM -- science, technology, engineering and math. The other set of content knowledge and skills is that of global collaboration. Successful global partnerships require awareness of one's own culture, the biases embedded within that culture, as well as developing awareness of the collaborators' culture. Workforce skills fostered include open-mindedness, perseverance when faced with obstacles, and resourceful use of technological "bridges" to facilitate and sustain communication. In respect for the 2016 GIFT Workshop focus, Global Collaborative STEM Education projects dedicated to astronomy research will be presented. The projects represent different benchmarks within the Global Collaborative STEM Education continuum, culminating in an astronomy research experience that fully reflects how the global STEM workforce collaborates. To facilitate wider engagement in Global Collaborative STEM Education, project summaries, classroom resources and contact information for established international collaborative astronomy research projects will be disseminated.

  12. Testing a Coupled Global-limited-area Data Assimilation System using Observations from the 2004 Pacific Typhoon Season

    NASA Astrophysics Data System (ADS)

    Holt, C. R.; Szunyogh, I.; Gyarmati, G.; Hoffman, R. N.; Leidner, M.

    2011-12-01

    Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs. We implement the Local Ensemble Transform Kalman Filter algorithm, a vari- ation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area anal- ysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific. The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs. The global component of the ensemble-based system shows improvement in po- sition analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional com- ponent of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.

  13. Too Exhausted to Perform at the Highest Level? On the Importance of Self-control Strength in Educational Settings

    PubMed Central

    Englert, Chris; Zavery, Alafia; Bertrams, Alex

    2017-01-01

    In order to perform at the highest level in educational settings (e.g., students in testing situations), individuals often have to control their impulses or desires (e.g., to study for an upcoming test or to prepare a course instead of spending time with the peer group). Previous research suggests that the ability to exert self-control is an important predictor of performance and behavior in educational contexts. According to the strength model, all self-control acts are based on one global energy pool whose capacity is assumed to be limited. After having performed a first act of self-control, this resource can become temporarily depleted which negatively affects subsequent self-control. In such a state of ego depletion, individuals tend to display impaired concentration and academic performance, fail to meet academic deadlines, or even disengage from their duties. In this mini-review, we report recent studies on ego depletion which have focused on children as well as adults in educational settings, derive practical implications for how to improve self-control strength in the realm of education and instruction, and discuss limitations regarding the assumptions of the strength model of self-control. PMID:28790963

  14. Too Exhausted to Perform at the Highest Level? On the Importance of Self-control Strength in Educational Settings.

    PubMed

    Englert, Chris; Zavery, Alafia; Bertrams, Alex

    2017-01-01

    In order to perform at the highest level in educational settings (e.g., students in testing situations), individuals often have to control their impulses or desires (e.g., to study for an upcoming test or to prepare a course instead of spending time with the peer group). Previous research suggests that the ability to exert self-control is an important predictor of performance and behavior in educational contexts. According to the strength model, all self-control acts are based on one global energy pool whose capacity is assumed to be limited. After having performed a first act of self-control, this resource can become temporarily depleted which negatively affects subsequent self-control. In such a state of ego depletion, individuals tend to display impaired concentration and academic performance, fail to meet academic deadlines, or even disengage from their duties. In this mini-review, we report recent studies on ego depletion which have focused on children as well as adults in educational settings, derive practical implications for how to improve self-control strength in the realm of education and instruction, and discuss limitations regarding the assumptions of the strength model of self-control.

  15. Gadolinia fuel performance in BWRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, W.E.; Crowther, R.L.

    1985-11-01

    Gadolinia has the unique property of having a high neutron absorption cross section coupled with a burnup rate that can approximately match the uranium 235 depletion. These qualities and others make gadolinia an ideal burnable absorber, and it has been used in all General Electric-designed boiling water reactors. Fabrication corrosion properties, and performance of gadolinia-containing fuel elements are discussed. Development of a reliable and efficient set of local and global gadolinia-urania design methods has been an arduous process that has taken approx.15 years to accomplish.

  16. Optimal Wonderful Life Utility Functions in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Tumer, Kagan; Swanson, Keith (Technical Monitor)

    2000-01-01

    The mathematics of Collective Intelligence (COINs) is concerned with the design of multi-agent systems so as to optimize an overall global utility function when those systems lack centralized communication and control. Typically in COINs each agent runs a distinct Reinforcement Learning (RL) algorithm, so that much of the design problem reduces to how best to initialize/update each agent's private utility function, as far as the ensuing value of the global utility is concerned. Traditional team game solutions to this problem assign to each agent the global utility as its private utility function. In previous work we used the COIN framework to derive the alternative Wonderful Life Utility (WLU), and experimentally established that having the agents use it induces global utility performance up to orders of magnitude superior to that induced by use of the team game utility. The WLU has a free parameter (the clamping parameter) which we simply set to zero in that previous work. Here we derive the optimal value of the clamping parameter, and demonstrate experimentally that using that optimal value can result in significantly improved performance over that of clamping to zero, over and above the improvement beyond traditional approaches.

  17. Intercomparison of 4 Years of Global Formaldehyde Observations from the GOME-2 and OMI Sensors

    NASA Astrophysics Data System (ADS)

    De Smedt, Isabelle; Van Roozendael, Michel; Stravrakou, Trissevgeni; Muller, Jean-Francois; Chance, Kelly; Kurosu, Thomas

    2012-11-01

    Formaldehyde (H2CO) tropospheric columns have been retrieved since 2007 from backscattered UV radiance measurements performed by the GOME-2 instrument on the EUMETSAT METOP-A platform. This data set extends the successful time-series of global H2CO observations established with GOME/ ERS-2 (1996-2003), SCIAMACHY/ ENVISAT (2003-2012), and OMI on the NASA AURA platform (2005-now). In this work, we perform an intercomparison of the H2CO tropospheric columns retrieved from GOME-2 and OMI between 2007 and 2010, respectively at BIRA-IASB and at Harvard SAO. We first compare the global formaldehyde data products that are provided by each retrieval group. We then investigate each step of the retrieval procedure: the slant column fitting, the reference sector correction and the air mass factor calculation. New air mass factors are computed for OMI using external parameters consistent with those used for GOME-2. By doing so, the impacts of the different a priori profiles and aerosol corrections are quantified. The remaining differences are evaluated in view of the expected diurnal variations of the formaldehyde concentrations, based on ground-based measurements performed in the Beijing area.

  18. Ligand Binding Site Detection by Local Structure Alignment and Its Performance Complementarity

    PubMed Central

    Lee, Hui Sun; Im, Wonpil

    2013-01-01

    Accurate determination of potential ligand binding sites (BS) is a key step for protein function characterization and structure-based drug design. Despite promising results of template-based BS prediction methods using global structure alignment (GSA), there is a room to improve the performance by properly incorporating local structure alignment (LSA) because BS are local structures and often similar for proteins with dissimilar global folds. We present a template-based ligand BS prediction method using G-LoSA, our LSA tool. A large benchmark set validation shows that G-LoSA predicts drug-like ligands’ positions in single-chain protein targets more precisely than TM-align, a GSA-based method, while the overall success rate of TM-align is better. G-LoSA is particularly efficient for accurate detection of local structures conserved across proteins with diverse global topologies. Recognizing the performance complementarity of G-LoSA to TM-align and a non-template geometry-based method, fpocket, a robust consensus scoring method, CMCS-BSP (Complementary Methods and Consensus Scoring for ligand Binding Site Prediction), is developed and shows improvement on prediction accuracy. The G-LoSA source code is freely available at http://im.bioinformatics.ku.edu/GLoSA. PMID:23957286

  19. Interdependencies and Causalities in Coupled Financial Networks

    PubMed Central

    Vodenska, Irena; Aoyama, Hideaki; Fujiwara, Yoshi; Iyetomi, Hiroshi; Arai, Yuta

    2016-01-01

    We explore the foreign exchange and stock market networks for 48 countries from 1999 to 2012 and propose a model, based on complex Hilbert principal component analysis, for extracting significant lead-lag relationships between these markets. The global set of countries, including large and small countries in Europe, the Americas, Asia, and the Middle East, is contrasted with the limited scopes of targets, e.g., G5, G7 or the emerging Asian countries, adopted by previous works. We construct a coupled synchronization network, perform community analysis, and identify formation of four distinct network communities that are relatively stable over time. In addition to investigating the entire period, we divide the time period into into “mild crisis,” (1999–2002), “calm,” (2003–2006) and “severe crisis” (2007–2012) sub-periods and find that the severe crisis period behavior dominates the dynamics in the foreign exchange-equity synchronization network. We observe that in general the foreign exchange market has predictive power for the global stock market performances. In addition, the United States, German and Mexican markets have forecasting power for the performances of other global equity markets. PMID:26977806

  20. Identifying Critical Success Factors for TQM and Employee Performance in Malaysian Automotive Industry: A Literature Review

    NASA Astrophysics Data System (ADS)

    Nadia Dedy, Aimie; Zakuan, Norhayati; Zaidi Bahari, Ahamad; Ariff, Mohd Shoki Md; Chin, Thoo Ai; Zameri Mat Saman, Muhamad

    2016-05-01

    TQM is a management philosophy embracing all activities through which the needs and expectations of the customer and the community and the goals of the companies are satisfied in the most efficient and cost effective way by maximizing the potential of all workers in a continuing drive for total quality improvement. TQM is very important to the company especially in automotive industry in order for them to survive in the competitive global market. The main objective of this study is to review a relationship between TQM and employee performance. Authors review updated literature on TQM study with two main targets: (a) evolution of TQM considering as a set of practice, (b) and its impacts to employee performance. Therefore, two research questions are proposed in order to review TQM constructs and employee performance measure: (a) Is the set of critical success factors associated with TQM valid as a whole? (b) What is the critical success factors should be considered to measure employee performance in automotive industry?

  1. Assessing the performance of community-available global MHD models using key system parameters and empirical relationships

    NASA Astrophysics Data System (ADS)

    Gordeev, E.; Sergeev, V.; Honkonen, I.; Kuznetsova, M.; Rastätter, L.; Palmroth, M.; Janhunen, P.; Tóth, G.; Lyon, J.; Wiltberger, M.

    2015-12-01

    Global magnetohydrodynamic (MHD) modeling is a powerful tool in space weather research and predictions. There are several advanced and still developing global MHD (GMHD) models that are publicly available via Community Coordinated Modeling Center's (CCMC) Run on Request system, which allows the users to simulate the magnetospheric response to different solar wind conditions including extraordinary events, like geomagnetic storms. Systematic validation of GMHD models against observations still continues to be a challenge, as well as comparative benchmarking of different models against each other. In this paper we describe and test a new approach in which (i) a set of critical large-scale system parameters is explored/tested, which are produced by (ii) specially designed set of computer runs to simulate realistic statistical distributions of critical solar wind parameters and are compared to (iii) observation-based empirical relationships for these parameters. Being tested in approximately similar conditions (similar inputs, comparable grid resolution, etc.), the four models publicly available at the CCMC predict rather well the absolute values and variations of those key parameters (magnetospheric size, magnetic field, and pressure) which are directly related to the large-scale magnetospheric equilibrium in the outer magnetosphere, for which the MHD is supposed to be a valid approach. At the same time, the models have systematic differences in other parameters, being especially different in predicting the global convection rate, total field-aligned current, and magnetic flux loading into the magnetotail after the north-south interplanetary magnetic field turning. According to validation results, none of the models emerges as an absolute leader. The new approach suggested for the evaluation of the models performance against reality may be used by model users while planning their investigations, as well as by model developers and those interesting to quantitatively evaluate progress in magnetospheric modeling.

  2. Global Sensitivity Analysis and Parameter Calibration for an Ecosystem Carbon Model

    NASA Astrophysics Data System (ADS)

    Safta, C.; Ricciuto, D. M.; Sargsyan, K.; Najm, H. N.; Debusschere, B.; Thornton, P. E.

    2013-12-01

    We present uncertainty quantification results for a process-based ecosystem carbon model. The model employs 18 parameters and is driven by meteorological data corresponding to years 1992-2006 at the Harvard Forest site. Daily Net Ecosystem Exchange (NEE) observations were available to calibrate the model parameters and test the performance of the model. Posterior distributions show good predictive capabilities for the calibrated model. A global sensitivity analysis was first performed to determine the important model parameters based on their contribution to the variance of NEE. We then proceed to calibrate the model parameters in a Bayesian framework. The daily discrepancies between measured and predicted NEE values were modeled as independent and identically distributed Gaussians with prescribed daily variance according to the recorded instrument error. All model parameters were assumed to have uninformative priors with bounds set according to expert opinion. The global sensitivity results show that the rate of leaf fall (LEAFALL) is responsible for approximately 25% of the total variance in the average NEE for 1992-2005. A set of 4 other parameters, Nitrogen use efficiency (NUE), base rate for maintenance respiration (BR_MR), growth respiration fraction (RG_FRAC), and allocation to plant stem pool (ASTEM) contribute between 5% and 12% to the variance in average NEE, while the rest of the parameters have smaller contributions. The posterior distributions, sampled with a Markov Chain Monte Carlo algorithm, exhibit significant correlations between model parameters. However LEAFALL, the most important parameter for the average NEE, is not informed by the observational data, while less important parameters show significant updates between their prior and posterior densities. The Fisher information matrix values, indicating which parameters are most informed by the experimental observations, are examined to augment the comparison between the calibration and global sensitivity analysis results.

  3. Global Change adaptation in water resources management: the Water Change project.

    PubMed

    Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine

    2012-12-01

    In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Global health in the European Union--a review from an agenda-setting perspective.

    PubMed

    Aluttis, Christoph; Krafft, Thomas; Brand, Helmut

    2014-01-01

    This review attempts to analyse the global health agenda-setting process in the European Union (EU). We give an overview of the European perspective on global health, making reference to the developments that led to the EU acknowledging its role as a global health actor. The article thereby focuses in particular on the European interpretation of its role in global health from 2010, which was formalised through, respectively, a European Commission Communication and European Council Conclusions. Departing from there, and based on Kingdon's multiple streams theory on agenda setting, we identify some barriers that seem to hinder the further establishment and promotion of a solid global health agenda in the EU. The main barriers for creating a strong European global health agenda are the fragmentation of the policy community and the lack of a common definition for global health in Europe. Forwarding the agenda in Europe for global health requires more clarification of the common goals and perspectives of the policy community and the use of arising windows of opportunity.

  5. Estimating global distribution of boreal, temperate, and tropical tree plant functional types using clustering techniques

    NASA Astrophysics Data System (ADS)

    Wang, Audrey; Price, David T.

    2007-03-01

    A simple integrated algorithm was developed to relate global climatology to distributions of tree plant functional types (PFT). Multivariate cluster analysis was performed to analyze the statistical homogeneity of the climate space occupied by individual tree PFTs. Forested regions identified from the satellite-based GLC2000 classification were separated into tropical, temperate, and boreal sub-PFTs for use in the Canadian Terrestrial Ecosystem Model (CTEM). Global data sets of monthly minimum temperature, growing degree days, an index of climatic moisture, and estimated PFT cover fractions were then used as variables in the cluster analysis. The statistical results for individual PFT clusters were found consistent with other global-scale classifications of dominant vegetation. As an improvement of the quantification of the climatic limitations on PFT distributions, the results also demonstrated overlapping of PFT cluster boundaries that reflected vegetation transitions, for example, between tropical and temperate biomes. The resulting global database should provide a better basis for simulating the interaction of climate change and terrestrial ecosystem dynamics using global vegetation models.

  6. A framework for global river flood risk assessments

    NASA Astrophysics Data System (ADS)

    Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.

    2012-08-01

    There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate. The framework estimates hazard at high resolution (~1 km2) using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood routing model, and importantly, a flood extent downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case-study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard and damage estimates has been performed using the Dartmouth Flood Observatory database and damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.

  7. Compiling global name-space programs for distributed execution

    NASA Technical Reports Server (NTRS)

    Koelbel, Charles; Mehrotra, Piyush

    1990-01-01

    Distributed memory machines do not provide hardware support for a global address space. Thus programmers are forced to partition the data across the memories of the architecture and use explicit message passing to communicate data between processors. The compiler support required to allow programmers to express their algorithms using a global name-space is examined. A general method is presented for analysis of a high level source program and along with its translation to a set of independently executing tasks communicating via messages. If the compiler has enough information, this translation can be carried out at compile-time. Otherwise run-time code is generated to implement the required data movement. The analysis required in both situations is described and the performance of the generated code on the Intel iPSC/2 is presented.

  8. Deriving global parameter estimates for the Noah land surface model using FLUXNET and machine learning

    NASA Astrophysics Data System (ADS)

    Chaney, Nathaniel W.; Herman, Jonathan D.; Ek, Michael B.; Wood, Eric F.

    2016-11-01

    With their origins in numerical weather prediction and climate modeling, land surface models aim to accurately partition the surface energy balance. An overlooked challenge in these schemes is the role of model parameter uncertainty, particularly at unmonitored sites. This study provides global parameter estimates for the Noah land surface model using 85 eddy covariance sites in the global FLUXNET network. The at-site parameters are first calibrated using a Latin Hypercube-based ensemble of the most sensitive parameters, determined by the Sobol method, to be the minimum stomatal resistance (rs,min), the Zilitinkevich empirical constant (Czil), and the bare soil evaporation exponent (fxexp). Calibration leads to an increase in the mean Kling-Gupta Efficiency performance metric from 0.54 to 0.71. These calibrated parameter sets are then related to local environmental characteristics using the Extra-Trees machine learning algorithm. The fitted Extra-Trees model is used to map the optimal parameter sets over the globe at a 5 km spatial resolution. The leave-one-out cross validation of the mapped parameters using the Noah land surface model suggests that there is the potential to skillfully relate calibrated model parameter sets to local environmental characteristics. The results demonstrate the potential to use FLUXNET to tune the parameterizations of surface fluxes in land surface models and to provide improved parameter estimates over the globe.

  9. Habitat and environment of islands: primary and supplemental island sets

    USGS Publications Warehouse

    Matalas, Nicholas C.; Grossling, Bernardo F.

    2002-01-01

    The original intent of the study was to develop a first-order synopsis of island hydrology with an integrated geologic basis on a global scale. As the study progressed, the aim was broadened to provide a framework for subsequent assessments on large regional or global scales of island resources and impacts on those resources that are derived from global changes. Fundamental to the study was the development of a comprehensive framework?a wide range of parameters that describe a set of 'saltwater' islands sufficiently large to Characterize the spatial distribution of the world?s islands; Account for all major archipelagos; Account for almost all oceanically isolated islands, and Account collectively for a very large proportion of the total area of the world?s islands whereby additional islands would only marginally contribute to the representativeness and accountability of the island set. The comprehensive framework, which is referred to as the ?Primary Island Set,? is built on 122 parameters that describe 1,000 islands. To complement the investigations based on the Primary Island Set, two supplemental island sets, Set A?Other Islands (not in the Primary Island Set) and Set B?Lagoonal Atolls, are included in the study. The Primary Island Set, together with the Supplemental Island Sets A and B, provides a framework that can be used in various scientific disciplines for their island-based studies on broad regional or global scales. The study uses an informal, coherent, geophysical organization of the islands that belong to the three island sets. The organization is in the form of a global island chain, which is a particular sequential ordering of the islands referred to as the 'Alisida.' The Alisida was developed through a trial-and-error procedure by seeking to strike a balance between 'minimizing the length of the global chain' and 'maximizing the chain?s geophysical coherence.' The fact that an objective function cannot be minimized and maximized simultaneously indicates that the Alisida is not unique. Global island chains other than the Alisida may better serve disciplines other than those of hydrology and geology.

  10. Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).

    PubMed

    MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J

    2018-02-01

    The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.

  11. Performance Incentives to Improve Community College Completion: Learning from Washington State's Student Achievement Initiative. A State Policy Brief

    ERIC Educational Resources Information Center

    Shulock, Nancy; Jenkins, Davis

    2011-01-01

    Amid growing signs of America's weakening position in the global economy, federal and state policymakers and major foundations have set ambitious goals for increasing postsecondary attainment in the United States. Given changing U.S. demographics, it has become clear that these national goals are attainable only with vastly improved outcomes among…

  12. Analysis of engagement behavior in children during dyadic interactions using prosodic cues⋆

    PubMed Central

    Gupta, Rahul; Bone, Daniel; Lee, Sungbok; Narayanan, Shrikanth

    2017-01-01

    Child engagement is defined as the interaction of a child with his/her environment in a contextually appropriate manner. Engagement behavior in children is linked to socio-emotional and cognitive state assessment with enhanced engagement identified with improved skills. A vast majority of studies however rely solely, and often implicitly, on subjective perceptual measures of engagement. Access to automatic quantification could assist researchers/clinicians to objectively interpret engagement with respect to a target behavior or condition, and furthermore inform mechanisms for improving engagement in various settings. In this paper, we present an engagement prediction system based exclusively on vocal cues observed during structured interaction between a child and a psychologist involving several tasks. Specifically, we derive prosodic cues that capture engagement levels across the various tasks. Our experiments suggest that a child’s engagement is reflected not only in the vocalizations, but also in the speech of the interacting psychologist. Moreover, we show that prosodic cues are informative of the engagement phenomena not only as characterized over the entire task (i.e., global cues), but also in short term patterns (i.e., local cues). We perform a classification experiment assigning the engagement of a child into three discrete levels achieving an unweighted average recall of 55.8% (chance is 33.3%). While the systems using global cues and local level cues are each statistically significant in predicting engagement, we obtain the best results after fusing these two components. We perform further analysis of the cues at local and global levels to achieve insights linking specific prosodic patterns to the engagement phenomenon. We observe that while the performance of our model varies with task setting and interacting psychologist, there exist universal prosodic patterns reflective of engagement. PMID:28713198

  13. Analysis of engagement behavior in children during dyadic interactions using prosodic cues.

    PubMed

    Gupta, Rahul; Bone, Daniel; Lee, Sungbok; Narayanan, Shrikanth

    2016-05-01

    Child engagement is defined as the interaction of a child with his/her environment in a contextually appropriate manner. Engagement behavior in children is linked to socio-emotional and cognitive state assessment with enhanced engagement identified with improved skills. A vast majority of studies however rely solely, and often implicitly, on subjective perceptual measures of engagement. Access to automatic quantification could assist researchers/clinicians to objectively interpret engagement with respect to a target behavior or condition, and furthermore inform mechanisms for improving engagement in various settings. In this paper, we present an engagement prediction system based exclusively on vocal cues observed during structured interaction between a child and a psychologist involving several tasks. Specifically, we derive prosodic cues that capture engagement levels across the various tasks. Our experiments suggest that a child's engagement is reflected not only in the vocalizations, but also in the speech of the interacting psychologist. Moreover, we show that prosodic cues are informative of the engagement phenomena not only as characterized over the entire task (i.e., global cues), but also in short term patterns (i.e., local cues). We perform a classification experiment assigning the engagement of a child into three discrete levels achieving an unweighted average recall of 55.8% (chance is 33.3%). While the systems using global cues and local level cues are each statistically significant in predicting engagement, we obtain the best results after fusing these two components. We perform further analysis of the cues at local and global levels to achieve insights linking specific prosodic patterns to the engagement phenomenon. We observe that while the performance of our model varies with task setting and interacting psychologist, there exist universal prosodic patterns reflective of engagement.

  14. Global fund financing of tuberculosis services delivery in prisons.

    PubMed

    Lee, Donna; Lal, S S; Komatsu, Ryuichi; Zumla, Alimuddin; Atun, Rifat

    2012-05-15

    Despite concerted efforts to scale up tuberculosis control with large amounts of international financing in the last 2 decades, tuberculosis continues to be a social issue affecting the world's most marginalized and disadvantaged communities. This includes prisoners, estimated at about 10 million globally, for whom tuberculosis is a leading cause of mortality and morbidity. The Global Fund to Fight AIDS, Tuberculosis and Malaria has emerged as the single largest international donor for tuberculosis control, including funding support in delivering tuberculosis treatment for the confined population. The Global Fund grants database, with an aggregate approved investment of $21.7 billion in 150 countries by the end of 2010, was reviewed to identify tuberculosis and human immunodeficiency virus/tuberculosis grants and activities that monitored the delivery of tuberculosis treatment and support activities in penitentiary settings. The distribution and trend of number of countries with tuberculosis prison support was mapped by year, geographic region, tuberculosis or multidrug-resistant tuberculosis burden, and prison population rate. We examined the types of grant recipients managing program delivery, their performance, and the nature and range of services provided. Fifty-three of the 105 countries (50%) with Global Fund-supported tuberculosis programs delivered services within prison settings. Thirty-two percent (73 of 228) of tuberculosis grants, representing $558 million of all disbursements of Global Fund tuberculosis support by the end of 2010, included output indicators related to tuberculosis services delivered in prisons. Nearly two-thirds (64%) of these grants were implemented by governments, with the remaining by civil society and other partners. In terms of services, half (36 of 73) of grants provided diagnosis and treatment and an additional 27% provided screening and monitoring of tuberculosis for prisoners. The range of services tracked was limited in scope and scale, with 69% offering only 1 type of service and less than one-fifth offering 2 types of service. This study is a preliminary attempt to examine Global Fund investments in the fight against tuberculosis in prison settings. Tuberculosis services delivered in prisons have increased in the last decade, but systematic information on funding levels and gaps, services provided, and cost-effective delivery models for delivering tuberculosis services in prisons are lacking.

  15. The Prefrontal Model Revisited: Double Dissociations Between Young Sleep Deprived and Elderly Subjects on Cognitive Components of Performance

    PubMed Central

    Tucker, Adrienne M.; Stern, Yaakov; Basner, Robert C.; Rakitin, Brian C.

    2011-01-01

    Study Objectives: The prefrontal model suggests that total sleep deprivation (TSD) and healthy aging produce parallel cognitive deficits. Here we decompose global performance on two common tasks into component measures of specific cognitive processes to pinpoint the source of impairments in elderly and young TSD participants relative to young controls and to each other. Setting: The delayed letter recognition task (DLR) was performed in 3 studies. The psychomotor vigilance task (PVT) was performed in 1 of the DLR studies and 2 additional studies. Subjects: For DLR, young TSD (n = 20, age = 24.60 ± 0.62 years) and young control (n = 17, age = 24.00 ± 2.42); elderly (n = 26, age = 69.92 ± 1.06). For the PVT, young TSD (n = 18, age = 26.65 ± 4.57) and young control (n = 16, age = 25.19 ± 2.90); elderly (n = 21, age = 71.1 ± 4.92). Measurements and Results: Both elderly and young TSD subjects displayed impaired reaction time (RT), our measure of global performance, on both tasks relative to young controls. After decomposing global performance on the DLR, however, a double dissociation was observed as working memory scanning speed was impaired only in elderly subjects while other components of performance were impaired only by TSD. Similarly, for the PVT a second double dissociation was observed as vigilance impairments were present only in TSD while short-term response preparation effects were altered only in the elderly. Conclusions: The similarity between TSD and the elderly in impaired performance was evident only when examining global RT. In contrast, when specific cognitive components were examined double dissociations were observed between TSD and elderly subjects. This demonstrates the heterogeneity in those cognitive processes impaired in TSD versus the elderly. Citation: Tucker AM; Stern Y; Basner RC; Rakitin BC. The prefrontal model revisited: double dissociations between young sleep deprived and elderly subjects on cognitive components of performance. SLEEP 2011;34(8):1039-1050. PMID:21804666

  16. Progressive learning in endoscopy simulation training improves clinical performance: a blinded randomized trial.

    PubMed

    Grover, Samir C; Scaffidi, Michael A; Khan, Rishad; Garg, Ankit; Al-Mazroui, Ahmed; Alomani, Tareq; Yu, Jeffrey J; Plener, Ian S; Al-Awamy, Mohamed; Yong, Elaine L; Cino, Maria; Ravindran, Nikila C; Zasowski, Mark; Grantcharov, Teodor P; Walsh, Catharine M

    2017-11-01

    A structured comprehensive curriculum (SCC) that uses simulation-based training (SBT) can improve clinical colonoscopy performance. This curriculum may be enhanced through the application of progressive learning, a training strategy centered on incrementally challenging learners. We aimed to determine whether a progressive learning-based curriculum (PLC) would lead to superior clinical performance compared with an SCC. This was a single-blinded randomized controlled trial conducted at a single academic center. Thirty-seven novice endoscopists were recruited and randomized to either a PLC (n = 18) or to an SCC (n = 19). The PLC comprised 6 hours of SBT, which progressed in complexity and difficulty. The SCC included 6 hours of SBT, with cases of random order of difficulty. Both groups received expert feedback and 4 hours of didactic teaching. Participants were assessed at baseline, immediately after training, and 4 to 6 weeks after training. The primary outcome was participants' performance during their first 2 clinical colonoscopies, as assessed by using the Joint Advisory Group Direct Observation of Procedural Skills assessment tool (JAG DOPS). Secondary outcomes were differences in endoscopic knowledge, technical and communication skills, and global performance in the simulated setting. The PLC group outperformed the SCC group during first and second clinical colonoscopies, measured by JAG DOPS (P < .001). Additionally, the PLC group had superior technical and communication skills and global performance in the simulated setting (P < .05). There were no differences between groups in endoscopic knowledge (P > .05). Our findings demonstrate the superiority of a PLC for endoscopic simulation, compared with an SCC. Challenging trainees progressively is a simple, theory-based approach to simulation whereby the performance of clinical colonoscopies can be improved. (Clinical trial registration number: NCT02000180.). Copyright © 2017 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  17. Decomposed direct matrix inversion for fast non-cartesian SENSE reconstructions.

    PubMed

    Qian, Yongxian; Zhang, Zhenghui; Wang, Yi; Boada, Fernando E

    2006-08-01

    A new k-space direct matrix inversion (DMI) method is proposed here to accelerate non-Cartesian SENSE reconstructions. In this method a global k-space matrix equation is established on basic MRI principles, and the inverse of the global encoding matrix is found from a set of local matrix equations by taking advantage of the small extension of k-space coil maps. The DMI algorithm's efficiency is achieved by reloading the precalculated global inverse when the coil maps and trajectories remain unchanged, such as in dynamic studies. Phantom and human subject experiments were performed on a 1.5T scanner with a standard four-channel phased-array cardiac coil. Interleaved spiral trajectories were used to collect fully sampled and undersampled 3D raw data. The equivalence of the global k-space matrix equation to its image-space version, was verified via conjugate gradient (CG) iterative algorithms on a 2x undersampled phantom and numerical-model data sets. When applied to the 2x undersampled phantom and human-subject raw data, the decomposed DMI method produced images with small errors (< or = 3.9%) relative to the reference images obtained from the fully-sampled data, at a rate of 2 s per slice (excluding 4 min for precalculating the global inverse at an image size of 256 x 256). The DMI method may be useful for noise evaluations in parallel coil designs, dynamic MRI, and 3D sodium MRI with fixed coils and trajectories. Copyright 2006 Wiley-Liss, Inc.

  18. A framework for global river flood risk assessments

    NASA Astrophysics Data System (ADS)

    Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.

    2013-05-01

    There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.

  19. From a declaration of values to the creation of value in global health: a report from Harvard University's Global Health Delivery Project.

    PubMed

    Kim, J Y; Rhatigan, J; Jain, S H; Weintraub, R; Porter, M E

    2010-01-01

    To make best use of the new dollars available for the treatment of disease in resource-poor settings, global health practice requires a strategic approach that emphasises value for patients. Practitioners and global health academics should seek to identify and elaborate the set of factors that drives value for patients through the detailed study of actual care delivery organisations in multiple settings. Several frameworks can facilitate this study, including the care delivery value chain. We report on our efforts to catalyse the study of health care delivery in resource-limited settings in the hope that this inquiry will lead to insights that can improve the health of the neediest worldwide.

  20. a Voxel-Based Filtering Algorithm for Mobile LIDAR Data

    NASA Astrophysics Data System (ADS)

    Qin, H.; Guan, G.; Yu, Y.; Zhong, L.

    2018-04-01

    This paper presents a stepwise voxel-based filtering algorithm for mobile LiDAR data. In the first step, to improve computational efficiency, mobile LiDAR points, in xy-plane, are first partitioned into a set of two-dimensional (2-D) blocks with a given block size, in each of which all laser points are further organized into an octree partition structure with a set of three-dimensional (3-D) voxels. Then, a voxel-based upward growing processing is performed to roughly separate terrain from non-terrain points with global and local terrain thresholds. In the second step, the extracted terrain points are refined by computing voxel curvatures. This voxel-based filtering algorithm is comprehensively discussed in the analyses of parameter sensitivity and overall performance. An experimental study performed on multiple point cloud samples, collected by different commercial mobile LiDAR systems, showed that the proposed algorithm provides a promising solution to terrain point extraction from mobile point clouds.

  1. Peritoneal Dialysis in Austere Environments: An Emergent Approach to Renal Failure Management

    PubMed Central

    Gorbatkin, Chad; Finkelstein, Fredric O.; Gorbatkin, Steven M.

    2018-01-01

    Peritoneal dialysis (PD) is a means of renal replacement therapy (RRT) that can be performed in remote settings with limited resources, including regions that lack electrical power. PD is a mainstay of end-stage renal disease (ESRD) therapy worldwide, and the ease of initiation and maintenance has enabled it to flourish in both resource-limited and resource-abundant settings. In natural disaster scenarios, military conflicts, and other austere areas, PD may be the only available life-saving measure for acute kidney injury (AKI) or ESRD. PD in austere environments is not without challenges, including catheter placement, availability of dialysate, and medical complications related to the procedure itself. However, when hemodialysis is unavailable, PD can be performed using generally available medical supplies including sterile tubing and intravenous fluids. Amidst the ever-increasing global burden of ESRD and AKI, the ability to perform PD is essential for many medical facilities. PMID:29760854

  2. Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco

    NASA Astrophysics Data System (ADS)

    Bounoua, Z.; Mechaqrane, A.

    2018-05-01

    An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.

  3. Diffraction-geometry refinement in the DIALS framework

    DOE PAGES

    Waterman, David G.; Winter, Graeme; Gildea, Richard J.; ...

    2016-03-30

    Rapid data collection and modern computing resources provide the opportunity to revisit the task of optimizing the model of diffraction geometry prior to integration. A comprehensive description is given of new software that builds upon established methods by performing a single global refinement procedure, utilizing a smoothly varying model of the crystal lattice where appropriate. This global refinement technique extends to multiple data sets, providing useful constraints to handle the problem of correlated parameters, particularly for small wedges of data. Examples of advanced uses of the software are given and the design is explained in detail, with particular emphasis onmore » the flexibility and extensibility it entails.« less

  4. Fast Gaussian kernel learning for classification tasks based on specially structured global optimization.

    PubMed

    Zhong, Shangping; Chen, Tianshun; He, Fengying; Niu, Yuzhen

    2014-09-01

    For a practical pattern classification task solved by kernel methods, the computing time is mainly spent on kernel learning (or training). However, the current kernel learning approaches are based on local optimization techniques, and hard to have good time performances, especially for large datasets. Thus the existing algorithms cannot be easily extended to large-scale tasks. In this paper, we present a fast Gaussian kernel learning method by solving a specially structured global optimization (SSGO) problem. We optimize the Gaussian kernel function by using the formulated kernel target alignment criterion, which is a difference of increasing (d.i.) functions. Through using a power-transformation based convexification method, the objective criterion can be represented as a difference of convex (d.c.) functions with a fixed power-transformation parameter. And the objective programming problem can then be converted to a SSGO problem: globally minimizing a concave function over a convex set. The SSGO problem is classical and has good solvability. Thus, to find the global optimal solution efficiently, we can adopt the improved Hoffman's outer approximation method, which need not repeat the searching procedure with different starting points to locate the best local minimum. Also, the proposed method can be proven to converge to the global solution for any classification task. We evaluate the proposed method on twenty benchmark datasets, and compare it with four other Gaussian kernel learning methods. Experimental results show that the proposed method stably achieves both good time-efficiency performance and good classification performance. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. The Uses of Globalization in the (Shifting) Landscape of Educational Studies

    ERIC Educational Resources Information Center

    Tarc, Paul

    2012-01-01

    The term "globalization" does more than represent a set of material (and ideological) processes that have impacts on education and schooling. Additionally, "globalization" operates as a conceptual lens or set of interventions, which is significantly impacting academic discourses in Education and in other disciplines. Not only…

  6. Setting the Stage: Global Competition in Higher Education

    ERIC Educational Resources Information Center

    Bagley, Sylvia S.; Portnoi, Laura M.

    2014-01-01

    In this chapter, the issue editors set the stage for the chapters that follow by delineating recent developments in higher education and common strategies for creating globally competitive higher education institutions. The editors consider social justice concerns that arise with global competition and contend that contextualized priorities can…

  7. A global data set of soil hydraulic properties and sub-grid variability of soil water retention and hydraulic conductivity curves

    NASA Astrophysics Data System (ADS)

    Montzka, Carsten; Herbst, Michael; Weihermüller, Lutz; Verhoef, Anne; Vereecken, Harry

    2017-07-01

    Agroecosystem models, regional and global climate models, and numerical weather prediction models require adequate parameterization of soil hydraulic properties. These properties are fundamental for describing and predicting water and energy exchange processes at the transition zone between solid earth and atmosphere, and regulate evapotranspiration, infiltration and runoff generation. Hydraulic parameters describing the soil water retention (WRC) and hydraulic conductivity (HCC) curves are typically derived from soil texture via pedotransfer functions (PTFs). Resampling of those parameters for specific model grids is typically performed by different aggregation approaches such a spatial averaging and the use of dominant textural properties or soil classes. These aggregation approaches introduce uncertainty, bias and parameter inconsistencies throughout spatial scales due to nonlinear relationships between hydraulic parameters and soil texture. Therefore, we present a method to scale hydraulic parameters to individual model grids and provide a global data set that overcomes the mentioned problems. The approach is based on Miller-Miller scaling in the relaxed form by Warrick, that fits the parameters of the WRC through all sub-grid WRCs to provide an effective parameterization for the grid cell at model resolution; at the same time it preserves the information of sub-grid variability of the water retention curve by deriving local scaling parameters. Based on the Mualem-van Genuchten approach we also derive the unsaturated hydraulic conductivity from the water retention functions, thereby assuming that the local parameters are also valid for this function. In addition, via the Warrick scaling parameter λ, information on global sub-grid scaling variance is given that enables modellers to improve dynamical downscaling of (regional) climate models or to perturb hydraulic parameters for model ensemble output generation. The present analysis is based on the ROSETTA PTF of Schaap et al. (2001) applied to the SoilGrids1km data set of Hengl et al. (2014). The example data set is provided at a global resolution of 0.25° at https://doi.org/10.1594/PANGAEA.870605.

  8. Promoting medical students' reflection on competencies to advance a global health equities curriculum.

    PubMed

    Mullan, Patricia B; Williams, Joy; Malani, Preeti N; Riba, Michelle; Haig, Andrew; Perry, Julie; Kolars, Joseph C; Mangrulkar, Rajesh; Williams, Brent

    2014-05-03

    The move to frame medical education in terms of competencies - the extent to which trainees "can do" a professional responsibility - is congruent with calls for accountability in medical education. However, the focus on competencies might be a poor fit with curricula intended to prepare students for responsibilities not emphasized in traditional medical education. This study examines an innovative approach to the use of potential competency expectations related to advancing global health equity to promote students' reflections and to inform curriculum development. In 2012, 32 medical students were admitted into a newly developed Global Health and Disparities (GHD) Path of Excellence. The GHD program takes the form of mentored co-curricular activities built around defined competencies related to professional development and leadership skills intended to ameliorate health disparities in medically underserved settings, both domestically and globally. Students reviewed the GHD competencies from two perspectives: a) their ability to perform the identified competencies that they perceived themselves as holding as they began the GHD program and b) the extent to which they perceived that their future career would require these responsibilities. For both sets of assessments the response scale ranged from "Strongly Disagree" to "Strongly Agree." Wilcoxon's paired T-tests compared individual students' ordinal rating of their current level of ability to their perceived need for competence that they anticipated their careers would require. Statistical significance was set at p < .01. Students' ratings ranged from "strongly disagree" to "strongly agree" that they could perform the defined GHD-related competencies. However, on most competencies, at least 50 % of students indicated that the stated competencies were beyond their present ability level. For each competency, the results of Wilcoxon paired T-tests indicate - at statistically significant levels - that students perceive more need in their careers for GHD-program defined competencies than they currently possess. This study suggests congruence between student and program perceptions of the scope of practice required for GHD. Students report the need for enhanced skill levels in the careers they anticipate. This approach to formulating and reflecting on competencies will guide the program's design of learning experiences aligned with students' career goals.

  9. Power in global health agenda-setting: the role of private funding Comment on "Knowledge, moral claims and the exercise of power in global health".

    PubMed

    Levine, Ruth E

    2015-03-04

    The editorial by Jeremy Shiffman, "Knowledge, moral claims and the exercise of power in global health", highlights the influence on global health priority-setting of individuals and organizations that do not have a formal political mandate. This sheds light on the way key functions in global health depend on private funding, particularly from the Bill & Melinda Gates Foundation. © 2015 by Kerman University of Medical Sciences.

  10. Application of a derivative-free global optimization algorithm to the derivation of a new time integration scheme for the simulation of incompressible turbulence

    NASA Astrophysics Data System (ADS)

    Alimohammadi, Shahrouz; Cavaglieri, Daniele; Beyhaghi, Pooriya; Bewley, Thomas R.

    2016-11-01

    This work applies a recently developed Derivative-free optimization algorithm to derive a new mixed implicit-explicit (IMEX) time integration scheme for Computational Fluid Dynamics (CFD) simulations. This algorithm allows imposing a specified order of accuracy for the time integration and other important stability properties in the form of nonlinear constraints within the optimization problem. In this procedure, the coefficients of the IMEX scheme should satisfy a set of constraints simultaneously. Therefore, the optimization process, at each iteration, estimates the location of the optimal coefficients using a set of global surrogates, for both the objective and constraint functions, as well as a model of the uncertainty function of these surrogates based on the concept of Delaunay triangulation. This procedure has been proven to converge to the global minimum of the constrained optimization problem provided the constraints and objective functions are twice differentiable. As a result, a new third-order, low-storage IMEX Runge-Kutta time integration scheme is obtained with remarkably fast convergence. Numerical tests are then performed leveraging the turbulent channel flow simulations to validate the theoretical order of accuracy and stability properties of the new scheme.

  11. Setting Priorities in Global Child Health Research Investments: Guidelines for Implementation of the CHNRI Method

    PubMed Central

    Rudan, Igor; Gibson, Jennifer L.; Ameratunga, Shanthi; El Arifeen, Shams; Bhutta, Zulfiqar A.; Black, Maureen; Black, Robert E.; Brown, Kenneth H.; Campbell, Harry; Carneiro, Ilona; Chan, Kit Yee; Chandramohan, Daniel; Chopra, Mickey; Cousens, Simon; Darmstadt, Gary L.; Gardner, Julie Meeks; Hess, Sonja Y.; Hyder, Adnan A.; Kapiriri, Lydia; Kosek, Margaret; Lanata, Claudio F.; Lansang, Mary Ann; Lawn, Joy; Tomlinson, Mark; Tsai, Alexander C.; Webster, Jayne

    2008-01-01

    This article provides detailed guidelines for the implementation of systematic method for setting priorities in health research investments that was recently developed by Child Health and Nutrition Research Initiative (CHNRI). The target audience for the proposed method are international agencies, large research funding donors, and national governments and policy-makers. The process has the following steps: (i) selecting the managers of the process; (ii) specifying the context and risk management preferences; (iii) discussing criteria for setting health research priorities; (iv) choosing a limited set of the most useful and important criteria; (v) developing means to assess the likelihood that proposed health research options will satisfy the selected criteria; (vi) systematic listing of a large number of proposed health research options; (vii) pre-scoring check of all competing health research options; (viii) scoring of health research options using the chosen set of criteria; (ix) calculating intermediate scores for each health research option; (x) obtaining further input from the stakeholders; (xi) adjusting intermediate scores taking into account the values of stakeholders; (xii) calculating overall priority scores and assigning ranks; (xiii) performing an analysis of agreement between the scorers; (xiv) linking computed research priority scores with investment decisions; (xv) feedback and revision. The CHNRI method is a flexible process that enables prioritizing health research investments at any level: institutional, regional, national, international, or global. PMID:19090596

  12. Accuracy Evaluation of Two Global Land Cover Data Sets Over Wetlands of China

    NASA Astrophysics Data System (ADS)

    Niu, Z. G.; Shan, Y. X.; Gong, P.

    2012-07-01

    Although wetlands are well known as one of the most important ecosystems in the world, there are still few global wetland mapping efforts at present. To evaluate the wetland-related types of data accurately for both the Global Land Cover 2000 (GLC2000) data set and MODIS land cover data set (MOD12Q1), we used the China wetland map of 2000, which was interpreted manually based on Landsat TM images, to examine the precision of these global land cover data sets from two aspects (class area accuracy, and spatial agreement) across China. The results show that the area consistency coefficients of wetland-related types between the two global data sets and the reference data are 77.27% and 56.85%, respectively. However, the overall accuracy of relevant wetland types from GLC2000 is only 19.81% based on results of confusion matrix of spatial consistency, and similarly, MOD12Q1 is merely 18.91%. Furthermore, the accuracy of the peatlands is much lower than that of the water bodies according to the results of per-pixel comparison. The categories where errors occurred frequently mainly include grasslands, croplands, bare lands and part of woodland (deciduous coniferous forest, deciduous broadleaf forest and open shrubland). The possible reasons for the low precision of wetland-related land cover types include (1)the different aims of various products and therefore the inconsistent wetland definitions in their systems; (2) the coarse spatial resolution of satellite images used in global data; (3) Discrepancies in dates when images were acquired between the global data set and the reference data. Overall, the unsatisfactory results highlight that more attention should be paid to the application of these two global data products, especially in wetland-relevant types across China.

  13. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less

  14. Forest, Trees, Dynamics: Results from a Novel Wisconsin Card Sorting Test Variant Protocol for Studying Global-Local Attention and Complex Cognitive Processes

    PubMed Central

    Cowley, Benjamin; Lukander, Kristian

    2016-01-01

    Background: Recognition of objects and their context relies heavily on the integrated functioning of global and local visual processing. In a realistic setting such as work, this processing becomes a sustained activity, implying a consequent interaction with executive functions. Motivation: There have been many studies of either global-local attention or executive functions; however it is relatively novel to combine these processes to study a more ecological form of attention. We aim to explore the phenomenon of global-local processing during a task requiring sustained attention and working memory. Methods: We develop and test a novel protocol for global-local dissociation, with task structure including phases of divided (“rule search”) and selective (“rule found”) attention, based on the Wisconsin Card Sorting Task (WCST). We test it in a laboratory study with 25 participants, and report on behavior measures (physiological data was also gathered, but not reported here). We develop novel stimuli with more naturalistic levels of information and noise, based primarily on face photographs, with consequently more ecological validity. Results: We report behavioral results indicating that sustained difficulty when participants test their hypotheses impacts matching-task performance, and diminishes the global precedence effect. Results also show a dissociation between subjectively experienced difficulty and objective dimension of performance, and establish the internal validity of the protocol. Contribution: We contribute an advance in the state of the art for testing global-local attention processes in concert with complex cognition. With three results we establish a connection between global-local dissociation and aspects of complex cognition. Our protocol also improves ecological validity and opens options for testing additional interactions in future work. PMID:26941689

  15. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    DOE PAGES

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; ...

    2016-06-01

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less

  16. A gridded global data set of soil, intact regolith, and sedimentary deposit thicknesses for regional and global land surface modeling

    DOE PAGES

    Pelletier, Jon D.; Broxton, Patrick D.; Hazenberg, Pieter; ...

    2016-01-22

    Earth’s terrestrial near-subsurface environment can be divided into relatively porous layers of soil, intact regolith, and sedimentary deposits above unweathered bedrock. Variations in the thicknesses of these layers control the hydrologic and biogeochemical responses of landscapes. Currently, Earth System Models approximate the thickness of these relatively permeable layers above bedrock as uniform globally, despite the fact that their thicknesses vary systematically with topography, climate, and geology. To meet the need for more realistic input data for models, we developed a high-resolution gridded global data set of the average thicknesses of soil, intact regolith, and sedimentary deposits within each 30 arcsecmore » (~ 1 km) pixel using the best available data for topography, climate, and geology as input. Our data set partitions the global land surface into upland hillslope, upland valley bottom, and lowland landscape components and uses models optimized for each landform type to estimate the thicknesses of each subsurface layer. On hillslopes, the data set is calibrated and validated using independent data sets of measured soil thicknesses from the U.S. and Europe and on lowlands using depth to bedrock observations from groundwater wells in the U.S. As a result, we anticipate that the data set will prove useful as an input to regional and global hydrological and ecosystems models.« less

  17. A gridded global data set of soil, intact regolith, and sedimentary deposit thicknesses for regional and global land surface modeling

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.; Broxton, Patrick D.; Hazenberg, Pieter; Zeng, Xubin; Troch, Peter A.; Niu, Guo-Yue; Williams, Zachary; Brunke, Michael A.; Gochis, David

    2016-03-01

    Earth's terrestrial near-subsurface environment can be divided into relatively porous layers of soil, intact regolith, and sedimentary deposits above unweathered bedrock. Variations in the thicknesses of these layers control the hydrologic and biogeochemical responses of landscapes. Currently, Earth System Models approximate the thickness of these relatively permeable layers above bedrock as uniform globally, despite the fact that their thicknesses vary systematically with topography, climate, and geology. To meet the need for more realistic input data for models, we developed a high-resolution gridded global data set of the average thicknesses of soil, intact regolith, and sedimentary deposits within each 30 arcsec (˜1 km) pixel using the best available data for topography, climate, and geology as input. Our data set partitions the global land surface into upland hillslope, upland valley bottom, and lowland landscape components and uses models optimized for each landform type to estimate the thicknesses of each subsurface layer. On hillslopes, the data set is calibrated and validated using independent data sets of measured soil thicknesses from the U.S. and Europe and on lowlands using depth to bedrock observations from groundwater wells in the U.S. We anticipate that the data set will prove useful as an input to regional and global hydrological and ecosystems models. This article was corrected on 2 FEB 2016. See the end of the full text for details.

  18. Global health in the European Union – a review from an agenda-setting perspective

    PubMed Central

    Aluttis, Christoph; Krafft, Thomas; Brand, Helmut

    2014-01-01

    This review attempts to analyse the global health agenda-setting process in the European Union (EU). We give an overview of the European perspective on global health, making reference to the developments that led to the EU acknowledging its role as a global health actor. The article thereby focusses in particular on the European interpretation of its role in global health from 2010, which was formalised through, respectively, a European Commission Communication and European Council Conclusions. Departing from there, and based on Kingdon's multiple streams theory on agenda setting, we identify some barriers that seem to hinder the further establishment and promotion of a solid global health agenda in the EU. The main barriers for creating a strong European global health agenda are the fragmentation of the policy community and the lack of a common definition for global health in Europe. Forwarding the agenda in Europe for global health requires more clarification of the common goals and perspectives of the policy community and the use of arising windows of opportunity. PMID:24560264

  19. Distributed Parallel Processing and Dynamic Load Balancing Techniques for Multidisciplinary High Speed Aircraft Design

    NASA Technical Reports Server (NTRS)

    Krasteva, Denitza T.

    1998-01-01

    Multidisciplinary design optimization (MDO) for large-scale engineering problems poses many challenges (e.g., the design of an efficient concurrent paradigm for global optimization based on disciplinary analyses, expensive computations over vast data sets, etc.) This work focuses on the application of distributed schemes for massively parallel architectures to MDO problems, as a tool for reducing computation time and solving larger problems. The specific problem considered here is configuration optimization of a high speed civil transport (HSCT), and the efficient parallelization of the embedded paradigm for reasonable design space identification. Two distributed dynamic load balancing techniques (random polling and global round robin with message combining) and two necessary termination detection schemes (global task count and token passing) were implemented and evaluated in terms of effectiveness and scalability to large problem sizes and a thousand processors. The effect of certain parameters on execution time was also inspected. Empirical results demonstrated stable performance and effectiveness for all schemes, and the parametric study showed that the selected algorithmic parameters have a negligible effect on performance.

  20. Differential impacts of global change variables on coastal South Atlantic phytoplankton: Role of seasonal variations.

    PubMed

    Cabrerizo, Marco J; Carrillo, Presentación; Villafañe, Virginia E; Helbling, E Walter

    2017-04-01

    Global change is associated to the increase in temperature (T), nutrient inputs (Nut) and solar radiation in the water column. To address their joint impact on the net community production [NCP], respiration [CR] and PSII performance (Φ PSII ) of coastal phytoplankton communities from the South Atlantic Ocean over a seasonal succession, we performed a factorial design. For this, we used a 2 × 2 × 2 matrix set-up, with and without UVR, ambient and enriched nutrients, and in situ T and in situ T + 3 °C. The future scenario of global change exerted a dual impact, from an enhancement of NCP and Φ PSII during the pre-bloom to an inhibition of both processes towards the bloom period, when the in situ T and irradiances were lower and the community was dominated by diatoms. The increased inhibition of NCP and Φ PSII during the most productive stage of the annual succession could produce significant alterations of the CO 2 -sink capacity of coastal areas in the future. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Global Security Rule Sets An Analysis of the Current Global Security Environment and Rule Sets Governing Nuclear Weapons Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mollahan, K; Nattrass, L

    America is in a unique position in its history. In maintaining its position as the world's only superpower, the US consistently finds itself taking on the role of a global cop, chief exporter of hard and soft power, and primary impetus for globalization. A view of the current global situation shows an America that can benefit greatly from the effects of globalization and soft power. Similarly, America's power can be reduced significantly if globalization and its soft power are not handled properly. At the same time, America has slowly come to realize that its next major adversary is not amore » near peer competitor but terrorism and disconnected nations that seek nuclear capabilities. In dealing with this new threat, America needs to come to terms with its own nuclear arsenal and build a security rule set that will establish for the world explicitly what actions will cause the US to consider nuclear weapons release. This rule set; however, needs to be established with sensitivity to the US's international interests in globalization and soft power. The US must find a way to establish its doctrine governing nuclear weapons release without threatening other peaceful nations in the process.« less

  2. Level set method for image segmentation based on moment competition

    NASA Astrophysics Data System (ADS)

    Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai

    2015-05-01

    We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.

  3. The CAFE model: A net production model for global ocean phytoplankton

    NASA Astrophysics Data System (ADS)

    Silsbe, Greg M.; Behrenfeld, Michael J.; Halsey, Kimberly H.; Milligan, Allen J.; Westberry, Toby K.

    2016-12-01

    The Carbon, Absorption, and Fluorescence Euphotic-resolving (CAFE) net primary production model is an adaptable framework for advancing global ocean productivity assessments by exploiting state-of-the-art satellite ocean color analyses and addressing key physiological and ecological attributes of phytoplankton. Here we present the first implementation of the CAFE model that incorporates inherent optical properties derived from ocean color measurements into a mechanistic and accurate model of phytoplankton growth rates (μ) and net phytoplankton production (NPP). The CAFE model calculates NPP as the product of energy absorption (QPAR), and the efficiency (ϕμ) by which absorbed energy is converted into carbon biomass (CPhyto), while μ is calculated as NPP normalized to CPhyto. The CAFE model performance is evaluated alongside 21 other NPP models against a spatially robust and globally representative set of direct NPP measurements. This analysis demonstrates that the CAFE model explains the greatest amount of variance and has the lowest model bias relative to other NPP models analyzed with this data set. Global oceanic NPP from the CAFE model (52 Pg C m-2 yr-1) and mean division rates (0.34 day-1) are derived from climatological satellite data (2002-2014). This manuscript discusses and validates individual CAFE model parameters (e.g., QPAR and ϕμ), provides detailed sensitivity analyses, and compares the CAFE model results and parameterization to other widely cited models.

  4. Global dynamic optimization approach to predict activation in metabolic pathways.

    PubMed

    de Hijas-Liste, Gundián M; Klipp, Edda; Balsa-Canto, Eva; Banga, Julio R

    2014-01-06

    During the last decade, a number of authors have shown that the genetic regulation of metabolic networks may follow optimality principles. Optimal control theory has been successfully used to compute optimal enzyme profiles considering simple metabolic pathways. However, applying this optimal control framework to more general networks (e.g. branched networks, or networks incorporating enzyme production dynamics) yields problems that are analytically intractable and/or numerically very challenging. Further, these previous studies have only considered a single-objective framework. In this work we consider a more general multi-objective formulation and we present solutions based on recent developments in global dynamic optimization techniques. We illustrate the performance and capabilities of these techniques considering two sets of problems. First, we consider a set of single-objective examples of increasing complexity taken from the recent literature. We analyze the multimodal character of the associated non linear optimization problems, and we also evaluate different global optimization approaches in terms of numerical robustness, efficiency and scalability. Second, we consider generalized multi-objective formulations for several examples, and we show how this framework results in more biologically meaningful results. The proposed strategy was used to solve a set of single-objective case studies related to unbranched and branched metabolic networks of different levels of complexity. All problems were successfully solved in reasonable computation times with our global dynamic optimization approach, reaching solutions which were comparable or better than those reported in previous literature. Further, we considered, for the first time, multi-objective formulations, illustrating how activation in metabolic pathways can be explained in terms of the best trade-offs between conflicting objectives. This new methodology can be applied to metabolic networks with arbitrary topologies, non-linear dynamics and constraints.

  5. An updated geospatial liquefaction model for global application

    USGS Publications Warehouse

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.

    2017-01-01

    We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.

  6. Comparative health system performance in six middle-income countries: cross-sectional analysis using World Health Organization study of global ageing and health.

    PubMed

    Alshamsan, Riyadh; Lee, John Tayu; Rana, Sangeeta; Areabi, Hasan; Millett, Christopher

    2017-09-01

    Objective To assess and compare health system performance across six middle-income countries that are strengthening their health systems in pursuit of universal health coverage. Design Cross-sectional analysis from the World Health Organization Study on global AGEing and adult health, collected between 2007 and 2010. Setting Six middle-income countries: China, Ghana, India, Mexico, Russia and South Africa. Participants Nationally representative sample of adults aged 50 years and older. Main outcome measures We present achievement against key indicators of health system performance across effectiveness, cost, access, patient-centredness and equity domains. Results We found areas of poor performance in prevention and management of chronic conditions, such as hypertension control and cancer screening coverage. We also found that cost remains a barrier to healthcare access in spite of insurance schemes. Finally, we found evidence of disparities across many indicators, particularly in the effectiveness and patient centredness domains. Conclusions These findings identify important focus areas for action and shared learning as these countries move towards achieving universal health coverage.

  7. Comparative health system performance in six middle-income countries: cross-sectional analysis using World Health Organization study of global ageing and health

    PubMed Central

    Alshamsan, Riyadh; Lee, John Tayu; Rana, Sangeeta; Areabi, Hasan; Millett, Christopher

    2017-01-01

    Objective To assess and compare health system performance across six middle-income countries that are strengthening their health systems in pursuit of universal health coverage. Design Cross-sectional analysis from the World Health Organization Study on global AGEing and adult health, collected between 2007 and 2010. Setting Six middle-income countries: China, Ghana, India, Mexico, Russia and South Africa. Participants Nationally representative sample of adults aged 50 years and older. Main outcome measures We present achievement against key indicators of health system performance across effectiveness, cost, access, patient-centredness and equity domains. Results We found areas of poor performance in prevention and management of chronic conditions, such as hypertension control and cancer screening coverage. We also found that cost remains a barrier to healthcare access in spite of insurance schemes. Finally, we found evidence of disparities across many indicators, particularly in the effectiveness and patient centredness domains. Conclusions These findings identify important focus areas for action and shared learning as these countries move towards achieving universal health coverage. PMID:28895493

  8. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2015-01-27

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  9. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2014-12-30

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  10. Rich client data exploration and research prototyping for NOAA

    NASA Astrophysics Data System (ADS)

    Grossberg, Michael; Gladkova, Irina; Guch, Ingrid; Alabi, Paul; Shahriar, Fazlul; Bonev, George; Aizenman, Hannah

    2009-08-01

    Data from satellites and model simulations is increasing exponentially as observations and model computing power improve rapidly. Not only is technology producing more data, but it often comes from sources all over the world. Researchers and scientists who must collaborate are also located globally. This work presents a software design and technologies which will make it possible for groups of researchers to explore large data sets visually together without the need to download these data sets locally. The design will also make it possible to exploit high performance computing remotely and transparently to analyze and explore large data sets. Computer power, high quality sensing, and data storage capacity have improved at a rate that outstrips our ability to develop software applications that exploit these resources. It is impractical for NOAA scientists to download all of the satellite and model data that may be relevant to a given problem and the computing environments available to a given researcher range from supercomputers to only a web browser. The size and volume of satellite and model data are increasing exponentially. There are at least 50 multisensor satellite platforms collecting Earth science data. On the ground and in the sea there are sensor networks, as well as networks of ground based radar stations, producing a rich real-time stream of data. This new wealth of data would have limited use were it not for the arrival of large-scale high-performance computation provided by parallel computers, clusters, grids, and clouds. With these computational resources and vast archives available, it is now possible to analyze subtle relationships which are global, multi-modal and cut across many data sources. Researchers, educators, and even the general public, need tools to access, discover, and use vast data center archives and high performance computing through a simple yet flexible interface.

  11. Verifying Diurnal Variations of Global Precipitation in Three New Global Reanalyses

    NASA Astrophysics Data System (ADS)

    Wu, S.; Xie, P.; Sun, F.; Joyce, R.

    2013-12-01

    Diurnal variations of global precipitation and their representation in three sets of new generation global reanalyses are examined using the reprocessed and bias corrected CMORPH satellite precipitation estimates. The CMORPH satellite precipitation estimates are produced on an 8km by 8km grid over the globe (60oS-60oN) and in a 30-min interval covering a 15-year period from 1998 to the present through combining information from IR and PMW observations from all available satellites. Bias correction is performed for the raw CMORPH precipitation estimates through calibration against an gauge-based analysis over land and against the pentad GPCP analysis over ocean. The reanalyses examined here include the NCEP CFS reanalysis (CFSR), NASA/GSFC MERRA, and ECMWF Interim. The bias-corrected CMORPH is integrated from its original resolution to the reanalyses grid systems to facilitate the verification. First, quantitative agreements between the reanalysis precipitation fields and the CMORPH satellite observation are examined over the global domain. Precipitation structures associated with the large-scale topography are well reproduced when compared against the observation. Evolution of precipitation patterns with the development of transient weather systems are captured by the CFSR and two other reanalyses. The reanalyses tend to generate precipitation fields with wider raining areas and reduced intensity for heavy rainfall cases compared the observations over both land and ocean. Seasonal migration of global precipitation depicted in the 15-year CMORPH satellite observations is very well captured by the three sets of new reanalyses, although magnitude of precipitation is larger, especially in the CFSR, compared to that in the observations. In general, the three sets of new reanalyses exhibit substantial improvements in their performance to represent global precipitation distributions and variations. In particular, the new reanalyses produced precipitation variations of fine time/space scales collated in the observations. The diurnal cycle of the precipitation is reasonably well reproduced by the reanalyses over many global oceanic and land areas. Diurnal amplitude of the reanalyses precipitation, defined as the standard deviation of the 24 hourly mean values, is smaller than that in the observations over most of the oceanic regions, attributable largely to the continuous weak precipitation throughout the diurnal cycle in all of the three reanalyses. Over ocean, the pattern of diurnal variations of precipitation in the reanalyses is quite similar to that in the observations, with the timing of maximum precipitation shifted by1-3 hours. Over land especially over Africa, the reanalyses tend to produce maximum precipitation around noon, much earlier than that in the observations. Particularly noticeable is the diurnal cycle of warm season precipitation over CONUS in association with the eastward propagation of meso-scale systems distinct in the observations. None of the three new reanalyses are capable of detecting this pattern of diurnal variations. A comprehensive description and diagnostic discussions will be given at the AGU meeting.

  12. Healthcare hashtag index development: Identifying global impact in social media.

    PubMed

    Pinho-Costa, Luís; Yakubu, Kenneth; Hoedebecke, Kyle; Laranjo, Liliana; Reichel, Christofer Patrick; Colon-Gonzalez, Maria Del C; Neves, Ana Luísa; Errami, Hassna

    2016-10-01

    Create an index of global reach for healthcare hashtags and tweeters therein, filterable by topic of interest. For this proof-of-concept study we focused on the field of Primary Care and Family Medicine. Six hashtags were selected based on their importance, from the ones included in the 'Healthcare Hashtag Project'. Hashtag Global Reach (HGR) was calculated using the additive aggregation of five weighted, normalized indicator variables: number of impressions, tweets, tweeters, user locations, and user languages. Data were obtained for the last quarter of 2014 and first quarter of 2015 using Symplur Signals. Topic-specific HGR were calculated for the top 10 terms and for sets of quotes mapped after a thematic analysis. Individual Global Reach, IGR, was calculated across hashtags as additive indexes of three indicators: replies, retweets and mentions. Using the HGR score we were able to rank six selected hashtags and observe their performance throughout the study period. We found that #PrimaryCare and #FMRevolution had the highest HGR score in both quarters; interestingly, #FMChangeMakers experienced a marked increase in its global visibility during the study period. "Health Policy" was the commonest theme, while "Care", "Family" and "Health" were the most common terms. This is the first study describing an altmetric hashtag index. Assuming analytical soundness, the Index might prove generalizable to other healthcare hashtags. If released as a real-time business intelligence tool with customizable settings, it could aid publishing and strategic decisions by netizens, organizations, and analysts. IGR could also serve to augment academic evaluation and professional development. Our study demonstrates the feasibility of using an index on the global reach of healthcare hashtags and tweeters. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Tuning collective communication for Partitioned Global Address Space programming models

    DOE PAGES

    Nishtala, Rajesh; Zheng, Yili; Hargrove, Paul H.; ...

    2011-06-12

    Partitioned Global Address Space (PGAS) languages offer programmers the convenience of a shared memory programming style combined with locality control necessary to run on large-scale distributed memory systems. Even within a PGAS language programmers often need to perform global communication operations such as broadcasts or reductions, which are best performed as collective operations in which a group of threads work together to perform the operation. In this study we consider the problem of implementing collective communication within PGAS languages and explore some of the design trade-offs in both the interface and implementation. In particular, PGAS collectives have semantic issues thatmore » are different than in send–receive style message passing programs, and different implementation approaches that take advantage of the one-sided communication style in these languages. We present an implementation framework for PGAS collectives as part of the GASNet communication layer, which supports shared memory, distributed memory and hybrids. The framework supports a broad set of algorithms for each collective, over which the implementation may be automatically tuned. In conclusion, we demonstrate the benefit of optimized GASNet collectives using application benchmarks written in UPC, and demonstrate that the GASNet collectives can deliver scalable performance on a variety of state-of-the-art parallel machines including a Cray XT4, an IBM BlueGene/P, and a Sun Constellation system with InfiniBand interconnect.« less

  14. Global Health: Pediatric Neurology.

    PubMed

    Bearden, David R; Ciccone, Ornella; Patel, Archana A

    2018-04-01

    Neurologic disorders contribute significantly to both morbidity and mortality among children in resource-limited settings, but there are a few succinct studies summarizing the epidemiology of neurologic disorders in these settings. A review of available literature was performed to identify data on the prevalence, etiology, outcomes, and treatment of neurologic disorders in children in resource-limited settings. The burden of neurologic disorders in children is high in resource-limited settings. Barriers to optimal care include lack of trained personnel, limited access to diagnostic technology, and limited availability of drugs used to treat common conditions. Several solutions have been suggested to deal with these challenges including increased collaborations to train neurologists willing to practice in resource-limited settings and increased training of physician extenders or community health workers. Further studies are necessary to improve our understanding of the epidemiology of neurologic disorders in resource-limited settings. Future epidemiologic studies should incorporate multiple countries in resource-limited settings and utilize standardized definitions and methodologies to enable comparison across regions. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  15. Education for public health in Europe and its global outreach.

    PubMed

    Bjegovic-Mikanovic, Vesna; Jovic-Vranes, Aleksandra; Czabanowska, Katarzyna; Otok, Robert

    2014-01-01

    At the present time, higher education institutions dealing with education for public health in Europe and beyond are faced with a complex and comprehensive task of responding to global health challenges. Literature reviews in public health and global health and exploration of internet presentations of regional and global organisations dealing with education for public health were the main methods employed in the work presented in this paper. Higher academic institutions are searching for appropriate strategies in competences-based education, which will increase the global attractiveness of their academic programmes and courses for continuous professional development. Academic professionals are taking advantage of blended learning and new web technologies. In Europe and beyond they are opening up debates about the scope of public health and global health. Nevertheless, global health is bringing revitalisation of public health education, which is recognised as one of the core components by many other academic institutions involved in global health work. More than ever, higher academic institutions for public health are recognising the importance of institutional partnerships with various organisations and efficient modes of cooperation in regional and global networks. Networking in a global setting is bringing new opportunities, but also opening debates about global harmonisation of competence-based education to achieve functional knowledge, increase mobility of public health professionals, better employability and affordable performance. As public health opportunities and threats are increasingly global, higher education institutions in Europe and in other regions have to look beyond national boundaries and participate in networks for education, research and practice.

  16. Are global warming and ocean acidification conspiring against marine ectotherms? A meta-analysis of the respiratory effects of elevated temperature, high CO2 and their interaction.

    PubMed

    Lefevre, Sjannie

    2016-01-01

    With the occurrence of global change, research aimed at estimating the performance of marine ectotherms in a warmer and acidified future has intensified. The concept of oxygen- and capacity-limited thermal tolerance, which is inspired by the Fry paradigm of a bell-shaped increase-optimum-decrease-type response of aerobic scope to increasing temperature, but also includes proposed negative and synergistic effects of elevated CO2 levels, has been suggested as a unifying framework. The objectives of this meta-analysis were to assess the following: (i) the generality of a bell-shaped relationship between absolute aerobic scope (AAS) and temperature; (ii) to what extent elevated CO2 affects resting oxygen uptake MO2rest and AAS; and (iii) whether there is an interaction between elevated temperature and CO2. The behavioural effects of CO2 are also briefly discussed. In 31 out of 73 data sets (both acutely exposed and acclimated), AAS increased and remained above 90% of the maximum, whereas a clear thermal optimum was observed in the remaining 42 data sets. Carbon dioxide caused a significant rise in MO2rest in only 18 out of 125 data sets, and a decrease in 25, whereas it caused a decrease in AAS in four out of 18 data sets and an increase in two. The analysis did not reveal clear evidence for an overall correlation with temperature, CO2 regime or duration of CO2 treatment. When CO2 had an effect, additive rather than synergistic interactions with temperature were most common and, interestingly, they even interacted antagonistically on MO2rest and AAS. The behavioural effects of CO2 could complicate experimental determination of respiratory performance. Overall, this meta-analysis reveals heterogeneity in the responses to elevated temperature and CO2 that is not in accordance with the idea of a single unifying principle and which cannot be ignored in attempts to model and predict the impacts of global warming and ocean acidification on marine ectotherms.

  17. Are global warming and ocean acidification conspiring against marine ectotherms? A meta-analysis of the respiratory effects of elevated temperature, high CO2 and their interaction

    PubMed Central

    Lefevre, Sjannie

    2016-01-01

    Abstract With the occurrence of global change, research aimed at estimating the performance of marine ectotherms in a warmer and acidified future has intensified. The concept of oxygen- and capacity-limited thermal tolerance, which is inspired by the Fry paradigm of a bell-shaped increase–optimum–decrease-type response of aerobic scope to increasing temperature, but also includes proposed negative and synergistic effects of elevated CO2 levels, has been suggested as a unifying framework. The objectives of this meta-analysis were to assess the following: (i) the generality of a bell-shaped relationship between absolute aerobic scope (AAS) and temperature; (ii) to what extent elevated CO2 affects resting oxygen uptake MO2rest and AAS; and (iii) whether there is an interaction between elevated temperature and CO2. The behavioural effects of CO2 are also briefly discussed. In 31 out of 73 data sets (both acutely exposed and acclimated), AAS increased and remained above 90% of the maximum, whereas a clear thermal optimum was observed in the remaining 42 data sets. Carbon dioxide caused a significant rise in MO2rest in only 18 out of 125 data sets, and a decrease in 25, whereas it caused a decrease in AAS in four out of 18 data sets and an increase in two. The analysis did not reveal clear evidence for an overall correlation with temperature, CO2 regime or duration of CO2 treatment. When CO2 had an effect, additive rather than synergistic interactions with temperature were most common and, interestingly, they even interacted antagonistically on MO2rest and AAS. The behavioural effects of CO2 could complicate experimental determination of respiratory performance. Overall, this meta-analysis reveals heterogeneity in the responses to elevated temperature and CO2 that is not in accordance with the idea of a single unifying principle and which cannot be ignored in attempts to model and predict the impacts of global warming and ocean acidification on marine ectotherms. PMID:27382472

  18. The Importance of Global Health Experiences in the Development of New Cardiologists

    PubMed Central

    Abdalla, Marwah; Kovach, Neal; Liu, Connie; Damp, Julie B.; Jahangir, Eiman; Hilliard, Anthony; Gopinathannair, Rakesh; Abu-Fadel, Mazen S.; El Chami, Mikhael F.; Gafoor, Sameer; Vedanthan, Rajesh; Sanchez-Shields, Monica; George, Jon C.; Priester, Tiffany; Alasnag, Mirvat; Barker, Colin; Freeman, Andrew M.

    2016-01-01

    As the global burden of cardiovascular disease continues to increase worldwide, nurturing the development of early-career cardiologists interested in global health is essential in order to create a cadre of providers with the skill set to prevent and treat cardiovascular diseases in international settings. As such, interest in global health has increased among cardiology trainees and early-career cardiologists over the past decade. International clinical and research experiences abroad present an additional opportunity for growth and development beyond traditional cardiovascular training. We describe the American College of Cardiology International Cardiovascular Exchange Database, a new resource for cardiologists interested in pursuing short-term clinical exchange opportunities abroad, and report some of the benefits and challenges of global health cardiovascular training in both resource-limited and resource-abundant settings. PMID:26763797

  19. Influence of Thermal Contact Resistance of Aluminum Foams in Forced Convection: Experimental Analysis

    PubMed Central

    Venettacci, Simone

    2017-01-01

    In this paper, the heat transfer performances of aluminum metal foams, placed on horizontal plane surface, was evaluated in forced convection conditions. Three different types of contacts between the sample and the heated base plate have been investigated: simple contact, brazed contact and grease paste contact. First, in order to perform the study, an ad hoc experimental set-up was built. Second, the value of thermal contact resistance was estimated. The results show that both the use of a conductive paste and the brazing contact, realized by means of a copper electro-deposition, allows a great reduction of the global thermal resistance, increasing de facto the global heat transfer coefficient of almost 80%, compared to the simple contact case. Finally, it was shown that, while the contribution of thermal resistance is negligible for the cases of brazed and grease paste contact, it is significantly high for the case of simple contact. PMID:28783052

  20. A Simple Global View of Fuel Burnup

    NASA Astrophysics Data System (ADS)

    Sekimoto, Hiroshi

    2017-01-01

    Reactor physics and fuel burnup are discussed in order to obtain a simple global view of the effects of nuclear reactor characteristics to fuel cycle system performance. It may provide some idea of free thinking and overall vision, though it is still a small part of nuclear energy system. At the beginning of this lecture, governing equations for nuclear reactors are presented. Since the set of these equations is so big and complicated, it is simplified by imposing some extreme conditions and the nuclear equilibrium equation is derived. Some features of future nuclear equilibrium state are obtained by solving this equation. The contribution of a nucleus charged into reactor core to the system performance indexes such as criticality is worth for understanding the importance of each nuclide. It is called nuclide importance and can be evaluated by using the equations adjoint to the nuclear equilibrium equation. Examples of some importances and their application to criticalily search problem are presented.

  1. Optimization and performance of bifacial solar modules: A global perspective

    DOE PAGES

    Sun, Xingshu; Khan, Mohammad Ryyan; Deline, Chris; ...

    2018-02-06

    With the rapidly growing interest in bifacial photovoltaics (PV), a worldwide map of their potential performance can help assess and accelerate the global deployment of this emerging technology. However, the existing literature only highlights optimized bifacial PV for a few geographic locations or develops worldwide performance maps for very specific configurations, such as the vertical installation. It is still difficult to translate these location- and configuration-specific conclusions to a general optimized performance of this technology. In this paper, we present a global study and optimization of bifacial solar modules using a rigorous and comprehensive modeling framework. Our results demonstrate thatmore » with a low albedo of 0.25, the bifacial gain of ground-mounted bifacial modules is less than 10% worldwide. However, increasing the albedo to 0.5 and elevating modules 1 m above the ground can boost the bifacial gain to 30%. Moreover, we derive a set of empirical design rules, which optimize bifacial solar modules across the world and provide the groundwork for rapid assessment of the location-specific performance. We find that ground-mounted, vertical, east-west-facing bifacial modules will outperform their south-north-facing, optimally tilted counterparts by up to 15% below the latitude of 30 degrees, for an albedo of 0.5. The relative energy output is reversed in latitudes above 30 degrees. A detailed and systematic comparison with data from Asia, Africa, Europe, and North America validates the model presented in this paper.« less

  2. Optimization and performance of bifacial solar modules: A global perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Xingshu; Khan, Mohammad Ryyan; Deline, Chris

    With the rapidly growing interest in bifacial photovoltaics (PV), a worldwide map of their potential performance can help assess and accelerate the global deployment of this emerging technology. However, the existing literature only highlights optimized bifacial PV for a few geographic locations or develops worldwide performance maps for very specific configurations, such as the vertical installation. It is still difficult to translate these location- and configuration-specific conclusions to a general optimized performance of this technology. In this paper, we present a global study and optimization of bifacial solar modules using a rigorous and comprehensive modeling framework. Our results demonstrate thatmore » with a low albedo of 0.25, the bifacial gain of ground-mounted bifacial modules is less than 10% worldwide. However, increasing the albedo to 0.5 and elevating modules 1 m above the ground can boost the bifacial gain to 30%. Moreover, we derive a set of empirical design rules, which optimize bifacial solar modules across the world and provide the groundwork for rapid assessment of the location-specific performance. We find that ground-mounted, vertical, east-west-facing bifacial modules will outperform their south-north-facing, optimally tilted counterparts by up to 15% below the latitude of 30 degrees, for an albedo of 0.5. The relative energy output is reversed in latitudes above 30 degrees. A detailed and systematic comparison with data from Asia, Africa, Europe, and North America validates the model presented in this paper.« less

  3. Recognizing the bank robber and spotting the difference: emotional state and global vs. local attentional set.

    PubMed

    Pacheco-Unguetti, Antonia Pilar; Acosta, Alberto; Lupiáñez, Juan

    2014-01-01

    In two experiments (161 participants in total), we investigated how current mood influences processing styles (global vs. local). Participants watched a video of a bank robbery before receiving a positive, negative or neutral induction, and they performed two tasks: a face-recognition task about the bank robber as global processing measure, and a spot-the-difference task using neutral pictures (Experiment-1) or emotional scenes (Experiment-2) as local processing measure. Results showed that positive mood induction favoured a global processing style, enhancing participants' ability to correctly identify a face even when they watched the video before the mood-induction. This shows that, besides influencing encoding processes, mood state can be also related to retrieval processes. On the contrary, negative mood induction enhanced a local processing style, making easier and faster the detection of differences between nearly identical pictures, independently of their valence. This dissociation supports the hypothesis that current mood modulates processing through activation of different cognitive styles.

  4. Attitudes towards Internationalism through the Lens of Cognitive Effort, Global Mindset, and Cultural Intelligence

    ERIC Educational Resources Information Center

    Romano, Joan; Platania, Judith

    2014-01-01

    In the current study we examine attitudes towards internationalism through the lens of a specific set of constructs necessary in defining an effective global leader. One hundred fifty-nine undergraduates responded to items measuring need for cognition, cultural intelligence, and a set of items measuring the correlates of global mindset. In…

  5. High-Resolution Regional Reanalysis in China: Evaluation of 1 Year Period Experiments

    NASA Astrophysics Data System (ADS)

    Zhang, Qi; Pan, Yinong; Wang, Shuyu; Xu, Jianjun; Tang, Jianping

    2017-10-01

    Globally, reanalysis data sets are widely used in assessing climate change, validating numerical models, and understanding the interactions between the components of a climate system. However, due to the relatively coarse resolution, most global reanalysis data sets are not suitable to apply at the local and regional scales directly with the inadequate descriptions of mesoscale systems and climatic extreme incidents such as mesoscale convective systems, squall lines, tropical cyclones, regional droughts, and heat waves. In this study, by using a data assimilation system of Gridpoint Statistical Interpolation, and a mesoscale atmospheric model of Weather Research and Forecast model, we build a regional reanalysis system. This is preliminary and the first experimental attempt to construct a high-resolution reanalysis for China main land. Four regional test bed data sets are generated for year 2013 via three widely used methods (classical dynamical downscaling, spectral nudging, and data assimilation) and a hybrid method with data assimilation coupled with spectral nudging. Temperature at 2 m, precipitation, and upper level atmospheric variables are evaluated by comparing against observations for one-year-long tests. It can be concluded that the regional reanalysis with assimilation and nudging methods can better produce the atmospheric variables from surface to upper levels, and regional extreme events such as heat waves, than the classical dynamical downscaling. Compared to the ERA-Interim global reanalysis, the hybrid nudging method performs slightly better in reproducing upper level temperature and low-level moisture over China, which improves regional reanalysis data quality.

  6. The Neuropsychology of Starvation: Set-Shifting and Central Coherence in a Fasted Nonclinical Sample

    PubMed Central

    Pender, Sarah; Gilbert, Sam J.; Serpell, Lucy

    2014-01-01

    Objectives Recent research suggests certain neuropsychological deficits occur in anorexia nervosa (AN). The role of starvation in these deficits remains unclear. Studies of individuals without AN can elucidate our understanding of the effect of short-term starvation on neuropsychological performance. Methods Using a within-subjects repeated measures design, 60 healthy female participants were tested once after fasting for 18 hours, and once when satiated. Measures included two tasks to measure central coherence and a set-shifting task. Results Fasting exacerbated set-shifting difficulties on a rule-change task. Fasting was associated with stronger local and impaired global processing, indicating weaker central coherence. Conclusions Models of AN that propose a central role for set-shifting difficulties or weak central coherence should also consider the impact of short-term fasting on these processes. PMID:25338075

  7. Empirical Modeling of the Plasmasphere Dynamics Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Zhelavskaya, I. S.; Shprits, Y.; Spasojevic, M.

    2017-12-01

    We present a new empirical model for reconstructing the global dynamics of the cold plasma density distribution based only on solar wind data and geomagnetic indices. Utilizing the density database obtained using the NURD (Neural-network-based Upper hybrid Resonance Determination) algorithm for the period of October 1, 2012 - July 1, 2016, in conjunction with solar wind data and geomagnetic indices, we develop a neural network model that is capable of globally reconstructing the dynamics of the cold plasma density distribution for 2 ≤ L ≤ 6 and all local times. We validate and test the model by measuring its performance on independent datasets withheld from the training set and by comparing the model predicted global evolution with global images of He+ distribution in the Earth's plasmasphere from the IMAGE Extreme UltraViolet (EUV) instrument. We identify the parameters that best quantify the plasmasphere dynamics by training and comparing multiple neural networks with different combinations of input parameters (geomagnetic indices, solar wind data, and different durations of their time history). We demonstrate results of both local and global plasma density reconstruction. This study illustrates how global dynamics can be reconstructed from local in-situ observations by using machine learning techniques.

  8. Aligning corporate greenhouse-gas emissions targets with climate goals

    NASA Astrophysics Data System (ADS)

    Krabbe, Oskar; Linthorst, Giel; Blok, Kornelis; Crijns-Graus, Wina; van Vuuren, Detlef P.; Höhne, Niklas; Faria, Pedro; Aden, Nate; Pineda, Alberto Carrillo

    2015-12-01

    Corporate climate action is increasingly considered important in driving the transition towards a low-carbon economy. For this, it is critical to ensure translation of global goals to greenhouse-gas (GHG) emissions reduction targets at company level. At the moment, however, there is a lack of clear methods to derive consistent corporate target setting that keeps cumulative corporate GHG emissions within a specific carbon budget (for example, 550-1,300 GtCO2 between 2011 and 2050 for the 2 °C target). Here we propose a method for corporate emissions target setting that derives carbon intensity pathways for companies based on sectoral pathways from existing mitigation scenarios: the Sectoral Decarbonization Approach (SDA). These company targets take activity growth and initial performance into account. Next to target setting on company level, the SDA can be used by companies, policymakers, investors or other stakeholders as a benchmark for tracking corporate climate performance and actions, providing a mechanism for corporate accountability.

  9. Global Survey of Protein Expression during Gonadal Sex Determination in Mice*

    PubMed Central

    Ewen, Katherine; Baker, Mark; Wilhelm, Dagmar; Aitken, R. John; Koopman, Peter

    2009-01-01

    The development of an embryo as male or female depends on differentiation of the gonads as either testes or ovaries. A number of genes are known to be important for gonadal differentiation, but our understanding of the regulatory networks underpinning sex determination remains fragmentary. To advance our understanding of sexual development beyond the transcriptome level, we performed the first global survey of the mouse gonad proteome at the time of sex determination by using two-dimensional nanoflow LC-MS/MS. The resulting data set contains a total of 1037 gene products (154 non-redundant and 883 redundant proteins) identified from 620 peptides. Functional classification and biological network construction suggested that the identified proteins primarily serve in RNA post-transcriptional modification and trafficking, protein synthesis and folding, and post-translational modification. The data set contains potential novel regulators of gonad development and sex determination not revealed previously by transcriptomics and proteomics studies and more than 60 proteins with potential links to human disorders of sexual development. PMID:19617587

  10. Use of Massive Parallel Computing Libraries in the Context of Global Gravity Field Determination from Satellite Data

    NASA Astrophysics Data System (ADS)

    Brockmann, J. M.; Schuh, W.-D.

    2011-07-01

    The estimation of the global Earth's gravity field parametrized as a finite spherical harmonic series is computationally demanding. The computational effort depends on the one hand on the maximal resolution of the spherical harmonic expansion (i.e. the number of parameters to be estimated) and on the other hand on the number of observations (which are several millions for e.g. observations from the GOCE satellite missions). To circumvent these restrictions, a massive parallel software based on high-performance computing (HPC) libraries as ScaLAPACK, PBLAS and BLACS was designed in the context of GOCE HPF WP6000 and the GOCO consortium. A prerequisite for the use of these libraries is that all matrices are block-cyclic distributed on a processor grid comprised by a large number of (distributed memory) computers. Using this set of standard HPC libraries has the benefit that once the matrices are distributed across the computer cluster, a huge set of efficient and highly scalable linear algebra operations can be used.

  11. Mental chronometry with simple linear regression.

    PubMed

    Chen, J Y

    1997-10-01

    Typically, mental chronometry is performed by means of introducing an independent variable postulated to affect selectively some stage of a presumed multistage process. However, the effect could be a global one that spreads proportionally over all stages of the process. Currently, there is no method to test this possibility although simple linear regression might serve the purpose. In the present study, the regression approach was tested with tasks (memory scanning and mental rotation) that involved a selective effect and with a task (word superiority effect) that involved a global effect, by the dominant theories. The results indicate (1) the manipulation of the size of a memory set or of angular disparity affects the intercept of the regression function that relates the times for memory scanning with different set sizes or for mental rotation with different angular disparities and (2) the manipulation of context affects the slope of the regression function that relates the times for detecting a target character under word and nonword conditions. These ratify the regression approach as a useful method for doing mental chronometry.

  12. Transcriptome meta-analysis reveals common differential and global gene expression profiles in cystic fibrosis and other respiratory disorders and identifies CFTR regulators.

    PubMed

    Clarke, Luka A; Botelho, Hugo M; Sousa, Lisete; Falcao, Andre O; Amaral, Margarida D

    2015-11-01

    A meta-analysis of 13 independent microarray data sets was performed and gene expression profiles from cystic fibrosis (CF), similar disorders (COPD: chronic obstructive pulmonary disease, IPF: idiopathic pulmonary fibrosis, asthma), environmental conditions (smoking, epithelial injury), related cellular processes (epithelial differentiation/regeneration), and non-respiratory "control" conditions (schizophrenia, dieting), were compared. Similarity among differentially expressed (DE) gene lists was assessed using a permutation test, and a clustergram was constructed, identifying common gene markers. Global gene expression values were standardized using a novel approach, revealing that similarities between independent data sets run deeper than shared DE genes. Correlation of gene expression values identified putative gene regulators of the CF transmembrane conductance regulator (CFTR) gene, of potential therapeutic significance. Our study provides a novel perspective on CF epithelial gene expression in the context of other lung disorders and conditions, and highlights the contribution of differentiation/EMT and injury to gene signatures of respiratory disease. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Observation Data Model Core Components, its Implementation in the Table Access Protocol Version 1.1

    NASA Astrophysics Data System (ADS)

    Louys, Mireille; Tody, Doug; Dowler, Patrick; Durand, Daniel; Michel, Laurent; Bonnarel, Francos; Micol, Alberto; IVOA DataModel Working Group; Louys, Mireille; Tody, Doug; Dowler, Patrick; Durand, Daniel

    2017-05-01

    This document defines the core components of the Observation data model that are necessary to perform data discovery when querying data centers for astronomical observations of interest. It exposes use-cases to be carried out, explains the model and provides guidelines for its implementation as a data access service based on the Table Access Protocol (TAP). It aims at providing a simple model easy to understand and to implement by data providers that wish to publish their data into the Virtual Observatory. This interface integrates data modeling and data access aspects in a single service and is named ObsTAP. It will be referenced as such in the IVOA registries. In this document, the Observation Data Model Core Components (ObsCoreDM) defines the core components of queryable metadata required for global discovery of observational data. It is meant to allow a single query to be posed to TAP services at multiple sites to perform global data discovery without having to understand the details of the services present at each site. It defines a minimal set of basic metadata and thus allows for a reasonable cost of implementation by data providers. The combination of the ObsCoreDM with TAP is referred to as an ObsTAP service. As with most of the VO Data Models, ObsCoreDM makes use of STC, Utypes, Units and UCDs. The ObsCoreDM can be serialized as a VOTable. ObsCoreDM can make reference to more complete data models such as Characterisation DM, Spectrum DM or Simple Spectral Line Data Model (SSLDM). ObsCore shares a large set of common concepts with DataSet Metadata Data Model (Cresitello-Dittmar et al. 2016) which binds together most of the data model concepts from the above models in a comprehensive and more general frame work. This current specification on the contrary provides guidelines for implementing these concepts using the TAP protocol and answering ADQL queries. It is dedicated to global discovery.

  14. Assessing effects of variation in global climate data sets on spatial predictions from climate envelope models

    USGS Publications Warehouse

    Romañach, Stephanie; Watling, James I.; Fletcher, Robert J.; Speroterra, Carolina; Bucklin, David N.; Brandt, Laura A.; Pearlstine, Leonard G.; Escribano, Yesenia; Mazzotti, Frank J.

    2014-01-01

    Climate change poses new challenges for natural resource managers. Predictive modeling of species–environment relationships using climate envelope models can enhance our understanding of climate change effects on biodiversity, assist in assessment of invasion risk by exotic organisms, and inform life-history understanding of individual species. While increasing interest has focused on the role of uncertainty in future conditions on model predictions, models also may be sensitive to the initial conditions on which they are trained. Although climate envelope models are usually trained using data on contemporary climate, we lack systematic comparisons of model performance and predictions across alternative climate data sets available for model training. Here, we seek to fill that gap by comparing variability in predictions between two contemporary climate data sets to variability in spatial predictions among three alternative projections of future climate. Overall, correlations between monthly temperature and precipitation variables were very high for both contemporary and future data. Model performance varied across algorithms, but not between two alternative contemporary climate data sets. Spatial predictions varied more among alternative general-circulation models describing future climate conditions than between contemporary climate data sets. However, we did find that climate envelope models with low Cohen's kappa scores made more discrepant spatial predictions between climate data sets for the contemporary period than did models with high Cohen's kappa scores. We suggest conservation planners evaluate multiple performance metrics and be aware of the importance of differences in initial conditions for spatial predictions from climate envelope models.

  15. Building the foundation to generate a fundamental care standardised data set.

    PubMed

    Jeffs, Lianne; Muntlin Athlin, Asa; Needleman, Jack; Jackson, Debra; Kitson, Alison

    2018-06-01

    This paper provides an overview of the current state of performance measurement, key trends and a methodological approach to leverage in efforts to generate a standardised data set for fundamental care. Considerable transformation is occurring in health care globally with organisations focusing on achieving the quadruple aim of improving the experience of care, the health of populations, and the experience of providing care while reducing per capita costs of health care. In response, healthcare organisations are employing performance measurement and quality improvement methods to achieve the quadruple aim. Despite the plethora of measures available to health managers, there is no standardised data set and virtually no indicators reflecting how patients actually experience the delivery of fundamental care, such as nutrition, hydration, mobility, respect, education and psychosocial support. Given the linkages of fundamental care to safety and quality metrics, efforts to build the evidence base and knowledge that captures the impact of enacting fundamental care across the healthcare continuum and lifespan should include generating a routinely collected data set of relevant measures. This paper provides an overview of the current state of performance measurement, key trends and a methodological approach to leverage in efforts to generate a standardised data set for fundamental care. Standardised data sets enable comparability of data across clinical populations, healthcare sectors, geographic locations and time and provide data about care to support clinical, administrative and health policy decision-making. © 2018 John Wiley & Sons Ltd.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peressutti, D; Schipaanboord, B; Kadir, T

    Purpose: To investigate the effectiveness of atlas selection methods for improving atlas-based auto-contouring in radiotherapy planning. Methods: 275 H&N clinically delineated cases were employed as an atlas database from which atlases would be selected. A further 40 previously contoured cases were used as test patients against which atlas selection could be performed and evaluated. 26 variations of selection methods proposed in the literature and used in commercial systems were investigated. Atlas selection methods comprised either global or local image similarity measures, computed after rigid or deformable registration, combined with direct atlas search or with an intermediate template image. Workflow Boxmore » (Mirada-Medical, Oxford, UK) was used for all auto-contouring. Results on brain, brainstem, parotids and spinal cord were compared to random selection, a fixed set of 10 “good” atlases, and optimal selection by an “oracle” with knowledge of the ground truth. The Dice score and the average ranking with respect to the “oracle” were employed to assess the performance of the top 10 atlases selected by each method. Results: The fixed set of “good” atlases outperformed all of the atlas-patient image similarity-based selection methods (mean Dice 0.715 c.f. 0.603 to 0.677). In general, methods based on exhaustive comparison of local similarity measures showed better average Dice scores (0.658 to 0.677) compared to the use of either template image (0.655 to 0.672) or global similarity measures (0.603 to 0.666). The performance of image-based selection methods was found to be only slightly better than a random (0.645). Dice scores given relate to the left parotid, but similar results patterns were observed for all organs. Conclusion: Intuitively, atlas selection based on the patient CT is expected to improve auto-contouring performance. However, it was found that published approaches performed marginally better than random and use of a fixed set of representative atlases showed favourable performance. This research was funded via InnovateUK Grant 600277 as part of Eurostars Grant E!9297. DP,BS,MG,TK are employees of Mirada Medical Ltd.« less

  17. Global Conservation Priorities for Marine Turtles

    PubMed Central

    Wallace, Bryan P.; DiMatteo, Andrew D.; Bolten, Alan B.; Chaloupka, Milani Y.; Hutchinson, Brian J.; Abreu-Grobois, F. Alberto; Mortimer, Jeanne A.; Seminoff, Jeffrey A.; Amorocho, Diego; Bjorndal, Karen A.; Bourjea, Jérôme; Bowen, Brian W.; Briseño Dueñas, Raquel; Casale, Paolo; Choudhury, B. C.; Costa, Alice; Dutton, Peter H.; Fallabrino, Alejandro; Finkbeiner, Elena M.; Girard, Alexandre; Girondot, Marc; Hamann, Mark; Hurley, Brendan J.; López-Mendilaharsu, Milagros; Marcovaldi, Maria Angela; Musick, John A.; Nel, Ronel; Pilcher, Nicolas J.; Troëng, Sebastian; Witherington, Blair; Mast, Roderic B.

    2011-01-01

    Where conservation resources are limited and conservation targets are diverse, robust yet flexible priority-setting frameworks are vital. Priority-setting is especially important for geographically widespread species with distinct populations subject to multiple threats that operate on different spatial and temporal scales. Marine turtles are widely distributed and exhibit intra-specific variations in population sizes and trends, as well as reproduction and morphology. However, current global extinction risk assessment frameworks do not assess conservation status of spatially and biologically distinct marine turtle Regional Management Units (RMUs), and thus do not capture variations in population trends, impacts of threats, or necessary conservation actions across individual populations. To address this issue, we developed a new assessment framework that allowed us to evaluate, compare and organize marine turtle RMUs according to status and threats criteria. Because conservation priorities can vary widely (i.e. from avoiding imminent extinction to maintaining long-term monitoring efforts) we developed a “conservation priorities portfolio” system using categories of paired risk and threats scores for all RMUs (n = 58). We performed these assessments and rankings globally, by species, by ocean basin, and by recognized geopolitical bodies to identify patterns in risk, threats, and data gaps at different scales. This process resulted in characterization of risk and threats to all marine turtle RMUs, including identification of the world's 11 most endangered marine turtle RMUs based on highest risk and threats scores. This system also highlighted important gaps in available information that is crucial for accurate conservation assessments. Overall, this priority-setting framework can provide guidance for research and conservation priorities at multiple relevant scales, and should serve as a model for conservation status assessments and priority-setting for widespread, long-lived taxa. PMID:21969858

  18. Reference evapotranspiration forecasting based on local meteorological and global climate information screened by partial mutual information

    NASA Astrophysics Data System (ADS)

    Fang, Wei; Huang, Shengzhi; Huang, Qiang; Huang, Guohe; Meng, Erhao; Luan, Jinkai

    2018-06-01

    In this study, reference evapotranspiration (ET0) forecasting models are developed for the least economically developed regions subject to meteorological data scarcity. Firstly, the partial mutual information (PMI) capable of capturing the linear and nonlinear dependence is investigated regarding its utility to identify relevant predictors and exclude those that are redundant through the comparison with partial linear correlation. An efficient input selection technique is crucial for decreasing model data requirements. Then, the interconnection between global climate indices and regional ET0 is identified. Relevant climatic indices are introduced as additional predictors to comprise information regarding ET0, which ought to be provided by meteorological data unavailable. The case study in the Jing River and Beiluo River basins, China, reveals that PMI outperforms the partial linear correlation in excluding the redundant information, favouring the yield of smaller predictor sets. The teleconnection analysis identifies the correlation between Nino 1 + 2 and regional ET0, indicating influences of ENSO events on the evapotranspiration process in the study area. Furthermore, introducing Nino 1 + 2 as predictors helps to yield more accurate ET0 forecasts. A model performance comparison also shows that non-linear stochastic models (SVR or RF with input selection through PMI) do not always outperform linear models (MLR with inputs screen by linear correlation). However, the former can offer quite comparable performance depending on smaller predictor sets. Therefore, efforts such as screening model inputs through PMI and incorporating global climatic indices interconnected with ET0 can benefit the development of ET0 forecasting models suitable for data-scarce regions.

  19. Status and Plans for the WCRP/GEWEX Global Precipitation Climatology Project (GPCP)

    NASA Technical Reports Server (NTRS)

    Adler, Robert F.

    2007-01-01

    The Global Precipitation Climatology Project (GPCP) is an international project under the auspices of the World Climate Research Program (WCRP) and GEWEX (Global Water and Energy Experiment). The GPCP group consists of scientists from agencies and universities in various countries that work together to produce a set of global precipitation analyses at time scales of monthly, pentad, and daily. The status of the current products will be briefly summarized, focusing on the monthly analysis. Global and large regional rainfall variations and possible long-term changes are examined using the 27-year (1 979-2005) monthly dataset. In addition to global patterns associated with phenomena such as ENSO, the data set is explored for evidence of long-term change. Although the global change of precipitation in the data set is near zero, the data set does indicate a small upward change in the Tropics (25s-25N) during the period,. especially over ocean. Techniques are derived to isolate and eliminate variations due to ENS0 and major volcanic eruptions and the significance of the linear change is examined. Plans for a GPCP reprocessing for a Version 3 of products, potentially including a fine-time resolution product will be discussed. Current and future links to IPWG will also be addressed.

  20. Exercise Performance Measurement with Smartphone Embedded Sensor for Well-Being Management

    PubMed Central

    Liu, Chung-Tse; Chan, Chia-Tai

    2016-01-01

    Regular physical activity reduces the risk of many diseases and improves physical and mental health. However, physical inactivity is widespread globally. Improving physical activity levels is a global concern in well-being management. Exercise performance measurement systems have the potential to improve physical activity by providing feedback and motivation to users. We propose an exercise performance measurement system for well-being management that is based on the accumulated activity effective index (AAEI) and incorporates a smartphone-embedded sensor. The proposed system generates a numeric index that is based on users’ exercise performance: their level of physical activity and number of days spent exercising. The AAEI presents a clear number that can serve as a useful feedback and goal-setting tool. We implemented the exercise performance measurement system by using a smartphone and conducted experiments to assess the feasibility of the system and investigated the user experience. We recruited 17 participants for validating the feasibility of the measurement system and a total of 35 participants for investigating the user experience. The exercise performance measurement system showed an overall precision of 88% in activity level estimation. Users provided positive feedback about their experience with the exercise performance measurement system. The proposed system is feasible and has a positive effective on well-being management. PMID:27727188

  1. Exercise Performance Measurement with Smartphone Embedded Sensor for Well-Being Management.

    PubMed

    Liu, Chung-Tse; Chan, Chia-Tai

    2016-10-11

    Regular physical activity reduces the risk of many diseases and improves physical and mental health. However, physical inactivity is widespread globally. Improving physical activity levels is a global concern in well-being management. Exercise performance measurement systems have the potential to improve physical activity by providing feedback and motivation to users. We propose an exercise performance measurement system for well-being management that is based on the accumulated activity effective index (AAEI) and incorporates a smartphone-embedded sensor. The proposed system generates a numeric index that is based on users' exercise performance: their level of physical activity and number of days spent exercising. The AAEI presents a clear number that can serve as a useful feedback and goal-setting tool. We implemented the exercise performance measurement system by using a smartphone and conducted experiments to assess the feasibility of the system and investigated the user experience. We recruited 17 participants for validating the feasibility of the measurement system and a total of 35 participants for investigating the user experience. The exercise performance measurement system showed an overall precision of 88% in activity level estimation. Users provided positive feedback about their experience with the exercise performance measurement system. The proposed system is feasible and has a positive effective on well-being management.

  2. Scale-dependent performances of CMIP5 earth system models in simulating terrestrial vegetation carbon

    NASA Astrophysics Data System (ADS)

    Jiang, L.; Luo, Y.; Yan, Y.; Hararuk, O.

    2013-12-01

    Mitigation of global changes will depend on reliable projection for the future situation. As the major tools to predict future climate, Earth System Models (ESMs) used in Coupled Model Intercomparison Project Phase 5 (CMIP5) for the IPCC Fifth Assessment Report have incorporated carbon cycle components, which account for the important fluxes of carbon between the ocean, atmosphere, and terrestrial biosphere carbon reservoirs; and therefore are expected to provide more detailed and more certain projections. However, ESMs are never perfect; and evaluating the ESMs can help us to identify uncertainties in prediction and give the priorities for model development. In this study, we benchmarked carbon in live vegetation in the terrestrial ecosystems simulated by 19 ESMs models from CMIP5 with an observationally estimated data set of global carbon vegetation pool 'Olson's Major World Ecosystem Complexes Ranked by Carbon in Live Vegetation: An Updated Database Using the GLC2000 Land Cover Product' by Gibbs (2006). Our aim is to evaluate the ability of ESMs to reproduce the global vegetation carbon pool at different scales and what are the possible causes for the bias. We found that the performance CMIP5 ESMs is very scale-dependent. While CESM1-BGC, CESM1-CAM5, CESM1-FASTCHEM and CESM1-WACCM, and NorESM1-M and NorESM1-ME (they share the same model structure) have very similar global sums with the observation data but they usually perform poorly at grid cell and biome scale. In contrast, MIROC-ESM and MIROC-ESM-CHEM simulate the best on at grid cell and biome scale but have larger differences in global sums than others. Our results will help improve CMIP5 ESMs for more reliable prediction.

  3. Numerical and analytical investigation towards performance enhancement of a newly developed rockfall protective cable-net structure

    NASA Astrophysics Data System (ADS)

    Dhakal, S.; Bhandary, N. P.; Yatabe, R.; Kinoshita, N.

    2012-04-01

    In a previous companion paper, we presented a three-tier modelling of a particular type of rockfall protective cable-net structure (barrier), developed newly in Japan. Therein, we developed a three-dimensional, Finite Element based, nonlinear numerical model having been calibrated/back-calculated and verified with the element- and structure-level physical tests. Moreover, using a very simple, lumped-mass, single-degree-of-freedom, equivalently linear analytical model, a global-displacement-predictive correlation was devised by modifying the basic equation - obtained by combining the principles of conservation of linear momentum and energy - based on the back-analysis of the tests on the numerical model. In this paper, we use the developed models to explore the performance enhancement potential of the structure in terms of (a) the control of global displacement - possibly the major performance criterion for the proposed structure owing to a narrow space available in the targeted site, and (b) the increase in energy dissipation by the existing U-bolt-type Friction-brake Devices - which are identified to have performed weakly when integrated into the structure. A set of parametric investigations have revealed correlations to achieve the first objective in terms of the structure's mass, particularly by manipulating the wire-net's characteristics, and has additionally disclosed the effects of the impacting-block's parameters. Towards achieving the second objective, another set of parametric investigations have led to a proposal of a few innovative improvements in the constitutive behaviour (model) of the studied brake device (dissipator), in addition to an important recommendation of careful handling of the device based on the identified potential flaw.

  4. Junior doctors' extended work hours and the effects on their performance: the Irish case.

    PubMed

    Flinn, Fiona; Armstrong, Claire

    2011-04-01

    To explore the relationship between junior doctors' long working hours and their performance in a variety of cognitive and clinical decision-making tests. Also, to consider the implications of performance decrements in such tests for healthcare quality. A within-subject design was used to eliminate variation related to individual differences. Each participant was tested twice, once post call and once rested. At each session, participants were tested on cognitive functioning and clinical decision-making. The study was based on six acute Irish hospitals during 2008. Thirty junior hospital doctors, ages ranged from 23 to 30 years; of them, 17 of the participants were female and 13 were male. Measures Cognitive functioning was measured by the MindStreams Global Assessment Battery (NeuroTrax Corp., NY, USA). This is a set of computerized tests, designed for use in medical settings, that assesses performance in memory, executive function, visual spatial perception, verbal function, attention, information processing speed and motor skills. Clinical decision-making was tested using Key Features Problems. Each Key Features Problem consists of a case scenario and then three to four questions about this scenario. In an effort to make it more realistic, the speed with which participants completed the three problems was also recorded. Participants' global cognitive scores, attention, information processing speed and motor skills were significantly worse post call than when rested. They also took longer to complete clinical decision-making questions in the post-call condition and obtained lower scores than when rested. There are significant negative changes in doctors' cognitive functioning and clinical decision-making performance that appear to be attributable to long working hours. This therefore raises the important question of whether working long hours decreases healthcare quality and compromises patient safety.

  5. Metric learning for automatic sleep stage classification.

    PubMed

    Phan, Huy; Do, Quan; Do, The-Luan; Vu, Duc-Lung

    2013-01-01

    We introduce in this paper a metric learning approach for automatic sleep stage classification based on single-channel EEG data. We show that learning a global metric from training data instead of using the default Euclidean metric, the k-nearest neighbor classification rule outperforms state-of-the-art methods on Sleep-EDF dataset with various classification settings. The overall accuracy for Awake/Sleep and 4-class classification setting are 98.32% and 94.49% respectively. Furthermore, the superior accuracy is achieved by performing classification on a low-dimensional feature space derived from time and frequency domains and without the need for artifact removal as a preprocessing step.

  6. Evaluation of observation-driven evaporation algorithms: results of the WACMOS-ET project

    NASA Astrophysics Data System (ADS)

    Miralles, Diego G.; Jimenez, Carlos; Ershadi, Ali; McCabe, Matthew F.; Michel, Dominik; Hirschi, Martin; Seneviratne, Sonia I.; Jung, Martin; Wood, Eric F.; (Bob) Su, Z.; Timmermans, Joris; Chen, Xuelong; Fisher, Joshua B.; Mu, Quiaozen; Fernandez, Diego

    2015-04-01

    Terrestrial evaporation (ET) links the continental water, energy and carbon cycles. Understanding the magnitude and variability of ET at the global scale is an essential step towards reducing uncertainties in our projections of climatic conditions and water availability for the future. However, the requirement of global observational data of ET can neither be satisfied with our sparse global in-situ networks, nor with the existing satellite sensors (which cannot measure evaporation directly from space). This situation has led to the recent rise of several algorithms dedicated to deriving ET fields from satellite data indirectly, based on the combination of ET-drivers that can be observed from space (e.g. radiation, temperature, phenological variability, water content, etc.). These algorithms can either be based on physics (e.g. Priestley and Taylor or Penman-Monteith approaches) or be purely statistical (e.g., machine learning). However, and despite the efforts from different initiatives like GEWEX LandFlux (Jimenez et al., 2011; Mueller et al., 2013), the uncertainties inherent in the resulting global ET datasets remain largely unexplored, partly due to a lack of inter-product consistency in forcing data. In response to this need, the ESA WACMOS-ET project started in 2012 with the main objectives of (a) developing a Reference Input Data Set to derive and validate ET estimates, and (b) performing a cross-comparison, error characterization and validation exercise of a group of selected ET algorithms driven by this Reference Input Data Set and by in-situ forcing data. The algorithms tested are SEBS (Su et al., 2002), the Penman- Monteith approach from MODIS (Mu et al., 2011), the Priestley and Taylor JPL model (Fisher et al., 2008), the MPI-MTE model (Jung et al., 2010) and GLEAM (Miralles et al., 2011). In this presentation we will show the first results from the ESA WACMOS-ET project. The performance of the different algorithms at multiple spatial and temporal scales for the 2005-2007 reference period will be disclosed. The skill of these algorithms to close the water balance over the continents will be assessed by comparisons to runoff data. The consistency in forcing data will allow to (a) evaluate the skill of these five algorithms in producing ET over particular ecosystems, (b) facilitate the attribution of the observed differences to either algorithms or driving data, and (c) set up a solid scientific basis for the development of global long-term benchmark ET products. Project progress can be followed on our website http://wacmoset.estellus.eu. REFERENCES Fisher, J. B., Tu, K.P., and Baldocchi, D.D. Global estimates of the land-atmosphere water flux based on monthly AVHRR and ISLSCP-II data, validated at 16 FLUXNET sites. Remote Sens. Environ. 112, 901-919, 2008. Jiménez, C. et al. Global intercomparison of 12 land surface heat flux estimates. J. Geophys. Res. 116, D02102, 2011. Jung, M. et al. Recent decline in the global land evapotranspiration trend due to limited moisture supply. Nature 467, 951-954, 2010. Miralles, D.G. et al. Global land-surface evaporation estimated from satellite-based observations. Hydrol. Earth Syst. Sci. 15, 453-469, 2011. Mu, Q., Zhao, M. & Running, S.W. Improvements to a MODIS global terrestrial evapotranspiration algorithm. Remote Sens. Environ. 115, 1781-1800, 2011. Mueller, B. et al. Benchmark products for land evapotranspiration: LandFlux-EVAL multi- dataset synthesis. Hydrol. Earth Syst. Sci. 17, 3707-3720, 2013. Su, Z. The Surface Energy Balance System (SEBS) for estimation of turbulent heat fluxes. Hydrol. Earth Syst. Sci. 6, 85-99, 2002.

  7. Seasonal to interannual Arctic sea ice predictability in current global climate models

    NASA Astrophysics Data System (ADS)

    Tietsche, S.; Day, J. J.; Guemas, V.; Hurlin, W. J.; Keeley, S. P. E.; Matei, D.; Msadek, R.; Collins, M.; Hawkins, E.

    2014-02-01

    We establish the first intermodel comparison of seasonal to interannual predictability of present-day Arctic climate by performing coordinated sets of idealized ensemble predictions with four state-of-the-art global climate models. For Arctic sea ice extent and volume, there is potential predictive skill for lead times of up to 3 years, and potential prediction errors have similar growth rates and magnitudes across the models. Spatial patterns of potential prediction errors differ substantially between the models, but some features are robust. Sea ice concentration errors are largest in the marginal ice zone, and in winter they are almost zero away from the ice edge. Sea ice thickness errors are amplified along the coasts of the Arctic Ocean, an effect that is dominated by sea ice advection. These results give an upper bound on the ability of current global climate models to predict important aspects of Arctic climate.

  8. Orthogonal recursive bisection data decomposition for high performance computing in cardiac model simulations: dependence on anatomical geometry.

    PubMed

    Reumann, Matthias; Fitch, Blake G; Rayshubskiy, Aleksandr; Keller, David U J; Seemann, Gunnar; Dossel, Olaf; Pitman, Michael C; Rice, John J

    2009-01-01

    Orthogonal recursive bisection (ORB) algorithm can be used as data decomposition strategy to distribute a large data set of a cardiac model to a distributed memory supercomputer. It has been shown previously that good scaling results can be achieved using the ORB algorithm for data decomposition. However, the ORB algorithm depends on the distribution of computational load of each element in the data set. In this work we investigated the dependence of data decomposition and load balancing on different rotations of the anatomical data set to achieve optimization in load balancing. The anatomical data set was given by both ventricles of the Visible Female data set in a 0.2 mm resolution. Fiber orientation was included. The data set was rotated by 90 degrees around x, y and z axis, respectively. By either translating or by simply taking the magnitude of the resulting negative coordinates we were able to create 14 data set of the same anatomy with different orientation and position in the overall volume. Computation load ratios for non - tissue vs. tissue elements used in the data decomposition were 1:1, 1:2, 1:5, 1:10, 1:25, 1:38.85, 1:50 and 1:100 to investigate the effect of different load ratios on the data decomposition. The ten Tusscher et al. (2004) electrophysiological cell model was used in monodomain simulations of 1 ms simulation time to compare performance using the different data sets and orientations. The simulations were carried out for load ratio 1:10, 1:25 and 1:38.85 on a 512 processor partition of the IBM Blue Gene/L supercomputer. Th results show that the data decomposition does depend on the orientation and position of the anatomy in the global volume. The difference in total run time between the data sets is 10 s for a simulation time of 1 ms. This yields a difference of about 28 h for a simulation of 10 s simulation time. However, given larger processor partitions, the difference in run time decreases and becomes less significant. Depending on the processor partition size, future work will have to consider the orientation of the anatomy in the global volume for longer simulation runs.

  9. A fuzzy controller with nonlinear control rules is the sum of a global nonlinear controller and a local nonlinear PI-like controller

    NASA Technical Reports Server (NTRS)

    Ying, Hao

    1993-01-01

    The fuzzy controllers studied in this paper are the ones that employ N trapezoidal-shaped members for input fuzzy sets, Zadeh fuzzy logic and a centroid defuzzification algorithm for output fuzzy set. The author analytically proves that the structure of the fuzzy controllers is the sum of a global nonlinear controller and a local nonlinear proportional-integral-like controller. If N approaches infinity, the global controller becomes a nonlinear controller while the local controller disappears. If linear control rules are used, the global controller becomes a global two-dimensional multilevel relay which approaches a global linear proportional-integral (PI) controller as N approaches infinity.

  10. Child health in low-resource settings: pathways through UK paediatric training.

    PubMed

    Goenka, Anu; Magnus, Dan; Rehman, Tanya; Williams, Bhanu; Long, Andrew; Allen, Steve J

    2013-11-01

    UK doctors training in paediatrics benefit from experience of child health in low-resource settings. Institutions in low-resource settings reciprocally benefit from hosting UK trainees. A wide variety of opportunities exist for trainees working in low-resource settings including clinical work, research and the development of transferable skills in management, education and training. This article explores a range of pathways for UK trainees to develop experience in low-resource settings. It is important for trainees to start planning a robust rationale early for global child health activities via established pathways, in the interests of their own professional development as well as UK service provision. In the future, run-through paediatric training may include core elements of global child health, as well as designated 'tracks' for those wishing to develop their career in global child health further. Hands-on experience in low-resource settings is a critical component of these training initiatives.

  11. On the fall 2010 Enhancements of the Global Precipitation Climatology Centre's Data Sets

    NASA Astrophysics Data System (ADS)

    Becker, A. W.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Finger, P.; Rudolf, B.

    2010-12-01

    Precipitation is meanwhile a top listed parameter on the WMO GCOS list of 44 essential climate variables (ECV). This is easily justified by its crucial role to sustain any form of life on earth as major source of fresh water, its major impact on weather, climate, climate change and related issues of society’s adaption to the latter. Finally its occurrence is highly variable in space and time thus bearing the potential to trigger major flood and draught related disasters. Since its start in 1989 the Global precipitation Climatology Centre (GPCC) performs global analyses of monthly precipitation for the earth’s land-surface on the basis of in-situ measurements. The effort was inaugurated as part of the Global Precipitation Climatology Project of the WMO World Climate Research Program (WCRP). Meanwhile, the data set has continuously grown both in temporal coverage (original start of the evaluation period was 1986), as well as extent and quality of the underlying data base. The number of stations involved in the related data base has approximately doubled in the past 8 years by trespassing the 40, 60 and 80k thresholds in 2002, 2006 and 2010. Core data source of the GPCC analyses are the data from station networks operated by the National Meteorological Services worldwide; data deliveries have been received from ca. 190 countries. The GPCC integrates also other global precipitation data collections (i.e. FAO, CRU and GHCN), as well as regional data sets. Currently the Africa data set from S. Nicholson (Univ. Tallahassee) is integrated. As a result of these efforts the GPCC holds the worldwide largest and most comprehensive collection of precipitation data, which is continuously updated and extended. Due to the high spatial-temporal variability of precipitation, even its global analysis requires this high number of stations to provide for a sufficient density of measurement data on almost any place on the globe. The acquired data sets are pre-checked, reformatted and then imported into a relational data base, where they are archived separately in source specific slots, thus allowing an inter-comparison of data from the different sources. Any time new data sets are imported to the data base the metadata in the input data set are compared to those already available in the data base. In case of discrepancies (e.g. deviating coordinates), external geographical sources of information are utilized to decide whether a correction of the metadata in the data base is required or not, thus resulting in a perpetual improvement of the station meta data. The presentation shall give an account on the four major products derived from the GPCC data base, which are two near real-time ones comprising the precipitation data retrieved from the GTS, and two offline products that allow for hydro-climatological assessments. The real-time products are used for example to calibrate Satellite based precipitation measurements. To illustrate the potential of the offline (Full Data) products we will present an asessment of the strong 2010 La Nina season that has apparently caused severe weather patterns world wide, including the flood disasters in Pakistan and Wuhan, China.

  12. Multiscale moment-based technique for object matching and recognition

    NASA Astrophysics Data System (ADS)

    Thio, HweeLi; Chen, Liya; Teoh, Eam-Khwang

    2000-03-01

    A new method is proposed to extract features from an object for matching and recognition. The features proposed are a combination of local and global characteristics -- local characteristics from the 1-D signature function that is defined to each pixel on the object boundary, global characteristics from the moments that are generated from the signature function. The boundary of the object is first extracted, then the signature function is generated by computing the angle between two lines from every point on the boundary as a function of position along the boundary. This signature function is position, scale and rotation invariant (PSRI). The shape of the signature function is then described quantitatively by using moments. The moments of the signature function are the global characters of a local feature set. Using moments as the eventual features instead of the signature function reduces the time and complexity of an object matching application. Multiscale moments are implemented to produce several sets of moments that will generate more accurate matching. Basically multiscale technique is a coarse to fine procedure and makes the proposed method more robust to noise. This method is proposed to match and recognize objects under simple transformation, such as translation, scale changes, rotation and skewing. A simple logo indexing system is implemented to illustrate the performance of the proposed method.

  13. Fast kinematic ray tracing of first- and later-arriving global seismic phases

    NASA Astrophysics Data System (ADS)

    Bijwaard, Harmen; Spakman, Wim

    1999-11-01

    We have developed a ray tracing algorithm that traces first- and later-arriving global seismic phases precisely (traveltime errors of the order of 0.1 s), and with great computational efficiency (15 rays s- 1). To achieve this, we have extended and adapted two existing ray tracing techniques: a graph method and a perturbation method. The two resulting algorithms are able to trace (critically) refracted, (multiply) reflected, some diffracted (Pdiff), and (multiply) converted seismic phases in a 3-D spherical geometry, thus including the largest part of seismic phases that are commonly observed on seismograms. We have tested and compared the two methods in 2-D and 3-D Cartesian and spherical models, for which both algorithms have yielded precise paths and traveltimes. These tests indicate that only the perturbation method is computationally efficient enough to perform 3-D ray tracing on global data sets of several million phases. To demonstrate its potential for non-linear tomography, we have applied the ray perturbation algorithm to a data set of 7.6 million P and pP phases used by Bijwaard et al. (1998) for linearized tomography. This showed that the expected heterogeneity within the Earth's mantle leads to significant non-linear effects on traveltimes for 10 per cent of the applied phases.

  14. A theory for protein dynamics: Global anisotropy and a normal mode approach to local complexity

    NASA Astrophysics Data System (ADS)

    Copperman, Jeremy; Romano, Pablo; Guenza, Marina

    2014-03-01

    We propose a novel Langevin equation description for the dynamics of biological macromolecules by projecting the solvent and all atomic degrees of freedom onto a set of coarse-grained sites at the single residue level. We utilize a multi-scale approach where molecular dynamic simulations are performed to obtain equilibrium structural correlations input to a modified Rouse-Zimm description which can be solved analytically. The normal mode solution provides a minimal basis set to account for important properties of biological polymers such as the anisotropic global structure, and internal motion on a complex free-energy surface. This multi-scale modeling method predicts the dynamics of both global rotational diffusion and constrained internal motion from the picosecond to the nanosecond regime, and is quantitative when compared to both simulation trajectory and NMR relaxation times. Utilizing non-equilibrium sampling techniques and an explicit treatment of the free-energy barriers in the mode coordinates, the model is extended to include biologically important fluctuations in the microsecond regime, such as bubble and fork formation in nucleic acids, and protein domain motion. This work supported by the NSF under the Graduate STEM Fellows in K-12 Education (GK-12) program, grant DGE-0742540 and NSF grant DMR-0804145, computational support from XSEDE and ACISS.

  15. Education for public health in Europe and its global outreach

    PubMed Central

    Bjegovic-Mikanovic, Vesna; Jovic-Vranes, Aleksandra; Czabanowska, Katarzyna; Otok, Robert

    2014-01-01

    Introduction At the present time, higher education institutions dealing with education for public health in Europe and beyond are faced with a complex and comprehensive task of responding to global health challenges. Review Literature reviews in public health and global health and exploration of internet presentations of regional and global organisations dealing with education for public health were the main methods employed in the work presented in this paper. Higher academic institutions are searching for appropriate strategies in competences-based education, which will increase the global attractiveness of their academic programmes and courses for continuous professional development. Academic professionals are taking advantage of blended learning and new web technologies. In Europe and beyond they are opening up debates about the scope of public health and global health. Nevertheless, global health is bringing revitalisation of public health education, which is recognised as one of the core components by many other academic institutions involved in global health work. More than ever, higher academic institutions for public health are recognising the importance of institutional partnerships with various organisations and efficient modes of cooperation in regional and global networks. Networking in a global setting is bringing new opportunities, but also opening debates about global harmonisation of competence-based education to achieve functional knowledge, increase mobility of public health professionals, better employability and affordable performance. Conclusions As public health opportunities and threats are increasingly global, higher education institutions in Europe and in other regions have to look beyond national boundaries and participate in networks for education, research and practice. PMID:24560263

  16. A Review On Missing Value Estimation Using Imputation Algorithm

    NASA Astrophysics Data System (ADS)

    Armina, Roslan; Zain, Azlan Mohd; Azizah Ali, Nor; Sallehuddin, Roselina

    2017-09-01

    The presence of the missing value in the data set has always been a major problem for precise prediction. The method for imputing missing value needs to minimize the effect of incomplete data sets for the prediction model. Many algorithms have been proposed for countermeasure of missing value problem. In this review, we provide a comprehensive analysis of existing imputation algorithm, focusing on the technique used and the implementation of global or local information of data sets for missing value estimation. In addition validation method for imputation result and way to measure the performance of imputation algorithm also described. The objective of this review is to highlight possible improvement on existing method and it is hoped that this review gives reader better understanding of imputation method trend.

  17. The Importance of Global Health Experiences in the Development of New Cardiologists.

    PubMed

    Abdalla, Marwah; Kovach, Neal; Liu, Connie; Damp, Julie B; Jahangir, Eiman; Hilliard, Anthony; Gopinathannair, Rakesh; Abu-Fadel, Mazen S; El Chami, Mikhael F; Gafoor, Sameer; Vedanthan, Rajesh; Sanchez-Shields, Monica; George, Jon C; Priester, Tiffany; Alasnag, Mirvat; Barker, Colin; Freeman, Andrew M

    2016-06-14

    As the global burden of cardiovascular disease continues to increase worldwide, nurturing the development of early-career cardiologists interested in global health is essential to create a cadre of providers with the skill set to prevent and treat cardiovascular diseases in international settings. As such, interest in global health has increased among cardiology trainees and early-career cardiologists over the past decade. International clinical and research experiences abroad present an additional opportunity for growth and development beyond traditional cardiovascular training. We describe the American College of Cardiology International Cardiovascular Exchange Database, a new resource for cardiologists interested in pursuing short-term clinical exchange opportunities abroad, and report some of the benefits and challenges of global health cardiovascular training in both resource-limited and resource-abundant settings. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  18. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable–region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observationalmore » dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.« less

  19. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    DOE PAGES

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    2017-11-29

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable–region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observationalmore » dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.« less

  20. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    NASA Astrophysics Data System (ADS)

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    2017-11-01

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable-region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observational dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.

  1. An analysis of parameter sensitivities of preference-inspired co-evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Mansor, Maszatul M.; Purshouse, Robin C.; Fleming, Peter J.

    2015-10-01

    Many-objective optimisation problems remain challenging for many state-of-the-art multi-objective evolutionary algorithms. Preference-inspired co-evolutionary algorithms (PICEAs) which co-evolve the usual population of candidate solutions with a family of decision-maker preferences during the search have been demonstrated to be effective on such problems. However, it is unknown whether PICEAs are robust with respect to the parameter settings. This study aims to address this question. First, a global sensitivity analysis method - the Sobol' variance decomposition method - is employed to determine the relative importance of the parameters controlling the performance of PICEAs. Experimental results show that the performance of PICEAs is controlled for the most part by the number of function evaluations. Next, we investigate the effect of key parameters identified from the Sobol' test and the genetic operators employed in PICEAs. Experimental results show improved performance of the PICEAs as more preferences are co-evolved. Additionally, some suggestions for genetic operator settings are provided for non-expert users.

  2. User's guide to the Nimbus-4 backscatter ultraviolet experiment data sets

    NASA Technical Reports Server (NTRS)

    Lowrey, B. E.

    1978-01-01

    The first year's data from the Nimbus 4 backscatter ultraviolet (BUV) experiment have been archived in the National Space Science Data Center (NSSDC). Backscattered radiances in the ultraviolet measured by the satellite were used to compute the global total ozone for the period April 1970 - April 1971. The data sets now in the NSSDC are the results obtained by the Ozone Processing Team, which has processed the data with the purpose of determining the best quality of the data. There are four basic sets of data available in the NSSDC representing various stages in processing. The primary data base contains organized and cleaned data in telemetry units. The radiance data has had most of the engineering calibrations performed. The detailed total ozone data is the result of computations to obtain the total ozone; the Compressed Total Ozone data is a convenient condensation of the detailed total ozone. Product data sets are also included.

  3. Global Assessment of Seasonal Potential Distribution of Mediterranean Fruit Fly, Ceratitis capitata (Diptera: Tephritidae)

    PubMed Central

    Szyniszewska, Anna M.; Tatem, Andrew J.

    2014-01-01

    The Mediterranean fruit fly (Medfly) is one of the world's most economically damaging pests. It displays highly seasonal population dynamics, and the environmental conditions suitable for its abundance are not constant throughout the year in most places. An extensive literature search was performed to obtain the most comprehensive data on the historical and contemporary spatio-temporal occurrence of the pest globally. The database constructed contained 2328 unique geo-located entries on Medfly detection sites from 43 countries and nearly 500 unique localities, as well as information on hosts, life stages and capture method. Of these, 125 localities had information on the month when Medfly was recorded and these data were complemented by additional material found in comprehensive databases available online. Records from 1980 until present were used for medfly environmental niche modeling. Maximum Entropy Algorithm (MaxEnt) and a set of seasonally varying environmental covariates were used to predict the fundamental niche of the Medfly on a global scale. Three seasonal maps were also produced: January-April, May-August and September-December. Models performed significantly better than random achieving high accuracy scores, indicating a good discrimination of suitable versus unsuitable areas for the presence of the species. PMID:25375649

  4. Developing a global mixed-canopy, height-variable vegetation structure dataset for estimating global vegetation albedo by a clumped canopy radiative transfer scheme in the NASA Ent Terrestrial Biosphere Model and GISS GCM

    NASA Astrophysics Data System (ADS)

    Montes, Carlo; Kiang, Nancy Y.; Ni-Meister, Wenge; Yang, Wenze; Schaaf, Crystal; Aleinov, Igor; Jonas, Jeffrey A.; Zhao, Feng; Yao, Tian; Wang, Zhuosen; Sun, Qingsong; Carrer, Dominique

    2016-04-01

    Processes determining biosphere-atmosphere coupling are strongly influenced by vegetation structure. Thus, ecosystem carbon sequestration and evapotranspiration affecting global carbon and water balances will depend upon the spatial extent of vegetation, its vertical structure, and its physiological variability. To represent this globally, Dynamic Global Vegetation Models (DGVMs) coupled to General Circulation Models (GCMs) make use of satellite and/or model-based vegetation classifications often composed by homogeneous communities. This work aims at developing a new Global Vegetation Structure Dataset (GVSD) by incorporating varying vegetation heights for mixed plant communities to be used as boundary conditions to the Analytical Clumped Two-Stream (ACTS) canopy radiative transfer scheme (Ni-Meister et al., 2010) incorporated into the NASA Ent Terrestrial Biosphere Model (TBM), the DGVM coupled to the NASA Goddard Institute for Space Studies (GISS) GCM. Information sources about land surface and vegetation characteristics obtained from a number of earth observation platforms and algorithms include the Moderate Resolution Imaging Spectroradiometer (MODIS) land cover and plant functional types (PFTs) (Friedl et al., 2010), soil albedo derived from MODIS (Carrer et al., 2014), along with vegetation height from the Geoscience Laser Altimeter System (GLAS) on board ICESat (Ice, Cloud, and land Elevation Satellite) (Simard et al., 2011; Tang et al., 2014). Three widely used Leaf Area Index (LAI) products are compared as input to the GVSD and ACTS forcing in terms of vegetation albedo: Global Data Sets of Vegetation (LAI)3g (Zhu et al. 2013), Beijing Normal University LAI (Yuan et al., 2011), and MODIS MOD15A2H product (Yang et al., 2006). Further PFT partitioning is performed according to a climate classification utilizing the Climate Research Unit (CRU; Harris et al., 2013) and the NOAA Global Precipitation Climatology Centre (GPCC; Scheider et al., 2014) data. Final products are a GVSD consisting of mixed plant communities (e.g. mixed forests, savannas, mixed PFTs) following the Ecosystem Demography model (Moorcroft et al., 2001) approach represented by multi-cohort community patches at the sub-grid level of the GCM, which are ensembles of identical individuals whose differences are represented by PFTs, canopy height, density and vegetation structure sensitivity to allometric parameters. The performance of the Ent TBM in estimating VIS-NIR vegetation albedo by the new GVSD and ACTS is assessed first by comparison against the previous GISS GCM vegetation classification and prescribed Lambertian albedoes of Matthews (1984), and secondly, against MODIS global estimations and FLUXNET site-scale observations. Ultimately, this GVSD will serve as a template for community data sets, and be used as boundary conditions to the Ent TBM for prediction of biomass, carbon balances and GISS GCM climate.

  5. Global integrated drought monitoring and prediction system

    PubMed Central

    Hao, Zengchao; AghaKouchak, Amir; Nakhjiri, Navid; Farahmand, Alireza

    2014-01-01

    Drought is by far the most costly natural disaster that can lead to widespread impacts, including water and food crises. Here we present data sets available from the Global Integrated Drought Monitoring and Prediction System (GIDMaPS), which provides drought information based on multiple drought indicators. The system provides meteorological and agricultural drought information based on multiple satellite-, and model-based precipitation and soil moisture data sets. GIDMaPS includes a near real-time monitoring component and a seasonal probabilistic prediction module. The data sets include historical drought severity data from the monitoring component, and probabilistic seasonal forecasts from the prediction module. The probabilistic forecasts provide essential information for early warning, taking preventive measures, and planning mitigation strategies. GIDMaPS data sets are a significant extension to current capabilities and data sets for global drought assessment and early warning. The presented data sets would be instrumental in reducing drought impacts especially in developing countries. Our results indicate that GIDMaPS data sets reliably captured several major droughts from across the globe. PMID:25977759

  6. Global integrated drought monitoring and prediction system.

    PubMed

    Hao, Zengchao; AghaKouchak, Amir; Nakhjiri, Navid; Farahmand, Alireza

    2014-01-01

    Drought is by far the most costly natural disaster that can lead to widespread impacts, including water and food crises. Here we present data sets available from the Global Integrated Drought Monitoring and Prediction System (GIDMaPS), which provides drought information based on multiple drought indicators. The system provides meteorological and agricultural drought information based on multiple satellite-, and model-based precipitation and soil moisture data sets. GIDMaPS includes a near real-time monitoring component and a seasonal probabilistic prediction module. The data sets include historical drought severity data from the monitoring component, and probabilistic seasonal forecasts from the prediction module. The probabilistic forecasts provide essential information for early warning, taking preventive measures, and planning mitigation strategies. GIDMaPS data sets are a significant extension to current capabilities and data sets for global drought assessment and early warning. The presented data sets would be instrumental in reducing drought impacts especially in developing countries. Our results indicate that GIDMaPS data sets reliably captured several major droughts from across the globe.

  7. Global Characterization of Protein Altering Mutations in Prostate Cancer

    DTIC Science & Technology

    2011-08-01

    integrative analyses of somatic mutation with gene expression and copy number change data collected on the same samples. To date, we have performed...implications for resistance to cancer therapeutics. We have also identified a subset of genes that appear to be recurrently mutated in our discovery set, and...integrative analyses of somatic mutation with gene expression and copy number change data collected on the same samples. Body This is a “synergy” project

  8. Computer-aided global breast MR image feature analysis for prediction of tumor response to chemotherapy: performance assessment

    NASA Astrophysics Data System (ADS)

    Aghaei, Faranak; Tan, Maxine; Hollingsworth, Alan B.; Zheng, Bin; Cheng, Samuel

    2016-03-01

    Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) has been used increasingly in breast cancer diagnosis and assessment of cancer treatment efficacy. In this study, we applied a computer-aided detection (CAD) scheme to automatically segment breast regions depicting on MR images and used the kinetic image features computed from the global breast MR images acquired before neoadjuvant chemotherapy to build a new quantitative model to predict response of the breast cancer patients to the chemotherapy. To assess performance and robustness of this new prediction model, an image dataset involving breast MR images acquired from 151 cancer patients before undergoing neoadjuvant chemotherapy was retrospectively assembled and used. Among them, 63 patients had "complete response" (CR) to chemotherapy in which the enhanced contrast levels inside the tumor volume (pre-treatment) was reduced to the level as the normal enhanced background parenchymal tissues (post-treatment), while 88 patients had "partially response" (PR) in which the high contrast enhancement remain in the tumor regions after treatment. We performed the studies to analyze the correlation among the 22 global kinetic image features and then select a set of 4 optimal features. Applying an artificial neural network trained with the fusion of these 4 kinetic image features, the prediction model yielded an area under ROC curve (AUC) of 0.83+/-0.04. This study demonstrated that by avoiding tumor segmentation, which is often difficult and unreliable, fusion of kinetic image features computed from global breast MR images without tumor segmentation can also generate a useful clinical marker in predicting efficacy of chemotherapy.

  9. The internal gravity wave spectrum in two high-resolution global ocean models

    NASA Astrophysics Data System (ADS)

    Arbic, B. K.; Ansong, J. K.; Buijsman, M. C.; Kunze, E. L.; Menemenlis, D.; Müller, M.; Richman, J. G.; Savage, A.; Shriver, J. F.; Wallcraft, A. J.; Zamudio, L.

    2016-02-01

    We examine the internal gravity wave (IGW) spectrum in two sets of high-resolution global ocean simulations that are forced concurrently by atmospheric fields and the astronomical tidal potential. We analyze global 1/12th and 1/25th degree HYCOM simulations, and global 1/12th, 1/24th, and 1/48th degree simulations of the MITgcm. We are motivated by the central role that IGWs play in ocean mixing, by operational considerations of the US Navy, which runs HYCOM as an ocean forecast model, and by the impact of the IGW continuum on the sea surface height (SSH) measurements that will be taken by the planned NASA/CNES SWOT wide-swath altimeter mission. We (1) compute the IGW horizontal wavenumber-frequency spectrum of kinetic energy, and interpret the results with linear dispersion relations computed from the IGW Sturm-Liouville problem, (2) compute and similarly interpret nonlinear spectral kinetic energy transfers in the IGW band, (3) compute and similarly interpret IGW contributions to SSH variance, (4) perform comparisons of modeled IGW kinetic energy frequency spectra with moored current meter observations, and (5) perform comparisons of modeled IGW kinetic energy vertical wavenumber-frequency spectra with moored observations. This presentation builds upon our work in Muller et al. (2015, GRL), who performed tasks (1), (2), and (4) in 1/12th and 1/25th degree HYCOM simulations, for one region of the North Pacific. New for this presentation are tasks (3) and (5), the inclusion of MITgcm solutions, and the analysis of additional ocean regions.

  10. Evolving Multi Rover Systems in Dynamic and Noisy Environments

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Agogino, Adrian

    2005-01-01

    In this chapter, we address how to evolve control strategies for a collective: a set of entities that collectively strives to maximize a global evaluation function that rates the performance of the full system. Addressing such problems by directly applying a global evolutionary algorithm to a population of collectives is unworkable because the search space is prohibitively large. Instead, we focus on evolving control policies for each member of the collective, where each member is trying to maximize the fitness of its own population. The main difficulty with this approach is creating fitness evaluation functions for the members of the collective that induce the collective to achieve high performance with respect to the system level goal. To overcome this difficulty, we derive member evaluation functions that are both aligned with the global evaluation function (ensuring that members trying to achieve high fitness results in a collective with high fitness) and sensitive to the fitness of each member (a member's fitness depends more on its own actions than on actions of other members). In a difficult rover coordination problem in dynamic and noisy environments, we show how to construct evaluation functions that lead to good collective behavior. The control policy evolved using aligned and member-sensitive evaluations outperforms global evaluation methods by up to a factor of four. in addition we show that the collective continues to perform well in the presence of high noise levels and when the environment is highly dynamic. More notably, in the presence of a larger number of rovers or rovers with noisy sensors, the improvements due to the proposed method become significantly more pronounced.

  11. Parameter Estimation and Sensitivity Analysis of an Urban Surface Energy Balance Parameterization at a Tropical Suburban Site

    NASA Astrophysics Data System (ADS)

    Harshan, S.; Roth, M.; Velasco, E.

    2014-12-01

    Forecasting of the urban weather and climate is of great importance as our cities become more populated and considering the combined effects of global warming and local land use changes which make urban inhabitants more vulnerable to e.g. heat waves and flash floods. In meso/global scale models, urban parameterization schemes are used to represent the urban effects. However, these schemes require a large set of input parameters related to urban morphological and thermal properties. Obtaining all these parameters through direct measurements are usually not feasible. A number of studies have reported on parameter estimation and sensitivity analysis to adjust and determine the most influential parameters for land surface schemes in non-urban areas. Similar work for urban areas is scarce, in particular studies on urban parameterization schemes in tropical cities have so far not been reported. In order to address above issues, the town energy balance (TEB) urban parameterization scheme (part of the SURFEX land surface modeling system) was subjected to a sensitivity and optimization/parameter estimation experiment at a suburban site in, tropical Singapore. The sensitivity analysis was carried out as a screening test to identify the most sensitive or influential parameters. Thereafter, an optimization/parameter estimation experiment was performed to calibrate the input parameter. The sensitivity experiment was based on the "improved Sobol's global variance decomposition method" . The analysis showed that parameters related to road, roof and soil moisture have significant influence on the performance of the model. The optimization/parameter estimation experiment was performed using the AMALGM (a multi-algorithm genetically adaptive multi-objective method) evolutionary algorithm. The experiment showed a remarkable improvement compared to the simulations using the default parameter set. The calibrated parameters from this optimization experiment can be used for further model validation studies to identify inherent deficiencies in model physics.

  12. LandScan 2016 High-Resolution Global Population Data Set

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bright, Edward A; Rose, Amy N; Urban, Marie L

    The LandScan data set is a worldwide population database compiled on a 30" x 30" latitude/longitude grid. Census counts (at sub-national level) were apportioned to each grid cell based on likelihood coefficients, which are based on land cover, slope, road proximity, high-resolution imagery, and other data sets. The LandScan data set was developed as part of Oak Ridge National Laboratory (ORNL) Global Population Project for estimating ambient populations at risk.

  13. Word embeddings and recurrent neural networks based on Long-Short Term Memory nodes in supervised biomedical word sense disambiguation.

    PubMed

    Jimeno Yepes, Antonio

    2017-09-01

    Word sense disambiguation helps identifying the proper sense of ambiguous words in text. With large terminologies such as the UMLS Metathesaurus ambiguities appear and highly effective disambiguation methods are required. Supervised learning algorithm methods are used as one of the approaches to perform disambiguation. Features extracted from the context of an ambiguous word are used to identify the proper sense of such a word. The type of features have an impact on machine learning methods, thus affect disambiguation performance. In this work, we have evaluated several types of features derived from the context of the ambiguous word and we have explored as well more global features derived from MEDLINE using word embeddings. Results show that word embeddings improve the performance of more traditional features and allow as well using recurrent neural network classifiers based on Long-Short Term Memory (LSTM) nodes. The combination of unigrams and word embeddings with an SVM sets a new state of the art performance with a macro accuracy of 95.97 in the MSH WSD data set. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Heidelberg Retina Tomograph 3 machine learning classifiers for glaucoma detection

    PubMed Central

    Townsend, K A; Wollstein, G; Danks, D; Sung, K R; Ishikawa, H; Kagemann, L; Gabriele, M L; Schuman, J S

    2010-01-01

    Aims To assess performance of classifiers trained on Heidelberg Retina Tomograph 3 (HRT3) parameters for discriminating between healthy and glaucomatous eyes. Methods Classifiers were trained using HRT3 parameters from 60 healthy subjects and 140 glaucomatous subjects. The classifiers were trained on all 95 variables and smaller sets created with backward elimination. Seven types of classifiers, including Support Vector Machines with radial basis (SVM-radial), and Recursive Partitioning and Regression Trees (RPART), were trained on the parameters. The area under the ROC curve (AUC) was calculated for classifiers, individual parameters and HRT3 glaucoma probability scores (GPS). Classifier AUCs and leave-one-out accuracy were compared with the highest individual parameter and GPS AUCs and accuracies. Results The highest AUC and accuracy for an individual parameter were 0.848 and 0.79, for vertical cup/disc ratio (vC/D). For GPS, global GPS performed best with AUC 0.829 and accuracy 0.78. SVM-radial with all parameters showed significant improvement over global GPS and vC/ D with AUC 0.916 and accuracy 0.85. RPART with all parameters provided significant improvement over global GPS with AUC 0.899 and significant improvement over global GPS and vC/D with accuracy 0.875. Conclusions Machine learning classifiers of HRT3 data provide significant enhancement over current methods for detection of glaucoma. PMID:18523087

  15. Setting research priorities by applying the combined approach matrix.

    PubMed

    Ghaffar, Abdul

    2009-04-01

    Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.

  16. Promoting medical students’ reflection on competencies to advance a global health equities curriculum

    PubMed Central

    2014-01-01

    Background The move to frame medical education in terms of competencies – the extent to which trainees “can do” a professional responsibility - is congruent with calls for accountability in medical education. However, the focus on competencies might be a poor fit with curricula intended to prepare students for responsibilities not emphasized in traditional medical education. This study examines an innovative approach to the use of potential competency expectations related to advancing global health equity to promote students’ reflections and to inform curriculum development. Methods In 2012, 32 medical students were admitted into a newly developed Global Health and Disparities (GHD) Path of Excellence. The GHD program takes the form of mentored co-curricular activities built around defined competencies related to professional development and leadership skills intended to ameliorate health disparities in medically underserved settings, both domestically and globally. Students reviewed the GHD competencies from two perspectives: a) their ability to perform the identified competencies that they perceived themselves as holding as they began the GHD program and b) the extent to which they perceived that their future career would require these responsibilities. For both sets of assessments the response scale ranged from “Strongly Disagree” to “Strongly Agree.” Wilcoxon’s paired T-tests compared individual students’ ordinal rating of their current level of ability to their perceived need for competence that they anticipated their careers would require. Statistical significance was set at p < .01. Results Students’ ratings ranged from “strongly disagree” to “strongly agree” that they could perform the defined GHD-related competencies. However, on most competencies, at least 50 % of students indicated that the stated competencies were beyond their present ability level. For each competency, the results of Wilcoxon paired T-tests indicate – at statistically significant levels - that students perceive more need in their careers for GHD-program defined competencies than they currently possess. Conclusion This study suggests congruence between student and program perceptions of the scope of practice required for GHD. Students report the need for enhanced skill levels in the careers they anticipate. This approach to formulating and reflecting on competencies will guide the program’s design of learning experiences aligned with students’ career goals. PMID:24886229

  17. Evaluating a Satellite-derived Time Series of Inundation Dynamics

    NASA Astrophysics Data System (ADS)

    Matthews, E.; Papa, F.; Prigent, C.; McDonald, K.

    2006-12-01

    A new data set of inundation dynamics derived from a suite of satellites (Prigent et al.; Papa et al.) provides the first global, multi-year observations of monthly inundation extent. Initial global and regional evaluation of the data set using data on wetland/vegetation distributions from traditional and remote-sensing sources, GCPC rainfall, and altimeter-derived river heights indicates reasonable spatial distributions and seasonality. We extend the evaluation of this new data set - using independent multi-date, high-resolution satellite observations of inundated ecosystems and freeze-thaw dynamics, as well as climate data - focusing on a variety of boreal and tropical ecosystems representative of global wetlands. The goal is to investigate the strengths of the new data set, and develop strategies for improving weaknesses where identified.

  18. Towards identification of relevant variables in the observed aerosol optical depth bias between MODIS and AERONET observations

    NASA Astrophysics Data System (ADS)

    Malakar, N. K.; Lary, D. J.; Gencaga, D.; Albayrak, A.; Wei, J.

    2013-08-01

    Measurements made by satellite remote sensing, Moderate Resolution Imaging Spectroradiometer (MODIS), and globally distributed Aerosol Robotic Network (AERONET) are compared. Comparison of the two datasets measurements for aerosol optical depth values show that there are biases between the two data products. In this paper, we present a general framework towards identifying relevant set of variables responsible for the observed bias. We present a general framework to identify the possible factors influencing the bias, which might be associated with the measurement conditions such as the solar and sensor zenith angles, the solar and sensor azimuth, scattering angles, and surface reflectivity at the various measured wavelengths, etc. Specifically, we performed analysis for remote sensing Aqua-Land data set, and used machine learning technique, neural network in this case, to perform multivariate regression between the ground-truth and the training data sets. Finally, we used mutual information between the observed and the predicted values as the measure of similarity to identify the most relevant set of variables. The search is brute force method as we have to consider all possible combinations. The computations involves a huge number crunching exercise, and we implemented it by writing a job-parallel program.

  19. An Improved Empirical Harmonic Model of the Celestial Intermediate Pole Offsets from a Global VLBI Solution

    NASA Astrophysics Data System (ADS)

    Belda, Santiago; Heinkelmann, Robert; Ferrándiz, José M.; Karbon, Maria; Nilsson, Tobias; Schuh, Harald

    2017-10-01

    Very Long Baseline Interferometry (VLBI) is the only space geodetic technique capable of measuring all the Earth orientation parameters (EOP) accurately and simultaneously. Modeling the Earth's rotational motion in space within the stringent consistency goals of the Global Geodetic Observing System (GGOS) makes VLBI observations essential for constraining the rotation theories. However, the inaccuracy of early VLBI data and the outdated products could cause non-compliance with these goals. In this paper, we perform a global VLBI analysis of sessions with different processing settings to determine a new set of empirical corrections to the precession offsets and rates, and to the amplitudes of a wide set of terms included in the IAU 2006/2000A precession-nutation theory. We discuss the results in terms of consistency, systematic errors, and physics of the Earth. We find that the largest improvements w.r.t. the values from IAU 2006/2000A precession-nutation theory are associated with the longest periods (e.g., 18.6-yr nutation). A statistical analysis of the residuals shows that the provided corrections attain an error reduction at the level of 15 μas. Additionally, including a Free Core Nutation (FCN) model into a priori Celestial Pole Offsets (CPOs) provides the lowest Weighted Root Mean Square (WRMS) of residuals. We show that the CPO estimates are quite insensitive to TRF choice, but slightly sensitive to the a priori EOP and the inclusion of different VLBI sessions. Finally, the remaining residuals reveal two apparent retrograde signals with periods of nearly 2069 and 1034 days.

  20. Indicators of Family Care for Development for Use in Multicountry Surveys

    PubMed Central

    Kariger, Patricia; Engle, Patrice; Britto, Pia M. Rebello; Sywulka, Sara M.; Menon, Purnima

    2012-01-01

    Indicators of family care for development are essential for ascertaining whether families are providing their children with an environment that leads to positive developmental outcomes. This project aimed to develop indicators from a set of items, measuring family care practices and resources important for caregiving, for use in epidemiologic surveys in developing countries. A mixed method (quantitative and qualitative) design was used for item selection and evaluation. Qualitative and quantitative analyses were conducted to examine the validity of candidate items in several country samples. Qualitative methods included the use of global expert panels to identify and evaluate the performance of each candidate item as well as in-country focus groups to test the content validity of the items. The quantitative methods included analyses of item-response distributions, using bivariate techniques. The selected items measured two family care practices (support for learning/stimulating environment and limit-setting techniques) and caregiving resources (adequacy of the alternate caregiver when the mother worked). Six play-activity items, indicative of support for learning/stimulating environment, were included in the core module of UNICEF's Multiple Cluster Indictor Survey 3. The other items were included in optional modules. This project provided, for the first time, a globally-relevant set of items for assessing family care practices and resources in epidemiological surveys. These items have multiple uses, including national monitoring and cross-country comparisons of the status of family care for development used globally. The obtained information will reinforce attention to efforts to improve the support for development of children. PMID:23304914

  1. A semantic data dictionary method for database schema integration in CIESIN

    NASA Astrophysics Data System (ADS)

    Hinds, N.; Huang, Y.; Ravishankar, C.

    1993-08-01

    CIESIN (Consortium for International Earth Science Information Network) is funded by NASA to investigate the technology necessary to integrate and facilitate the interdisciplinary use of Global Change information. A clear of this mission includes providing a link between the various global change data sets, in particular the physical sciences and the human (social) sciences. The typical scientist using the CIESIN system will want to know how phenomena in an outside field affects his/her work. For example, a medical researcher might ask: how does air-quality effect emphysema? This and many similar questions will require sophisticated semantic data integration. The researcher who raised the question may be familiar with medical data sets containing emphysema occurrences. But this same investigator may know little, if anything, about the existance or location of air-quality data. It is easy to envision a system which would allow that investigator to locate and perform a ``join'' on two data sets, one containing emphysema cases and the other containing air-quality levels. No such system exists today. One major obstacle to providing such a system will be overcoming the heterogeneity which falls into two broad categories. ``Database system'' heterogeneity involves differences in data models and packages. ``Data semantic'' heterogeneity involves differences in terminology between disciplines which translates into data semantic issues, and varying levels of data refinement, from raw to summary. Our work investigates a global data dictionary mechanism to facilitate a merged data service. Specially, we propose using a semantic tree during schema definition to aid in locating and integrating heterogeneous databases.

  2. Local and global inhibition in bilingual word production: fMRI evidence from Chinese-English bilinguals

    PubMed Central

    Guo, Taomei; Liu, Hongyan; Misra, Maya; Kroll, Judith F.

    2011-01-01

    The current study examined the neural correlates associated with local and global inhibitory processes used by bilinguals to resolve interference between competing responses. Two groups of participants completed both blocked and mixed picture naming tasks while undergoing functional magnetic resonance imaging (fMRI). One group first named a set of pictures in L1, and then named the same pictures in L2. The other group first named pictures in L2, and then in L1. After the blocked naming tasks, both groups performed a mixed language naming task (i.e., naming pictures in either language according to a cue). The comparison between the blocked and mixed naming tasks, collapsed across groups, was defined as the local switching effect, while the comparison between blocked naming in each language was defined as the global switching effect. Distinct patterns of neural activation were found for local inhibition as compared to global inhibition in bilingual word production. Specifically, the results suggest that the dorsal anterior cingulate cortex (ACC) and the supplementary motor area (SMA) play important roles in local inhibition, while the dorsal left frontal gyrus and parietal cortex are important for global inhibition. PMID:21440072

  3. A fast global fitting algorithm for fluorescence lifetime imaging microscopy based on image segmentation.

    PubMed

    Pelet, S; Previte, M J R; Laiho, L H; So, P T C

    2004-10-01

    Global fitting algorithms have been shown to improve effectively the accuracy and precision of the analysis of fluorescence lifetime imaging microscopy data. Global analysis performs better than unconstrained data fitting when prior information exists, such as the spatial invariance of the lifetimes of individual fluorescent species. The highly coupled nature of global analysis often results in a significantly slower convergence of the data fitting algorithm as compared with unconstrained analysis. Convergence speed can be greatly accelerated by providing appropriate initial guesses. Realizing that the image morphology often correlates with fluorophore distribution, a global fitting algorithm has been developed to assign initial guesses throughout an image based on a segmentation analysis. This algorithm was tested on both simulated data sets and time-domain lifetime measurements. We have successfully measured fluorophore distribution in fibroblasts stained with Hoechst and calcein. This method further allows second harmonic generation from collagen and elastin autofluorescence to be differentiated in fluorescence lifetime imaging microscopy images of ex vivo human skin. On our experimental measurement, this algorithm increased convergence speed by over two orders of magnitude and achieved significantly better fits. Copyright 2004 Biophysical Society

  4. Global chemical profiling based quality evaluation approach of rhubarb using ultra performance liquid chromatography with tandem quadrupole time-of-flight mass spectrometry.

    PubMed

    Zhang, Li; Liu, Haiyu; Qin, Lingling; Zhang, Zhixin; Wang, Qing; Zhang, Qingqing; Lu, Zhiwei; Wei, Shengli; Gao, Xiaoyan; Tu, Pengfei

    2015-02-01

    A global chemical profiling based quality evaluation approach using ultra performance liquid chromatography with tandem quadrupole time-of-flight mass spectrometry was developed for the quality evaluation of three rhubarb species, including Rheum palmatum L., Rheum tanguticum Maxim. ex Balf., and Rheum officinale Baill. Considering that comprehensive detection of chemical components is crucial for the global profile, a systemic column performance evaluation method was developed. Based on this, a Cortecs column was used to acquire the chemical profile, and Chempattern software was employed to conduct similarity evaluation and hierarchical cluster analysis. The results showed R. tanguticum could be differentiated from R. palmatum and R. officinale at the similarity value 0.65, but R. palmatum and R. officinale could not be distinguished effectively. Therefore, a common pattern based on three rhubarb species was developed to conduct the quality evaluation, and the similarity value 0.50 was set as an appropriate threshold to control the quality of rhubarb. A total of 88 common peaks were identified by their accurate mass and fragmentation, and partially verified by reference standards. Through the verification, the newly developed method could be successfully used for evaluating the holistic quality of rhubarb. It would provide a reference for the quality control of other herbal medicines. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Complexity transitions in global algorithms for sparse linear systems over finite fields

    NASA Astrophysics Data System (ADS)

    Braunstein, A.; Leone, M.; Ricci-Tersenghi, F.; Zecchina, R.

    2002-09-01

    We study the computational complexity of a very basic problem, namely that of finding solutions to a very large set of random linear equations in a finite Galois field modulo q. Using tools from statistical mechanics we are able to identify phase transitions in the structure of the solution space and to connect them to the changes in the performance of a global algorithm, namely Gaussian elimination. Crossing phase boundaries produces a dramatic increase in memory and CPU requirements necessary for the algorithms. In turn, this causes the saturation of the upper bounds for the running time. We illustrate the results on the specific problem of integer factorization, which is of central interest for deciphering messages encrypted with the RSA cryptosystem.

  6. Towards Validation of SMAP: SMAPEX-4 & -5

    NASA Technical Reports Server (NTRS)

    Ye, Nan; Walker, Jeffrey; Wu, Xiaoling; Jackson, Thomas; Renzullo, Luigi; Merlin, Olivier; Rudiger, Christoph; Entekhabi, Dara; DeJeu, Richard; Kim, Edward

    2016-01-01

    The L-band (1 - 2 GHz) microwave remote sensing has been widely acknowledged as the most promising method to monitor regional to global soil moisture. Consequently, the Soil Moisture Active Passive (SMAP) satellite applied this technique to provide global soil moisture every 2 to 3 days. To verify the performance of SMAP, the fourth and fifth campaign of SMAP Experiments (SMAPEx-4 -5) were carried out at the beginning of the SMAP operational phase in the Murrumbidgee River catchment, southeast Australia. The airborne radar and radiometer observations together with ground sampling on soil moisture, vegetation water content, and surface roughness were collected in coincidence with SMAP overpasses. The SMAPEx-4 and -5 data sets will benefit to SMAP post-launch calibration andvalidation under Australian land surface conditions.

  7. Localized Principal Component Analysis based Curve Evolution: A Divide and Conquer Approach

    PubMed Central

    Appia, Vikram; Ganapathy, Balaji; Yezzi, Anthony; Faber, Tracy

    2014-01-01

    We propose a novel localized principal component analysis (PCA) based curve evolution approach which evolves the segmenting curve semi-locally within various target regions (divisions) in an image and then combines these locally accurate segmentation curves to obtain a global segmentation. The training data for our approach consists of training shapes and associated auxiliary (target) masks. The masks indicate the various regions of the shape exhibiting highly correlated variations locally which may be rather independent of the variations in the distant parts of the global shape. Thus, in a sense, we are clustering the variations exhibited in the training data set. We then use a parametric model to implicitly represent each localized segmentation curve as a combination of the local shape priors obtained by representing the training shapes and the masks as a collection of signed distance functions. We also propose a parametric model to combine the locally evolved segmentation curves into a single hybrid (global) segmentation. Finally, we combine the evolution of these semilocal and global parameters to minimize an objective energy function. The resulting algorithm thus provides a globally accurate solution, which retains the local variations in shape. We present some results to illustrate how our approach performs better than the traditional approach with fully global PCA. PMID:25520901

  8. Assessing the Performance of GPS Precise Point Positioning Under Different Geomagnetic Storm Conditions during Solar Cycle 24.

    PubMed

    Luo, Xiaomin; Gu, Shengfeng; Lou, Yidong; Xiong, Chao; Chen, Biyan; Jin, Xueyuan

    2018-06-01

    The geomagnetic storm, which is an abnormal space weather phenomenon, can sometimes severely affect GPS signal propagation, thereby impacting the performance of GPS precise point positioning (PPP). However, the investigation of GPS PPP accuracy over the global scale under different geomagnetic storm conditions is very limited. This paper for the first time presents the performance of GPS dual-frequency (DF) and single-frequency (SF) PPP under moderate, intense, and super storms conditions during solar cycle 24 using a large data set collected from about 500 international GNSS services (IGS) stations. The global root mean square (RMS) maps of GPS PPP results show that stations with degraded performance are mainly distributed at high-latitude, and the degradation level generally depends on the storm intensity. The three-dimensional (3D) RMS of GPS DF PPP for high-latitude during moderate, intense, and super storms are 0.393 m, 0.680 m and 1.051 m, respectively, with respect to only 0.163 m on quiet day. RMS errors of mid- and low-latitudes show less dependence on the storm intensities, with values less than 0.320 m, compared to 0.153 m on quiet day. Compared with DF PPP, the performance of GPS SF PPP is inferior regardless of quiet or disturbed conditions. The degraded performance of GPS positioning during geomagnetic storms is attributed to the increased ionospheric disturbances, which have been confirmed by our global rate of TEC index (ROTI) maps. Ionospheric disturbances not only lead to the deteriorated ionospheric correction but also to the frequent cycle-slip occurrence. Statistical results show that, compared with that on quiet day, the increased cycle-slip occurrence are 13.04%, 56.52%, and 69.57% under moderate, intense, and super storms conditions, respectively.

  9. Improved Algorithms for Accurate Retrieval of UV - Visible Diffuse Attenuation Coefficients in Optically Complex, Inshore Waters

    NASA Technical Reports Server (NTRS)

    Cao, Fang; Fichot, Cedric G.; Hooker, Stanford B.; Miller, William L.

    2014-01-01

    Photochemical processes driven by high-energy ultraviolet radiation (UVR) in inshore, estuarine, and coastal waters play an important role in global bio geochemical cycles and biological systems. A key to modeling photochemical processes in these optically complex waters is an accurate description of the vertical distribution of UVR in the water column which can be obtained using the diffuse attenuation coefficients of down welling irradiance (Kd()). The Sea UV Sea UVc algorithms (Fichot et al., 2008) can accurately retrieve Kd ( 320, 340, 380,412, 443 and 490 nm) in oceanic and coastal waters using multispectral remote sensing reflectances (Rrs(), Sea WiFS bands). However, SeaUVSeaUVc algorithms are currently not optimized for use in optically complex, inshore waters, where they tend to severely underestimate Kd(). Here, a new training data set of optical properties collected in optically complex, inshore waters was used to re-parameterize the published SeaUVSeaUVc algorithms, resulting in improved Kd() retrievals for turbid, estuarine waters. Although the updated SeaUVSeaUVc algorithms perform best in optically complex waters, the published SeaUVSeaUVc models still perform well in most coastal and oceanic waters. Therefore, we propose a composite set of SeaUVSeaUVc algorithms, optimized for Kd() retrieval in almost all marine systems, ranging from oceanic to inshore waters. The composite algorithm set can retrieve Kd from ocean color with good accuracy across this wide range of water types (e.g., within 13 mean relative error for Kd(340)). A validation step using three independent, in situ data sets indicates that the composite SeaUVSeaUVc can generate accurate Kd values from 320 490 nm using satellite imagery on a global scale. Taking advantage of the inherent benefits of our statistical methods, we pooled the validation data with the training set, obtaining an optimized composite model for estimating Kd() in UV wavelengths for almost all marine waters. This optimized composite set of SeaUVSeaUVc algorithms will provide the optical community with improved ability to quantify the role of solar UV radiation in photochemical and photobiological processes in the ocean.

  10. Pyrolysis of reinforced polymer composites: Parameterizing a model for multiple compositions

    NASA Astrophysics Data System (ADS)

    Martin, Geraldine E.

    A single set of material properties was developed to describe the pyrolysis of fiberglass reinforced polyester composites at multiple composition ratios. Milligram-scale testing was performed on the unsaturated polyester (UP) resin using thermogravimetric analysis (TGA) coupled with differential scanning calorimetry (DSC) to establish and characterize an effective semi-global reaction mechanism, of three consecutive first-order reactions. Radiation-driven gasification experiments were conducted on UP resin and the fiberglass composites at compositions ranging from 41 to 54 wt% resin at external heat fluxes from 30 to 70 kW m -2. The back surface temperature was recorded with an infrared camera and used as the target for inverse analysis to determine the thermal conductivity of the systematically isolated constituent species. Manual iterations were performed in a comprehensive pyrolysis model, ThermaKin. The complete set of properties was validated for the ability to reproduce the mass loss rate during gasification testing.

  11. Stochastic Leader Gravitational Search Algorithm for Enhanced Adaptive Beamforming Technique

    PubMed Central

    Darzi, Soodabeh; Islam, Mohammad Tariqul; Tiong, Sieh Kiong; Kibria, Salehin; Singh, Mandeep

    2015-01-01

    In this paper, stochastic leader gravitational search algorithm (SL-GSA) based on randomized k is proposed. Standard GSA (SGSA) utilizes the best agents without any randomization, thus it is more prone to converge at suboptimal results. Initially, the new approach randomly choses k agents from the set of all agents to improve the global search ability. Gradually, the set of agents is reduced by eliminating the agents with the poorest performances to allow rapid convergence. The performance of the SL-GSA was analyzed for six well-known benchmark functions, and the results are compared with SGSA and some of its variants. Furthermore, the SL-GSA is applied to minimum variance distortionless response (MVDR) beamforming technique to ensure compatibility with real world optimization problems. The proposed algorithm demonstrates superior convergence rate and quality of solution for both real world problems and benchmark functions compared to original algorithm and other recent variants of SGSA. PMID:26552032

  12. Barriers to using eHealth data for clinical performance feedback in Malawi: A case study.

    PubMed

    Landis-Lewis, Zach; Manjomo, Ronald; Gadabu, Oliver J; Kam, Matthew; Simwaka, Bertha N; Zickmund, Susan L; Chimbwandira, Frank; Douglas, Gerald P; Jacobson, Rebecca S

    2015-10-01

    Sub-optimal performance of healthcare providers in low-income countries is a critical and persistent global problem. The use of electronic health information technology (eHealth) in these settings is creating large-scale opportunities to automate performance measurement and provision of feedback to individual healthcare providers, to support clinical learning and behavior change. An electronic medical record system (EMR) deployed in 66 antiretroviral therapy clinics in Malawi collects data that supervisors use to provide quarterly, clinic-level performance feedback. Understanding barriers to provision of eHealth-based performance feedback for individual healthcare providers in this setting could present a relatively low-cost opportunity to significantly improve the quality of care. The aims of this study were to identify and describe barriers to using EMR data for individualized audit and feedback for healthcare providers in Malawi and to consider how to design technology to overcome these barriers. We conducted a qualitative study using interviews, observations, and informant feedback in eight public hospitals in Malawi where an EMR system is used. We interviewed 32 healthcare providers and conducted seven hours of observation of system use. We identified four key barriers to the use of EMR data for clinical performance feedback: provider rotations, disruptions to care processes, user acceptance of eHealth, and performance indicator lifespan. Each of these factors varied across sites and affected the quality of EMR data that could be used for the purpose of generating performance feedback for individual healthcare providers. Using routinely collected eHealth data to generate individualized performance feedback shows potential at large-scale for improving clinical performance in low-resource settings. However, technology used for this purpose must accommodate ongoing changes in barriers to eHealth data use. Understanding the clinical setting as a complex adaptive system (CAS) may enable designers of technology to effectively model change processes to mitigate these barriers. Copyright © 2015. Published by Elsevier Ireland Ltd.

  13. Barriers to using eHealth data for clinical performance feedback in Malawi: A case study

    PubMed Central

    Landis-Lewis, Zach; Manjomo, Ronald; Gadabu, Oliver J; Kam, Matthew; Simwaka, Bertha N; Zickmund, Susan L; Chimbwandira, Frank; Douglas, Gerald P; Jacobson, Rebecca S

    2016-01-01

    Introduction Sub-optimal performance of healthcare providers in low-income countries is a critical and persistent global problem. The use of electronic health information technology (eHealth) in these settings is creating large-scale opportunities to automate performance measurement and provision of feedback to individual healthcare providers, to support clinical learning and behavior change. An electronic medical record system (EMR) deployed in 66 antiretroviral therapy clinics in Malawi collects data that supervisors use to provide quarterly, clinic-level performance feedback. Understanding barriers to provision of eHealth-based performance feedback for individual healthcare providers in this setting could present a relatively low-cost opportunity to significantly improve the quality of care. Objective The aims of this study were to identify and describe barriers to using EMR data for individualized audit and feedback for healthcare providers in Malawi and to consider how to design technology to overcome these barriers. Methods We conducted a qualitative study using interviews, observations, and informant feedback in eight public hospitals in Malawi where an EMR is used. We interviewed 32 healthcare providers and conducted seven hours of observation of system use. Results We identified four key barriers to the use of EMR data for clinical performance feedback: provider rotations, disruptions to care processes, user acceptance of eHealth, and performance indicator lifespan. Each of these factors varied across sites and affected the quality of EMR data that could be used for the purpose of generating performance feedback for individual healthcare providers. Conclusion Using routinely collected eHealth data to generate individualized performance feedback shows potential at large-scale for improving clinical performance in low-resource settings. However, technology used for this purpose must accommodate ongoing changes in barriers to eHealth data use. Understanding the clinical setting as a complex adaptive system (CAS) may enable designers of technology to effectively model change processes to mitigate these barriers. PMID:26238704

  14. Towards monitoring land-cover and land-use changes at a global scale: the global land survey 2005

    USGS Publications Warehouse

    Gutman, G.; Byrnes, Raymond A.; Masek, J.; Covington, S.; Justice, C.; Franks, S.; Headley, Rachel

    2008-01-01

    Land cover is a critical component of the Earth system, infl uencing land-atmosphere interactions, greenhouse gas fl uxes, ecosystem health, and availability of food, fi ber, and energy for human populations. The recent Integrated Global Observations of Land (IGOL) report calls for the generation of maps documenting global land cover at resolutions between 10m and 30m at least every fi ve years (Townshend et al., in press). Moreover, despite 35 years of Landsat observations, there has not been a unifi ed global analysis of land-cover trends nor has there been a global assessment of land-cover change at Landsat-like resolution. Since the 1990s, the National Aeronautics and Space Administration (NASA) and the U.S. Geological Survey (USGS) have supported development of data sets based on global Landsat observations (Tucker et al., 2004). These land survey data sets, usually referred to as GeoCover ™, provide global, orthorectifi ed, typically cloud-free Landsat imagery centered on the years 1975, 1990, and 2000, with a preference for leaf-on conditions. Collectively, these data sets provided a consistent set of observations to assess land-cover changes at a decadal scale. These data are freely available via the Internet from the USGS Center for Earth Resources Observation and Science (EROS) (see http://earthexplorer.usgs.gov or http://glovis.usgs.gov). This has resulted in unprecedented downloads of data, which are widely used in scientifi c studies of land-cover change (e.g., Boone et al., 2007; Harris et al., 2005; Hilbert, 2006; Huang et al. 2007; Jantz et al., 2005, Kim et al., 2007; Leimgruber, 2005; Masek et al., 2006). NASA and USGS are continuing to support land-cover change research through the development of GLS2005 - an additional global Landsat assessment circa 20051 . Going beyond the earlier initiatives, this data set will establish a baseline for monitoring changes on a 5-year interval and will pave the way toward continuous global land-cover monitoring at Landsat-like resolution in the next decade.

  15. Diagnostic Performance of Tuberculosis-Specific IgG Antibody Profiles in Patients with Presumptive Tuberculosis from Two Continents.

    PubMed

    Broger, Tobias; Basu Roy, Robindra; Filomena, Angela; Greef, Charles H; Rimmele, Stefanie; Havumaki, Joshua; Danks, David; Schneiderhan-Marra, Nicole; Gray, Christen M; Singh, Mahavir; Rosenkrands, Ida; Andersen, Peter; Husar, Gregory M; Joos, Thomas O; Gennaro, Maria L; Lochhead, Michael J; Denkinger, Claudia M; Perkins, Mark D

    2017-04-01

    Development of rapid diagnostic tests for tuberculosis is a global priority. A whole proteome screen identified Mycobacterium tuberculosis antigens associated with serological responses in tuberculosis patients. We used World Health Organization (WHO) target product profile (TPP) criteria for a detection test and triage test to evaluate these antigens. Consecutive patients presenting to microscopy centers and district hospitals in Peru and to outpatient clinics at a tuberculosis reference center in Vietnam were recruited. We tested blood samples from 755 HIV-uninfected adults with presumptive pulmonary tuberculosis to measure IgG antibody responses to 57 M. tuberculosis antigens using a field-based multiplexed serological assay and a 132-antigen bead-based reference assay. We evaluated single antigen performance and models of all possible 3-antigen combinations and multiantigen combinations. Three-antigen and multiantigen models performed similarly and were superior to single antigens. With specificity set at 90% for a detection test, the best sensitivity of a 3-antigen model was 35% (95% confidence interval [CI], 31-40). With sensitivity set at 85% for a triage test, the specificity of the best 3-antigen model was 34% (95% CI, 29-40). The reference assay also did not meet study targets. Antigen performance differed significantly between the study sites for 7/22 of the best-performing antigens. Although M. tuberculosis antigens were recognized by the IgG response during tuberculosis, no single antigen or multiantigen set performance approached WHO TPP criteria for clinical utility among HIV-uninfected adults with presumed tuberculosis in high-volume, urban settings in tuberculosis-endemic countries. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  16. Developing a global mixed-canopy, height-variable vegetation structure dataset for estimating global vegetation albedo and biomass in the NASA Ent Terrestrial Biosphere Model and GISS GCM

    NASA Astrophysics Data System (ADS)

    Montes, C.; Kiang, N. Y.; Yang, W.; Ni-Meister, W.; Schaaf, C.; Aleinov, I. D.; Jonas, J.; Zhao, F. A.; Yao, T.; Wang, Z.; Sun, Q.

    2015-12-01

    Processes determining biosphere-atmosphere coupling are strongly influenced by vegetation structure. Thus, ecosystem carbon sequestration and evapotranspiration affecting global carbon and water balances will depend upon the spatial extent of vegetation, its vertical structure, and its physiological variability. To represent this globally, Dynamic Global Vegetation Models (DGVMs) coupled to General Circulation Models (GCMs) make use of satellite and/or model-based vegetation classifications often composed by homogeneous communities. This work aims at developing a new Global Vegetation Structure Dataset (GVSD) by incorporating varying vegetation heights for mixed plant communities to be used as input to the Ent Terrestrial Biosphere Model (TBM), the DGVM coupled to the NASA Goddard Institute for Space Studies (GISS) GCM. Information sources include the Moderate Resolution Imaging Spectroradiometer (MODIS) land cover and plant functional types (PFTs) (Friedl et al., 2010), vegetation height from the Geoscience Laser Altimeter System (GLAS) on board ICESat (Ice, Cloud, and land Elevation Satellite) (Simard et al., 2011; Tang et al., 2014) along with the Global Data Sets of Vegetation Leaf Area Index (LAI)3g (Zhu et al. 2013). Further PFT partitioning is performed according to a climate classification utilizing the Climate Research Unit (CRU) and the NOAA Global Precipitation Climatology Centre (GPCC) data. Final products are a GVSD consisting of mixed plant communities (e.g. mixed forests, savannas, mixed PFTs) following the Ecosystem Demography model (Moorcroft et al., 2001) approach represented by multi-cohort community patches at the sub-grid level of the GCM, which are ensembles of identical individuals whose differences are represented by PFTs, canopy height, density and vegetation structure sensitivity to allometric parameters. To assess the sensitivity of the GISS GCM to vegetation structure, we produce a range of estimates of Ent TBM biomass and plant densities by varying allometric specifications. Ultimately, this GVSD will serve as a template for community data sets, and be used as boundary conditions to the Ent TBM for prediction of canopy albedo in the Analytical Clumped Two-Stream canopy radiative transfer scheme, biomass, primary productivity, respiration, and GISS GCM climate.

  17. Building global models for fat and total protein content in raw milk based on historical spectroscopic data in the visible and short-wave near infrared range.

    PubMed

    Melenteva, Anastasiia; Galyanin, Vladislav; Savenkova, Elena; Bogomolov, Andrey

    2016-07-15

    A large set of fresh cow milk samples collected from many suppliers over a large geographical area in Russia during a year has been analyzed by optical spectroscopy in the range 400-1100 nm in accordance with previously developed scatter-based technique. The global (i.e. resistant to seasonal, genetic, regional and other variations of the milk composition) models for fat and total protein content, which were built using partial least-squares (PLS) regression, exhibit satisfactory prediction performances enabling their practical application in the dairy. The root mean-square errors of prediction (RMSEP) were 0.09 and 0.10 for fat and total protein content, respectively. The issues of raw milk analysis and multivariate modelling based on the historical spectroscopic data have been considered and approaches to the creation of global models and their transfer between the instruments have been proposed. Availability of global models should significantly facilitate the dissemination of optical spectroscopic methods for the laboratory and in-line quantitative milk analysis. Copyright © 2016. Published by Elsevier Ltd.

  18. A Comprehensive Radial Velocity Error Budget for Next Generation Doppler Spectrometers

    NASA Technical Reports Server (NTRS)

    Halverson, Samuel; Ryan, Terrien; Mahadevan, Suvrath; Roy, Arpita; Bender, Chad; Stefansson, Guomundur Kari; Monson, Andrew; Levi, Eric; Hearty, Fred; Blake, Cullen; hide

    2016-01-01

    We describe a detailed radial velocity error budget for the NASA-NSF Extreme Precision Doppler Spectrometer instrument concept NEID (NN-explore Exoplanet Investigations with Doppler spectroscopy). Such an instrument performance budget is a necessity for both identifying the variety of noise sources currently limiting Doppler measurements, and estimating the achievable performance of next generation exoplanet hunting Doppler spectrometers. For these instruments, no single source of instrumental error is expected to set the overall measurement floor. Rather, the overall instrumental measurement precision is set by the contribution of many individual error sources. We use a combination of numerical simulations, educated estimates based on published materials, extrapolations of physical models, results from laboratory measurements of spectroscopic subsystems, and informed upper limits for a variety of error sources to identify likely sources of systematic error and construct our global instrument performance error budget. While natively focused on the performance of the NEID instrument, this modular performance budget is immediately adaptable to a number of current and future instruments. Such an approach is an important step in charting a path towards improving Doppler measurement precisions to the levels necessary for discovering Earth-like planets.

  19. Local Table Condensation in Rough Set Approach for Jumping Emerging Pattern Induction

    NASA Astrophysics Data System (ADS)

    Terlecki, Pawel; Walczak, Krzysztof

    This paper extends the rough set approach for JEP induction based on the notion of a condensed decision table. The original transaction database is transformed to a relational form and patterns are induced by means of local reducts. The transformation employs an item aggregation obtained by coloring a graph that re0ects con0icts among items. For e±ciency reasons we propose to perform this preprocessing locally, i.e. at the transaction level, to achieve a higher dimensionality gain. Special maintenance strategy is also used to avoid graph rebuilds. Both global and local approach have been tested and discussed for dense and synthetically generated sparse datasets.

  20. Experimental test of an online ion-optics optimizer

    NASA Astrophysics Data System (ADS)

    Amthor, A. M.; Schillaci, Z. M.; Morrissey, D. J.; Portillo, M.; Schwarz, S.; Steiner, M.; Sumithrarachchi, Ch.

    2018-07-01

    A technique has been developed and tested to automatically adjust multiple electrostatic or magnetic multipoles on an ion optical beam line - according to a defined optimization algorithm - until an optimal tune is found. This approach simplifies the process of determining high-performance optical tunes, satisfying a given set of optical properties, for an ion optical system. The optimization approach is based on the particle swarm method and is entirely model independent, thus the success of the optimization does not depend on the accuracy of an extant ion optical model of the system to be optimized. Initial test runs of a first order optimization of a low-energy (<60 keV) all-electrostatic beamline at the NSCL show reliable convergence of nine quadrupole degrees of freedom to well-performing tunes within a reasonable number of trial solutions, roughly 500, with full beam optimization run times of roughly two hours. Improved tunes were found both for quasi-local optimizations and for quasi-global optimizations, indicating a good ability of the optimizer to find a solution with or without a well defined set of initial multipole settings.

  1. External quality assessment study for ebolavirus PCR-diagnostic promotes international preparedness during the 2014 – 2016 Ebola outbreak in West Africa

    PubMed Central

    Jacobsen, Sonja; Patel, Pranav; Rieger, Toni; Eickmann, Markus; Becker, Stephan; Günther, Stephan; Naidoo, Dhamari; Schrick, Livia; Keeren, Kathrin; Targosz, Angelina; Teichmann, Anette; Formenty, Pierre; Niedrig, Matthias

    2017-01-01

    During the recent Ebola outbreak in West Africa several international mobile laboratories were deployed to the mainly affected countries Guinea, Sierra Leone and Liberia to provide ebolavirus diagnostic capacity. Additionally, imported cases and small outbreaks in other countries required global preparedness for Ebola diagnostics. Detection of viral RNA by reverse transcription polymerase chain reaction has proven effective for diagnosis of ebolavirus disease and several assays are available. However, reliability of these assays is largely unknown and requires serious evaluation. Therefore, a proficiency test panel of 11 samples was generated and distributed on a global scale. Panels were analyzed by 83 expert laboratories and 106 data sets were returned. From these 78 results were rated optimal and 3 acceptable, 25 indicated need for improvement. While performance of the laboratories deployed to West Africa was superior to the overall performance there was no significant difference between the different assays applied. PMID:28459810

  2. External quality assessment study for ebolavirus PCR-diagnostic promotes international preparedness during the 2014 - 2016 Ebola outbreak in West Africa.

    PubMed

    Ellerbrok, Heinz; Jacobsen, Sonja; Patel, Pranav; Rieger, Toni; Eickmann, Markus; Becker, Stephan; Günther, Stephan; Naidoo, Dhamari; Schrick, Livia; Keeren, Kathrin; Targosz, Angelina; Teichmann, Anette; Formenty, Pierre; Niedrig, Matthias

    2017-05-01

    During the recent Ebola outbreak in West Africa several international mobile laboratories were deployed to the mainly affected countries Guinea, Sierra Leone and Liberia to provide ebolavirus diagnostic capacity. Additionally, imported cases and small outbreaks in other countries required global preparedness for Ebola diagnostics. Detection of viral RNA by reverse transcription polymerase chain reaction has proven effective for diagnosis of ebolavirus disease and several assays are available. However, reliability of these assays is largely unknown and requires serious evaluation. Therefore, a proficiency test panel of 11 samples was generated and distributed on a global scale. Panels were analyzed by 83 expert laboratories and 106 data sets were returned. From these 78 results were rated optimal and 3 acceptable, 25 indicated need for improvement. While performance of the laboratories deployed to West Africa was superior to the overall performance there was no significant difference between the different assays applied.

  3. Modeling and simulation of surfactant-polymer flooding using a new hybrid method

    NASA Astrophysics Data System (ADS)

    Daripa, Prabir; Dutta, Sourav

    2017-04-01

    Chemical enhanced oil recovery by surfactant-polymer (SP) flooding has been studied in two space dimensions. A new global pressure for incompressible, immiscible, multicomponent two-phase porous media flow has been derived in the context of SP flooding. This has been used to formulate a system of flow equations that incorporates the effect of capillary pressure and also the effect of polymer and surfactant on viscosity, interfacial tension and relative permeabilities of the two phases. The coupled system of equations for pressure, water saturation, polymer concentration and surfactant concentration has been solved using a new hybrid method in which the elliptic global pressure equation is solved using a discontinuous finite element method and the transport equations for water saturation and concentrations of the components are solved by a Modified Method Of Characteristics (MMOC) in the multicomponent setting. Numerical simulations have been performed to validate the method, both qualitatively and quantitatively, and to evaluate the relative performance of the various flooding schemes for several different heterogeneous reservoirs.

  4. The assessment of Global Precipitation Measurement estimates over the Indian subcontinent

    NASA Astrophysics Data System (ADS)

    Murali Krishna, U. V.; Das, Subrata Kumar; Deshpande, Sachin M.; Doiphode, S. L.; Pandithurai, G.

    2017-08-01

    Accurate and real-time precipitation estimation is a challenging task for current and future spaceborne measurements, which is essential to understand the global hydrological cycle. Recently, the Global Precipitation Measurement (GPM) satellites were launched as a next-generation rainfall mission for observing the global precipitation characteristics. The purpose of the GPM is to enhance the spatiotemporal resolution of global precipitation. The main objective of the present study is to assess the rainfall products from the GPM, especially the Integrated Multi-satellitE Retrievals for the GPM (IMERG) data by comparing with the ground-based observations. The multitemporal scale evaluations of rainfall involving subdaily, diurnal, monthly, and seasonal scales were performed over the Indian subcontinent. The comparison shows that the IMERG performed better than the Tropical Rainfall Measuring Mission (TRMM)-3B42, although both rainfall products underestimated the observed rainfall compared to the ground-based measurements. The analyses also reveal that the TRMM-3B42 and IMERG data sets are able to represent the large-scale monsoon rainfall spatial features but are having region-specific biases. The IMERG shows significant improvement in low rainfall estimates compared to the TRMM-3B42 for selected regions. In the spatial distribution, the IMERG shows higher rain rates compared to the TRMM-3B42, due to its enhanced spatial and temporal resolutions. Apart from this, the characteristics of raindrop size distribution (DSD) obtained from the GPM mission dual-frequency precipitation radar is assessed over the complex mountain terrain site in the Western Ghats, India, using the DSD measured by a Joss-Waldvogel disdrometer.

  5. Global surgery in a postconflict setting - 5-year results of implementation in the Russian North Caucasus

    PubMed Central

    Lunze, Fatima I.; Lunze, Karsten; Tsorieva, Zemfira M.; Esenov, Constantin T.; Reutov, Alexandr; Eichhorn, Thomas; Offergeld, Christian

    2015-01-01

    Background Collaborations for global surgery face many challenges to achieve fair and safe patient care and to build sustainable capacity. The 2004 terrorist attack on a school in Beslan in North Ossetia in the Russian North Caucasus left many victims with complex otologic barotrauma. In response, we implemented a global surgery partnership between the Vladikavkaz Children's Hospital, international surgical teams, the North Ossetian Health Ministry, and civil society organizations. This study's aim was to describe the implementation and 5-year results of capacity building for complex surgery in a postconflict, mid-income setting. Design We conducted an observational study at the Children's Hospital in Vladikavkaz in the autonomous Republic of North Ossetia-Alania, part of the Russian Federation. We assessed the outcomes of 15 initial patients who received otologic surgeries for complex barotrauma resulting from the Beslan terrorism attack and for other indications, and report the incidence of intra- and postoperative complications. Results Patients were treated for trauma related to terrorism (53%) and for indications not related to violence (47%). None of the patients developed peri- or postoperative complications. Three patients (two victims of terrorism) who underwent repair of tympanic perforations presented with re-perforations. Four junior and senior surgeons were trained on-site and in Germany to perform and teach similar procedures autonomously. Conclusions In mid-income, postconflict settings, complex surgery can be safely implemented and achieve patient outcomes comparable to global standards. Capacity building can build on existing resources, such as operation room management, nursing, and anesthesia services. In postconflict environments, substantial surgical burden is not directly attributable to conflict-related injury and disease, but to health systems weakened by conflicts. Extending training and safe surgical care to include specialized interventions such as microsurgery are integral components to strengthen local capacity and ownership. Our experience identified strategies for fair patient selection and might provide a model for potentially sustainable surgical system building in postconflict environments. PMID:26498745

  6. Two global data sets of daily fire emission injection heights since 2003

    NASA Astrophysics Data System (ADS)

    Rémy, Samuel; Veira, Andreas; Paugam, Ronan; Sofiev, Mikhail; Kaiser, Johannes W.; Marenco, Franco; Burton, Sharon P.; Benedetti, Angela; Engelen, Richard J.; Ferrare, Richard; Hair, Jonathan W.

    2017-02-01

    The Global Fire Assimilation System (GFAS) assimilates fire radiative power (FRP) observations from satellite-based sensors to produce daily estimates of biomass burning emissions. It has been extended to include information about injection heights derived from fire observations and meteorological information from the operational weather forecasts of ECMWF. Injection heights are provided by two distinct methods: the Integrated Monitoring and Modelling System for wildland fires (IS4FIRES) parameterisation and the one-dimensional plume rise model (PRM). A global database of daily biomass burning emissions and injection heights at 0.1° resolution has been produced for 2003-2015 and is continuously extended in near-real time with the operational GFAS service of the Copernicus Atmospheric Monitoring Service (CAMS). In this study, the two injection height data sets were compared with the new MPHP2 (MISR Plume Height Project 2) satellite-based plume height retrievals. The IS4FIRES parameterisation showed a better overall agreement than the observations, while the PRM was better at capturing the variability of injection heights. The performance of both parameterisations is also dependent on the type of vegetation. Furthermore, the use of biomass burning emission heights from GFAS in atmospheric composition forecasts was assessed in two case studies: the South AMerican Biomass Burning Analysis (SAMBBA) campaign which took place in September 2012 in Brazil, and a series of large fire events in the western USA in August 2013. For these case studies, forecasts of biomass burning aerosol species by the Composition Integrated Forecasting System (C-IFS) of CAMS were found to better reproduce the observed vertical distribution when using PRM injection heights from GFAS compared to aerosols emissions being prescribed at the surface. The globally available GFAS injection heights introduced and evaluated in this study provide a comprehensive data set for future fire and atmospheric composition modelling studies.

  7. Educating Globally Competent Citizens: A Toolkit. Second Edition

    ERIC Educational Resources Information Center

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  8. Addressing the Global Burden of Breast Cancer

    Cancer.gov

    The US National Cancer Institute’s Center for Global Health (CGH) has been a key partner in a multi-institutional expert team that has developed a set of publications to address foundational concerns in breast cancer care across the cancer care continuum and within limited resource settings.

  9. Algorithms for optimization of branching gravity-driven water networks

    NASA Astrophysics Data System (ADS)

    Dardani, Ian; Jones, Gerard F.

    2018-05-01

    The design of a water network involves the selection of pipe diameters that satisfy pressure and flow requirements while considering cost. A variety of design approaches can be used to optimize for hydraulic performance or reduce costs. To help designers select an appropriate approach in the context of gravity-driven water networks (GDWNs), this work assesses three cost-minimization algorithms on six moderate-scale GDWN test cases. Two algorithms, a backtracking algorithm and a genetic algorithm, use a set of discrete pipe diameters, while a new calculus-based algorithm produces a continuous-diameter solution which is mapped onto a discrete-diameter set. The backtracking algorithm finds the global optimum for all but the largest of cases tested, for which its long runtime makes it an infeasible option. The calculus-based algorithm's discrete-diameter solution produced slightly higher-cost results but was more scalable to larger network cases. Furthermore, the new calculus-based algorithm's continuous-diameter and mapped solutions provided lower and upper bounds, respectively, on the discrete-diameter global optimum cost, where the mapped solutions were typically within one diameter size of the global optimum. The genetic algorithm produced solutions even closer to the global optimum with consistently short run times, although slightly higher solution costs were seen for the larger network cases tested. The results of this study highlight the advantages and weaknesses of each GDWN design method including closeness to the global optimum, the ability to prune the solution space of infeasible and suboptimal candidates without missing the global optimum, and algorithm run time. We also extend an existing closed-form model of Jones (2011) to include minor losses and a more comprehensive two-part cost model, which realistically applies to pipe sizes that span a broad range typical of GDWNs of interest in this work, and for smooth and commercial steel roughness values.

  10. A computational investigation on the structure, global parameters and antioxidant capacity of a polyphenol, Gallic acid.

    PubMed

    Rajan, Vijisha K; Muraleedharan, K

    2017-04-01

    A computational DFT-B3LYP structural analysis of a poly phenol, Gallic acid (GA) has been performed by using 6-311++ G (df, p) basis set. The GA is a relatively stable molecule with considerable radical scavenging capacity. It is a well known antioxidant. The NBO analysis shows that the aromatic system is delocalized. The results reveal that the most stable radical is formed at O 3 -atom upon scavenging the free radicals. Global descriptive parameters show that GA acts as an acceptor center in charge transfer complex formation which is supported by ESP and contour diagrams and also by Q max value. The GA is a good antioxidant and it can be better understood by HAT and TMC mechanisms as it has low BDE, ΔH acidity and ΔG acidity values. The ΔBDE and ΔAIP values also confirm that the antioxidant capacity of GA can be explained through HAT rather than the SET-PT mechanism. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Intrasite motions and monument instabilities at Medicina ITRF co-location site

    NASA Astrophysics Data System (ADS)

    Sarti, Pierguido; Abbondanza, Claudio; Legrand, Juliette; Bruyninx, Carine; Vittuari, Luca; Ray, Jim

    2013-03-01

    We process the total-station surveys performed at the ITRF co-location site Medicina (Northern Italy) over the decade (2001-2010) with the purpose of determining the extent of local intrasite motions and relating them to local geophysical processes, the geological setting and the design of the ground pillars. In addition, continuous observations acquired by two co-located GPS stations (MEDI and MSEL separated by ≈27 m) are analysed and their relative motion is cross-checked with the total-station results. The local ground control network extends over a small area (<100 × 100 m) but the results demonstrate significant anisotropic deformations with rates up to 1.6 mm a-1, primarily horizontal, a value comparable to intraplate tectonic deformations. The results derived from GPS and total-station observations are consistent and point to the presence of horizontal intrasite motions over very short distances possibly associated with varying environmental conditions in a very unfavourable local geological setting and unsuitable monument design, these latter being crucial aspects of the realization and maintenance of global permanent geodetic networks and the global terrestrial reference frame.

  12. Effective seeding strategy in evolutionary prisoner's dilemma games on online social networks

    NASA Astrophysics Data System (ADS)

    Xu, Bo; Shi, Huibin; Wang, Jianwei; Huang, Yun

    2015-04-01

    This paper explores effective seeding strategies in prisoner's dilemma game (PDG) on online social networks, i.e. the optimal strategy to obtain global cooperation with minimum cost. Three distinct seeding strategies are compared by performing computer simulations on real online social network datasets. Our finding suggests that degree centrality seeding outperforms other strategies regardless of the initial payoff setting or network size. Celebrities of online social networks play key roles in preserving cooperation.

  13. Neurodynamical model of collective brain

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1992-01-01

    A dynamical system which mimics collective purposeful activities of a set of units of intelligence is introduced and discussed. A global control of the unit activities is replaced by the probabilistic correlations between them. These correlations are learned during a long term period of performing collective tasks, and are stored in the synaptic interconnections. The model is represented by a system of ordinary differential equations with terminal attractors and repellers, and does not contain any man-made digital devices.

  14. “Violence. Enough already”: findings from a global participatory survey among women living with HIV

    PubMed Central

    Orza, Luisa; Bewley, Susan; Chung, Cecilia; Crone, E Tyler; Nagadya, Hajjarah; Vazquez, Marijo; Welbourn, Alice

    2015-01-01

    Introduction Women living with HIV are vulnerable to gender-based violence (GBV) before and after diagnosis, in multiple settings. This study's aim was to explore how GBV is experienced by women living with HIV, how this affects women's sexual and reproductive health (SRH) and human rights (HR), and the implications for policymakers. Methods A community-based, participatory, user-led, mixed-methods study was conducted, with women living with HIV from key affected populations. Simple descriptive frequencies were used for quantitative data. Thematic coding of open qualitative responses was performed and validated with key respondents. Results In total, 945 women living with HIV from 94 countries participated in the study. Eighty-nine percent of 480 respondents to an optional section on GBV reported having experienced or feared violence, either before, since and/or because of their HIV diagnosis. GBV reporting was higher after HIV diagnosis (intimate partner, family/neighbours, community and health settings). Women described a complex and iterative relationship between GBV and HIV occurring throughout their lives, including breaches of confidentiality and lack of SRH choice in healthcare settings, forced/coerced treatments, HR abuses, moralistic and judgemental attitudes (including towards women from key populations), and fear of losing child custody. Respondents recommended healthcare practitioners and policymakers address stigma and discrimination, training, awareness-raising, and HR abuses in healthcare settings. Conclusions Respondents reported increased GBV with partners and in families, communities and healthcare settings after their HIV diagnosis and across the life-cycle. Measures of GBV must be sought and monitored, particularly within healthcare settings that should be safe. Respondents offered policymakers a comprehensive range of recommendations to achieve their SRH and HR goals. Global guidance documents and policies are more likely to succeed for the end-users if lived experiences are used. PMID:26643458

  15. "Violence. Enough already": findings from a global participatory survey among women living with HIV.

    PubMed

    Orza, Luisa; Bewley, Susan; Chung, Cecilia; Crone, E Tyler; Nagadya, Hajjarah; Vazquez, Marijo; Welbourn, Alice

    2015-01-01

    Women living with HIV are vulnerable to gender-based violence (GBV) before and after diagnosis, in multiple settings. This study's aim was to explore how GBV is experienced by women living with HIV, how this affects women's sexual and reproductive health (SRH) and human rights (HR), and the implications for policymakers. A community-based, participatory, user-led, mixed-methods study was conducted, with women living with HIV from key affected populations. Simple descriptive frequencies were used for quantitative data. Thematic coding of open qualitative responses was performed and validated with key respondents. In total, 945 women living with HIV from 94 countries participated in the study. Eighty-nine percent of 480 respondents to an optional section on GBV reported having experienced or feared violence, either before, since and/or because of their HIV diagnosis. GBV reporting was higher after HIV diagnosis (intimate partner, family/neighbours, community and health settings). Women described a complex and iterative relationship between GBV and HIV occurring throughout their lives, including breaches of confidentiality and lack of SRH choice in healthcare settings, forced/coerced treatments, HR abuses, moralistic and judgemental attitudes (including towards women from key populations), and fear of losing child custody. Respondents recommended healthcare practitioners and policymakers address stigma and discrimination, training, awareness-raising, and HR abuses in healthcare settings. Respondents reported increased GBV with partners and in families, communities and healthcare settings after their HIV diagnosis and across the life-cycle. Measures of GBV must be sought and monitored, particularly within healthcare settings that should be safe. Respondents offered policymakers a comprehensive range of recommendations to achieve their SRH and HR goals. Global guidance documents and policies are more likely to succeed for the end-users if lived experiences are used.

  16. Global Gravity Field Determination by Combination of terrestrial and Satellite Gravity Data

    NASA Astrophysics Data System (ADS)

    Fecher, T.; Pail, R.; Gruber, T.

    2011-12-01

    A multitude of impressive results document the success of the satellite gravity field mission GOCE with a wide field of applications in geodesy, geophysics and oceanography. The high performance of GOCE gravity field models can be further improved by combination with GRACE data, which is contributing the long wavelength signal content of the gravity field with very high accuracy. An example for such a consistent combination of satellite gravity data are the satellite-only models GOCO01S and GOCO02S. However, only the further combination with terrestrial and altimetric gravity data enables to expand gravity field models up to very high spherical harmonic degrees and thus to achieve a spatial resolution down to 20-30 km. First numerical studies for high-resolution global gravity field models combining GOCE, GRACE and terrestrial/altimetric data on basis of the DTU10 model have already been presented. Computations up to degree/order 600 based on full normal equations systems to preserve the full variance-covariance information, which results mainly from different weights of individual terrestrial/altimetric data sets, have been successfully performed. We could show that such large normal equations systems (degree/order 600 corresponds to a memory demand of almost 1TByte), representing an immense computational challenge as computation time and memory requirements put high demand on computational resources, can be handled. The DTU10 model includes gravity anomalies computed from the global model EGM08 in continental areas. Therefore, the main focus of this presentation lies on the computation of high-resolution combined gravity field models based on real terrestrial gravity anomaly data sets. This is a challenge due to the inconsistency of these data sets, including also systematic error components, but a further step to a real independent gravity field model. This contribution will present our recent developments and progress by using independent data sets at certain land areas, which are combined with DTU10 in the ocean areas, as well as satellite gravity data. Investigations have been made concerning the preparation and optimum weighting of the different data sources. The results, which should be a major step towards a GOCO-C model, will be validated using external gravity field data and by applying different validation methods.

  17. Variability and Extremes of Precipitation in the Global Climate as Determined by the 25-Year GEWEX/GPCP Data Set

    NASA Technical Reports Server (NTRS)

    Adler, R. F.; Gu, G.; Curtis, S.; Huffman, G. J.; Bolvin, D. T.; Nelkin, E. J.

    2005-01-01

    The Global Precipitation Climatology Project (GPCP) 25-year precipitation data set is used to evaluate the variability and extremes on global and regional scales. The variability of precipitation year-to-year is evaluated in relation to the overall lack of a significant global trend and to climate events such as ENSO and volcanic eruptions. The validity of conclusions and limitations of the data set are checked by comparison with independent data sets (e.g., TRMM). The GPCP data set necessarily has a heterogeneous time series of input data sources, so part of the assessment described above is to test the initial results for potential influence by major data boundaries in the record. Regional trends, or inter-decadal changes, are also analyzed to determine validity and correlation with other long-term data sets related to the hydrological cycle (e.g., clouds and ocean surface fluxes). Statistics of extremes (both wet and dry) are analyzed at the monthly time scale for the 25 years. A preliminary result of increasing frequency of extreme monthly values will be a focus to determine validity. Daily values for an eight-year are also examined for variation in extremes and compared to the longer monthly-based study.

  18. Current Issues and Challenges in Global Analysis of Parton Distributions

    NASA Astrophysics Data System (ADS)

    Tung, Wu-Ki

    2007-01-01

    A new implementation of precise perturbative QCD calculation of deep inelastic scattering structure functions and cross sections, incorporating heavy quark mass effects, is applied to the global analysis of the full HERA I data sets on NC and CC cross sections, in conjunction with other experiments. Improved agreement between the NLO QCD theory and the global data sets are obtained. Comparison of the new results to that of previous analysis based on conventional zero-mass parton formalism is made. Exploratory work on implications of new fixed-target neutrino scattering and Drell-Yan data on global analysis is also discussed.

  19. Sensitive study of the climatological SST by using ATSR global SST data sets

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Lawrence, Sean P.; Llewellyn-Jones, David T.

    1995-12-01

    Climatological sea surface temperature (SST) is an initial step for global climate processing monitoring. A comparison has been made by using Oberhuber's SST data set and two years monthly averaged SST from ATSR thermal band data to force the OGCM. In the eastern Pacific Ocean, these make only a small difference to model SST. In the western Pacific Ocean, the use of Oberhuber's data set gives higher climatological SST than that using ATSR data. The SSTs were also simulated for 1992 using climatological SSTs from two years monthly averaged ATSR data and Oberhuber data. The forcing with SST from ATSR data was found to give better SST simulation than that from Oberhuber's data. Our study has confirmed that ATSR can provide accurate monthly averaged global SST for global climate processing monitoring.

  20. Online Low-Rank Representation Learning for Joint Multi-subspace Recovery and Clustering.

    PubMed

    Li, Bo; Liu, Risheng; Cao, Junjie; Zhang, Jie; Lai, Yu-Kun; Liua, Xiuping

    2017-10-06

    Benefiting from global rank constraints, the lowrank representation (LRR) method has been shown to be an effective solution to subspace learning. However, the global mechanism also means that the LRR model is not suitable for handling large-scale data or dynamic data. For large-scale data, the LRR method suffers from high time complexity, and for dynamic data, it has to recompute a complex rank minimization for the entire data set whenever new samples are dynamically added, making it prohibitively expensive. Existing attempts to online LRR either take a stochastic approach or build the representation purely based on a small sample set and treat new input as out-of-sample data. The former often requires multiple runs for good performance and thus takes longer time to run, and the latter formulates online LRR as an out-ofsample classification problem and is less robust to noise. In this paper, a novel online low-rank representation subspace learning method is proposed for both large-scale and dynamic data. The proposed algorithm is composed of two stages: static learning and dynamic updating. In the first stage, the subspace structure is learned from a small number of data samples. In the second stage, the intrinsic principal components of the entire data set are computed incrementally by utilizing the learned subspace structure, and the low-rank representation matrix can also be incrementally solved by an efficient online singular value decomposition (SVD) algorithm. The time complexity is reduced dramatically for large-scale data, and repeated computation is avoided for dynamic problems. We further perform theoretical analysis comparing the proposed online algorithm with the batch LRR method. Finally, experimental results on typical tasks of subspace recovery and subspace clustering show that the proposed algorithm performs comparably or better than batch methods including the batch LRR, and significantly outperforms state-of-the-art online methods.

  1. Learning in Global Settings: Developing Transitions for Meaning-Making

    ERIC Educational Resources Information Center

    Norden, Birgitta; Avery, Helen; Anderberg, Elsie

    2012-01-01

    Global teaching and learning for sustainable development reaches from the classroom to the world outside, and is therefore a particularly interesting setting for practising transition skills. The article suggests a number of features perceived as crucial in developing young people's capability to act in a changing world and under circumstances…

  2. Oceanic Fluxes of Mass, Heat and Freshwater: A Global Estimate and Perspective

    NASA Technical Reports Server (NTRS)

    MacDonald, Alison Marguerite

    1995-01-01

    Data from fifteen globally distributed, modern, high resolution, hydrographic oceanic transects are combined in an inverse calculation using large scale box models. The models provide estimates of the global meridional heat and freshwater budgets and are used to examine the sensitivity of the global circulation, both inter and intra-basin exchange rates, to a variety of external constraints provided by estimates of Ekman, boundary current and throughflow transports. A solution is found which is consistent with both the model physics and the global data set, despite a twenty five year time span and a lack of seasonal consistency among the data. The overall pattern of the global circulation suggested by the models is similar to that proposed in previously published local studies and regional reviews. However, significant qualitative and quantitative differences exist. These differences are due both to the model definition and to the global nature of the data set.

  3. 10 rules for managing global innovation.

    PubMed

    Wilson, Keeley; Doz, Yves L

    2012-10-01

    More and more companies recognize that their dispersed, global operations are a treasure trove of ideas and capabilities for innovation. But it's proving harder than expected to unearth those ideas or exploit those capabilities. Part of the problem is that companies manage global innovation the same way they manage traditional, single-location projects. Single-location projects draw on a large reservoir of tacit knowledge, shared context, and trust that global projects lack. The management challenge, therefore, is to replicate the positive aspects of colocation while harnessing the opportunities of dispersion. In this article, Insead's Wilson and Doz draw on research into global strategy and innovation to present a set of guidelines for setting up and managing global innovation. They explore in detail the challenges that make global projects inherently different and show how these can be overcome by applying superior project management skills across teams, fostering a strong collaborative culture, and using a robust array of communications tools.

  4. Risk factors and global cognitive status related to brain arteriolosclerosis in elderly individuals

    PubMed Central

    Ighodaro, Eseosa T; Abner, Erin L; Fardo, David W; Lin, Ai-Ling; Katsumata, Yuriko; Schmitt, Frederick A; Kryscio, Richard J; Jicha, Gregory A; Neltner, Janna H; Monsell, Sarah E; Kukull, Walter A; Moser, Debra K; Appiah, Frank; Bachstetter, Adam D; Van Eldik, Linda J

    2016-01-01

    Risk factors and cognitive sequelae of brain arteriolosclerosis pathology are not fully understood. To address this, we used multimodal data from the National Alzheimer's Coordinating Center and Alzheimer's Disease Neuroimaging Initiative data sets. Previous studies showed evidence of distinct neurodegenerative disease outcomes and clinical-pathological correlations in the “oldest-old” compared to younger cohorts. Therefore, using the National Alzheimer's Coordinating Center data set, we analyzed clinical and neuropathological data from two groups according to ages at death: < 80 years (n = 1008) and ≥80 years (n = 1382). In both age groups, severe brain arteriolosclerosis was associated with worse performances on global cognition tests. Hypertension (but not diabetes) was a brain arteriolosclerosis risk factor in the younger group. In the ≥ 80 years age at death group, an ABCC9 gene variant (rs704180), previously associated with aging-related hippocampal sclerosis, was also associated with brain arteriolosclerosis. A post-hoc arterial spin labeling neuroimaging experiment indicated that ABCC9 genotype is associated with cerebral blood flow impairment; in a convenience sample from Alzheimer's Disease Neuroimaging Initiative (n = 15, homozygous individuals), non-risk genotype carriers showed higher global cerebral blood flow compared to risk genotype carriers. We conclude that brain arteriolosclerosis is associated with altered cognitive status and a novel vascular genetic risk factor. PMID:26738751

  5. Mapping the global health employment market: an analysis of global health jobs.

    PubMed

    Keralis, Jessica M; Riggin-Pathak, Brianne L; Majeski, Theresa; Pathak, Bogdan A; Foggia, Janine; Cullinen, Kathleen M; Rajagopal, Abbhirami; West, Heidi S

    2018-02-27

    The number of university global health training programs has grown in recent years. However, there is little research on the needs of the global health profession. We therefore set out to characterize the global health employment market by analyzing global health job vacancies. We collected data from advertised, paid positions posted to web-based job boards, email listservs, and global health organization websites from November 2015 to May 2016. Data on requirements for education, language proficiency, technical expertise, physical location, and experience level were analyzed for all vacancies. Descriptive statistics were calculated for the aforementioned job characteristics. Associations between technical specialty area and requirements for non-English language proficiency and overseas experience were calculated using Chi-square statistics. A qualitative thematic analysis was performed on a subset of vacancies. We analyzed the data from 1007 global health job vacancies from 127 employers. Among private and non-profit sector vacancies, 40% (n = 354) were for technical or subject matter experts, 20% (n = 177) for program directors, and 16% (n = 139) for managers, compared to 9.8% (n = 87) for entry-level and 13.6% (n = 120) for mid-level positions. The most common technical focus area was program or project management, followed by HIV/AIDS and quantitative analysis. Thematic analysis demonstrated a common emphasis on program operations, relations, design and planning, communication, and management. Our analysis shows a demand for candidates with several years of experience with global health programs, particularly program managers/directors and technical experts, with very few entry-level positions accessible to recent graduates of global health training programs. It is unlikely that global health training programs equip graduates to be competitive for the majority of positions that are currently available in this field.

  6. Global How?--Linking Practice to Theory: A Competency Model for Training Global Learning Facilitators

    ERIC Educational Resources Information Center

    Büker, Gundula; Schell-Straub, Sigrid

    2017-01-01

    Global learning facilitators from civil society organizations (CSOs) design and enrich educational processes in formal and non-formal educational settings. They need to be empowered through adequate training opportunities in global learning (GL) contexts. The project Facilitating Global Learning--Key Competences from Members of European CSOs (FGL)…

  7. Automatic control of cryogenic wind tunnels

    NASA Technical Reports Server (NTRS)

    Balakrishna, S.

    1989-01-01

    Inadequate Reynolds number similarity in testing of scaled models affects the quality of aerodynamic data from wind tunnels. This is due to scale effects of boundary-layer shock wave interaction which is likely to be severe at transonic speeds. The idea of operation of wind tunnels using test gas cooled to cryogenic temperatures has yielded a quantrum jump in the ability to realize full scale Reynolds number flow similarity in small transonic tunnels. In such tunnels, the basic flow control problem consists of obtaining and maintaining the desired test section flow parameters. Mach number, Reynolds number, and dynamic pressure are the three flow parameters that are usually required to be kept constant during the period of model aerodynamic data acquisition. The series of activity involved in modeling, control law development, mechanization of the control laws on a microcomputer, and the performance of a globally stable automatic control system for the 0.3-m Transonic Cryogenic Tunnel (TCT) are discussed. A lumped multi-variable nonlinear dynamic model of the cryogenic tunnel, generation of a set of linear control laws for small perturbation, and nonlinear control strategy for large set point changes including tunnel trajectory control are described. The details of mechanization of the control laws on a 16 bit microcomputer system, the software features, operator interface, the display and safety are discussed. The controller is shown to provide globally stable and reliable temperature control to + or - 0.2 K, pressure to + or - 0.07 psi and Mach number to + or - 0.002 of the set point value. This performance is obtained both during large set point commands as for a tunnel cooldown, and during aerodynamic data acquisition with intrusive activity like geometrical changes in the test section such as angle of attack changes, drag rake movements, wall adaptation and sidewall boundary-layer removal. Feasibility of the use of an automatic Reynolds number control mode with fixed Mach number control is demonstrated.

  8. Implementation and performance of shutterless uncooled micro-bolometer cameras

    NASA Astrophysics Data System (ADS)

    Das, J.; de Gaspari, D.; Cornet, P.; Deroo, P.; Vermeiren, J.; Merken, P.

    2015-06-01

    A shutterless algorithm is implemented into the Xenics LWIR thermal cameras and modules. Based on a calibration set and a global temperature coefficient the optimal non-uniformity correction is calculated onboard of the camera. The limited resources in the camera require a compact algorithm, hence the efficiency of the coding is important. The performance of the shutterless algorithm is studied by a comparison of the residual non-uniformity (RNU) and signal-to-noise ratio (SNR) between the shutterless and shuttered correction algorithm. From this comparison we conclude that the shutterless correction is only slightly less performant compared to the standard shuttered algorithm, making this algorithm very interesting for thermal infrared applications where small weight and size, and continuous operation are important.

  9. Investing for Impact: The Global Fund Approach to Measurement of AIDS Response.

    PubMed

    Jain, Suman; Zorzi, Nathalie

    2017-07-01

    The Global Fund raises and invests nearly US$4 billion a year to support programs run in more than 140 countries. The Global Fund strategy 2012-2016 is focused on "Investing for Impact". In order to accomplish this, timely and accurate data are needed to inform strategies and prioritize activities to achieve greater coverage with quality services. Monitoring and evaluation is intrinsic to the Global Fund's system of performance-based funding. The Global Fund invests in strengthening measurement and reporting of results at all stages of the grant cycle. The Global Fund approach to measurement is based on three key principles-(1) simplified reporting: the Global Fund has updated its measurement guidance to focus on impact, coverage and quality with the use of a harmonized set of indicators. (2) Supporting data systems-based on a common framework developed and supported by partners, it promotes investment in five common data systems: routine reporting including HMIS; Surveys-population based and risk group surveys; Analysis, reviews and transparency; Administrative and financial data sources; and, Vital registration systems. (3) Strengthen data use: the Global Fund funding encourages use of data at all levels-national, subnational and site level. Countries do not automatically prioritize M&E but when guidance, tools and investments are available, there is high level utilization of M&E systems in program design, planning, implementation, and results reporting. An in-depth analysis of the available data helps the Global Fund and countries to direct investments towards interventions where impact could be achieved and focus on target population groups and geographic areas that are most affected.

  10. A globally calibrated scheme for generating daily meteorology from monthly statistics: Global-WGEN (GWGEN) v1.0

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-10-01

    While a wide range of Earth system processes occur at daily and even subdaily timescales, many global vegetation and other terrestrial dynamics models historically used monthly meteorological forcing both to reduce computational demand and because global datasets were lacking. Recently, dynamic land surface modeling has moved towards resolving daily and subdaily processes, and global datasets containing daily and subdaily meteorology have become available. These meteorological datasets, however, cover only the instrumental era of the last approximately 120 years at best, are subject to considerable uncertainty, and represent extremely large data files with associated computational costs of data input/output and file transfer. For periods before the recent past or in the future, global meteorological forcing can be provided by climate model output, but the quality of these data at high temporal resolution is low, particularly for daily precipitation frequency and amount. Here, we present GWGEN, a globally applicable statistical weather generator for the temporal downscaling of monthly climatology to daily meteorology. Our weather generator is parameterized using a global meteorological database and simulates daily values of five common variables: minimum and maximum temperature, precipitation, cloud cover, and wind speed. GWGEN is lightweight, modular, and requires a minimal set of monthly mean variables as input. The weather generator may be used in a range of applications, for example, in global vegetation, crop, soil erosion, or hydrological models. While GWGEN does not currently perform spatially autocorrelated multi-point downscaling of daily weather, this additional functionality could be implemented in future versions.

  11. Parameter Set Cloning Based on Catchment Similarity for Large-scale Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Kaheil, Y.; McCollum, J.

    2016-12-01

    Parameter calibration is a crucial step to ensure the accuracy of hydrological models. However, streamflow gauges are not available everywhere for calibrating a large-scale hydrologic model globally. Thus, assigning parameters appropriately for regions where the calibration cannot be performed directly has been a challenge for large-scale hydrologic modeling. Here we propose a method to estimate the model parameters in ungauged regions based on the values obtained through calibration in areas where gauge observations are available. This parameter set cloning is performed according to a catchment similarity index, a weighted sum index based on four catchment characteristic attributes. These attributes are IPCC Climate Zone, Soil Texture, Land Cover, and Topographic Index. The catchments with calibrated parameter values are donors, while the uncalibrated catchments are candidates. Catchment characteristic analyses are first conducted for both donors and candidates. For each attribute, we compute a characteristic distance between donors and candidates. Next, for each candidate, weights are assigned to the four attributes such that higher weights are given to properties that are more directly linked to the hydrologic dominant processes. This will ensure that the parameter set cloning emphasizes the dominant hydrologic process in the region where the candidate is located. The catchment similarity index for each donor - candidate couple is then created as the sum of the weighted distance of the four properties. Finally, parameters are assigned to each candidate from the donor that is "most similar" (i.e. with the shortest weighted distance sum). For validation, we applied the proposed method to catchments where gauge observations are available, and compared simulated streamflows using the parameters cloned by other catchments to the results obtained by calibrating the hydrologic model directly using gauge data. The comparison shows good agreement between the two models for different river basins as we show here. This method has been applied globally to the Hillslope River Routing (HRR) model using gauge observations obtained from the Global Runoff Data Center (GRDC). As next step, more catchment properties can be taken into account to further improve the representation of catchment similarity.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fadika, Zacharia; Dede, Elif; Govindaraju, Madhusudhan

    MapReduce is increasingly becoming a popular framework, and a potent programming model. The most popular open source implementation of MapReduce, Hadoop, is based on the Hadoop Distributed File System (HDFS). However, as HDFS is not POSIX compliant, it cannot be fully leveraged by applications running on a majority of existing HPC environments such as Teragrid and NERSC. These HPC environments typicallysupport globally shared file systems such as NFS and GPFS. On such resourceful HPC infrastructures, the use of Hadoop not only creates compatibility issues, but also affects overall performance due to the added overhead of the HDFS. This paper notmore » only presents a MapReduce implementation directly suitable for HPC environments, but also exposes the design choices for better performance gains in those settings. By leveraging inherent distributed file systems' functions, and abstracting them away from its MapReduce framework, MARIANE (MApReduce Implementation Adapted for HPC Environments) not only allows for the use of the model in an expanding number of HPCenvironments, but also allows for better performance in such settings. This paper shows the applicability and high performance of the MapReduce paradigm through MARIANE, an implementation designed for clustered and shared-disk file systems and as such not dedicated to a specific MapReduce solution. The paper identifies the components and trade-offs necessary for this model, and quantifies the performance gains exhibited by our approach in distributed environments over Apache Hadoop in a data intensive setting, on the Magellan testbed at the National Energy Research Scientific Computing Center (NERSC).« less

  13. Global estimation of CO emissions using three sets of satellite data for burned area

    NASA Astrophysics Data System (ADS)

    Jain, Atul K.

    Using three sets of satellite data for burned areas together with the tree cover imagery and a biogeochemical component of the Integrated Science Assessment Model (ISAM) the global emissions of CO and associated uncertainties are estimated for the year 2000. The available fuel load (AFL) is calculated using the ISAM biogeochemical model, which accounts for the aboveground and surface fuel removed by land clearing for croplands and pasturelands, as well as the influence on fuel load of various ecosystem processes (such as stomatal conductance, evapotranspiration, plant photosynthesis and respiration, litter production, and soil organic carbon decomposition) and important feedback mechanisms (such as climate and fertilization feedback mechanism). The ISAM estimated global total AFL in the year 2000 was about 687 Pg AFL. All forest ecosystems account for about 90% of the global total AFL. The estimated global CO emissions based on three global burned area satellite data sets (GLOBSCAR, GBA, and Global Fire Emissions Database version 2 (GFEDv2)) for the year 2000 ranges between 320 and 390 Tg CO. Emissions from open fires are highest in tropical Africa, primarily due to forest cutting and burning. The estimated overall uncertainty in global CO emission is about ±65%, with the highest uncertainty occurring in North Africa and Middle East region (±99%). The results of this study suggest that the uncertainties in the calculated emissions stem primarily from the area burned data.

  14. Effects of non-neuronal components for functional connectivity analysis from resting-state functional MRI toward automated diagnosis of schizophrenia

    NASA Astrophysics Data System (ADS)

    Kim, Junghoe; Lee, Jong-Hwan

    2014-03-01

    A functional connectivity (FC) analysis from resting-state functional MRI (rsfMRI) is gaining its popularity toward the clinical application such as diagnosis of neuropsychiatric disease. To delineate the brain networks from rsfMRI data, non-neuronal components including head motions and physiological artifacts mainly observed in cerebrospinal fluid (CSF), white matter (WM) along with a global brain signal have been regarded as nuisance variables in calculating the FC level. However, it is still unclear how the non-neuronal components can affect the performance toward diagnosis of neuropsychiatric disease. In this study, a systematic comparison of classification performance of schizophrenia patients was provided employing the partial correlation coefficients (CCs) as feature elements. Pair-wise partial CCs were calculated between brain regions, in which six combinatorial sets of nuisance variables were considered. The partial CCs were used as candidate feature elements followed by feature selection based on the statistical significance test between two groups in the training set. Once a linear support vector machine was trained using the selected features from the training set, the classification performance was evaluated using the features from the test set (i.e. leaveone- out cross validation scheme). From the results, the error rate using all non-neuronal components as nuisance variables (12.4%) was significantly lower than those using remaining combination of non-neuronal components as nuisance variables (13.8 ~ 20.0%). In conclusion, the non-neuronal components substantially degraded the automated diagnosis performance, which supports our hypothesis that the non-neuronal components are crucial in controlling the automated diagnosis performance of the neuropsychiatric disease using an fMRI modality.

  15. Designing Training for Global Environments: Knowing What Questions To Ask.

    ERIC Educational Resources Information Center

    Gayeski, Diane M.; Sanchirico, Christine; Anderson, Janet

    2002-01-01

    Presents a framework for identifying important issues for instructional design and delivery in global settings. Highlights include cultural factors in global training; an instructional design model; corporate globalization strategy; communication and training norms; language barriers; implicit value differences; and technical and legal…

  16. Representation of fine scale atmospheric variability in a nudged limited area quasi-geostrophic model: application to regional climate modelling

    NASA Astrophysics Data System (ADS)

    Omrani, H.; Drobinski, P.; Dubos, T.

    2009-09-01

    In this work, we consider the effect of indiscriminate nudging time on the large and small scales of an idealized limited area model simulation. The limited area model is a two layer quasi-geostrophic model on the beta-plane driven at its boundaries by its « global » version with periodic boundary condition. This setup mimics the configuration used for regional climate modelling. Compared to a previous study by Salameh et al. (2009) who investigated the existence of an optimal nudging time minimizing the error on both large and small scale in a linear model, we here use a fully non-linear model which allows us to represent the chaotic nature of the atmosphere: given the perfect quasi-geostrophic model, errors in the initial conditions, concentrated mainly in the smaller scales of motion, amplify and cascade into the larger scales, eventually resulting in a prediction with low skill. To quantify the predictability of our quasi-geostrophic model, we measure the rate of divergence of the system trajectories in phase space (Lyapunov exponent) from a set of simulations initiated with a perturbation of a reference initial state. Predictability of the "global", periodic model is mostly controlled by the beta effect. In the LAM, predictability decreases as the domain size increases. Then, the effect of large-scale nudging is studied by using the "perfect model” approach. Two sets of experiments were performed: (1) the effect of nudging is investigated with a « global » high resolution two layer quasi-geostrophic model driven by a low resolution two layer quasi-geostrophic model. (2) similar simulations are conducted with the two layer quasi-geostrophic LAM where the size of the LAM domain comes into play in addition to the first set of simulations. In the two sets of experiments, the best spatial correlation between the nudge simulation and the reference is observed with a nudging time close to the predictability time.

  17. Development and evaluation of CAHPS survey items assessing how well healthcare providers address health literacy.

    PubMed

    Weidmer, Beverly A; Brach, Cindy; Hays, Ron D

    2012-09-01

    The complexity of health information often exceeds patients' skills to understand and use it. To develop survey items assessing how well healthcare providers communicate health information. Domains and items for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) Item Set for Addressing Health Literacy were identified through an environmental scan and input from stakeholders. The draft item set was translated into Spanish and pretested in both English and Spanish. The revised item set was field tested with a randomly selected sample of adult patients from 2 sites using mail and telephonic data collection. Item-scale correlations, confirmatory factor analysis, and internal consistency reliability estimates were estimated to assess how well the survey items performed and identify composite measures. Finally, we regressed the CAHPS global rating of the provider item on the CAHPS core communication composite and the new health literacy composites. A total of 601 completed surveys were obtained (52% response rate). Two composite measures were identified: (1) Communication to Improve Health Literacy (16 items); and (2) How Well Providers Communicate About Medicines (6 items). These 2 composites were significantly uniquely associated with the global rating of the provider (communication to improve health literacy: P<0.001, b=0.28; and communication about medicines composite: P=0.02, b=0.04). The 2 composites and the CAHPS core communication composite accounted for 51% of the variance in the global rating of the provider. A 5-item subset of the Communication to Improve Health Literacy composite accounted for 90% of the variance of the original 16-item composite. This study provides support for reliability and validity of the CAHPS Item Set for Addressing Health Literacy. These items can serve to assess whether healthcare providers have communicated effectively with their patients and as a tool for quality improvement.

  18. Day 1 for the Integrated Multi-Satellite Retrievals for GPM (IMERG) Data Sets

    NASA Astrophysics Data System (ADS)

    Huffman, G. J.; Bolvin, D. T.; Braithwaite, D.; Hsu, K. L.; Joyce, R.; Kidd, C.; Sorooshian, S.; Xie, P.

    2014-12-01

    The Integrated Multi-satellitE Retrievals for GPM (IMERG) is designed to compute the best time series of (nearly) global precipitation from "all" precipitation-relevant satellites and global surface precipitation gauge analyses. IMERG was developed to use GPM Core Observatory data as a reference for the international constellation of satellites of opportunity that constitute the GPM virtual constellation. Computationally, IMERG is a unified U.S. algorithm drawing on strengths in the three contributing groups, whose previous work includes: 1) the TRMM Multi-satellite Precipitation Analysis (TMPA); 2) the CPC Morphing algorithm with Kalman Filtering (K-CMORPH); and 3) the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS). We review the IMERG design, development, testing, and current status. IMERG provides 0.1°x0.1° half-hourly data, and will be run at multiple times, providing successively more accurate estimates: 4 hours, 8 hours, and 2 months after observation time. In Day 1 the spatial extent is 60°N-S, for the period March 2014 to the present. In subsequent reprocessing the data will extend to fully global, covering the period 1998 to the present. Both the set of input data set retrievals and the IMERG system are substantially different than those used in previous U.S. products. The input passive microwave data are all being produced with GPROF2014, which is substantially upgraded compared to previous versions. For the first time, this includes microwave sounders. Accordingly, there is a strong need to carefully check the initial test data sets for performance. IMERG output will be illustrated using pre-operational test data, including the variety of supporting fields, such as the merged-microwave and infrared estimates, and the precipitation type. Finally, we will summarize the expected release of various output products, and the subsequent reprocessing sequence.

  19. A global reconstruction of climate-driven subdecadal water storage variability

    NASA Astrophysics Data System (ADS)

    Humphrey, V.; Gudmundsson, L.; Seneviratne, S. I.

    2017-03-01

    Since 2002, the Gravity Recovery and Climate Experiment (GRACE) mission has provided unprecedented observations of global mass redistribution caused by hydrological processes. However, there are still few sources on pre-2002 global terrestrial water storage (TWS). Classical approaches to retrieve past TWS rely on either land surface models (LSMs) or basin-scale water balance calculations. Here we propose a new approach which statistically relates anomalies in atmospheric drivers to monthly GRACE anomalies. Gridded subdecadal TWS changes and time-dependent uncertainty intervals are reconstructed for the period 1985-2015. Comparisons with model results demonstrate the performance and robustness of the derived data set, which represents a new and valuable source for studying subdecadal TWS variability, closing the ocean/land water budgets and assessing GRACE uncertainties. At midpoint between GRACE observations and LSM simulations, the statistical approach provides TWS estimates (doi:10.5905/ethz-1007-85) that are essentially derived from observations and are based on a limited number of transparent model assumptions.

  20. Modeling Physiological Systems in the Human Body as Networks of Quasi-1D Fluid Flows

    NASA Astrophysics Data System (ADS)

    Staples, Anne

    2008-11-01

    Extensive research has been done on modeling human physiology. Most of this work has been aimed at developing detailed, three-dimensional models of specific components of physiological systems, such as a cell, a vein, a molecule, or a heart valve. While efforts such as these are invaluable to our understanding of human biology, if we were to construct a global model of human physiology with this level of detail, computing even a nanosecond in this computational being's life would certainly be prohibitively expensive. With this in mind, we derive the Pulsed Flow Equations, a set of coupled one-dimensional partial differential equations, specifically designed to capture two-dimensional viscous, transport, and other effects, and aimed at providing accurate and fast-to-compute global models for physiological systems represented as networks of quasi one-dimensional fluid flows. Our goal is to be able to perform faster-than-real time simulations of global processes in the human body on desktop computers.

  1. Global models for synthetic fuels planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamontagne, J.

    1983-10-01

    This study was performed to identify the set of existing global models with the best potential for use in the US Synthetic Fuels Corporation's strategic planning process, and to recommend the most appropriate model. The study was limited to global models with representations that encompass time horizons beyond the year 2000, multiple fuel forms, and significant regional detail. Potential accessibility to the Synthetic Fuels Corporation and adequate documentation were also required. Four existing models (LORENDAS, WIM, IIASA, and IEA/ORAU) were judged to be the best candidates for the SFC's use at this time; none of the models appears to bemore » ideal for the SFC's purposes. On the basis of currently available information, the most promising short-term option open to the SFC is the use of LORENDAS, with careful attention to definition of alternative energy demand scenarios. Longer-term options which deserve further study are coupling LORENDAS with an explicit model of energy demand, and modification of the IEA/ORAU model to include finer time-period definition and additional technological detail.« less

  2. Global Microwave Imager (GMI) Spin Mechanism Assembly Design, Development, and Performance Test Results

    NASA Technical Reports Server (NTRS)

    Kubitschek, Michael; Woolaway, Scott; Guy, Larry; Dayton, Chris; Berdanier, Barry; Newell, David; Pellicciotti, Joseph W.

    2011-01-01

    The GMI Spin Mechanism Assembly (SMA) is a precision bearing and power transfer drive assembly mechanism that supports and spins the Global Microwave Imager (GMI) instrument at a constant rate of 32 rpm continuously for the 3 year plus mission life. The GMI instrument will fly on the core Global Precipitation Measurement (GPM) spacecraft and will be used to make calibrated radiometric measurements at multiple microwave frequencies and polarizations. The GPM mission is an international effort managed by the National Aeronautics and Space Administration (NASA) to improve climate, weather, and hydro-meteorological predictions through more accurate and frequent precipitation measurements [1]. Ball Aerospace and Technologies Corporation (BATC) was selected by NASA Goddard Space Flight Center (GSFC) to design, build, and test the GMI instrument. The SMA design has to meet a challenging set of requirements and is based on BATC space mechanisms heritage and lessons learned design changes made to the WindSat BAPTA mechanism that is currently operating on-orbit and has recently surpassed 8 years of Flight operation.

  3. Long-wave instabilities of two interlaced helical vortices

    NASA Astrophysics Data System (ADS)

    Quaranta, H. U.; Brynjell-Rahkola, M.; Leweke, T.; Henningson, D. S.

    2016-09-01

    We present a comparison between experimental observations and theoretical predictions concerning long-wave displacement instabilities of the helical vortices in the wake of a two-bladed rotor. Experiments are performed with a small-scale rotor in a water channel, using a set-up that allows the individual triggering of various instability modes at different azimuthal wave numbers, leading to local or global pairing of successive vortex loops. The initial development of the instability and the measured growth rates are in good agreement with the predictions from linear stability theory, based on an approach where the helical vortex system is represented by filaments. At later times, local pairing develops into large-scale distortions of the vortices, whereas for global pairing the non-linear evolution returns the system almost to its initial geometry.

  4. Robust distributed model predictive control of linear systems with structured time-varying uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Langwen; Xie, Wei; Wang, Jingcheng

    2017-11-01

    In this work, synthesis of robust distributed model predictive control (MPC) is presented for a class of linear systems subject to structured time-varying uncertainties. By decomposing a global system into smaller dimensional subsystems, a set of distributed MPC controllers, instead of a centralised controller, are designed. To ensure the robust stability of the closed-loop system with respect to model uncertainties, distributed state feedback laws are obtained by solving a min-max optimisation problem. The design of robust distributed MPC is then transformed into solving a minimisation optimisation problem with linear matrix inequality constraints. An iterative online algorithm with adjustable maximum iteration is proposed to coordinate the distributed controllers to achieve a global performance. The simulation results show the effectiveness of the proposed robust distributed MPC algorithm.

  5. The Mpi-M Aerosol Climatology (MAC)

    NASA Astrophysics Data System (ADS)

    Kinne, S.

    2014-12-01

    Monthly gridded global data-sets for aerosol optical properties (AOD, SSA and g) and for aerosol microphysical properties (CCN and IN) offer a (less complex) alternate path to include aerosol radiative effects and aerosol impacts on cloud-microphysics in global simulations. Based on merging AERONET sun-/sky-photometer data onto background maps provided by AeroCom phase 1 modeling output and AERONET sun-/the MPI-M Aerosol Climatology (MAC) version 1 was developed and applied in IPCC simulations with ECHAM and as ancillary data-set in satellite-based global data-sets. An updated version 2 of this climatology will be presented now applying central values from the more recent AeroCom phase 2 modeling and utilizing the better global coverage of trusted sun-photometer data - including statistics from the Marine Aerosol network (MAN). Applications include spatial distributions of estimates for aerosol direct and aerosol indirect radiative effects.

  6. Probing the functions of long non-coding RNAs by exploiting the topology of global association and interaction network.

    PubMed

    Deng, Lei; Wu, Hongjie; Liu, Chuyao; Zhan, Weihua; Zhang, Jingpu

    2018-06-01

    Long non-coding RNAs (lncRNAs) are involved in many biological processes, such as immune response, development, differentiation and gene imprinting and are associated with diseases and cancers. But the functions of the vast majority of lncRNAs are still unknown. Predicting the biological functions of lncRNAs is one of the key challenges in the post-genomic era. In our work, We first build a global network including a lncRNA similarity network, a lncRNA-protein association network and a protein-protein interaction network according to the expressions and interactions, then extract the topological feature vectors of the global network. Using these features, we present an SVM-based machine learning approach, PLNRGO, to annotate human lncRNAs. In PLNRGO, we construct a training data set according to the proteins with GO annotations and train a binary classifier for each GO term. We assess the performance of PLNRGO on our manually annotated lncRNA benchmark and a protein-coding gene benchmark with known functional annotations. As a result, the performance of our method is significantly better than that of other state-of-the-art methods in terms of maximum F-measure and coverage. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. New results on global exponential dissipativity analysis of memristive inertial neural networks with distributed time-varying delays.

    PubMed

    Zhang, Guodong; Zeng, Zhigang; Hu, Junhao

    2018-01-01

    This paper is concerned with the global exponential dissipativity of memristive inertial neural networks with discrete and distributed time-varying delays. By constructing appropriate Lyapunov-Krasovskii functionals, some new sufficient conditions ensuring global exponential dissipativity of memristive inertial neural networks are derived. Moreover, the globally exponential attractive sets and positive invariant sets are also presented here. In addition, the new proposed results here complement and extend the earlier publications on conventional or memristive neural network dynamical systems. Finally, numerical simulations are given to illustrate the effectiveness of obtained results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Global-mean BC lifetime as an indicator of model skill? Constraining the vertical aerosol distribution using aircraft observations

    NASA Astrophysics Data System (ADS)

    Lund, M. T.; Samset, B. H.; Skeie, R. B.; Berntsen, T.

    2017-12-01

    Several recent studies have used observations from the HIPPO flight campaigns to constrain the modeled vertical distribution of black carbon (BC) over the Pacific. Results indicate a relatively linear relationship between global-mean atmospheric BC residence time, or lifetime, and bias in current models. A lifetime of less than 5 days is necessary for models to reasonably reproduce these observations. This is shorter than what many global models predict, which will in turn affect their estimates of BC climate impacts. Here we use the chemistry-transport model OsloCTM to examine whether this relationship between global BC lifetime and model skill also holds for a broader a set of flight campaigns from 2009-2013 covering both remote marine and continental regions at a range of latitudes. We perform four sets of simulations with varying scavenging efficiency to obtain a spread in the modeled global BC lifetime and calculate the model error and bias for each campaign and region. Vertical BC profiles are constructed using an online flight simulator, as well by averaging and interpolating monthly mean model output, allowing us to quantify sampling errors arising when measurements are compared with model output at different spatial and temporal resolutions. Using the OsloCTM coupled with a microphysical aerosol parameterization, we investigate the sensitivity of modeled BC vertical distribution to uncertainties in the aerosol aging and scavenging processes in more detail. From this, we can quantify how model uncertainties in the BC life cycle propagate into uncertainties in its climate impacts. For most campaigns and regions, a short global-mean BC lifetime corresponds with the lowest model error and bias. On an aggregated level, sampling errors appear to be small, but larger differences are seen in individual regions. However, we also find that model-measurement discrepancies in BC vertical profiles cannot be uniquely attributed to uncertainties in a single process or parameter, at least in this model. Model development therefore needs to focus on improvements to individual processes, supported by a broad range of observational and experimental data, rather than tuning individual, effective parameters such as global BC lifetime.

  9. Parton Distributions based on a Maximally Consistent Dataset

    NASA Astrophysics Data System (ADS)

    Rojo, Juan

    2016-04-01

    The choice of data that enters a global QCD analysis can have a substantial impact on the resulting parton distributions and their predictions for collider observables. One of the main reasons for this has to do with the possible presence of inconsistencies, either internal within an experiment or external between different experiments. In order to assess the robustness of the global fit, different definitions of a conservative PDF set, that is, a PDF set based on a maximally consistent dataset, have been introduced. However, these approaches are typically affected by theory biases in the selection of the dataset. In this contribution, after a brief overview of recent NNPDF developments, we propose a new, fully objective, definition of a conservative PDF set, based on the Bayesian reweighting approach. Using the new NNPDF3.0 framework, we produce various conservative sets, which turn out to be mutually in agreement within the respective PDF uncertainties, as well as with the global fit. We explore some of their implications for LHC phenomenology, finding also good consistency with the global fit result. These results provide a non-trivial validation test of the new NNPDF3.0 fitting methodology, and indicate that possible inconsistencies in the fitted dataset do not affect substantially the global fit PDFs.

  10. The 1 km resolution global data set: needs of the International Geosphere Biosphere Programme

    USGS Publications Warehouse

    Townshend, J.R.G.; Justice, C.O.; Skole, D.; Malingreau, J.-P.; Cihlar, J.; Teillet, P.; Sadowski, F.; Ruttenberg, S.

    1994-01-01

    Examination of the scientific priorities for the International Geosphere Biosphere Programme (IGBP) reveals a requirement for global land data sets in several of its Core Projects. These data sets need to be at several space and time scales. Requirements are demonstrated for the regular acquisition of data at spatial resolutions of 1 km and finer and at high temporal frequencies. Global daily data at a resolution of approximately 1 km are sensed by the Advanced Very High Resolution Radiometer (AVHRR), but they have not been available in a single archive. It is proposed, that a global data set of the land surface is created from remotely sensed data from the AVHRR to support a number of IGBP's projects. This data set should have a spatial resolution of 1 km and should be generated at least once every 10 days for the entire globe. The minimum length of record should be a year, and ideally a system should be put in place which leads to the continuous acquisition of 1 km data to provide a base line data set prior to the Earth Observing System (EOS) towards the end of the decade. Because of the high cloud cover in many parts of the world, it is necessary to plan for the collection of data from every orbit. Substantial effort will be required in the preprocessing of the data set involving radiometric calibration, atmospheric correction, geometric correction and temporal compositing, to make it suitable for the extraction of information.

  11. Control of free-flying space robot manipulator systems

    NASA Technical Reports Server (NTRS)

    Cannon, Robert H., Jr.

    1988-01-01

    The focus of the work is to develop and perform a set of research projects using laboratory models of satellite robots. These devices use air cushion technology to simulate in two dimensions the drag-free, zero-g conditions of space. Five research areas are examined: cooperative manipulation on a fixed base; cooperative manipulation on a free-floating base; global navigation and control of a free-floating robot; an alternative transport mode call Locomotion Enhancement via Arm Push-Off (LEAP), and adaptive control of LEAP.

  12. A variable resolution nonhydrostatic global atmospheric semi-implicit semi-Lagrangian model

    NASA Astrophysics Data System (ADS)

    Pouliot, George Antoine

    2000-10-01

    The objective of this project is to develop a variable-resolution finite difference adiabatic global nonhydrostatic semi-implicit semi-Lagrangian (SISL) model based on the fully compressible nonhydrostatic atmospheric equations. To achieve this goal, a three-dimensional variable resolution dynamical core was developed and tested. The main characteristics of the dynamical core can be summarized as follows: Spherical coordinates were used in a global domain. A hydrostatic/nonhydrostatic switch was incorporated into the dynamical equations to use the fully compressible atmospheric equations. A generalized horizontal variable resolution grid was developed and incorporated into the model. For a variable resolution grid, in contrast to a uniform resolution grid, the order of accuracy of finite difference approximations is formally lost but remains close to the order of accuracy associated with the uniform resolution grid provided the grid stretching is not too significant. The SISL numerical scheme was implemented for the fully compressible set of equations. In addition, the generalized minimum residual (GMRES) method with restart and preconditioner was used to solve the three-dimensional elliptic equation derived from the discretized system of equations. The three-dimensional momentum equation was integrated in vector-form to incorporate the metric terms in the calculations of the trajectories. Using global re-analysis data for a specific test case, the model was compared to similar SISL models previously developed. Reasonable agreement between the model and the other independently developed models was obtained. The Held-Suarez test for dynamical cores was used for a long integration and the model was successfully integrated for up to 1200 days. Idealized topography was used to test the variable resolution component of the model. Nonhydrostatic effects were simulated at grid spacings of 400 meters with idealized topography and uniform flow. Using a high-resolution topographic data set and the variable resolution grid, sets of experiments with increasing resolution were performed over specific regions of interest. Using realistic initial conditions derived from re-analysis fields, nonhydrostatic effects were significant for grid spacings on the order of 0.1 degrees with orographic forcing. If the model code was adapted for use in a message passing interface (MPI) on a parallel supercomputer today, it was estimated that a global grid spacing of 0.1 degrees would be achievable for a global model. In this case, nonhydrostatic effects would be significant for most areas. A variable resolution grid in a global model provides a unified and flexible approach to many climate and numerical weather prediction problems. The ability to configure the model from very fine to very coarse resolutions allows for the simulation of atmospheric phenomena at different scales using the same code. We have developed a dynamical core illustrating the feasibility of using a variable resolution in a global model.

  13. Do International Cocurricular Activities Have an Impact on Cultivating a Global Mindset in Business School Students?

    ERIC Educational Resources Information Center

    Le, Quan; Ling, Teresa; Yau, Jot

    2018-01-01

    In today's integrated global economy, business executives of multinational corporations are required to have a flexible global mindset in order to cope with the driving forces of globalization. Thus, the global market forces stress the importance for business schools to graduate students with skill sets pertinent to functioning competitively in…

  14. Developing Scientific Literacy Skills through Interdisciplinary, Technology-Based Global Simulations: GlobalEd 2

    ERIC Educational Resources Information Center

    Lawless, Kimberly A.; Brown, Scott W.

    2015-01-01

    GlobalEd 2 (GE2) is a set of technology-mediated, problem-based learning (PBL) simulations for middle-grade students, that capitalises on the multidisciplinary nature of the social sciences as an expanded curricular space for students to learn and apply scientific literacies and concepts, while simultaneously also enriching their understanding of…

  15. Financing tuberculosis control: the role of a global financial monitoring system.

    PubMed

    Floyd, Katherine; Pantoja, Andrea; Dye, Christopher

    2007-05-01

    Control of tuberculosis (TB), like health care in general, costs money. To sustain TB control at current levels, and to make further progress so that global targets can be achieved, information about funding needs, sources of funding, funding gaps and expenditures is important at global, regional, national and sub-national levels. Such data can be used for resource mobilization efforts; to document how funding requirements and gaps are changing over time; to assess whether increases in funding can be translated into increased expenditures and whether increases in expenditure are producing improvements in programme performance; and to identify which countries or regions have the greatest needs and funding gaps. In this paper, we discuss a global system for financial monitoring of TB control that was established in WHO in 2002. By early 2007, this system had accounted for actual or planned expenditures of more than US$ 7 billion and was systematically reporting financial data for countries that carry more than 90% of the global burden of TB. We illustrate the value of this system by presenting major findings that have been produced for the period 2002-2007, including results that are relevant to the achievement of global targets for TB control set for 2005 and 2015. We also analyse the strengths and limitations of the system and its relevance to other health-care programmes.

  16. Modeling global mangrove soil carbon stocks: filling the gaps in coastal environments

    NASA Astrophysics Data System (ADS)

    Rovai, A.; Twilley, R.

    2017-12-01

    We provide an overview of contemporaneous global mangrove soil organic carbon (SOC) estimates, focusing on a framework to explain disproportionate differences among observed data as a way to improve global estimates. This framework is based on a former conceptual model, the coastal environmental setting, in contrast to the more popular latitude-based hypotheses largely believed to explain hemispheric variation in mangrove ecosystem properties. To demonstrate how local and regional estimates of SOC linked to coastal environmental settings can render more realistic global mangrove SOC extrapolations we combined published and unpublished data, yielding a total of 106 studies, reporting on 552 sites from 43 countries. These sites were classified into distinct coastal environmental setting types according to two concurrent worldwide typology of nearshore coastal systems classifications. Mangrove SOC density varied substantially across coastal environmental settings, ranging from 14.9 ± 0.8 in river dominated (deltaic) soils to 53.9 ± 1.6 mg cm-3 (mean ± SE) in karstic coastlines. Our findings reveal striking differences between published values and contemporary global mangrove SOC extrapolation based on country-level mean reference values, particularly for karstic-dominated coastlines where mangrove SOC stocks have been underestimated by up to 50%. Correspondingly, climate-based global estimates predicted lower mangrove SOC density values (32-41 mg C cm-3) for mangroves in karstic environments, differing from published (21-126 mg C cm-3) and unpublished (47-58 mg C cm-3) values. Moreover, climate-based projections yielded higher SOC density values (27-70 mg C cm-3) for river-dominated mangroves compared to lower ranges reported in the literature (11-24 mg C cm-3). We argue that this inconsistent reporting of SOC stock estimates between river-dominated and karstic coastal environmental settings is likely due to the omission of geomorphological and geophysical environmental drivers, which control C storage in coastal wetlands. We encourage the science community more close utilize coastal environmental settings and new inventories of geomorphological typologies to build more robust estimates of local and regional estimates of SOC that can be extrapolated to global C estimates.

  17. Assessing Performance in Shoulder Arthroscopy: The Imperial Global Arthroscopy Rating Scale (IGARS).

    PubMed

    Bayona, Sofia; Akhtar, Kash; Gupte, Chinmay; Emery, Roger J H; Dodds, Alexander L; Bello, Fernando

    2014-07-02

    Surgical training is undergoing major changes with reduced resident work hours and an increasing focus on patient safety and surgical aptitude. The aim of this study was to create a valid, reliable method for an assessment of arthroscopic skills that is independent of time and place and is designed for both real and simulated settings. The validity of the scale was tested using a virtual reality shoulder arthroscopy simulator. The study consisted of two parts. In the first part, an Imperial Global Arthroscopy Rating Scale for assessing technical performance was developed using a Delphi method. Application of this scale required installing a dual-camera system to synchronously record the simulator screen and body movements of trainees to allow an assessment that is independent of time and place. The scale includes aspects such as efficient portal positioning, angles of instrument insertion, proficiency in handling the arthroscope and adequately manipulating the camera, and triangulation skills. In the second part of the study, a validation study was conducted. Two experienced arthroscopic surgeons, blinded to the identities and experience of the participants, each assessed forty-nine subjects performing three different tests using the Imperial Global Arthroscopy Rating Scale. Results were analyzed using two-way analysis of variance with measures of absolute agreement. The intraclass correlation coefficient was calculated for each test to assess inter-rater reliability. The scale demonstrated high internal consistency (Cronbach alpha, 0.918). The intraclass correlation coefficient demonstrated high agreement between the assessors: 0.91 (p < 0.001). Construct validity was evaluated using Kruskal-Wallis one-way analysis of variance (chi-square test, 29.826; p < 0.001), demonstrating that the Imperial Global Arthroscopy Rating Scale distinguishes significantly between subjects with different levels of experience utilizing a virtual reality simulator. The Imperial Global Arthroscopy Rating Scale has a high internal consistency and excellent inter-rater reliability and offers an approach for assessing technical performance in basic arthroscopy on a virtual reality simulator. The Imperial Global Arthroscopy Rating Scale provides detailed information on surgical skills. Although it requires further validation in the operating room, this scale, which is independent of time and place, offers a robust and reliable method for assessing arthroscopic technical skills. Copyright © 2014 by The Journal of Bone and Joint Surgery, Incorporated.

  18. A Review of Pediatric Critical Care in Resource-Limited Settings: A Look at Past, Present, and Future Directions

    PubMed Central

    Turner, Erin L.; Nielsen, Katie R.; Jamal, Shelina M.; von Saint André-von Arnim, Amelie; Musa, Ndidiamaka L.

    2016-01-01

    Fifteen years ago, United Nations world leaders defined millenium development goal 4 (MDG 4): to reduce under-5-year mortality rates by two-thirds by the year 2015. Unfortunately, only 27 of 138 developing countries are expected to achieve MDG 4. The majority of childhood deaths in these settings result from reversible causes, and developing effective pediatric emergency and critical care services could substantially reduce this mortality. The Ebola outbreak highlighted the fragility of health care systems in resource-limited settings and emphasized the urgent need for a paradigm shift in the global approach to healthcare delivery related to critical illness. This review provides an overview of pediatric critical care in resource-limited settings and outlines strategies to address challenges specific to these areas. Implementation of these tools has the potential to move us toward delivery of an adequate standard of critical care for all children globally, and ultimately decrease global child mortality in resource-limited settings. PMID:26925393

  19. Adaptive Local Realignment of Protein Sequences.

    PubMed

    DeBlasio, Dan; Kececioglu, John

    2018-06-11

    While mutation rates can vary markedly over the residues of a protein, multiple sequence alignment tools typically use the same values for their scoring-function parameters across a protein's entire length. We present a new approach, called adaptive local realignment, that in contrast automatically adapts to the diversity of mutation rates along protein sequences. This builds upon a recent technique known as parameter advising, which finds global parameter settings for an aligner, to now adaptively find local settings. Our approach in essence identifies local regions with low estimated accuracy, constructs a set of candidate realignments using a carefully-chosen collection of parameter settings, and replaces the region if a realignment has higher estimated accuracy. This new method of local parameter advising, when combined with prior methods for global advising, boosts alignment accuracy as much as 26% over the best default setting on hard-to-align protein benchmarks, and by 6.4% over global advising alone. Adaptive local realignment has been implemented within the Opal aligner using the Facet accuracy estimator.

  20. Climate Trend Detection using Sea-Surface Temperature Data-sets from the (A)ATSR and AVHRR Space Sensors.

    NASA Astrophysics Data System (ADS)

    Llewellyn-Jones, D. T.; Corlett, G. K.; Remedios, J. J.; Noyes, E. J.; Good, S. A.

    2007-05-01

    Sea-Surface Temperature (SST) is an important indicator of global change, designated by GCOS as an essential Climate Variable (ECV). The detection of trends in Global SST requires rigorous measurements that are not only global, but also highly accurate and consistent. Space instruments can provide the means to achieve these required attributes in SST data. This paper presents an analysis of 15 years of SST data from two independent data sets, generated from the (A)ATSR and AVHRR series of sensors respectively. The analyses reveal trends of increasing global temperature between 0.13°C to 0.18 °C, per decade, closely matching that expected from some current predictions. A high level of consistency in the results from the two independent observing systems is seen, which gives increased confidence in data from both systems and also enables comparative analyses of the accuracy and stability of both data sets to be carried out. The conclusion is that these satellite SST data-sets provide important means to quantify and explore the processes of climate change. An analysis based upon singular value decomposition, allowing the removal of gross transitory disturbances, notably the El Niño, in order to examine regional areas of change other than the tropical Pacific, is also presented. Interestingly, although El Niño events clearly affect SST globally, they are found to have a non- significant (within error) effect on the calculated trends, which changed by only 0.01 K/decade when the pattern of El Niño and the associated variations was removed from the SST record. Although similar global trends were calculated for these two independent data sets, larger regional differences are noted. Evidence of decreased temperatures after the eruption of Mount Pinatubo in 1991 was also observed. The methodology demonstrated here can be applied to other data-sets, which cover long time-series observations of geophysical observations in order to characterise long-term change.

  1. Cognitive Performance Scores for the Pediatric Automated Neuropsychological Assessment Metrics in Childhood-Onset Systemic Lupus Erythematosus.

    PubMed

    Vega-Fernandez, Patricia; Vanderburgh White, Shana; Zelko, Frank; Ruth, Natasha M; Levy, Deborah M; Muscal, Eyal; Klein-Gitelman, Marisa S; Huber, Adam M; Tucker, Lori B; Roebuck-Spencer, Tresa; Ying, Jun; Brunner, Hermine I

    2015-08-01

    To develop and initially validate a global cognitive performance score (CPS) for the Pediatric Automated Neuropsychological Assessment Metrics (PedANAM) to serve as a screening tool of cognition in childhood lupus. Patients (n = 166) completed the 9 subtests of the PedANAM battery, each of which provides 3 principal performance parameters (accuracy, mean reaction time for correct responses, and throughput). Cognitive ability was measured by formal neurocognitive testing or estimated by the Pediatric Perceived Cognitive Function Questionnaire-43 to determine the presence or absence of neurocognitive dysfunction (NCD). A subset of the data was used to develop 4 candidate PedANAM-CPS indices with supervised or unsupervised statistical approaches: PedANAM-CPSUWA , i.e., unweighted averages of the accuracy scores of all PedANAM subtests; PedANAM-CPSPCA , i.e., accuracy scores of all PedANAM subtests weighted through principal components analysis; PedANAM-CPSlogit , i.e., algorithm derived from logistic models to estimate NCD status based on the accuracy scores of all of the PedANAM subtests; and PedANAM-CPSmultiscore , i.e., algorithm derived from logistic models to estimate NCD status based on select PedANAM performance parameters. PedANAM-CPS candidates were validated using the remaining data. PedANAM-CPS indices were moderately correlated with each other (|r| > 0.65). All of the PedANAM-CPS indices discriminated children by NCD status across data sets (P < 0.036). The PedANAM-CPSmultiscore had the highest area under the receiver operating characteristic curve (AUC) across all data sets for identifying NCD status (AUC >0.74), followed by the PedANAM-CPSlogit , the PedANAM-CPSPCA , and the PedANAM-CPSUWA , respectively. Based on preliminary validation and considering ease of use, the PedANAM-CPSmultiscore and the PedANAM-CPSPCA appear to be best suited as global measures of PedANAM performance. © 2015, American College of Rheumatology.

  2. Climate pattern-scaling set for an ensemble of 22 GCMs - adding uncertainty to the IMOGEN version 2.0 impact system

    NASA Astrophysics Data System (ADS)

    Zelazowski, Przemyslaw; Huntingford, Chris; Mercado, Lina M.; Schaller, Nathalie

    2018-02-01

    Global circulation models (GCMs) are the best tool to understand climate change, as they attempt to represent all the important Earth system processes, including anthropogenic perturbation through fossil fuel burning. However, GCMs are computationally very expensive, which limits the number of simulations that can be made. Pattern scaling is an emulation technique that takes advantage of the fact that local and seasonal changes in surface climate are often approximately linear in the rate of warming over land and across the globe. This allows interpolation away from a limited number of available GCM simulations, to assess alternative future emissions scenarios. In this paper, we present a climate pattern-scaling set consisting of spatial climate change patterns along with parameters for an energy-balance model that calculates the amount of global warming. The set, available for download, is derived from 22 GCMs of the WCRP CMIP3 database, setting the basis for similar eventual pattern development for the CMIP5 and forthcoming CMIP6 ensemble. Critically, it extends the use of the IMOGEN (Integrated Model Of Global Effects of climatic aNomalies) framework to enable scanning across full uncertainty in GCMs for impact studies. Across models, the presented climate patterns represent consistent global mean trends, with a maximum of 4 (out of 22) GCMs exhibiting the opposite sign to the global trend per variable (relative humidity). The described new climate regimes are generally warmer, wetter (but with less snowfall), cloudier and windier, and have decreased relative humidity. Overall, when averaging individual performance across all variables, and without considering co-variance, the patterns explain one-third of regional change in decadal averages (mean percentage variance explained, PVE, 34.25 ± 5.21), but the signal in some models exhibits much more linearity (e.g. MIROC3.2(hires): 41.53) than in others (GISS_ER: 22.67). The two most often considered variables, near-surface temperature and precipitation, have a PVE of 85.44 ± 4.37 and 14.98 ± 4.61, respectively. We also provide an example assessment of a terrestrial impact (changes in mean runoff) and compare projections by the IMOGEN system, which has one land surface model, against direct GCM outputs, which all have alternative representations of land functioning. The latter is noted as an additional source of uncertainty. Finally, current and potential future applications of the IMOGEN version 2.0 modelling system in the areas of ecosystem modelling and climate change impact assessment are presented and discussed.

  3. Cognitive and Psychomotor Entrustable Professional Activities: Can Simulators Help Assess Competency in Trainees?

    PubMed

    Dwyer, Tim; Wadey, Veronica; Archibald, Douglas; Kraemer, William; Shantz, Jesse Slade; Townley, John; Ogilvie-Harris, Darrell; Petrera, Massimo; Ferguson, Peter; Nousiainen, Markku

    2016-04-01

    An entrustable professional activity describes a professional task that postgraduate residents must master during their training. The use of simulation to assess performance of entrustable professional activities requires further investigation. (1) Is simulation-based assessment of resident performance of entrustable professional activities reliable? (2) Is there evidence of important differences between Postgraduate Year (PGY)-1 and PGY-4 residents when performing simulated entrustable professional activities? Three entrustable professional activities were chosen from a list of competencies: management of the patient for total knee arthroplasty (TKA); management of the patient with an intertrochanteric hip fracture; and management of the patient with an ankle fracture. Each assessment of entrustable professional activity was 40 minutes long with three components: preoperative management of a patient (history-taking, examination, image interpretation); performance of a technical procedure on a sawbones model; and postoperative management of a patient (postoperative orders, management of complications). Residents were assessed by six faculty members who used checklists based on a modified Delphi technique, an overall global rating scale as well as a previously validated global rating scale for the technical procedure component of each activity. Nine PGY-1 and nine PGY-4 residents participated in our simulated assessment. We assessed reliability by calculating the internal consistency of the mean global rating for each activity as well as the interrater reliability between the faculty assessment and blinded review of videotaped encounters. We sought evidence of a difference in performance between PGY-1 and PGY-4 residents on the overall global rating scale for each station of each entrustable professional activity. The reliability (Cronbach's α) for the hip fracture activity was 0.88, it was 0.89 for the ankle fracture activity, and it was 0.84 for the TKA activity. A strong correlation was seen between blinded observer video review and faculty scores (mean 0.87 [0.07], p < 0.001). For the hip fracture entrustable professional activity, the PGY-4 group had a higher mean global rating scale than the PGY-1 group for preoperative management (3.56 [0.5] versus 2.33 [0.5], p < 0.001), postoperative management (3.67 [0.5] versus 2.22 [0.7], p < 0.001), and technical procedures (3.11 [0.3] versus 3.67 [0.5], p = 0.015). For the TKA activity, the PGY-4 group scored higher for postoperative management (3.5 [0.8] versus 2.67 [0.5], p = 0.016) and technical procedures (3.22 [0.9] versus 2.22 [0.9], p = 0.04) than the PGY-1 group, but no difference for preoperative management with the numbers available (PGY-4, 3.44 [0.7] versus PGY-1 2.89 [0.8], p = 0.14). For the ankle fracture activity, the PGY-4 group scored higher for postoperative management (3.22 [0.8] versus 2.33 [0.7], p = 0.18) and technical procedures (3.22 [1.2] versus 2.0 [0.7], p = 0.018) than the PGY-1 groups, but no difference for preoperative management with the numbers available (PGY-4, 3.22 [0.8] versus PGY-1, 2.78 [0.7], p = 0.23). The results of our study show that simulated assessment of entrustable professional activities may be used to determine the ability of a resident to perform professional tasks that are critical components of medical training. In this manner, educators can ensure that competent performance of these skills in the simulated setting occurs before actual practice with patients in the clinical setting.

  4. A soft damping function for dispersion corrections with less overfitting

    NASA Astrophysics Data System (ADS)

    Ucak, Umit V.; Ji, Hyunjun; Singh, Yashpal; Jung, Yousung

    2016-11-01

    The use of damping functions in empirical dispersion correction schemes is common and widespread. These damping functions contain scaling and damping parameters, and they are usually optimized for the best performance in practical systems. In this study, it is shown that the overfitting problem can be present in current damping functions, which can sometimes yield erroneous results for real applications beyond the nature of training sets. To this end, we present a damping function called linear soft damping (lsd) that suffers less from this overfitting. This linear damping function damps the asymptotic curve more softly than existing damping functions, attempting to minimize the usual overcorrection. The performance of the proposed damping function was tested with benchmark sets for thermochemistry, reaction energies, and intramolecular interactions, as well as intermolecular interactions including nonequilibrium geometries. For noncovalent interactions, all three damping schemes considered in this study (lsd, lg, and BJ) roughly perform comparably (approximately within 1 kcal/mol), but for atomization energies, lsd clearly exhibits a better performance (up to 2-6 kcal/mol) compared to other schemes due to an overfitting in lg and BJ. The number of unphysical parameters resulting from global optimization also supports the overfitting symptoms shown in the latter numerical tests.

  5. Implementing Global Fund programs: a survey of opinions and experiences of the Principal Recipients across 69 countries.

    PubMed

    Wafula, Francis; Marwa, Charles; McCoy, David

    2014-03-24

    Principal Recipients (PRs) receive money from the Global Fund to fight AIDS, Tuberculosis and Malaria (Global Fund) to manage and implement programs. However, little research has gone into understanding their opinions and experiences. This survey set out to describe these, thereby providing a baseline against which changes in PR opinions and experiences can be assessed as the recently introduced new funding model is rolled out. An internet based questionnaire was administered to 315 PRs. A total of 115 responded from 69 countries in Africa, Asia, Eastern Europe and Latin America. The study was conducted between September and December 2012. Three quarters of PRs thought the progress update and disbursement request (PU/DR) system was a useful method of reporting grant progress. However, most felt that the grant negotiation processes were complicated, and that the grant rating system did not reflect performance.While nearly all PRs were happy with the work being done by sub-Recipients (92%) and Fund Portfolio Managers (86%), fewer were happy with the Office of the Inspector General (OIG). Non-government PRs were generally less happy with the OIG's work compared to government PRs.Most PRs thought the Global Fund's Voluntary Pooled Procurement system made procurement easier. However, only 29% said the system should be made compulsory.When asked which aspects of the Global Fund's operations needed improvement, most PRs said that the Fund should re-define and clarify the roles of different actors, minimize staff turnover at its Secretariat, and shorten the grant application and approval processes. All these are currently being addressed, either directly or indirectly, under a new funding model. Vigorous assessments should nonetheless follow the roll-out of the new model to ensure the areas that are most likely to affect PR performance realize sustained improvement. Opinions and experiences with the Global Fund were varied, with PRs having good communication with Fund Portfolio Managers and sub-Recipients, but being unhappy with the grant negotiation and grant rating systems. Recommendations included simplifying grant processes, finding performance assessment methods that look beyond numbers, and employing Local Fund Agents who understand public health aspects of programs.

  6. New version of 1 km global river flood hazard maps for the next generation of Aqueduct Global Flood Analyzer

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, Edwin; van Beek, Rens; Winsemius, Hessel; Ward, Philip; Bierkens, Marc

    2017-04-01

    The Aqueduct Global Flood Analyzer, launched in 2015, is an open-access and free-of-charge web-based interactive platform which assesses and visualises current and future projections of river flood impacts across the globe. One of the key components in the Analyzer is a set of river flood inundation hazard maps derived from the global hydrological model simulation of PCR-GLOBWB. For the current version of the Analyzer, accessible on http://floods.wri.org/#/, the early generation of PCR-GLOBWB 1.0 was used and simulated at 30 arc-minute ( 50 km at the equator) resolution. In this presentation, we will show the new version of these hazard maps. This new version is based on the latest version of PCR-GLOBWB 2.0 (https://github.com/UU-Hydro/PCR-GLOBWB_model, Sutanudjaja et al., 2016, doi:10.5281/zenodo.60764) simulated at 5 arc-minute ( 10 km at the equator) resolution. The model simulates daily hydrological and water resource fluxes and storages, including the simulation of overbank volume that ends up on the floodplain (if flooding occurs). The simulation was performed for the present day situation (from 1960) and future climate projections (until 2099) using the climate forcing created in the ISI-MIP project. From the simulated flood inundation volume time series, we then extract annual maxima for each cell, and fit these maxima to a Gumbel extreme value distribution. This allows us to derive flood volume maps of any hazard magnitude (ranging from 2-year to 1000-year flood events) and for any time period (e.g. 1960-1999, 2010-2049, 2030-2069, and 2060-2099). The derived flood volumes (at 5 arc-minute resolution) are then spread over the high resolution terrain model using an updated GLOFRIS downscaling module (Winsemius et al., 2013, doi:10.5194/hess-17-1871-2013). The updated version performs a volume spreading sequentially from more upstream basins to downstream basins, hence enabling a better inclusion of smaller streams, and takes into account spreading of water over diverging deltaic regions. This results in a set of high resolution hazard maps of flood inundation depth at 30 arc-second ( 1 km at the equator) resolution. Together with many other updates and new features, the resulting flood hazard maps will be used in the next generation of the Aqueduct Global Flood Analyzer.

  7. Causal involvement of visual area MT in global feature-based enhancement but not contingent attentional capture.

    PubMed

    Painter, David R; Dux, Paul E; Mattingley, Jason B

    2015-09-01

    When visual attention is set for a particular target feature, such as color or shape, neural responses to that feature are enhanced across the visual field. This global feature-based enhancement is hypothesized to underlie the contingent attentional capture effect, in which task-irrelevant items with the target feature capture spatial attention. In humans, however, different cortical regions have been implicated in global feature-based enhancement and contingent capture. Here, we applied intermittent theta-burst stimulation (iTBS) to assess the causal roles of two regions of extrastriate cortex - right area MT and the right temporoparietal junction (TPJ) - in both global feature-based enhancement and contingent capture. We recorded cortical activity using EEG while participants monitored centrally for targets defined by color and ignored peripheral checkerboards that matched the distractor or target color. In central vision, targets were preceded by colored cues designed to capture attention. Stimuli flickered at unique frequencies, evoking distinct cortical oscillations. Analyses of these oscillations and behavioral performance revealed contingent capture in central vision and global feature-based enhancement in the periphery. Stimulation of right area MT selectively increased global feature-based enhancement, but did not influence contingent attentional capture. By contrast, stimulation of the right TPJ left both processes unaffected. Our results reveal a causal role for the right area MT in feature-based attention, and suggest that global feature-based enhancement does not underlie the contingent capture effect. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. "Functional" Inspiratory and Core Muscle Training Enhances Running Performance and Economy.

    PubMed

    Tong, Tomas K; McConnell, Alison K; Lin, Hua; Nie, Jinlei; Zhang, Haifeng; Wang, Jiayuan

    2016-10-01

    Tong, TK, McConnell, AK, Lin, H, Nie, J, Zhang, H, and Wang, J. "Functional" inspiratory and core muscle training enhances running performance and economy. J Strength Cond Res 30(10): 2942-2951, 2016-We compared the effects of two 6-week high-intensity interval training interventions. Under the control condition (CON), only interval training was undertaken, whereas under the intervention condition (ICT), interval training sessions were followed immediately by core training, which was combined with simultaneous inspiratory muscle training (IMT)-"functional" IMT. Sixteen recreational runners were allocated to either ICT or CON groups. Before the intervention phase, both groups undertook a 4-week program of "foundation" IMT to control for the known ergogenic effect of IMT (30 inspiratory efforts at 50% maximal static inspiratory pressure [P0] per set, 2 sets per day, 6 days per week). The subsequent 6-week interval running training phase consisted of 3-4 sessions per week. In addition, the ICT group undertook 4 inspiratory-loaded core exercises (10 repetitions per set, 2 sets per day, inspiratory load set at 50% post-IMT P0) immediately after each interval training session. The CON group received neither core training nor functional IMT. After the intervention phase, global inspiratory and core muscle functions increased in both groups (p ≤ 0.05), as evidenced by P0 and a sport-specific endurance plank test (SEPT) performance, respectively. Compared with CON, the ICT group showed larger improvements in SEPT, running economy at the speed of the onset of blood lactate accumulation, and 1-hour running performance (3.04% vs. 1.57%, p ≤ 0.05). The changes in these variables were interindividually correlated (r ≥ 0.57, n = 16, p ≤ 0.05). Such findings suggest that the addition of inspiratory-loaded core conditioning into a high-intensity interval training program augments the influence of the interval program on endurance running performance and that this may be underpinned by an improvement in running economy.

  9. A Review of Global Precipitation Data Sets: Data Sources, Estimation, and Intercomparisons

    NASA Astrophysics Data System (ADS)

    Sun, Qiaohong; Miao, Chiyuan; Duan, Qingyun; Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin

    2018-03-01

    In this paper, we present a comprehensive review of the data sources and estimation methods of 30 currently available global precipitation data sets, including gauge-based, satellite-related, and reanalysis data sets. We analyzed the discrepancies between the data sets from daily to annual timescales and found large differences in both the magnitude and the variability of precipitation estimates. The magnitude of annual precipitation estimates over global land deviated by as much as 300 mm/yr among the products. Reanalysis data sets had a larger degree of variability than the other types of data sets. The degree of variability in precipitation estimates also varied by region. Large differences in annual and seasonal estimates were found in tropical oceans, complex mountain areas, northern Africa, and some high-latitude regions. Overall, the variability associated with extreme precipitation estimates was slightly greater at lower latitudes than at higher latitudes. The reliability of precipitation data sets is mainly limited by the number and spatial coverage of surface stations, the satellite algorithms, and the data assimilation models. The inconsistencies described limit the capability of the products for climate monitoring, attribution, and model validation.

  10. The 1 km AVHRR global land data set: first stages in implementation

    USGS Publications Warehouse

    Eidenshink, J.C.; Faundeen, J.L.

    1994-01-01

    The global land 1 km data set project represents an international effort to acquire, archive, process, and distribute 1 km AVHRR data of the entire global land surface in order to meet the needs of the international science community. A network of 26 high resolution picture transmission (HRPT) stations, along with data recorded by the National Oceanic and Atmospheric Administration (NOAA), has been acquiring daily global land coverage since 1 April 1992. A data set of over 30000 AVHRR images has been archived and made available for distribution by the United States Geological Survey, EROS Data Center and the European Space Agency. Under the guidance of the International Geosphere Biosphere programme, processing standards for the AVHRR data have been developed for calibration, atmospheric correction, geometric registration, and the production of global 10-day maximum normalized difference vegetation index (NDVI) composites. The major uses of the composites are related to the study of surface vegetation cover. A prototype 10-day composite was produced for the period of 21–30 June 1992. Production of an 18-month time series of 10-day composites is underway.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelletier, Jon D.; Broxton, Patrick D.; Hazenberg, Pieter

    Earth’s terrestrial near-subsurface environment can be divided into relatively porous layers of soil, intact regolith, and sedimentary deposits above unweathered bedrock. Variations in the thicknesses of these layers control the hydrologic and biogeochemical responses of landscapes. Currently, Earth System Models approximate the thickness of these relatively permeable layers above bedrock as uniform globally, despite the fact that their thicknesses vary systematically with topography, climate, and geology. To meet the need for more realistic input data for models, we developed a high-resolution gridded global data set of the average thicknesses of soil, intact regolith, and sedimentary deposits within each 30 arcsecmore » (~ 1 km) pixel using the best available data for topography, climate, and geology as input. Our data set partitions the global land surface into upland hillslope, upland valley bottom, and lowland landscape components and uses models optimized for each landform type to estimate the thicknesses of each subsurface layer. On hillslopes, the data set is calibrated and validated using independent data sets of measured soil thicknesses from the U.S. and Europe and on lowlands using depth to bedrock observations from groundwater wells in the U.S. As a result, we anticipate that the data set will prove useful as an input to regional and global hydrological and ecosystems models.« less

  12. Steady-state global optimization of metabolic non-linear dynamic models through recasting into power-law canonical models

    PubMed Central

    2011-01-01

    Background Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task. PMID:21867520

  13. QNOTE: an instrument for measuring the quality of EHR clinical notes.

    PubMed

    Burke, Harry B; Hoang, Albert; Becher, Dorothy; Fontelo, Paul; Liu, Fang; Stephens, Mark; Pangaro, Louis N; Sessums, Laura L; O'Malley, Patrick; Baxi, Nancy S; Bunt, Christopher W; Capaldi, Vincent F; Chen, Julie M; Cooper, Barbara A; Djuric, David A; Hodge, Joshua A; Kane, Shawn; Magee, Charles; Makary, Zizette R; Mallory, Renee M; Miller, Thomas; Saperstein, Adam; Servey, Jessica; Gimbel, Ronald W

    2014-01-01

    The outpatient clinical note documents the clinician's information collection, problem assessment, and patient management, yet there is currently no validated instrument to measure the quality of the electronic clinical note. This study evaluated the validity of the QNOTE instrument, which assesses 12 elements in the clinical note, for measuring the quality of clinical notes. It also compared its performance with a global instrument that assesses the clinical note as a whole. Retrospective multicenter blinded study of the clinical notes of 100 outpatients with type 2 diabetes mellitus who had been seen in clinic on at least three occasions. The 300 notes were rated by eight general internal medicine and eight family medicine practicing physicians. The QNOTE instrument scored the quality of the note as the sum of a set of 12 note element scores, and its inter-rater agreement was measured by the intraclass correlation coefficient. The Global instrument scored the note in its entirety, and its inter-rater agreement was measured by the Fleiss κ. The overall QNOTE inter-rater agreement was 0.82 (CI 0.80 to 0.84), and its note quality score was 65 (CI 64 to 66). The Global inter-rater agreement was 0.24 (CI 0.19 to 0.29), and its note quality score was 52 (CI 49 to 55). The QNOTE quality scores were consistent, and the overall QNOTE score was significantly higher than the overall Global score (p=0.04). We found the QNOTE to be a valid instrument for evaluating the quality of electronic clinical notes, and its performance was superior to that of the Global instrument. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Which Skills are Associated with Residents’ Sense of Preparedness to Manage Chronic Pain?

    PubMed Central

    Fox, Aaron D.; Kunins, Hillary V.; Starrels, Joanna L.

    2013-01-01

    Objective To identify gaps in residents’ confidence and knowledge in managing chronic non-malignant pain (CNMP) and to explore whether specific skills or pain knowledge were associated with global preparedness to manage CNMP. Design Cross-sectional web-based survey Setting & Participants Internal medicine residents in Bronx, NY Main Outcome Measures We assessed: (1) confidence in skills within four content areas: physical examination, diagnosis, treatment, and safer opioid prescribing; (2) pain-related knowledge on a 16-item scale; and (3) global preparedness to manage CNMP (agreement with, “I feel prepared to manage CNMP”). Gaps in confidence were skills in which fewer than 50% reported confidence. Gaps in knowledge were items in which fewer than 50% answered correctly. Using logistic regression, we examined whether skills or knowledge were associated with global preparedness. Results Of 145 residents, 92 (63%) responded. Gaps in confidence included diagnosing fibromyalgia, performing corticosteroid injections, and using pain medication agreements. Gaps in knowledge included pharmacotherapy for neuropathic pain and interpreting urine drug test results. Twenty-four residents (26%) felt globally prepared to manage CNMP. Confidence using pain medication agreements (AOR 5.99, 95% CI: 2.02, 17.75), prescribing long-acting opioids (AOR 5.85, 95%CI: 2.00, 17.18), and performing corticosteroid injection of the knee (AOR 5.76, 95% CI: 1.16, 28.60]) were strongly associated with global preparedness. Conclusions Few internal medicine residents felt prepared to manage CNMP. Our findings suggest that educational interventions to improve residents’ preparedness to manage CNMP should target complex pain syndromes (e.g., fibromyalgia and neuropathic pain), safer opioid prescribing practices, and alternatives to opioid analgesics. PMID:23247909

  15. Production of long-term global water vapor and liquid water data set using ultra-fast methods to assimilate multi-satellite and radiosonde observations

    NASA Technical Reports Server (NTRS)

    Vonderhaar, Thomas H.; Randel, David L.; Reinke, Donald L.; Stephens, Graeme L.; Ringerud, Mark A.; Combs, Cynthia L.; Greenwald, Thomas J.; Wittmeyer, Ian L.

    1994-01-01

    In recent years climate research scientists have recognized the need for increased time and space resolution precipitable and liquid water data sets. This project is designed to meet those needs. Specifically, NASA is funding STC-METSAT to develop a total integrated column and layered precipitable water data set. This is complemented by a total column liquid water data set. These data are global in extent, 1 deg x 1 deg in resolution, with daily grids produced. Precipitable water is measured by a combination of in situ radiosonde observations and satellite derived infrared and microwave retrievals from four satellites. This project combines these data into a coherent merged product for use in global climate research. This report is the Year 2 Annual Report from this NASA-sponsored project and includes progress-to-date on the assigned tasks.

  16. A proposal to extend our understanding of the global economy

    NASA Technical Reports Server (NTRS)

    Hough, Robbin R.; Ehlers, Manfred

    1991-01-01

    Satellites acquire information on a global and repetitive basis. They are thus ideal tools for use when global scale and analysis over time is required. Data from satellites comes in digital form which means that it is ideally suited for incorporation in digital data bases and that it can be evaluated using automated techniques. The development of a global multi-source data set which integrates digital information is proposed regarding some 15,000 major industrial sites worldwide with remotely sensed images of the sites. The resulting data set would provide the basis for a wide variety of studies of the global economy. The preliminary results give promise of a new class of global policy model which is far more detailed and helpful to local policy makers than its predecessors. The central thesis of this proposal is that major industrial sites can be identified and their utilization can be tracked with the aid of satellite images.

  17. Globally Averaged Atmospheric CFC-11 Concentrations: Monthly and Annual Data for the Period 1975-1992 (DB1010)

    DOE Data Explorer

    Khalil, M. A.K. [Oregon Graduate Institute of Science and Technology Portland, Oregon (USA); Rasmussen, R. A. [Oregon Graduate Institute of Science and Technology Portland, Oregon

    1996-01-01

    This data set presents globally averaged atmospheric concentrations of chlorofluorocarbon 11, known also as CFC-11 or F-11 (chemical name: trichlorofluoromethane; formula: CCl3F). The monthly global average data are derived from flask air samples collected at eight sites in six locations over the period August 1980-July 1992. The sites are Barrow (Alaska), Cape Meares (Oregon), Cape Kumukahi and Mauna Loa (Hawaii), Cape Matatula (American Samoa), Cape Grim (Tasmania), Palmer Station, and the South Pole (Antarctica). At each collection site, monthly averages were obtained from three flask samples collected every week. In addition to the monthly global averages available for 1980-992, this data set also contains annual global average data for 1975-1985. These annual global averages were derived from January measurements at the South Pole and in the Pacific Northwest of the United States (specifically, Washington state and the Oregon coast).

  18. Mapping a Global Agenda for Adolescent Health

    PubMed Central

    Patton, George C.; Viner, Russell M.; Linh, Le Cu; Ameratunga, Shanthi; Fatusi, Adesegun O.; Ferguson, B. Jane; Patel, Vikram

    2016-01-01

    Major changes in health are underway in many low- and middle-income countries that are likely to bring greater focus on adolescents. This commentary, based on a 2009 London meeting, considers the need for strategic information for future global initiatives in adolescent health. Current coverage of adolescent health in global data collections is patchy. There is both the need and scope to extend existing collections into the adolescent years as well as achieve greater harmonization of measures between surveys. The development of a core set of global adolescent health indicators would aid this process. Other important tasks include adapting and testing interventions in low- and middle-income countries, growing research capacity in those settings, better communication of research from those countries, and building structures to implement future global initiatives. A global agenda needs more than good data, but sound information about adolescent health and its social and environmental determinants, will be important in both advocacy and practice. PMID:20970076

  19. Comprehensive evaluation of long-term hydrological data sets: Constraints of the Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Orlowsky, Boris; Seneviratne, Sonia I.

    2013-04-01

    An accurate estimate of the climatological land water balance is essential for a wide range of socio-economical issues. Despite the simplicity of the underlying water balance equation, its individual variables are of complex nature. Global estimates, either derived from observations or from models, of precipitation (P ) and especially evapotranspiration (ET) are characterized by high uncertainties. This leads to inconsistent results in determining conditions related to the land water balance and its components. In this study, we consider the Budyko framework as a constraint to evaluate long-term hydrological data sets within the period from 1984 to 2005. The Budyko framework is a well established empirically based relationsship between ET-P and Ep-P , with Ep being the potential evaporation. We use estimates of ET associated with the LandFlux-EVAL initiative (Mueller et. al., 2012), either derived from observations, CMIP5 models or land-surface models (LSMs) driven with observation-based forcing or atmospheric reanalyses. Data sets of P comprise all commonly used global observation-based estimates. Ep is determined by methods of differing complexity with recent global temperature and radiation data sets. Based on this comprehensive synthesis of data sets and methods to determine Ep, more than 2000 possible combinations for ET-P in conjunction with Ep-P are created. All combinations are validated against the Budyko curve and against physical limits within the Budyko phase space. For this purpose we develop an error measure based on the root mean square error which combines both constraints. We find that uncertainties are mainly induced by the ET data sets. In particular, reanalysis and CMIP5 data sets are characterized by low realism. The realism of LSMs is further not primarily controlled by the forcing, as different LSMs driven with the same forcing show significantly different error measures. Our comprehensive approach is thus suitable to detect uncertainties associated with individual data sets. Furthermore, combinations performing well within the Budyko phase space are identified and could be used for future studies, like e.g. to investigate decadal changes of the land water balance. B. MUELLER, M. HIRSCHI, C. JIMENEZ, P. CIAIS, P.A. DIRMEYER, A.J. DOLMAN, J.B. FISHER, Z. GUO, M. JUNG, F. LUDWIG, F. MAIGNAN, D. MIRALLES, M.F. MCCABE, M. REICHSTEIN, J. SHEELD, K. WANG, E.F.WOOD, Y. ZHANG, S.I. SENEVIRATNE (2012): Benchmark products for land evapotranspiration: LandFlux-EVAL multi-dataset synthesis, Hydrol. Earth Syst. Sci., submitted.

  20. Round Robin evaluation of soil moisture retrieval models for the MetOp-A ASCAT Instrument

    NASA Astrophysics Data System (ADS)

    Gruber, Alexander; Paloscia, Simonetta; Santi, Emanuele; Notarnicola, Claudia; Pasolli, Luca; Smolander, Tuomo; Pulliainen, Jouni; Mittelbach, Heidi; Dorigo, Wouter; Wagner, Wolfgang

    2014-05-01

    Global soil moisture observations are crucial to understand hydrologic processes, earth-atmosphere interactions and climate variability. ESA's Climate Change Initiative (CCI) project aims to create a global consistent long-term soil moisture data set based on the merging of the best available active and passive satellite-based microwave sensors and retrieval algorithms. Within the CCI, a Round Robin evaluation of existing retrieval algorithms for both active and passive instruments was carried out. In this study we present the comparison of five different retrieval algorithms covering three different modelling principles applied to active MetOp-A ASCAT L1 backscatter data. These models include statistical models (Bayesian Regression and Support Vector Regression, provided by the Institute for Applied Remote Sensing, Eurac Research Viale Druso, Italy, and an Artificial Neural Network, provided by the Institute of Applied Physics, CNR-IFAC, Italy), a semi-empirical model (provided by the Finnish Meteorological Institute), and a change detection model (provided by the Vienna University of Technology). The algorithms were applied on L1 backscatter data within the period of 2007-2011, resampled to a 12.5 km grid. The evaluation was performed over 75 globally distributed, quality controlled in situ stations drawn from the International Soil Moisture Network (ISMN) using surface soil moisture data from the Global Land Data Assimilation System (GLDAS-) Noah land surface model as second independent reference. The temporal correlation between the data sets was analyzed and random errors of the the different algorithms were estimated using the triple collocation method. Absolute soil moisture values as well as soil moisture anomalies were considered including both long-term anomalies from the mean seasonal cycle and short-term anomalies from a five weeks moving average window. Results show a very high agreement between all five algorithms for most stations. A slight vegetation dependency of the errors and a spatial decorrelation of the performance patterns of the different algorithms was found. We conclude that future research should focus on understanding, combining and exploiting the advantages of all available modelling approaches rather than trying to optimize one approach to fit every possible condition.

  1. Global and Temporal Cortical Folding in Patients with Early-Onset Schizophrenia

    ERIC Educational Resources Information Center

    Penttila, Jani; Paillere-Martinot, Marie-Laure; Martinot, Jean-Luc; Mangin, Jean-Francois; Burke, Lisa; Corrigall, Richard; Frangou, Sophia; Cachia, Arnaud

    2008-01-01

    Disturbances in the temporal lobes and alterations in cortical folding in adult on-set schizophrenia are studied using magnetic resonance T1 images of 51 patients. The study showed that patients with early on-set schizophrenia had lower global sulcal indices in both hemispheres and the left collateral sulcus has a lower sulcal index irrespective…

  2. Development of a New Research Data Infrastructure for Collaboration in Earth Observation and Global Change Science

    NASA Astrophysics Data System (ADS)

    Wagner, Wolfgang; Briese, Christian

    2017-04-01

    With the global population having surpassed 7 billion people in 2012, the impacts of human activities on the environment have started to be noticeable almost everywhere on our planet. Yet, while pressing social problems such as mass migration may be at least be partly a consequence of these impacts, many are still elusive, particularly when trying to quantify them on larger scales. Therefore, it is essential to collect verifiable observations that allow tracing environmental changes from a local to global scale over several decades. Complementing in situ networks, this task is increasingly fulfilled by earth observation satellites which have been acquiring measurements of the land, atmosphere and oceans since the beginning of the 1970s. While many multi-decadal data sets are already available, the major limitation hindering their effective exploitation in global change studies is the lack of dedicated data centres offering the high performance processing capabilities needed to process multi-year global data sets at a fine spatial resolution (Wagner, 2015). Essentially the only platform which currently offers these capabilities is Google's Earth Engine. From a scientific perspective there is undoubtedly a high need to build up independent science-driven platforms that are transparent for their users and offer a higher diversity and flexibility in terms of the data sets and algorithms used. Recognizing this need, TU Wien founded the EODC Earth Observation Data Centre for Water Resources Monitoring together with other Austrian partners in May 2014 as a public-private partnership (Wagner et al. 2014). Thanks to its integrative governance approach, EODC has succeeded of quickly developing an international cooperation consisting of scientific institutions, public organisations and several private partners. Making best use of their existing infrastructures, the EODC partners have already created the first elements of a federated IT infrastructure capable of storing and processing Petabytes of satellite data. One central site of this infrastructure is the Science Centre Arsenal in Vienna, where a cloud platform and storage system were set up and connected to the Vienna Scientific Cluster (VSC). To provide functionality, this facility connects several hardware components including a Petabyte-scale frontend storage for making data available for scientific analysis and high-performance-computing on the VSC, and robotic tape libraries for mirroring and archiving tens of Petabyte of data. In this contribution, the EODC approach for building a federated IT infrastructure and collaborative data storage and analysis capabilities are presented. REFERENCES Wagner, W. (2015) Big Data infrastructures for processing Sentinel data, in Photogrammetric Week 2015, Dieter Fritsch (Ed.), Wichmann/VDE, Berlin Offenbach, 93-104. Wagner, W., J. Fröhlich, G. Wotawa, R. Stowasser, M. Staudinger, C. Hoffmann, A. Walli, C. Federspiel, M. Aspetsberger, C. Atzberger, C. Briese, C. Notarnicola, M. Zebisch, A. Boresch, M. Enenkel, R. Kidd, A. von Beringe, S. Hasenauer, V. Naeimi, W. Mücke (2014) Addressing grand challenges in earth observation science: The Earth Observation Data Centre for Water Resources Monitoring, ISPRS Commission VII Symposium, Istanbul, Turkey, 29 September-2 October 2014, ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences (ISPRS Annals), Volume II-7, 81-88.

  3. Point-of-Care Diagnostics for Improving Maternal Health in South Africa

    PubMed Central

    Mashamba-Thompson, Tivani P.; Sartorius, Benn; Drain, Paul K.

    2016-01-01

    Improving maternal health is a global priority, particularly in high HIV-endemic, resource-limited settings. Failure to use health care facilities due to poor access is one of the main causes of maternal deaths in South Africa. “Point-of-care” (POC) diagnostics are an innovative healthcare approach to improve healthcare access and health outcomes in remote and resource-limited settings. In this review, POC testing is defined as a diagnostic test that is carried out near patients and leads to rapid clinical decisions. We review the current and emerging POC diagnostics for maternal health, with a specific focus on the World Health Organization (WHO) quality-ASSURED (Affordability, Sensitivity, Specificity, User friendly, Rapid and robust, Equipment free and Delivered) criteria for an ideal point-of-care test in resource-limited settings. The performance of POC diagnostics, barriers and challenges related to implementing POC diagnostics for maternal health in rural and resource-limited settings are reviewed. Innovative strategies for overcoming these barriers are recommended to achieve substantial progress on improving maternal health outcomes in these settings. PMID:27589808

  4. Global Improvement in Genotyping of Human Papillomavirus DNA: the 2011 HPV LabNet International Proficiency Study

    PubMed Central

    Eklund, Carina; Forslund, Ola; Wallin, Keng-Ling

    2014-01-01

    Accurate and internationally comparable human papillomavirus (HPV) DNA genotyping is essential for HPV vaccine research and for HPV surveillance. The HPV Laboratory Network (LabNet) has designed international proficiency studies that can be issued regularly and in a reproducible manner. The 2011 HPV genotyping proficiency panel contained 43 coded samples composed of purified plasmids of 16 HPV types (HPV6, -11, -16, -18, -31, -33, -35, -39, -45, -51, -52, -56, -58, -59, -66, -68a, and -68b) and 3 extraction controls. Tests that detected 50 IU of HPV16 and HPV18 and 500 genome equivalents for the other 14 HPV types in both single and multiple infections were considered proficient. Ninety-six laboratories worldwide submitted 134 data sets. Twenty-five different HPV genotyping assay methods were used, including the Linear Array, line blot/INNO-LiPA, PapilloCheck, and PCR Luminex assays. The major oncogenic HPV types, HPV16 and HPV18, were proficiently detected in 97.0% (113/116) and 87.0% (103/118) of the data sets, respectively. In 2011, 51 data sets (39%) were 100% proficient for the detection of at least one HPV type, and 37 data sets (28%) were proficient for all 16 HPV types; this was an improvement over the panel results from the 2008 and 2010 studies, when <25 data sets (23% and 19% for 2008 and 2010, respectively) were fully proficient. The improvement was also evident for the 54 laboratories that had also participated in the previous proficiency studies. In conclusion, a continuing global proficiency program has documented worldwide improvement in the comparability and reliability of HPV genotyping assay performances. PMID:24478473

  5. Global Landscape of a Co-Expressed Gene Network in Barley and its Application to Gene Discovery in Triticeae Crops

    PubMed Central

    Mochida, Keiichi; Uehara-Yamaguchi, Yukiko; Yoshida, Takuhiro; Sakurai, Tetsuya; Shinozaki, Kazuo

    2011-01-01

    Accumulated transcriptome data can be used to investigate regulatory networks of genes involved in various biological systems. Co-expression analysis data sets generated from comprehensively collected transcriptome data sets now represent efficient resources that are capable of facilitating the discovery of genes with closely correlated expression patterns. In order to construct a co-expression network for barley, we analyzed 45 publicly available experimental series, which are composed of 1,347 sets of GeneChip data for barley. On the basis of a gene-to-gene weighted correlation coefficient, we constructed a global barley co-expression network and classified it into clusters of subnetwork modules. The resulting clusters are candidates for functional regulatory modules in the barley transcriptome. To annotate each of the modules, we performed comparative annotation using genes in Arabidopsis and Brachypodium distachyon. On the basis of a comparative analysis between barley and two model species, we investigated functional properties from the representative distributions of the gene ontology (GO) terms. Modules putatively involved in drought stress response and cellulose biogenesis have been identified. These modules are discussed to demonstrate the effectiveness of the co-expression analysis. Furthermore, we applied the data set of co-expressed genes coupled with comparative analysis in attempts to discover potentially Triticeae-specific network modules. These results demonstrate that analysis of the co-expression network of the barley transcriptome together with comparative analysis should promote the process of gene discovery in barley. Furthermore, the insights obtained should be transferable to investigations of Triticeae plants. The associated data set generated in this analysis is publicly accessible at http://coexpression.psc.riken.jp/barley/. PMID:21441235

  6. 3D multimodal MRI brain glioma tumor and edema segmentation: a graph cut distribution matching approach.

    PubMed

    Njeh, Ines; Sallemi, Lamia; Ayed, Ismail Ben; Chtourou, Khalil; Lehericy, Stephane; Galanaud, Damien; Hamida, Ahmed Ben

    2015-03-01

    This study investigates a fast distribution-matching, data-driven algorithm for 3D multimodal MRI brain glioma tumor and edema segmentation in different modalities. We learn non-parametric model distributions which characterize the normal regions in the current data. Then, we state our segmentation problems as the optimization of several cost functions of the same form, each containing two terms: (i) a distribution matching prior, which evaluates a global similarity between distributions, and (ii) a smoothness prior to avoid the occurrence of small, isolated regions in the solution. Obtained following recent bound-relaxation results, the optima of the cost functions yield the complement of the tumor region or edema region in nearly real-time. Based on global rather than pixel wise information, the proposed algorithm does not require an external learning from a large, manually-segmented training set, as is the case of the existing methods. Therefore, the ensuing results are independent of the choice of a training set. Quantitative evaluations over the publicly available training and testing data set from the MICCAI multimodal brain tumor segmentation challenge (BraTS 2012) demonstrated that our algorithm yields a highly competitive performance for complete edema and tumor segmentation, among nine existing competing methods, with an interesting computing execution time (less than 0.5s per image). Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Galactic cosmic-ray model in the light of AMS-02 nuclei data

    NASA Astrophysics Data System (ADS)

    Niu, Jia-Shu; Li, Tianjun

    2018-01-01

    Cosmic ray (CR) physics has entered a precision-driven era. With the latest AMS-02 nuclei data (boron-to-carbon ratio, proton flux, helium flux, and antiproton-to-proton ratio), we perform a global fitting and constrain the primary source and propagation parameters of cosmic rays in the Milky Way by considering 3 schemes with different data sets (with and without p ¯ /p data) and different propagation models (diffusion-reacceleration and diffusion-reacceleration-convection models). We find that the data set with p ¯/p data can remove the degeneracy between the propagation parameters effectively and it favors the model with a very small value of convection (or disfavors the model with convection). The separated injection spectrum parameters are used for proton and other nucleus species, which reveal the different breaks and slopes among them. Moreover, the helium abundance, antiproton production cross sections, and solar modulation are parametrized in our global fitting. Benefited from the self-consistence of the new data set, the fitting results show a little bias, and thus the disadvantages and limitations of the existed propagation models appear. Comparing to the best fit results for the local interstellar spectra (ϕ =0 ) with the VOYAGER-1 data, we find that the primary sources or propagation mechanisms should be different between proton and helium (or other heavier nucleus species). Thus, how to explain these results properly is an interesting and challenging question.

  8. Comparison of the New LEAF Area INDEX (LAI 3G) with the Kazahstan-Wide LEAF Area INDEX DATA SET (GGRS-LAI) over Central ASIA

    NASA Astrophysics Data System (ADS)

    Kappas, M.; Propastin, P.; Degener, J.; Renchin, T.

    2014-12-01

    Long-term global data sets of Leaf Area Index (LAI) are important for monitoring global vegetation dynamics. LAI indicating phenological development of vegetation is an important state variable for modeling land surface processes. The comparison of long-term data sets is based on two recently available data sets both derived from AVHRR time series. The LAI 3g data set introduced by Zaichun Zhu et al. (2013) is developed from the new improved third generation Global Inventory Modeling and Mapping Studies (GIMMS) Normalized Difference Vegetation Index (NDVI3g) and best-quality MODIS LAI data. The second long-term data set is based on the 8 km spatial resolution GIMMS-AVHRR data (GGRS-data set by Propastin et al. 2012). The GGRS-LAI product uses a three-dimensional physical radiative transfer model which establishes relationship between LAI, vegetation fractional cover and given patterns of surface reflectance, view-illumination conditions and optical properties of vegetation. The model incorporates a number of site/region specific parameters, including the vegetation architecture variables such as leaf angle distribution, clumping index, and light extinction coefficient. For the application of the model to Kazakhstan, the vegetation architecture variables were computed at the local (pixel) level based on extensive field surveys of the biophysical properties of vegetation in representative grassland areas of Kazakhstan. The comparison of both long-term data sets will be used to interpret their quality for scientific research in other disciplines. References:Propastin, P., Kappas, M. (2012). Retrieval of coarse-resolution leaf area index over the Republic of Kazakhstan using NOAA AVHRR satellite data and ground measurements," Remote Sensing, vol. 4, no. 1, pp. 220-246. Zaichun Zhu, Jian Bi, Yaozhong Pan, Sangram Ganguly, Alessandro Anav, Liang Xu, Arindam Samanta, Shilong Piao, Ramakrishna R. Nemani and Ranga B. Myneni (2013). Global Data Sets of Vegetation Leaf Area Index (LAI)3g and Fraction of photosynthetically Active Radiation (FPAR)3g Derived from Global Inventory Modeling and Mapping Studies (GIMMS) Normalized Difference Vegetation Index (NDVI3g) for the Period 1981 to 2011. Remote Sens. 2013, 5, 927-948; doi:10.3390/rs5020927

  9. The unfunded priorities: an evaluation of priority setting for noncommunicable disease control in Uganda.

    PubMed

    Essue, Beverley M; Kapiriri, Lydia

    2018-02-20

    The double burden of infectious diseases coupled with noncommunicable diseases poses unique challenges for priority setting and for achieving equitable action to address the major causes of disease burden in health systems already impacted by limited resources. Noncommunicable disease control is an important global health and development priority. However, there are challenges for translating this global priority into local priorities and action. The aim of this study was to evaluate the influence of national, sub-national and global factors on priority setting for noncommunicable disease control in Uganda and examine the extent to which priority setting was successful. A mixed methods design that used the Kapiriri & Martin framework for evaluating priority setting in low income countries. The evaluation period was 2005-2015. Data collection included a document review (policy documents (n = 19); meeting minutes (n = 28)), media analysis (n = 114) and stakeholder interviews (n = 9). Data were analysed according to the Kapiriri & Martin (2010) framework. Priority setting for noncommunicable diseases was not entirely fair nor successful. While there were explicit processes that incorporated relevant criteria, evidence and wide stakeholder involvement, these criteria were not used systematically or consistently in the contemplation of noncommunicable diseases. There were insufficient resources for noncommunicable diseases, despite being a priority area. There were weaknesses in the priority setting institutions, and insufficient mechanisms to ensure accountability for decision-making. Priority setting was influenced by the priorities of major stakeholders (i.e. development assistance partners) which were not always aligned with national priorities. There were major delays in the implementation of noncommunicable disease-related priorities and in many cases, a failure to implement. This evaluation revealed the challenges that low income countries are grappling with in prioritizing noncommunicable diseases in the context of a double disease burden with limited resources. Strengthening local capacity for priority setting would help to support the development of sustainable and implementable noncommunicable disease-related priorities. Global support (i.e. aid) to low income countries for noncommunicable diseases must also catch up to align with NCDs as a global health priority.

  10. Coordinating Multi-Rover Systems: Evaluation Functions for Dynamic and Noisy Environments

    NASA Technical Reports Server (NTRS)

    Turner, Kagan; Agogino, Adrian

    2005-01-01

    This paper addresses the evolution of control strategies for a collective: a set of entities that collectively strives to maximize a global evaluation function that rates the performance of the full system. Directly addressing such problems by having a population of collectives and applying the evolutionary algorithm to that population is appealing, but the search space is prohibitively large in most cases. Instead, we focus on evolving control policies for each member of the collective. The fundamental issue in this approach is how to create an evaluation function for each member of the collective that is both aligned with the global evaluation function and is sensitive to the fitness changes of the member, while relatively insensitive to the fitness changes of other members. We show how to construct evaluation functions in dynamic, noisy and communication-limited collective environments. On a rover coordination problem, a control policy evolved using aligned and member-sensitive evaluations outperfoms global evaluation methods by up to 400%. More notably, in the presence of a larger number of rovers or rovers with noisy and communication limited sensors, the proposed method outperforms global evaluation by a higher percentage than in noise-free conditions with a small number of rovers.

  11. Climate Prediction Center global monthly soil moisture data set at 0.5° resolution for 1948 to present

    NASA Astrophysics Data System (ADS)

    Fan, Yun; van den Dool, Huug

    2004-05-01

    We have produced a 0.5° × 0.5° monthly global soil moisture data set for the period from 1948 to the present. The land model is a one-layer "bucket" water balance model, while the driving input fields are Climate Prediction Center monthly global precipitation over land, which uses over 17,000 gauges worldwide, and monthly global temperature from global Reanalysis. The output consists of global monthly soil moisture, evaporation, and runoff, starting from January 1948. A distinguishing feature of this data set is that all fields are updated monthly, which greatly enhances utility for near-real-time purposes. Data validation shows that the land model does well; both the simulated annual cycle and interannual variability of soil moisture are reasonably good against the limited observations in different regions. A data analysis reveals that, on average, the land surface water balance components have a stronger annual cycle in the Southern Hemisphere than those in the Northern Hemisphere. From the point of view of soil moisture, climates can be characterized into two types, monsoonal and midlatitude climates, with the monsoonal ones covering most of the low-latitude land areas and showing a more prominent annual variation. A global soil moisture empirical orthogonal function analysis and time series of hemisphere means reveal some interesting patterns (like El Niño-Southern Oscillation) and long-term trends in both regional and global scales.

  12. Tropospheric and Lower Stratospheric Temperature Anomalies Based on Global Radiosonde Network Data (1958 - 2005)

    DOE Data Explorer

    Sterin, Alexander M. [Russian Research Institute for Hydrometeorological Information--World Data Center

    2007-01-01

    The observed radiosonde data from the Comprehensive Aerological Reference Data Set (CARDS) (Eskridge et al. 1995) were taken as the primary input for obtaining the series. These data were for the global radiosonde observational network through 2001. Since 2002, the AEROSTAB data (uper-air observations obtained through communication channels), collected at RIHMI-WDC in Obninsk, have been used. Both of these data sources were for the global radiosonde observational network. The CARDS data set is known as the most complete collection of radiosonde data.

  13. Material discovery by combining stochastic surface walking global optimization with a neural network.

    PubMed

    Huang, Si-Da; Shang, Cheng; Zhang, Xiao-Jie; Liu, Zhi-Pan

    2017-09-01

    While the underlying potential energy surface (PES) determines the structure and other properties of a material, it has been frustrating to predict new materials from theory even with the advent of supercomputing facilities. The accuracy of the PES and the efficiency of PES sampling are two major bottlenecks, not least because of the great complexity of the material PES. This work introduces a "Global-to-Global" approach for material discovery by combining for the first time a global optimization method with neural network (NN) techniques. The novel global optimization method, named the stochastic surface walking (SSW) method, is carried out massively in parallel for generating a global training data set, the fitting of which by the atom-centered NN produces a multi-dimensional global PES; the subsequent SSW exploration of large systems with the analytical NN PES can provide key information on the thermodynamics and kinetics stability of unknown phases identified from global PESs. We describe in detail the current implementation of the SSW-NN method with particular focuses on the size of the global data set and the simultaneous energy/force/stress NN training procedure. An important functional material, TiO 2 , is utilized as an example to demonstrate the automated global data set generation, the improved NN training procedure and the application in material discovery. Two new TiO 2 porous crystal structures are identified, which have similar thermodynamics stability to the common TiO 2 rutile phase and the kinetics stability for one of them is further proved from SSW pathway sampling. As a general tool for material simulation, the SSW-NN method provides an efficient and predictive platform for large-scale computational material screening.

  14. Homogeneity testing of the global ESA CCI multi-satellite soil moisture climate data record

    NASA Astrophysics Data System (ADS)

    Preimesberger, Wolfgang; Su, Chun-Hsu; Gruber, Alexander; Dorigo, Wouter

    2017-04-01

    ESA's Climate Change Initiative (CCI) creates a global, long-term data record by merging multiple available earth observation products with the goal to provide a product for climate studies, trend analysis, and risk assessments. The blending of soil moisture (SM) time series derived from different active and passive remote sensing instruments with varying sensor characteristics, such as microwave frequency, signal polarization or radiometric accuracy, could potentially lead to inhomogeneities in the merged long-term data series, undercutting the usefulness of the product. To detect the spatio-temporal extent of contiguous periods without inhomogeneities as well as subsequently minimizing their negative impact on the data records, different relative homogeneity tests (namely Fligner-Killeen test of homogeneity of variances and Wilcoxon rank-sums test) are implemented and tested on the combined active-passive ESA CCI SM data set. Inhomogeneities are detected by comparing the data against reference data from in-situ data from ISMN, and model-based estimates from GLDAS-Noah and MERRA-Land. Inhomogeneity testing is performed over the ESA CCI SM data time frame of 38 years (from 1978 to 2015), on a global quarter-degree grid and with regard to six alterations in the combination of observation systems used in the data blending process. This study describes and explains observed variations in the spatial and temporal patterns of inhomogeneities in the combined products. Besides we proposes methodologies for measuring and reducing the impact of inhomogeneities on trends derived from the ESA CCI SM data set, and suggest the use of inhomogeneity-corrected data for future trend studies. This study is supported by the European Union's FP7 EartH2Observe "Global Earth Observation for Integrated Water Resource Assessment" project (grant agreement number 331 603608).

  15. Quality Indicators for Global Benchmarking of Localized Prostate Cancer Management.

    PubMed

    Sampurno, Fanny; Zheng, Jia; Di Stefano, Lydia; Millar, Jeremy L; Foster, Claire; Fuedea, Ferran; Higano, Celestia; Hulan, Hartwig; Mark, Stephen; Moore, Caroline; Richardson, Alison; Sullivan, Frank; Wenger, Neil S; Wittmann, Daniela; Evans, Sue

    2018-03-01

    We sought to develop a core set of clinical indicators to enable international benchmarking of localized prostate cancer management using data available in the TrueNTH (True North) Global Registry. An international expert panel completed an online survey and participated in a face to face meeting. Participants included 3 urologists, 3 radiation oncologists, 2 psychologists, 1 medical oncologist, 1 nurse and 1 epidemiologist with prostate cancer expertise from a total of 7 countries. Current guidelines on prostate cancer treatment and potential quality indicators were identified from a literature review. These potential indicators were refined and developed through a modified Delphi process during which each panelist independently and repeatedly rated each indicator based on importance (satisfying the indicator demonstrated a provision of high quality care) and feasibility (the likelihood that data used to construct the indicator could be collected at a population level). The main outcome measure was items with panel agreement indicted by a disagreement index less 1, median importance 8.5 or greater and median feasibility 9 or greater. The expert panel endorsed 33 indicators. Seven of these 33 prostate cancer quality indicators assessed care relating to diagnosis, 7 assessed primary treatment, 1 assessed salvage treatment and 18 assessed health outcomes. We developed a set of quality indicators to measure prostate cancer care using numerous international evidence-based clinical guidelines. These indicators will be pilot tested in the TrueNTH Global Registry. Reports comparing indicator performance will subsequently be distributed to groups at participating sites with the purpose of improving the consistency and quality of prostate cancer management on a global basis. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  16. Diagnostic Capability of Peripapillary Three-dimensional Retinal Nerve Fiber Layer Volume for Glaucoma Using Optical Coherence Tomography Volume Scans.

    PubMed

    Khoueir, Ziad; Jassim, Firas; Poon, Linda Yi-Chieh; Tsikata, Edem; Ben-David, Geulah S; Liu, Yingna; Shieh, Eric; Lee, Ramon; Guo, Rong; Papadogeorgou, Georgia; Braaf, Boy; Simavli, Huseyin; Que, Christian; Vakoc, Benjamin J; Bouma, Brett E; de Boer, Johannes F; Chen, Teresa C

    2017-10-01

    To determine the diagnostic capability of peripapillary 3-dimensional (3D) retinal nerve fiber layer (RNFL) volume measurements from spectral-domain optical coherence tomography (OCT) volume scans for open-angle glaucoma (OAG). Assessment of diagnostic accuracy. Setting: Academic clinical setting. Total of 180 patients (113 OAG and 67 normal subjects). One eye per subject was included. Peripapillary 3D RNFL volumes were calculated for global, quadrant, and sector regions, using 4 different-size annuli. Peripapillary 2D RNFL thickness circle scans were also obtained. Area under the receiver operating characteristic curve (AUROC) values, sensitivity, specificity, positive and negative predictive values, positive and negative likelihood ratios. Among all 2D and 3D RNFL parameters, best diagnostic capability was associated with inferior quadrant 3D RNFL volume of the smallest annulus (AUROC value 0.977). Otherwise, global 3D RNFL volume AUROC values were comparable to global 2D RNFL thickness AUROC values for all 4 annulus sizes (P values: .0593 to .6866). When comparing the 4 annulus sizes for global RNFL volume, the smallest annulus had the best AUROC values (P values: .0317 to .0380). The smallest-size annulus may have the best diagnostic potential, partly owing to having no areas excluded for being larger than the 6 × 6 mm 2 scanned region. Peripapillary 3D RNFL volume showed excellent diagnostic performance for detecting glaucoma. Peripapillary 3D RNFL volume parameters have the same or better diagnostic capability compared to peripapillary 2D RNFL thickness measurements, although differences were not statistically significant. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Internal validation of the GlobalFiler™ Express PCR Amplification Kit for the direct amplification of reference DNA samples on a high-throughput automated workflow.

    PubMed

    Flores, Shahida; Sun, Jie; King, Jonathan; Budowle, Bruce

    2014-05-01

    The GlobalFiler™ Express PCR Amplification Kit uses 6-dye fluorescent chemistry to enable multiplexing of 21 autosomal STRs, 1 Y-STR, 1 Y-indel and the sex-determining marker amelogenin. The kit is specifically designed for processing reference DNA samples in a high throughput manner. Validation studies were conducted to assess the performance and define the limitations of this direct amplification kit for typing blood and buccal reference DNA samples on various punchable collection media. Studies included thermal cycling sensitivity, reproducibility, precision, sensitivity of detection, minimum detection threshold, system contamination, stochastic threshold and concordance. Results showed that optimal amplification and injection parameters for a 1.2mm punch from blood and buccal samples were 27 and 28 cycles, respectively, combined with a 12s injection on an ABI 3500xL Genetic Analyzer. Minimum detection thresholds were set at 100 and 120RFUs for 27 and 28 cycles, respectively, and it was suggested that data from positive amplification controls provided a better threshold representation. Stochastic thresholds were set at 250 and 400RFUs for 27 and 28 cycles, respectively, as stochastic effects increased with cycle number. The minimum amount of input DNA resulting in a full profile was 0.5ng, however, the optimum range determined was 2.5-10ng. Profile quality from the GlobalFiler™ Express Kit and the previously validated AmpFlSTR(®) Identifiler(®) Direct Kit was comparable. The validation data support that reliable DNA typing results from reference DNA samples can be obtained using the GlobalFiler™ Express PCR Amplification Kit. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Optical Algorithms at Satellite Wavelengths for Total Suspended Matter in Tropical Coastal Waters.

    PubMed

    Ouillon, Sylvain; Douillet, Pascal; Petrenko, Anne; Neveux, Jacques; Dupouy, Cécile; Froidefond, Jean-Marie; Andréfouët, Serge; Muñoz-Caravaca, Alain

    2008-07-10

    Is it possible to derive accurately Total Suspended Matter concentration or its proxy, turbidity, from remote sensing data in tropical coastal lagoon waters? To investigate this question, hyperspectral remote sensing reflectance, turbidity and chlorophyll pigment concentration were measured in three coral reef lagoons. The three sites enabled us to get data over very diverse environments: oligotrophic and sediment-poor waters in the southwest lagoon of New Caledonia, eutrophic waters in the Cienfuegos Bay (Cuba), and sediment-rich waters in the Laucala Bay (Fiji). In this paper, optical algorithms for turbidity are presented per site based on 113 stations in New Caledonia, 24 stations in Cuba and 56 stations in Fiji. Empirical algorithms are tested at satellite wavebands useful to coastal applications. Global algorithms are also derived for the merged data set (193 stations). The performances of global and local regression algorithms are compared. The best one-band algorithms on all the measurements are obtained at 681 nm using either a polynomial or a power model. The best two-band algorithms are obtained with R412/R620, R443/R670 and R510/R681. Two three-band algorithms based on Rrs620.Rrs681/Rrs412 and Rrs620.Rrs681/Rrs510 also give fair regression statistics. Finally, we propose a global algorithm based on one or three bands: turbidity is first calculated from Rrs681 and then, if < 1 FTU, it is recalculated using an algorithm based on Rrs620.Rrs681/Rrs412. On our data set, this algorithm is suitable for the 0.2-25 FTU turbidity range and for the three sites sampled (mean bias: 3.6 %, rms: 35%, mean quadratic error: 1.4 FTU). This shows that defining global empirical turbidity algorithms in tropical coastal waters is at reach.

  19. SACRA - global data sets of satellite-derived crop calendars for agricultural simulations: an estimation of a high-resolution crop calendar using satellite-sensed NDVI

    NASA Astrophysics Data System (ADS)

    Kotsuki, S.; Tanaka, K.

    2015-01-01

    To date, many studies have performed numerical estimations of food production and agricultural water demand to understand the present and future supply-demand relationship. A crop calendar (CC) is an essential input datum to estimate food production and agricultural water demand accurately with the numerical estimations. CC defines the date or month when farmers plant and harvest in cropland. This study aims to develop a new global data set of a satellite-derived crop calendar for agricultural simulations (SACRA) and reveal advantages and disadvantages of the satellite-derived CC compared to other global products. We estimate global CC at a spatial resolution of 5 min (≈10 km) using the satellite-sensed NDVI data, which corresponds well to vegetation growth and death on the land surface. We first demonstrate that SACRA shows similar spatial pattern in planting date compared to a census-based product. Moreover, SACRA reflects a variety of CC in the same administrative unit, since it uses high-resolution satellite data. However, a disadvantage is that the mixture of several crops in a grid is not considered in SACRA. We also address that the cultivation period of SACRA clearly corresponds to the time series of NDVI. Therefore, accuracy of SACRA depends on the accuracy of NDVI used for the CC estimation. Although SACRA shows different CC from a census-based product in some regions, multiple usages of the two products are useful to take into consideration the uncertainty of the CC. An advantage of SACRA compared to the census-based products is that SACRA provides not only planting/harvesting dates but also a peak date from the time series of NDVI data.

  20. NASA Langley Atmospheric Science Data Centers Near Real-Time Data Products

    NASA Astrophysics Data System (ADS)

    Davenport, T.; Parker, L.; Rinsland, P. L.

    2014-12-01

    Over the past decade the Atmospheric Science Data Center (ASDC) at NASA Langley Research Center has archived and distributed a variety of satellite mission data sets. NASA's goal in Earth science is to observe, understand, and model the Earth system to discover how it is changing, to better predict change, and to understand the consequences for life on Earth. The ASDC has collaborated with Science Teams to accommodate emerging science users in the climate and modeling communities. The ASDC has expanded its original role to support operational usage by related Earth Science satellites, support land and ocean assimilations, support of field campaigns, outreach programs, and application projects for agriculture and energy industries to bridge the gap between Earth science research results and the adoption of data and prediction capabilities for reliable and sustained use in Decision Support Systems (DSS). For example; these products are being used by the community performing data assimilations to regulate aerosol mass in global transport models to improve model response and forecast accuracy, to assess the performance of components of a global coupled atmospheric-ocean climate model, improve atmospheric motion vector (winds) impact on numerical weather prediction models, and to provide internet-based access to parameters specifically tailored to assist in the design of solar and wind powered renewable energy systems. These more focused applications often require Near Real-Time (NRT) products. Generating NRT products pose their own unique set challenges for the ASDC and the Science Teams. Examples of ASDC NRT products and challenges will be discussed.

  1. Application of Deep Learning in GLOBELAND30-2010 Product Refinement

    NASA Astrophysics Data System (ADS)

    Liu, T.; Chen, X.

    2018-04-01

    GlobeLand30, as one of the best Global Land Cover (GLC) product at 30-m resolution, has been widely used in many research fields. Due to the significant spectral confusion among different land cover types and limited textual information of Landsat data, the overall accuracy of GlobeLand30 is about 80 %. Although such accuracy is much higher than most other global land cover products, it cannot satisfy various applications. There is still a great need of an effective method to improve the quality of GlobeLand30. The explosive high-resolution satellite images and remarkable performance of Deep Learning on image classification provide a new opportunity to refine GlobeLand30. However, the performance of deep leaning depends on quality and quantity of training samples as well as model training strategy. Therefore, this paper 1) proposed an automatic training sample generation method via Google earth to build a large training sample set; and 2) explore the best training strategy for land cover classification using GoogleNet (Inception V3), one of the most widely used deep learning network. The result shows that the fine-tuning from first layer of Inception V3 using rough large sample set is the best strategy. The retrained network was then applied in one selected area from Xi'an city as a case study of GlobeLand30 refinement. The experiment results indicate that the proposed approach with Deep Learning and google earth imagery is a promising solution for further improving accuracy of GlobeLand30.

  2. Technical note: Coordination and harmonization of the multi-scale, multi-model activities HTAP2, AQMEII3, and MICS-Asia3: simulations, emission inventories, boundary conditions, and model output formats.

    PubMed

    Galmarini, Stefano; Koffi, Brigitte; Solazzo, Efisio; Keating, Terry; Hogrefe, Christian; Schulz, Michael; Benedictow, Anna; Griesfeller, Jan Jurgen; Janssens-Maenhout, Greet; Carmichael, Greg; Fu, Joshua; Dentener, Frank

    2017-01-31

    We present an overview of the coordinated global numerical modelling experiments performed during 2012-2016 by the Task Force on Hemispheric Transport of Air Pollution (TF HTAP), the regional experiments by the Air Quality Model Evaluation International Initiative (AQMEII) over Europe and North America, and the Model Intercomparison Study for Asia (MICS-Asia). To improve model estimates of the impacts of intercontinental transport of air pollution on climate, ecosystems, and human health and to answer a set of policy-relevant questions, these three initiatives performed emission perturbation modelling experiments consistent across the global, hemispheric, and continental/regional scales. In all three initiatives, model results are extensively compared against monitoring data for a range of variables (meteorological, trace gas concentrations, and aerosol mass and composition) from different measurement platforms (ground measurements, vertical profiles, airborne measurements) collected from a number of sources. Approximately 10 to 25 modelling groups have contributed to each initiative, and model results have been managed centrally through three data hubs maintained by each initiative. Given the organizational complexity of bringing together these three initiatives to address a common set of policy-relevant questions, this publication provides the motivation for the modelling activity, the rationale for specific choices made in the model experiments, and an overview of the organizational structures for both the modelling and the measurements used and analysed in a number of modelling studies in this special issue.

  3. Technical note: Coordination and harmonization of the multi-scale, multi-model activities HTAP2, AQMEII3, and MICS-Asia3: simulations, emission inventories, boundary conditions, and model output formats

    NASA Astrophysics Data System (ADS)

    Galmarini, Stefano; Koffi, Brigitte; Solazzo, Efisio; Keating, Terry; Hogrefe, Christian; Schulz, Michael; Benedictow, Anna; Griesfeller, Jan Jurgen; Janssens-Maenhout, Greet; Carmichael, Greg; Fu, Joshua; Dentener, Frank

    2017-01-01

    We present an overview of the coordinated global numerical modelling experiments performed during 2012-2016 by the Task Force on Hemispheric Transport of Air Pollution (TF HTAP), the regional experiments by the Air Quality Model Evaluation International Initiative (AQMEII) over Europe and North America, and the Model Intercomparison Study for Asia (MICS-Asia). To improve model estimates of the impacts of intercontinental transport of air pollution on climate, ecosystems, and human health and to answer a set of policy-relevant questions, these three initiatives performed emission perturbation modelling experiments consistent across the global, hemispheric, and continental/regional scales. In all three initiatives, model results are extensively compared against monitoring data for a range of variables (meteorological, trace gas concentrations, and aerosol mass and composition) from different measurement platforms (ground measurements, vertical profiles, airborne measurements) collected from a number of sources. Approximately 10 to 25 modelling groups have contributed to each initiative, and model results have been managed centrally through three data hubs maintained by each initiative. Given the organizational complexity of bringing together these three initiatives to address a common set of policy-relevant questions, this publication provides the motivation for the modelling activity, the rationale for specific choices made in the model experiments, and an overview of the organizational structures for both the modelling and the measurements used and analysed in a number of modelling studies in this special issue.

  4. Technical note: Coordination and harmonization of the multi-scale, multi-model activities HTAP2, AQMEII3, and MICS-Asia3: simulations, emission inventories, boundary conditions, and model output formats

    PubMed Central

    Galmarini, Stefano; Koffi, Brigitte; Solazzo, Efisio; Keating, Terry; Hogrefe, Christian; Schulz, Michael; Benedictow, Anna; Griesfeller, Jan Jurgen; Janssens-Maenhout, Greet; Carmichael, Greg; Fu, Joshua; Dentener, Frank

    2018-01-01

    We present an overview of the coordinated global numerical modelling experiments performed during 2012–2016 by the Task Force on Hemispheric Transport of Air Pollution (TF HTAP), the regional experiments by the Air Quality Model Evaluation International Initiative (AQMEII) over Europe and North America, and the Model Intercomparison Study for Asia (MICS-Asia). To improve model estimates of the impacts of intercontinental transport of air pollution on climate, ecosystems, and human health and to answer a set of policy-relevant questions, these three initiatives performed emission perturbation modelling experiments consistent across the global, hemispheric, and continental/regional scales. In all three initiatives, model results are extensively compared against monitoring data for a range of variables (meteorological, trace gas concentrations, and aerosol mass and composition) from different measurement platforms (ground measurements, vertical profiles, airborne measurements) collected from a number of sources. Approximately 10 to 25 modelling groups have contributed to each initiative, and model results have been managed centrally through three data hubs maintained by each initiative. Given the organizational complexity of bringing together these three initiatives to address a common set of policy-relevant questions, this publication provides the motivation for the modelling activity, the rationale for specific choices made in the model experiments, and an overview of the organizational structures for both the modelling and the measurements used and analysed in a number of modelling studies in this special issue. PMID:29541091

  5. Integration of Network Topological and Connectivity Properties for Neuroimaging Classification

    PubMed Central

    Jie, Biao; Gao, Wei; Wang, Qian; Wee, Chong-Yaw

    2014-01-01

    Rapid advances in neuroimaging techniques have provided an efficient and noninvasive way for exploring the structural and functional connectivity of the human brain. Quantitative measurement of abnormality of brain connectivity in patients with neurodegenerative diseases, such as mild cognitive impairment (MCI) and Alzheimer’s disease (AD), have also been widely reported, especially at a group level. Recently, machine learning techniques have been applied to the study of AD and MCI, i.e., to identify the individuals with AD/MCI from the healthy controls (HCs). However, most existing methods focus on using only a single property of a connectivity network, although multiple network properties, such as local connectivity and global topological properties, can potentially be used. In this paper, by employing multikernel based approach, we propose a novel connectivity based framework to integrate multiple properties of connectivity network for improving the classification performance. Specifically, two different types of kernels (i.e., vector-based kernel and graph kernel) are used to quantify two different yet complementary properties of the network, i.e., local connectivity and global topological properties. Then, multikernel learning (MKL) technique is adopted to fuse these heterogeneous kernels for neuroimaging classification. We test the performance of our proposed method on two different data sets. First, we test it on the functional connectivity networks of 12 MCI and 25 HC subjects. The results show that our method achieves significant performance improvement over those using only one type of network property. Specifically, our method achieves a classification accuracy of 91.9%, which is 10.8% better than those by single network-property-based methods. Then, we test our method for gender classification on a large set of functional connectivity networks with 133 infants scanned at birth, 1 year, and 2 years, also demonstrating very promising results. PMID:24108708

  6. Wearable Technology for Global Surgical Teleproctoring.

    PubMed

    Datta, Néha; MacQueen, Ian T; Schroeder, Alexander D; Wilson, Jessica J; Espinoza, Juan C; Wagner, Justin P; Filipi, Charles J; Chen, David C

    2015-01-01

    In underserved communities around the world, inguinal hernias represent a significant burden of surgically-treatable disease. With traditional models of international surgical assistance limited to mission trips, a standardized framework to strengthen local healthcare systems is lacking. We established a surgical education model using web-based tools and wearable technology to allow for long-term proctoring and assessment in a resource-poor setting. This is a feasibility study examining wearable technology and web-based performance rating tools for long-term proctoring in an international setting. Using the Lichtenstein inguinal hernia repair as the index surgical procedure, local surgeons in Paraguay and Brazil were trained in person by visiting international expert trainers using a formal, standardized teaching protocol. Surgeries were captured in real-time using Google Glass and transmitted wirelessly to an online video stream, permitting real-time observation and proctoring by mentoring surgeon experts in remote locations around the world. A system for ongoing remote evaluation and support by experienced surgeons was established using the Lichtenstein-specific Operative Performance Rating Scale. Data were collected from 4 sequential training operations for surgeons trained in both Paraguay and Brazil. With continuous internet connectivity, live streaming of the surgeries was successful. The Operative Performance Rating Scale was immediately used after each operation. Both surgeons demonstrated proficiency at the completion of the fourth case. A sustainable model for surgical training and proctoring to empower local surgeons in resource-poor locations and "train trainers" is feasible with wearable technology and web-based communication. Capacity building by maximizing use of local resources and expertise offers a long-term solution to reducing the global burden of surgically-treatable disease. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  7. Sentinel-2A image quality commissioning phase final results: geometric calibration and performances

    NASA Astrophysics Data System (ADS)

    Languille, F.; Gaudel, A.; Dechoz, C.; Greslou, D.; de Lussy, F.; Trémas, T.; Poulain, V.; Massera, S.

    2016-10-01

    In the frame of the Copernicus program of the European Commission, Sentinel-2 offers multispectral high-spatial-resolution optical images over global terrestrial surfaces. In cooperation with ESA, the Centre National d'Etudes Spatiales (CNES) is in charge of the image quality of the project, and so ensures the CAL/VAL commissioning phase during the months following the launch. Sentinel-2 is a constellation of 2 satellites on a polar sun-synchronous orbit with a revisit time of 5 days (with both satellites), a high field of view - 290km, 13 spectral bands in visible and shortwave infrared, and high spatial resolution - 10m, 20m and 60m. The Sentinel-2 mission offers a global coverage over terrestrial surfaces. The satellites acquire systematically terrestrial surfaces under the same viewing conditions in order to have temporal images stacks. The first satellite was launched in June 2015. Following the launch, the CAL/VAL commissioning phase is then lasting during 6 months for geometrical calibration. This paper will point on observations and results seen on Sentinel-2 images during commissioning phase. It will provide explanations about Sentinel-2 products delivered with geometric corrections. This paper will detail calibration sites, and the methods used for geometrical parameters calibration and will present linked results. The following topics will be presented: viewing frames orientation assessment, focal plane mapping for all spectral bands, results on geolocation assessment, and multispectral registration. There is a systematic images recalibration over a same reference which is a set of S2 images produced during the 6 months of CAL/VAL. This set of images will be presented as well as the geolocation performance and the multitemporal performance after refining over this ground reference.

  8. Development of an Evaluation Methodology for Triple Bottom Line Reports Using International Standards on Reporting

    NASA Astrophysics Data System (ADS)

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  9. Development of an evaluation methodology for triple bottom line reports using international standards on reporting.

    PubMed

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  10. Interdecadal variability in pan-Pacific and global SST, revisited

    NASA Astrophysics Data System (ADS)

    Tung, Ka-Kit; Chen, Xianyao; Zhou, Jiansong; Li, King-Fai

    2018-05-01

    Interest in the "Interdecadal Pacific Oscillation (IPO)" in the global SST has surged recently on suggestions that the Pacific may be the source of prominent interdecadal variations observed in the global-mean surface temperature possibly through the mechanism of low-frequency modulation of the interannual El Nino-Southern Oscillation (ENSO) phenomenon. IPO was defined by performing empirical orthogonal function (EOF) analysis of low-pass filtered SST. The low-pass filtering creates its unique set of mathematical problems—in particular, mode mixing—and has led to some questions, many unanswered. To understand what these EOFs are, we express them first in terms of the recently developed pairwise rotated EOFs of the unfiltered SST, which can largely separate the high and low frequency bands without resorting to filtering. As reported elsewhere, the leading rotated dynamical modes (after the global warming trend) of the unfiltered global SST are: ENSO, Pacific Decadal Oscillation (PDO), and Atlantic Multidecadal Oscillation (AMO). IPO is not among them. The leading principal component (PC) of the low-pass filtered global SST is usually defined as IPO and it is seen to comprise of ENSO, PDO and AMO in various proportions depending on the filter threshold. With decadal filtering, the contribution of the interannual ENSO is understandably negligible. The leading dynamical mode of the filtered global SST is mostly AMO, and therefore should not have been called the Interdecadal "Pacific" Oscillation. The leading dynamical mode of the filtered pan-Pacific SST is mostly PDO. This and other low-frequency variability that have the action center in the Pacific, from either the pan-Pacific or global SST, have near zero global mean.

  11. Transcontinental anaesthesia: a pilot study.

    PubMed

    Hemmerling, T M; Arbeid, E; Wehbe, M; Cyr, S; Giunta, F; Zaouter, C

    2013-05-01

    Although telemedicine is one of the key initiatives of the World Health Organization, no study has explored the feasibility and efficacy of teleanaesthesia. This bi-centre pilot study investigates the feasibility of transcontinental anaesthesia. Twenty patients aged ≥ 18 yr undergoing elective thyroid surgery for ≥ 30 min were enrolled in this study. The remote and local set-up was composed of a master-computer (Montreal) and a slave-computer (Pisa). Standard Internet connection, remote desktop control, and video conference software were used. All patients received total i.v. anaesthesia controlled remotely (Montreal). The main outcomes were feasibility, clinical performance, and controller performance of transcontinental anaesthesia. The clinical performance of hypnosis control was the efficacy to maintain bispectral index (BIS) at 45: 'excellent', 'good', 'poor', and 'inadequate' control represented BIS values within 10, from 11 to 20, from 21 to 30, or >30% from target. The clinical performance of analgesia was the efficacy to maintain Analgoscore values at 0 (-9 to 9); -3 to +3 representing 'excellent' pain control, -3 to -6 and +3 to +6 representing 'good' pain control, and -6 to -9 and +6 to +9 representing 'insufficient' pain control. The controller performance was evaluated using Varvel parameters. Transcontinental anaesthesia was successful in all 20 consecutive patients. The clinical performance of hypnosis showed an 'excellent and good' control for 69% of maintenance time, and the controller performance showed an average global performance index of 57. The clinical performance of analgesia was 'excellent and good' for 92% of maintenance time, and the controller performance showed a global performance index of 1118. Transcontinental anaesthesia is feasible; control of anaesthesia shows good performance indexes. Clinical registration number NCT01331096.

  12. Geoid undulation computations at laser tracking stations

    NASA Technical Reports Server (NTRS)

    Despotakis, Vasilios K.

    1987-01-01

    Geoid undulation computations were performed at 29 laser stations distributed around the world using a combination of terrestrial gravity data within a cap of radius 2 deg and a potential coefficient set up to 180 deg. The traditional methods of Stokes' and Meissl's modification together with the Molodenskii method and the modified Sjoberg method were applied. Performing numerical tests based on global error assumptions regarding the terrestrial data and the geopotential set it was concluded that the modified Sjoberg method is the most accurate and promising technique for geoid undulation computations. The numerical computations for the geoid undulations using all the four methods resulted in agreement with the ellipsoidal minus orthometric value of the undulations on the order of 60 cm or better for most of the laser stations in the eastern United States, Australia, Japan, Bermuda, and Europe. A systematic discrepancy of about 2 meters for most of the western United States stations was detected and verified by using two relatively independent data sets. For oceanic laser stations in the western Atlantic and Pacific oceans that have no terrestrial data available, the adjusted GEOS-3 and SEASAT altimeter data were used for the computation of the geoid undulation in a collocation method.

  13. Choosing Sensor Configuration for a Flexible Structure Using Full Control Synthesis

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Nalbantoglu, Volkan; Balas, Gary

    1997-01-01

    Optimal locations and types for feedback sensors which meet design constraints and control requirements are difficult to determine. This paper introduces an approach to choosing a sensor configuration based on Full Control synthesis. A globally optimal Full Control compensator is computed for each member of a set of sensor configurations which are feasible for the plant. The sensor configuration associated with the Full Control system achieving the best closed-loop performance is chosen for feedback measurements to an output feedback controller. A flexible structure is used as an example to demonstrate this procedure. Experimental results show sensor configurations chosen to optimize the Full Control performance are effective for output feedback controllers.

  14. Synthesis, crystal structure investigation, spectroscopic characterizations and DFT computations on a novel 1-(2-chloro-4-phenylquinolin-3-yl)ethanone

    NASA Astrophysics Data System (ADS)

    Murugavel, S.; Stephen, C. S. Jacob Prasanna; Subashini, R.; Reddy, H. Raveendranatha; AnanthaKrishnan, Dhanabalan

    2016-10-01

    The title compound 1-(2-chloro-4-phenylquinolin-3-yl)ethanone (CPQE) was synthesised effectively by chlorination of 3-acetyl-4-phenylquinolin-2(1H)-one (APQ) using POCl3 reagent. Structural and vibrational spectroscopic studies were performed by utilizing single crystal X-ray diffraction, FTIR and NMR spectral analysis along with DFT method utilizing GAUSSIAN‧ 03 software. Veda program has been employed to perform a detailed interpretation of vibrational spectra. Mulliken population analyses on atomic charges, MEP, HOMO-LUMO, NBO, Global chemical reactivity descriptors and thermodynamic properties have been examined by (DFT/B3LYP) method with the 6-311G(d,p) basis set level.

  15. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  16. Long-Term Efficacy of Psychosocial Treatments for Adults With Attention-Deficit/Hyperactivity Disorder: A Meta-Analytic Review

    PubMed Central

    López-Pinar, Carlos; Martínez-Sanchís, Sonia; Carbonell-Vayá, Enrique; Fenollar-Cortés, Javier; Sánchez-Meca, Julio

    2018-01-01

    Background: Recent evidence suggests that psychosocial treatments, particularly cognitive-behavioral therapy (CBT), are effective interventions for adult attention deficit hyperactivity disorder (ADHD). The objective of this review was to determine the long-term efficacy of psychosocial interventions in improving clinically relevant variables, including ADHD core symptoms, clinical global impression (CGI), and global functioning. Methods: In total, nine randomized controlled trials and three uncontrolled single-group pretest-posttest studies were included. The data from these studies were combined using the inverse variance method. Heterogeneity and risk of bias were assessed. Subgroup analyses and meta-regressions were performed, to determine the influence of different potential moderator variables (risk of bias, medication status, follow-up length, therapy type and setting, and control group type) on effect size (ES) estimates. Results: Up to 680 of a total of 1,073 participants assessed pre-treatment were retained at follow-up. Treatment groups showed greater improvement than control groups in self-reported total ADHD symptoms, inattention, and hyperactivity/impulsivity, in addition to CGI and global functioning. Blind assessors also reported a large ES in within-subject outcomes. Studies using dialectical behavioral therapy (DBT) in a group setting, with active control matching, and that were rated as having an unclear risk of bias, achieved significantly lower ES estimates for most outcomes. Treatment effectiveness, according to the CGI measure, and global functioning were significantly increased when the percentage of medicated participants was greater. Conclusions: Our results indicate that the post-treatment gains reported in previous reviews are sustained for at least 12 months. Nevertheless, these results must be interpreted with caution, because of a high level of heterogeneity among studies and the risk of bias observed in the majority of outcomes. Thus, these findings indicate that psychological interventions are a highly valuable and stable clinical tool for the treatment of core symptoms and global functioning in adults with ADHD. PMID:29780342

  17. Long-Term Efficacy of Psychosocial Treatments for Adults With Attention-Deficit/Hyperactivity Disorder: A Meta-Analytic Review.

    PubMed

    López-Pinar, Carlos; Martínez-Sanchís, Sonia; Carbonell-Vayá, Enrique; Fenollar-Cortés, Javier; Sánchez-Meca, Julio

    2018-01-01

    Background: Recent evidence suggests that psychosocial treatments, particularly cognitive-behavioral therapy (CBT), are effective interventions for adult attention deficit hyperactivity disorder (ADHD). The objective of this review was to determine the long-term efficacy of psychosocial interventions in improving clinically relevant variables, including ADHD core symptoms, clinical global impression (CGI), and global functioning. Methods: In total, nine randomized controlled trials and three uncontrolled single-group pretest-posttest studies were included. The data from these studies were combined using the inverse variance method. Heterogeneity and risk of bias were assessed. Subgroup analyses and meta-regressions were performed, to determine the influence of different potential moderator variables (risk of bias, medication status, follow-up length, therapy type and setting, and control group type) on effect size (ES) estimates. Results: Up to 680 of a total of 1,073 participants assessed pre-treatment were retained at follow-up. Treatment groups showed greater improvement than control groups in self-reported total ADHD symptoms, inattention, and hyperactivity/impulsivity, in addition to CGI and global functioning. Blind assessors also reported a large ES in within-subject outcomes. Studies using dialectical behavioral therapy (DBT) in a group setting, with active control matching, and that were rated as having an unclear risk of bias, achieved significantly lower ES estimates for most outcomes. Treatment effectiveness, according to the CGI measure, and global functioning were significantly increased when the percentage of medicated participants was greater. Conclusions: Our results indicate that the post-treatment gains reported in previous reviews are sustained for at least 12 months. Nevertheless, these results must be interpreted with caution, because of a high level of heterogeneity among studies and the risk of bias observed in the majority of outcomes. Thus, these findings indicate that psychological interventions are a highly valuable and stable clinical tool for the treatment of core symptoms and global functioning in adults with ADHD.

  18. Crew fatigue safety performance indicators for fatigue risk management systems.

    PubMed

    Gander, Philippa H; Mangie, Jim; Van Den Berg, Margo J; Smith, A Alexander T; Mulrine, Hannah M; Signal, T Leigh

    2014-02-01

    Implementation of Fatigue Risk Management Systems (FRMS) is gaining momentum; however, agreed safety performance indicators (SPIs) are lacking. This paper proposes an initial set of SPIs based on measures of crewmember sleep, performance, and subjective fatigue and sleepiness, together with methods for interpreting them. Data were included from 133 landing crewmembers on 2 long-range and 3 ultra-long-range trips (4-person crews, 3 airlines, 220 flights). Studies had airline, labor, and regulatory support, and underwent independent ethical review. SPIs evaluated preflight and at top of descent (TOD) were: total sleep in the prior 24 h and time awake at duty start and at TOD (actigraphy); subjective sleepiness (Karolinska Sleepiness Scale) and fatigue (Samn-Perelli scale); and psychomotor vigilance task (PVT) performance. Kruskal-Wallis nonparametric ANOVA with post hoc tests was used to identify significant differences between flights for each SPI. Visual and preliminary quantitative comparisons of SPIs between flights were made using box plots and bar graphs. Statistical analyses identified significant differences between flights across a range of SPls. In an FRMS, crew fatigue SPIs are envisaged as a decision aid alongside operational SPIs, which need to reflect the relevant causes of fatigue in different operations. We advocate comparing multiple SPIs between flights rather than defining safe/unsafe thresholds on individual SPIs. More comprehensive data sets are needed to identify the operational and biological factors contributing to the differences between flights reported here. Global sharing of an agreed core set of SPIs would greatly facilitate implementation and improvement of FRMS.

  19. Achieving a Global Mind-Set at Home: Student Engagement with Immigrant Children

    ERIC Educational Resources Information Center

    Dallinger, Carolyn

    2017-01-01

    Developing a global mind-set in college students is a goal of many colleges and universities. Most often this goal is met by encouraging students to study abroad. This article explains how a service learning student engagement program at home achieves this goal by pairing Introduction to Sociology students with young immigrant children in a weekly…

  20. Satellite Instrument Calibration for Measuring Global Climate Change. Report of a Workshop at the University of Maryland Inn and Conference Center, College Park, MD. , November 12-14, 2002

    NASA Technical Reports Server (NTRS)

    Ohring, G.; Wielicki, B.; Spencer, R.; Emery, B.; Datla, R.

    2004-01-01

    Measuring the small changes associated with long-term global climate change from space is a daunting task. To address these problems and recommend directions for improvements in satellite instrument calibration some 75 scientists, including researchers who develop and analyze long-term data sets from satellites, experts in the field of satellite instrument calibration, and physicists working on state of the art calibration sources and standards met November 12 - 14, 2002 and discussed the issues. The workshop defined the absolute accuracies and long-term stabilities of global climate data sets that are needed to detect expected trends, translated these data set accuracies and stabilities to required satellite instrument accuracies and stabilities, and evaluated the ability of current observing systems to meet these requirements. The workshop's recommendations include a set of basic axioms or overarching principles that must guide high quality climate observations in general, and a roadmap for improving satellite instrument characterization, calibration, inter-calibration, and associated activities to meet the challenge of measuring global climate change. It is also recommended that a follow-up workshop be conducted to discuss implementation of the roadmap developed at this workshop.

  1. Recommended acetylene line list in the 20-240 cm-1 and 400-630 cm-1 regions: New measurements and global modeling

    NASA Astrophysics Data System (ADS)

    Jacquemart, David; Lyulin, Oleg; Perevalov, Valery I.

    2017-12-01

    A new recommended 12C2H2 line list for the 13-248 cm-1 and 390-634 cm-1 regions is presented. It is based on the results of the global modeling of the line positions and intensities performed in Tomsk within the framework of the method of effective operators. To validate the Tomsk calculations new measurements of both line positions and intensities were performed using acetylene spectra recorded between 25 and 680 cm-1 with the AILES-A beamline of SOLEIL synchrotron. Line positions and intensities of 627 transitions belonging to 9 bands have been measured for the first time in this region. Using the results of these new measurements and the published results of the measurements in the 13-248 cm-1 and 390-634 cm-1 regions performed with the same facilities new fittings of the line intensities for the ΔP=0 and ΔP=1 series of transitions have been performed. Here P=5v1+3v2+5v3+v4+v5 is a polyad number, where v1, v2, v3, v4, and v5 are the principal quantum numbers of the acetylene harmonic oscillators. These new sets of the effective dipole moment parameters were used to generate the line list which contains the line positions and intensities of 39 and 29 bands, respectively for the ΔP=0 and ΔP=1 series of transitions. None of these bands is present in the HITRAN 2012 [8] and GEISA 2015 [9] databases. This paper presents the first part of a global work on the validation of Tomsk calculations.

  2. A hybrid SVM-FFA method for prediction of monthly mean global solar radiation

    NASA Astrophysics Data System (ADS)

    Shamshirband, Shahaboddin; Mohammadi, Kasra; Tong, Chong Wen; Zamani, Mazdak; Motamedi, Shervin; Ch, Sudheer

    2016-07-01

    In this study, a hybrid support vector machine-firefly optimization algorithm (SVM-FFA) model is proposed to estimate monthly mean horizontal global solar radiation (HGSR). The merit of SVM-FFA is assessed statistically by comparing its performance with three previously used approaches. Using each approach and long-term measured HGSR, three models are calibrated by considering different sets of meteorological parameters measured for Bandar Abbass situated in Iran. It is found that the model (3) utilizing the combination of relative sunshine duration, difference between maximum and minimum temperatures, relative humidity, water vapor pressure, average temperature, and extraterrestrial solar radiation shows superior performance based upon all approaches. Moreover, the extraterrestrial radiation is introduced as a significant parameter to accurately estimate the global solar radiation. The survey results reveal that the developed SVM-FFA approach is greatly capable to provide favorable predictions with significantly higher precision than other examined techniques. For the SVM-FFA (3), the statistical indicators of mean absolute percentage error (MAPE), root mean square error (RMSE), relative root mean square error (RRMSE), and coefficient of determination ( R 2) are 3.3252 %, 0.1859 kWh/m2, 3.7350 %, and 0.9737, respectively which according to the RRMSE has an excellent performance. As a more evaluation of SVM-FFA (3), the ratio of estimated to measured values is computed and found that 47 out of 48 months considered as testing data fall between 0.90 and 1.10. Also, by performing a further verification, it is concluded that SVM-FFA (3) offers absolute superiority over the empirical models using relatively similar input parameters. In a nutshell, the hybrid SVM-FFA approach would be considered highly efficient to estimate the HGSR.

  3. Neural network river forecasting through baseflow separation and binary-coded swarm optimization

    NASA Astrophysics Data System (ADS)

    Taormina, Riccardo; Chau, Kwok-Wing; Sivakumar, Bellie

    2015-10-01

    The inclusion of expert knowledge in data-driven streamflow modeling is expected to yield more accurate estimates of river quantities. Modular models (MMs) designed to work on different parts of the hydrograph are preferred ways to implement such approach. Previous studies have suggested that better predictions of total streamflow could be obtained via modular Artificial Neural Networks (ANNs) trained to perform an implicit baseflow separation. These MMs fit separately the baseflow and excess flow components as produced by a digital filter, and reconstruct the total flow by adding these two signals at the output. The optimization of the filter parameters and ANN architectures is carried out through global search techniques. Despite the favorable premises, the real effectiveness of such MMs has been tested only on a few case studies, and the quality of the baseflow separation they perform has never been thoroughly assessed. In this work, we compare the performance of MM against global models (GMs) for nine different gaging stations in the northern United States. Binary-coded swarm optimization is employed for the identification of filter parameters and model structure, while Extreme Learning Machines, instead of ANN, are used to drastically reduce the large computational times required to perform the experiments. The results show that there is no evidence that MM outperform global GM for predicting the total flow. In addition, the baseflow produced by the MM largely underestimates the actual baseflow component expected for most of the considered gages. This occurs because the values of the filter parameters maximizing overall accuracy do not reflect the geological characteristics of the river basins. The results indeed show that setting the filter parameters according to expert knowledge results in accurate baseflow separation but lower accuracy of total flow predictions, suggesting that these two objectives are intrinsically conflicting rather than compatible.

  4. Validation of a global scale to assess the quality of interprofessional teamwork in mental health settings.

    PubMed

    Tomizawa, Ryoko; Yamano, Mayumi; Osako, Mitue; Hirabayashi, Naotugu; Oshima, Nobuo; Sigeta, Masahiro; Reeves, Scott

    2017-12-01

    Few scales currently exist to assess the quality of interprofessional teamwork through team members' perceptions of working together in mental health settings. The purpose of this study was to revise and validate an interprofessional scale to assess the quality of teamwork in inpatient psychiatric units and to use it multi-nationally. A literature review was undertaken to identify evaluative teamwork tools and develop an additional 12 items to ensure a broad global focus. Focus group discussions considered adaptation to different care systems using subjective judgements from 11 participants in a pre-test of items. Data quality, construct validity, reproducibility, and internal consistency were investigated in the survey using an international comparative design. Exploratory factor analysis yielded five factors with 21 items: 'patient/community centred care', 'collaborative communication', 'interprofessional conflict', 'role clarification', and 'environment'. High overall internal consistency, reproducibility, adequate face validity, and reasonable construct validity were shown in the USA and Japan. The revised Collaborative Practice Assessment Tool (CPAT) is a valid measure to assess the quality of interprofessional teamwork in psychiatry and identifies the best strategies to improve team performance. Furthermore, the revised scale will generate more rigorous evidence for collaborative practice in psychiatry internationally.

  5. The Utility of Local Anesthesia for Neurosurgical Interventions in a Low-Resource Setting: A Case Series.

    PubMed

    Eaton, Jessica; Hanif, Asma Bilal; Mzumara, Suzgisam; Charles, Anthony

    2018-05-01

    Trauma is a major contributor to global morbidity and mortality, and injury to the central nervous system is the most common cause of death in these patients. While the provision of surgical services is being recognized as essential to global public health efforts, specialty areas such as neurosurgery remain overlooked. This is a retrospective case review of patients with operable lesions, such as extra-axial hematomas and unstable depressed skull fractures that underwent neurosurgical interventions under local anesthesia. A total of 13 patients underwent neurosurgical intervention under local anesthesia. Two and three patients with burr hole decompression of epidural and subdural hematomas, respectively; seven patients had elevation of depressed skull fractures and lastly one patient had an aspiration of a brain abscess. All patients survived with and without residual neurological deficits. Access to resources and staff required to deliver general anesthesia is challenging in resource-poor settings. We have therefore begun performing emergent interventions under local anesthesia, with or without conscious sedation. While some patients had some minor residual weakness after the procedure, the degree of neurological deficit was improved from that observed before the procedure in all patients.

  6. Alternative Approaches to Land Initialization for Seasonal Precipitation and Temperature Forecasts

    NASA Technical Reports Server (NTRS)

    Koster, Randal; Suarez, Max; Liu, Ping; Jambor, Urszula

    2004-01-01

    The seasonal prediction system of the NASA Global Modeling and Assimilation Office is used to generate ensembles of summer forecasts utilizing realistic soil moisture initialization. To derive the realistic land states, we drive offline the system's land model with realistic meteorological forcing over the period 1979-1993 (in cooperation with the Global Land Data Assimilation System project at GSFC) and then extract the state variables' values on the chosen forecast start dates. A parallel series of forecast ensembles is performed with a random (though climatologically consistent) set of land initial conditions; by comparing the two sets of ensembles, we can isolate the impact of land initialization on forecast skill from that of the imposed SSTs. The base initialization experiment is supplemented with several forecast ensembles that use alternative initialization techniques. One ensemble addresses the impact of minimizing climate drift in the system through the scaling of the initial conditions, and another is designed to isolate the importance of the precipitation signal from that of all other signals in the antecedent offline forcing. A third ensemble includes a more realistic initialization of the atmosphere along with the land initialization. The impact of each variation on forecast skill is quantified.

  7. Development of low-cost devices for image-guided photodynamic therapy treatment of oral cancer in global health settings

    NASA Astrophysics Data System (ADS)

    Liu, Hui; Rudd, Grant; Daly, Liam; Hempstead, Joshua; Liu, Yiran; Khan, Amjad P.; Mallidi, Srivalleesha; Thomas, Richard; Rizvi, Imran; Arnason, Stephen; Cuckov, Filip; Hasan, Tayyaba; Celli, Jonathan P.

    2016-03-01

    Photodynamic therapy (PDT) is a light-based modality that shows promise for adaptation and implementation as a cancer treatment technology in resource-limited settings. In this context PDT is particularly well suited for treatment of pre-cancer and early stage malignancy of the oral cavity, that present a major global health challenge, but for which light delivery can be achieved without major infrastructure requirements. In recent reports we demonstrated that a prototype low-cost batterypowered 635nm LED light source for ALA-PpIX PDT achieves tumoricidal efficacy in vitro and vivo, comparable to a commercial turn-key laser source. Here, building on these reports, we describe the further development of a prototype PDT device to enable intraoral light delivery, designed for ALA- PDT treatment of precancerous and cancerous lesions of the oral cavity. We evaluate light delivery via fiber bundles and customized 3D printed light applicators for flexible delivery to lesions of varying size and position within the oral cavity. We also briefly address performance requirements (output power, stability, and light delivery) and present validation of the device for ALA-PDT treatment in monolayer squamous carcinoma cell cultures.

  8. The Global Seismographic Network (GSN): Deployment of Next Generation VBB Borehole Sensors and Improving Overall Network Noise Performance

    NASA Astrophysics Data System (ADS)

    Hafner, K.; Davis, P.; Wilson, D.; Sumy, D.

    2017-12-01

    The Global Seismographic Network (GSN) recently received delivery of the next generation Very Broadband (VBB) borehole sensors purchased through funding from the DOE. Deployment of these sensors will be underway during the end of summer and fall of 2017 and they will eventually replace the aging KS54000 sensors at approximately one-third of the GSN network stations. We will present the latest methods of deploying these sensors in the existing deep boreholes. To achieve lower noise performance at some sites, emplacement in shallow boreholes might result in lower noise performance for the existing site conditions. In some cases shallow borehole installations may be adapted to vault stations (which make up two thirds of the network), as a means of reducing tilt-induced signals on the horizontal components. The GSN is creating a prioritized list of equipment upgrades at selected stations with the ultimate goal of optimizing overall network data availability and noise performance. For an overview of the performance of the current GSN relative to selected set of metrics, we are utilizing data quality metrics and Probability Density Functions (PDFs)) generated by the IRIS Data Management Centers' (DMC) MUSTANG (Modular Utility for Statistical Knowledge Gathering) and LASSO (Latest Assessment of Seismic Station Observations) tools. We will present our metric analysis of GSN performance in 2016, and show the improvements at GSN sites resulting from recent instrumentation and infrastructure upgrades.

  9. Advances in infrastructure support for flat panel display manufacturing

    NASA Astrophysics Data System (ADS)

    Bardsley, James N.; Ciesinski, Michael F.; Pinnel, M. Robert

    1997-07-01

    The success of the US display industry, both in providing high-performance displays for the US Department of Defense at reasonable cost and in capturing a significant share of the global civilian market, depends on maintaining technological leadership and on building efficient manufacturing capabilities. The US Display Consortium (USDC) was set up in 1993 by the US Government and private industry to guide the development of the infrastructure needed to support the manufacturing of flat panel displays. This mainly involves the supply of equipment and materials, but also includes the formation of partnerships and the training of a skilled labor force. Examples are given of successful development projects, some involving USDC participation, others through independent efforts of its member companies. These examples show that US-based companies can achieve leadership positions in this young and rapidly growing global market.

  10. Exceptional suffering? Enumeration and vernacular accounting in the HIV-positive experience.

    PubMed

    Benton, Adia

    2012-01-01

    Drawing on 17 months of ethnographic fieldwork in Freetown, Sierra Leone, I highlight the recursive relationship between Sierra Leone as an exemplary setting and HIV as an exceptional disease. Through this relationship, I examine how HIV-positive individuals rely on both enumerative knowledge (seroprevalence rates) and vernacular accounting (NGO narratives of vulnerability) to communicate the uniqueness of their experience as HIV sufferers and to demarcate the boundaries of their status. Various observers' enumerative and vernacular accounts of Sierra Leone's decade-long civil conflict, coupled with global health accounts of HIV as exceptional, reveal the calculus of power through which global health projects operate. The contradictions between the exemplary and the exceptional-and the accompanying tension between quantitative and qualitative facts-are mutually constituted in performances and claims made by HIV-positive individuals themselves.

  11. God: Do I have your attention?

    PubMed

    Colzato, Lorenza S; van Beest, Ilja; van den Wildenberg, Wery P M; Scorolli, Claudia; Dorchin, Shirley; Meiran, Nachshon; Borghi, Anna M; Hommel, Bernhard

    2010-10-01

    Religion is commonly defined as a set of rules, developed as part of a culture. Here we provide evidence that practice in following these rules systematically changes the way people attend to visual stimuli, as indicated by the individual sizes of the global precedence effect (better performance to global than to local features). We show that this effect is significantly reduced in Calvinism, a religion emphasizing individual responsibility, and increased in Catholicism and Judaism, religions emphasizing social solidarity. We also show that this effect is long-lasting (still affecting baptized atheists) and that its size systematically varies as a function of the amount and strictness of religious practices. These findings suggest that religious practice induces particular cognitive-control styles that induce chronic, directional biases in the control of visual attention. Copyright 2010 Elsevier B.V. All rights reserved.

  12. Frequency-independent radiation modes of interior sound radiation: Experimental study and global active control

    NASA Astrophysics Data System (ADS)

    Hesse, C.; Papantoni, V.; Algermissen, S.; Monner, H. P.

    2017-08-01

    Active control of structural sound radiation is a promising technique to overcome the poor passive acoustic isolation performance of lightweight structures in the low-frequency region. Active structural acoustic control commonly aims at the suppression of the far-field radiated sound power. This paper is concerned with the active control of sound radiation into acoustic enclosures. Experimental results of a coupled rectangular plate-fluid system under stochastic excitation are presented. The amplitudes of the frequency-independent interior radiation modes are determined in real-time using a set of structural vibration sensors, for the purpose of estimating their contribution to the acoustic potential energy in the enclosure. This approach is validated by acoustic measurements inside the cavity. Utilizing a feedback control approach, a broadband reduction of the global acoustic response inside the enclosure is achieved.

  13. Numerical Parameter Optimization of the Ignition and Growth Model for HMX Based Plastic Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Gambino, James; Tarver, Craig; Springer, H. Keo; White, Bradley; Fried, Laurence

    2017-06-01

    We present a novel method for optimizing parameters of the Ignition and Growth reactive flow (I&G) model for high explosives. The I&G model can yield accurate predictions of experimental observations. However, calibrating the model is a time-consuming task especially with multiple experiments. In this study, we couple the differential evolution global optimization algorithm to simulations of shock initiation experiments in the multi-physics code ALE3D. We develop parameter sets for HMX based explosives LX-07 and LX-10. The optimization finds the I&G model parameters that globally minimize the difference between calculated and experimental shock time of arrival at embedded pressure gauges. This work was performed under the auspices of the U.S. DOE by LLNL under contract DE-AC52-07NA27344. LLNS, LLC LLNL-ABS- 724898.

  14. Reframing undergraduate medical education in global health: Rationale and key principles from the Bellagio Global Health Education Initiative.

    PubMed

    Peluso, Michael J; van Schalkwyk, Susan; Kellett, Anne; Brewer, Timothy F; Clarfield, A Mark; Davies, David; Garg, Bishan; Greensweig, Tobin; Hafler, Janet; Hou, Jianlin; Maley, Moira; Mayanja-Kizza, Harriet; Pemba, Senga; Jenny Samaan, Janette; Schoenbaum, Stephen; Sethia, Babulal; Uribe, Juan Pablo; Margolis, Carmi Z; Rohrbaugh, Robert M

    2017-06-01

    Global health education (GHE) continues to be a growing initiative in many medical schools across the world. This focus is no longer limited to participants from high-income countries and has expanded to institutions and students from low- and middle-income settings. With this shift has come a need to develop meaningful curricula through engagement between educators and learners who represent the sending institutions and the diverse settings in which GHE takes place. The Bellagio Global Health Education Initiative (BGHEI) was founded to create a space for such debate and discussion and to generate guidelines towards a universal curriculum for global health. In this article, we describe the development and process of our work and outline six overarching principles that ought to be considered when adopting an inclusive approach to GHE curriculum development.

  15. Multi objective climate change impact assessment using multi downscaled climate scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-04-01

    Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional and global scale. In the present study, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from a set of statistically downscaled GCM projections for Columbia River Basin (CRB). Analysis is performed using 2 different statistically downscaled climate projections namely the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. Analysis is performed on spatial, temporal and frequency based parameters in the future period at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice-versa for temperature. Frequency analysis provided insights into possible explanation to changes in precipitation.

  16. Aerosol and gamma background measurements at Basic Environmental Observatory Moussala

    NASA Astrophysics Data System (ADS)

    Angelov, Christo; Arsov, Todor; Penev, Ilia; Nikolova, Nina; Kalapov, Ivo; Georgiev, Stefan

    2016-03-01

    Trans boundary and local pollution, global climate changes and cosmic rays are the main areas of research performed at the regional Global Atmospheric Watch (GAW) station Moussala BEO (2925 m a.s.l., 42°10'45'' N, 23°35'07'' E). Real time measurements and observations are performed in the field of atmospheric chemistry and physics. Complex information about the aerosol is obtained by using a threewavelength integrating Nephelometer for measuring the scattering and backscattering coefficients, a continuous light absorption photometer and a scanning mobile particle sizer. The system for measuring radioactivity and heavy metals in aerosols allows us to monitor a large scale radioactive aerosol transport. The measurements of the gamma background and the gamma-rays spectrum in the air near Moussala peak are carried out in real time. The HYSPLIT back trajectory model is used to determine the origin of the data registered. DREAM code calculations [2] are used to forecast the air mass trajectory. The information obtained combined with a full set of corresponding meteorological parameters is transmitted via a high frequency radio telecommunication system to the Internet.

  17. Statistical classification of drug incidents due to look-alike sound-alike mix-ups.

    PubMed

    Wong, Zoie Shui Yee

    2016-06-01

    It has been recognised that medication names that look or sound similar are a cause of medication errors. This study builds statistical classifiers for identifying medication incidents due to look-alike sound-alike mix-ups. A total of 227 patient safety incident advisories related to medication were obtained from the Canadian Patient Safety Institute's Global Patient Safety Alerts system. Eight feature selection strategies based on frequent terms, frequent drug terms and constituent terms were performed. Statistical text classifiers based on logistic regression, support vector machines with linear, polynomial, radial-basis and sigmoid kernels and decision tree were trained and tested. The models developed achieved an average accuracy of above 0.8 across all the model settings. The receiver operating characteristic curves indicated the classifiers performed reasonably well. The results obtained in this study suggest that statistical text classification can be a feasible method for identifying medication incidents due to look-alike sound-alike mix-ups based on a database of advisories from Global Patient Safety Alerts. © The Author(s) 2014.

  18. Efficient visibility encoding for dynamic illumination in direct volume rendering.

    PubMed

    Kronander, Joel; Jönsson, Daniel; Löw, Joakim; Ljung, Patric; Ynnerman, Anders; Unger, Jonas

    2012-03-01

    We present an algorithm that enables real-time dynamic shading in direct volume rendering using general lighting, including directional lights, point lights, and environment maps. Real-time performance is achieved by encoding local and global volumetric visibility using spherical harmonic (SH) basis functions stored in an efficient multiresolution grid over the extent of the volume. Our method enables high-frequency shadows in the spatial domain, but is limited to a low-frequency approximation of visibility and illumination in the angular domain. In a first pass, level of detail (LOD) selection in the grid is based on the current transfer function setting. This enables rapid online computation and SH projection of the local spherical distribution of visibility information. Using a piecewise integration of the SH coefficients over the local regions, the global visibility within the volume is then computed. By representing the light sources using their SH projections, the integral over lighting, visibility, and isotropic phase functions can be efficiently computed during rendering. The utility of our method is demonstrated in several examples showing the generality and interactive performance of the approach.

  19. Global metabolic profiling procedures for urine using UPLC-MS.

    PubMed

    Want, Elizabeth J; Wilson, Ian D; Gika, Helen; Theodoridis, Georgios; Plumb, Robert S; Shockcor, John; Holmes, Elaine; Nicholson, Jeremy K

    2010-06-01

    The production of 'global' metabolite profiles involves measuring low molecular-weight metabolites (<1 kDa) in complex biofluids/tissues to study perturbations in response to physiological challenges, toxic insults or disease processes. Information-rich analytical platforms, such as mass spectrometry (MS), are needed. Here we describe the application of ultra-performance liquid chromatography-MS (UPLC-MS) to urinary metabolite profiling, including sample preparation, stability/storage and the selection of chromatographic conditions that balance metabolome coverage, chromatographic resolution and throughput. We discuss quality control and metabolite identification, as well as provide details of multivariate data analysis approaches for analyzing such MS data. Using this protocol, the analysis of a sample set in 96-well plate format, would take ca. 30 h, including 1 h for system setup, 1-2 h for sample preparation, 24 h for UPLC-MS analysis and 1-2 h for initial data processing. The use of UPLC-MS for metabolic profiling in this way is not faster than the conventional HPLC-based methods but, because of improved chromatographic performance, provides superior metabolome coverage.

  20. A model of global citizenship: antecedents and outcomes.

    PubMed

    Reysen, Stephen; Katzarska-Miller, Iva

    2013-01-01

    As the world becomes increasingly interconnected, exposure to global cultures affords individuals opportunities to develop global identities. In two studies, we examine the antecedents and outcomes of identifying with a superordinate identity--global citizen. Global citizenship is defined as awareness, caring, and embracing cultural diversity while promoting social justice and sustainability, coupled with a sense of responsibility to act. Prior theory and research suggest that being aware of one's connection with others in the world (global awareness) and embedded in settings that value global citizenship (normative environment) lead to greater identification with global citizens. Furthermore, theory and research suggest that when global citizen identity is salient, greater identification is related to adherence to the group's content (i.e., prosocial values and behaviors). Results of the present set of studies showed that global awareness (knowledge and interconnectedness with others) and one's normative environment (friends and family support global citizenship) predicted identification with global citizens, and global citizenship predicted prosocial values of intergroup empathy, valuing diversity, social justice, environmental sustainability, intergroup helping, and a felt responsibility to act for the betterment of the world. The relationship between antecedents (normative environment and global awareness) and outcomes (prosocial values) was mediated by identification with global citizens. We discuss the relationship between the present results and other research findings in psychology, the implications of global citizenship for other academic domains, and future avenues of research. Global citizenship highlights the unique effect of taking a global perspective on a multitude of topics relevant to the psychology of everyday actions, environments, and identity.

  1. Global data set of biogenic VOC emissions calculated by the MEGAN model over the last 30 years

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sindelarova, K.; Granier, Claire; Bouarar, I.

    The Model of Emissions of Gases and Aerosols from Nature (MEGANv2.1) together with the Modern-Era Retrospective Analysis for Research and Applications (MERRA) meteorological fields were used to create a global emission dataset of biogenic VOCs available on a monthly basis for the time period of 1980 - 2010. This dataset is called MEGAN-MACC. The model estimated mean annual total BVOC emission of 760 Tg(C) yr1 consisting of isoprene (70%), monoterpenes (11%), methanol (6%), acetone (3%), sesquiterpenes (2.5%) and other BVOC species each contributing less than 2 %. Several sensitivity model runs were performed to study the impact of different modelmore » input and model settings on isoprene estimates and resulted in differences of * 17% of the reference isoprene total. A greater impact was observed for sensitivity run applying parameterization of soil moisture deficit that led to a 50% reduction of isoprene emissions on a global scale, most significantly in specific regions of Africa, South America and Australia. MEGAN-MACC estimates are comparable to results of previous studies. More detailed comparison with other isoprene in ventories indicated significant spatial and temporal differences between the datasets especially for Australia, Southeast Asia and South America. MEGAN-MACC estimates of isoprene and*-pinene showed a reasonable agreement with surface flux measurements in the Amazon andthe model was able to capture the seasonal variation of emissions in this region.« less

  2. Evaluating the Credibility of Transport Processes in Simulations of Ozone Recovery using the Global Modeling Initiative Three-dimensional Model

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.

    2004-01-01

    The Global Modeling Initiative (GMI) has integrated two 36-year simulations of an ozone recovery scenario with an offline chemistry and tra nsport model using two different meteorological inputs. Physically ba sed diagnostics, derived from satellite and aircraft data sets, are d escribed and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barri er formation in the subtropics and polar regions, and extratropical w ave-driven transport. Some diagnostics are especially relevant to sim ulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The global temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of me teorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a g eneral circulation model (GMI(GCM)) showed a very good residual circulation in the tropics and Northern Hemisphere. The simulation with inp ut from a data assimilation system (GMI(DAS)) performed better in the midlatitudes than it did at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GML(GCM) has greater fidelity throughout the stratosphere tha n it does in the GMI(DAS)

  3. Developing core elements and checklist items for global hospital antimicrobial stewardship programmes: a consensus approach.

    PubMed

    Pulcini, C; Binda, F; Lamkang, A S; Trett, A; Charani, E; Goff, D A; Harbarth, S; Hinrichsen, S L; Levy-Hara, G; Mendelson, M; Nathwani, D; Gunturu, R; Singh, S; Srinivasan, A; Thamlikitkul, V; Thursky, K; Vlieghe, E; Wertheim, H; Zeng, M; Gandra, S; Laxminarayan, R

    2018-04-03

    With increasing global interest in hospital antimicrobial stewardship (AMS) programmes, there is a strong demand for core elements of AMS to be clearly defined on the basis of principles of effectiveness and affordability. To date, efforts to identify such core elements have been limited to Europe, Australia, and North America. The aim of this study was to develop a set of core elements and their related checklist items for AMS programmes that should be present in all hospitals worldwide, regardless of resource availability. A literature review was performed by searching Medline and relevant websites to retrieve a list of core elements and items that could have global relevance. These core elements and items were evaluated by an international group of AMS experts using a structured modified Delphi consensus procedure, using two-phased online in-depth questionnaires. The literature review identified seven core elements and their related 29 checklist items from 48 references. Fifteen experts from 13 countries in six continents participated in the consensus procedure. Ultimately, all seven core elements were retained, as well as 28 of the initial checklist items plus one that was newly suggested, all with ≥80% agreement; 20 elements and items were rephrased. This consensus on core elements for hospital AMS programmes is relevant to both high- and low-to-middle-income countries and could facilitate the development of national AMS stewardship guidelines and adoption by healthcare settings worldwide. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. All rights reserved.

  4. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  5. Evaluation of various LandFlux evapotranspiration algorithms using the LandFlux-EVAL synthesis benchmark products and observational data

    NASA Astrophysics Data System (ADS)

    Michel, Dominik; Hirschi, Martin; Jimenez, Carlos; McCabe, Mathew; Miralles, Diego; Wood, Eric; Seneviratne, Sonia

    2014-05-01

    Research on climate variations and the development of predictive capabilities largely rely on globally available reference data series of the different components of the energy and water cycles. Several efforts aimed at producing large-scale and long-term reference data sets of these components, e.g. based on in situ observations and remote sensing, in order to allow for diagnostic analyses of the drivers of temporal variations in the climate system. Evapotranspiration (ET) is an essential component of the energy and water cycle, which can not be monitored directly on a global scale by remote sensing techniques. In recent years, several global multi-year ET data sets have been derived from remote sensing-based estimates, observation-driven land surface model simulations or atmospheric reanalyses. The LandFlux-EVAL initiative presented an ensemble-evaluation of these data sets over the time periods 1989-1995 and 1989-2005 (Mueller et al. 2013). Currently, a multi-decadal global reference heat flux data set for ET at the land surface is being developed within the LandFlux initiative of the Global Energy and Water Cycle Experiment (GEWEX). This LandFlux v0 ET data set comprises four ET algorithms forced with a common radiation and surface meteorology. In order to estimate the agreement of this LandFlux v0 ET data with existing data sets, it is compared to the recently available LandFlux-EVAL synthesis benchmark product. Additional evaluation of the LandFlux v0 ET data set is based on a comparison to in situ observations of a weighing lysimeter from the hydrological research site Rietholzbach in Switzerland. These analyses serve as a test bed for similar evaluation procedures that are envisaged for ESA's WACMOS-ET initiative (http://wacmoset.estellus.eu). Reference: Mueller, B., Hirschi, M., Jimenez, C., Ciais, P., Dirmeyer, P. A., Dolman, A. J., Fisher, J. B., Jung, M., Ludwig, F., Maignan, F., Miralles, D. G., McCabe, M. F., Reichstein, M., Sheffield, J., Wang, K., Wood, E. F., Zhang, Y., and Seneviratne, S. I. (2013). Benchmark products for land evapotranspiration: LandFlux-EVAL multi-data set synthesis. Hydrology and Earth System Sciences, 17(10): 3707-3720.

  6. Production of long-term global water vapor and liquid water data set using ultra-fast methods to assimilate multi-satellite and radiosonde observations

    NASA Technical Reports Server (NTRS)

    Vonderhaar, T. H.; Reinke, Donald L.; Randel, David L.; Stephens, Graeme L.; Combs, Cynthia L.; Greenwald, Thomas J.; Ringerud, Mark A.; Wittmeyer, Ian L.

    1993-01-01

    During the next decade, many programs and experiments under the Global Energy and Water Cycle Experiment (GEWEX) will utilize present day and future data sets to improve our understanding of the role of moisture in climate, and its interaction with other variables such as clouds and radiation. An important element of GEWEX will be the GEWEX Water Vapor Project (GVaP), which will eventually initiate a routine, real-time assimilation of the highest quality, global water vapor data sets including information gained from future data collection systems, both ground and space based. The comprehensive global water vapor data set being produced by METSAT Inc. uses a combination of ground-based radiosonde data, and infrared and microwave satellite retrievals. This data is needed to provide the desired foundation from which future GEWEX-related research, such as GVaP, can build. The first year of this project was designed to use a combination of the best available atmospheric moisture data including: radiosonde (balloon/acft/rocket), HIRS/MSU (TOVS) retrievals, and SSM/I retrievals, to produce a one-year, global, high resolution data set of integrated column water vapor (precipitable water) with a horizontal resolution of 1 degree, and a temporal resolution of one day. The time period of this pilot product was to be det3ermined by the availability of all the input data sets. January 1988 through December 1988 were selected. In addition, a sample of vertically integrated liquid water content (LWC) was to be produced with the same temporal and spatial parameters. This sample was to be produced over ocean areas only. Three main steps are followed to produce a merged water vapor and liquid water product. Input data from Radiosondes, TOVS, and SSMI/I is quality checked in steps one and two. Processing is done in step two to generate individual total column water vapor and liquid water data sets. The third step, and final processing task, involves merging the individual output products to produce the integrated water vapor product. A final quality control is applied to the merged data sets.

  7. Translating Globalization Theories into Educational Research: Thoughts on Recent Shifts in Holocaust Education

    ERIC Educational Resources Information Center

    Macgilchrist, Felicitas; Christophe, Barbara

    2011-01-01

    Much educational research on globalization aims to prepare students to be successful citizens in a global society. We propose a set of three concepts, drawing on systems theory (Nassehi, Stichweh) and theories of the subject (Butler, Foucault), to think the global which enables educational research to step back from hegemonic discourses and…

  8. The Global Targeting of Education and Skill: Policy History and Comparative Perspectives

    ERIC Educational Resources Information Center

    King, Kenneth

    2016-01-01

    This analysis covers the period from 1925 to 2016 in respect of constructing national and global goals and targets in education and training. Tensions between global and national approaches to target-setting are identified. Equally, the ownership of the global target discourse is discussed along with its contested relevance to both developed and…

  9. Ambiguity resolution in precise point positioning with hourly data for global single receiver

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaohong; Li, Pan; Guo, Fei

    2013-01-01

    Integer ambiguity resolution (IAR) can improve precise point positioning (PPP) performance significantly. IAR for PPP became a highlight topic in global positioning system (GPS) community in recent years. More and more researchers focus on this issue. Progress has been made in the latest years. In this paper, we aim at investigating and demonstrating the performance of a global zero-differenced (ZD) PPP IAR service for GPS users by providing routine ZD uncalibrated fractional offsets (UFOs) for wide-lane and narrow-lane. Data sets from all IGS stations collected on DOY 1, 100, 200 and 300 of 2010 are used to validate and demonstrate this global service. Static experiment results show that an accuracy better than 1 cm in horizontal and 1-2 cm in vertical could be achieved in ambiguity-fixed PPP solution with only hourly data. Compared with PPP float solution, an average improvement reaches 58.2% in east, 28.3% in north and 23.8% in vertical for all tested stations. Results of kinematic experiments show that the RMS of kinematic PPP solutions can be improved from 21.6, 16.6 and 37.7 mm to 12.2, 13.3 and 34.3 mm for the fixed solutions in the east, north and vertical components, respectively. Both static and kinematic experiments show that wide-lane and narrow-lane UFO products of all satellites can be generated and provided in a routine way accompanying satellite orbit and clock products for the PPP user anywhere around the world, to obtain accurate and reliable ambiguity-fixed PPP solutions.

  10. Towards defining interprofessional competencies for global health education: drawing on educational frameworks and the experience of the UW-Madison Global Health Institute.

    PubMed

    Brown, Lori DiPrete

    2014-12-01

    The experience and lessons to date from the University of Wisconsin-Madison Global Health Institute's global health programs, considered together with more recently published competency frameworks related to global health practice, can provide important insights into the development of a core set of interprofessional competencies for global health that can be used across disciplines and professions. © 2014 American Society of Law, Medicine & Ethics, Inc.

  11. Global lightning studies

    NASA Technical Reports Server (NTRS)

    Goodman, Steven J.; Wright, Pat; Christian, Hugh; Blakeslee, Richard; Buechler, Dennis; Scharfen, Greg

    1991-01-01

    The global lightning signatures were analyzed from the DMSP Optical Linescan System (OLS) imagery archived at the National Snow and Ice Data Center. Transition to analysis of the digital archive becomes available and compare annual, interannual, and seasonal variations with other global data sets. An initial survey of the quality of the existing film archive was completed and lightning signatures were digitized for the summer months of 1986 to 1987. The relationship is studied between: (1) global and regional lightning activity and rainfall, and (2) storm electrical development and environment. Remote sensing data sets obtained from field programs are used in conjunction with satellite/radar/lightning data to develop and improve precipitation estimation algorithms, and to provide a better understanding of the co-evolving electrical, microphysical, and dynamical structure of storms.

  12. How Can a Global Social Support System Hope to Achieve Fairer Competiveness? Comment on "A Global Social Support System: What the International Community Could Learn From the United States' National Basketball Association".

    PubMed

    Goldblatt, Peter

    2015-12-25

    Ooms et al sets out some good general principles for a global social support system to improve fairer global competitiveness as a result of redistribution. This commentary sets out to summarize some of the conditions that would need to be satisfied for it to level up gradients in inequality through such a social support system, using the National Basketball Association (NBA) example as a point of reference. From this, the minimal conditions are described that would be required for the support system, proposed in the article by Ooms et al, to succeed. © 2016 by Kerman University of Medical Sciences.

  13. Paediatric Palliative Care in Resource-Poor Countries

    PubMed Central

    Boucher, Sue; Daniels, Alex; Nkosi, Busi

    2018-01-01

    There is a great need for paediatric palliative care (PPC) services globally, but access to services is lacking in many parts of the world, particularly in resource-poor settings. Globally it is estimated that 21.6 million children need access to palliative care, with 8.2 needing specialist services. PC has been identified as important within the global health agenda e.g., within universal health coverage, and a recent Lancet commission report recognised the need for PPC. However, a variety of challenges have been identified to PPC development globally such as: access to treatment, access to medications such as oral morphine, opiophobia, a lack of trained health and social care professionals, a lack of PPC policies and a lack of awareness about PPC. These challenges can be overcome utilising a variety of strategies including advocacy and public awareness, education, access to medications, implementation and research. Examples will be discussed impacting on the provision of PPC in resource-poor settings. High-quality PPC service provision can be provided with resource-poor settings, and there is an urgent need to scale up affordable, accessible, and quality PPC services globally to ensure that all children needing palliative care can access it. PMID:29463065

  14. Mapping Impervious Surfaces Globally at 30m Resolution Using Global Land Survey Data

    NASA Technical Reports Server (NTRS)

    DeColstoun, Eric Brown; Huang, Chengquan; Tan, Bin; Smith, Sarah Elizabeth; Phillips, Jacqueline; Wang, Panshi; Ling, Pui-Yu; Zhan, James; Li, Sike; Taylor, Michael P.; hide

    2013-01-01

    Impervious surfaces, mainly artificial structures and roads, cover less than 1% of the world's land surface (1.3% over USA). Regardless of the relatively small coverage, impervious surfaces have a significant impact on the environment. They are the main source of the urban heat island effect, and affect not only the energy balance, but also hydrology and carbon cycling, and both land and aquatic ecosystem services. In the last several decades, the pace of converting natural land surface to impervious surfaces has increased. Quantitatively monitoring the growth of impervious surface expansion and associated urbanization has become a priority topic across both the physical and social sciences. The recent availability of consistent, global scale data sets at 30m resolution such as the Global Land Survey from the Landsat satellites provides an unprecedented opportunity to map global impervious cover and urbanization at this resolution for the first time, with unprecedented detail and accuracy. Moreover, the spatial resolution of Landsat is absolutely essential to accurately resolve urban targets such a buildings, roads and parking lots. With long term GLS data now available for the 1975, 1990, 2000, 2005 and 2010 time periods, the land cover/use changes due to urbanization can now be quantified at this spatial scale as well. In the Global Land Survey - Imperviousness Mapping Project (GLS-IMP), we are producing the first global 30 m spatial resolution impervious cover data set. We have processed the GLS 2010 data set to surface reflectance (8500+ TM and ETM+ scenes) and are using a supervised classification method using a regression tree to produce continental scale impervious cover data sets. A very large set of accurate training samples is the key to the supervised classifications and is being derived through the interpretation of high spatial resolution (approx. 2 m or less) commercial satellite data (Quickbird and Worldview2) available to us through the unclassified archive of the National Geospatial Intelligence Agency (NGA). For each continental area several million training pixels are derived by analysts using image segmentation algorithms and tools and then aggregated to the 30m resolution of Landsat. Here we will discuss the production/testing of this massive data set for Europe, North and South America and Africa, including assessments of the 2010 surface reflectance data. This type of analysis is only possible because of the availability of long term 30m data sets from GLS and shows much promise for integration of Landsat 8 data in the future.

  15. Mapping Impervious Surfaces Globally at 30m Resolution Using Landsat Global Land Survey Data

    NASA Astrophysics Data System (ADS)

    Brown de Colstoun, E.; Huang, C.; Wolfe, R. E.; Tan, B.; Tilton, J.; Smith, S.; Phillips, J.; Wang, P.; Ling, P.; Zhan, J.; Xu, X.; Taylor, M. P.

    2013-12-01

    Impervious surfaces, mainly artificial structures and roads, cover less than 1% of the world's land surface (1.3% over USA). Regardless of the relatively small coverage, impervious surfaces have a significant impact on the environment. They are the main source of the urban heat island effect, and affect not only the energy balance, but also hydrology and carbon cycling, and both land and aquatic ecosystem services. In the last several decades, the pace of converting natural land surface to impervious surfaces has increased. Quantitatively monitoring the growth of impervious surface expansion and associated urbanization has become a priority topic across both the physical and social sciences. The recent availability of consistent, global scale data sets at 30m resolution such as the Global Land Survey from the Landsat satellites provides an unprecedented opportunity to map global impervious cover and urbanization at this resolution for the first time, with unprecedented detail and accuracy. Moreover, the spatial resolution of Landsat is absolutely essential to accurately resolve urban targets such a buildings, roads and parking lots. With long term GLS data now available for the 1975, 1990, 2000, 2005 and 2010 time periods, the land cover/use changes due to urbanization can now be quantified at this spatial scale as well. In the Global Land Survey - Imperviousness Mapping Project (GLS-IMP), we are producing the first global 30 m spatial resolution impervious cover data set. We have processed the GLS 2010 data set to surface reflectance (8500+ TM and ETM+ scenes) and are using a supervised classification method using a regression tree to produce continental scale impervious cover data sets. A very large set of accurate training samples is the key to the supervised classifications and is being derived through the interpretation of high spatial resolution (~2 m or less) commercial satellite data (Quickbird and Worldview2) available to us through the unclassified archive of the National Geospatial Intelligence Agency (NGA). For each continental area several million training pixels are derived by analysts using image segmentation algorithms and tools and then aggregated to the 30m resolution of Landsat. Here we will discuss the production/testing of this massive data set for Europe, North and South America and Africa, including assessments of the 2010 surface reflectance data. This type of analysis is only possible because of the availability of long term 30m data sets from GLS and shows much promise for integration of Landsat 8 data in the future.

  16. Survey Definitions of Gout for Epidemiologic Studies: Comparison With Crystal Identification as the Gold Standard.

    PubMed

    Dalbeth, Nicola; Schumacher, H Ralph; Fransen, Jaap; Neogi, Tuhina; Jansen, Tim L; Brown, Melanie; Louthrenoo, Worawit; Vazquez-Mellado, Janitzia; Eliseev, Maxim; McCarthy, Geraldine; Stamp, Lisa K; Perez-Ruiz, Fernando; Sivera, Francisca; Ea, Hang-Korng; Gerritsen, Martijn; Scire, Carlo A; Cavagna, Lorenzo; Lin, Chingtsai; Chou, Yin-Yi; Tausche, Anne-Kathrin; da Rocha Castelar-Pinheiro, Geraldo; Janssen, Matthijs; Chen, Jiunn-Horng; Cimmino, Marco A; Uhlig, Till; Taylor, William J

    2016-12-01

    To identify the best-performing survey definition of gout from items commonly available in epidemiologic studies. Survey definitions of gout were identified from 34 epidemiologic studies contributing to the Global Urate Genetics Consortium (GUGC) genome-wide association study. Data from the Study for Updated Gout Classification Criteria (SUGAR) were randomly divided into development and test data sets. A data-driven case definition was formed using logistic regression in the development data set. This definition, along with definitions used in GUGC studies and the 2015 American College of Rheumatology (ACR)/European League Against Rheumatism (EULAR) gout classification criteria were applied to the test data set, using monosodium urate crystal identification as the gold standard. For all tested GUGC definitions, the simple definition of "self-report of gout or urate-lowering therapy use" had the best test performance characteristics (sensitivity 82%, specificity 72%). The simple definition had similar performance to a SUGAR data-driven case definition with 5 weighted items: self-report, self-report of doctor diagnosis, colchicine use, urate-lowering therapy use, and hyperuricemia (sensitivity 87%, specificity 70%). Both of these definitions performed better than the 1977 American Rheumatism Association survey criteria (sensitivity 82%, specificity 67%). Of all tested definitions, the 2015 ACR/EULAR criteria had the best performance (sensitivity 92%, specificity 89%). A simple definition of "self-report of gout or urate-lowering therapy use" has the best test performance characteristics of existing definitions that use routinely available data. A more complex combination of features is more sensitive, but still lacks good specificity. If a more accurate case definition is required for a particular study, the 2015 ACR/EULAR gout classification criteria should be considered. © 2016, American College of Rheumatology.

  17. Model-data fusion across ecosystems: from multisite optimizations to global simulations

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Peylin, P.; Maignan, F.; Chevallier, F.; Kiely, G.; Montagnani, L.; Cescatti, A.

    2014-11-01

    This study uses a variational data assimilation framework to simultaneously constrain a global ecosystem model with eddy covariance measurements of daily net ecosystem exchange (NEE) and latent heat (LE) fluxes from a large number of sites grouped in seven plant functional types (PFTs). It is an attempt to bridge the gap between the numerous site-specific parameter optimization works found in the literature and the generic parameterization used by most land surface models within each PFT. The present multisite approach allows deriving PFT-generic sets of optimized parameters enhancing the agreement between measured and simulated fluxes at most of the sites considered, with performances often comparable to those of the corresponding site-specific optimizations. Besides reducing the PFT-averaged model-data root-mean-square difference (RMSD) and the associated daily output uncertainty, the optimization improves the simulated CO2 balance at tropical and temperate forests sites. The major site-level NEE adjustments at the seasonal scale are reduced amplitude in C3 grasslands and boreal forests, increased seasonality in temperate evergreen forests, and better model-data phasing in temperate deciduous broadleaf forests. Conversely, the poorer performances in tropical evergreen broadleaf forests points to deficiencies regarding the modelling of phenology and soil water stress for this PFT. An evaluation with data-oriented estimates of photosynthesis (GPP - gross primary productivity) and ecosystem respiration (Reco) rates indicates distinctively improved simulations of both gross fluxes. The multisite parameter sets are then tested against CO2 concentrations measured at 53 locations around the globe, showing significant adjustments of the modelled seasonality of atmospheric CO2 concentration, whose relevance seems PFT-dependent, along with an improved interannual variability. Lastly, a global-scale evaluation with remote sensing NDVI (normalized difference vegetation index) measurements indicates an improvement of the simulated seasonal variations of the foliar cover for all considered PFTs.

  18. Regional Ventilation Changes in the Lung: Treatment Response Mapping by Using Hyperpolarized Gas MR Imaging as a Quantitative Biomarker.

    PubMed

    Horn, Felix C; Marshall, Helen; Collier, Guilhem J; Kay, Richard; Siddiqui, Salman; Brightling, Christopher E; Parra-Robles, Juan; Wild, Jim M

    2017-09-01

    Purpose To assess the magnitude of regional response to respiratory therapeutic agents in the lungs by using treatment response mapping (TRM) with hyperpolarized gas magnetic resonance (MR) imaging. TRM was used to quantify regional physiologic response in adults with asthma who underwent a bronchodilator challenge. Materials and Methods This study was approved by the national research ethics committee and was performed with informed consent. Imaging was performed in 20 adult patients with asthma by using hyperpolarized helium 3 ( 3 He) ventilation MR imaging. Two sets of baseline images were acquired before inhalation of a bronchodilating agent (salbutamol 400 μg), and one set was acquired after. All images were registered for voxelwise comparison. Regional treatment response, ΔR(r), was calculated as the difference in regional gas distribution (R[r] = ratio of inhaled gas to total volume of a voxel when normalized for lung inflation volume) before and after intervention. A voxelwise activation threshold from the variability of the baseline images was applied to ΔR(r) maps. The summed global treatment response map (ΔR net ) was then used as a global lung index for comparison with metrics of bronchodilator response measured by using spirometry and the global imaging metric percentage ventilated volume (%VV). Results ΔR net showed significant correlation (P < .01) with changes in forced expiratory volume in 1 second (r = 0.70), forced vital capacity (r = 0.84), and %VV (r = 0.56). A significant (P < .01) positive treatment effect was detected with all metrics; however, ΔR net showed a lower intersubject coefficient of variation (64%) than all of the other tests (coefficient of variation, ≥99%). Conclusion TRM provides regional quantitative information on changes in inhaled gas ventilation in response to therapy. This method could be used as a sensitive regional outcome metric for novel respiratory interventions. © RSNA, 2017 Online supplemental material is available for this article.

  19. Review of comparative LCAs of food waste management systems - Current status and potential improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstad, A., E-mail: anna.bernstad@chemeng.lth.se; Cour Jansen, J. la

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer GHG-emissions from different treatment alternatives vary largely in 25 reviewed comparative LCAs of bio-waste management. Black-Right-Pointing-Pointer System-boundary settings often vary largely in reviewed studies. Black-Right-Pointing-Pointer Existing LCA guidelines give varying recommendations in relation to several key issues. - Abstract: Twenty-five comparative cycle assessments (LCAs) addressing food waste treatment were reviewed, including the treatment alternatives landfill, thermal treatment, compost (small and large scale) and anaerobic digestion. The global warming potential related to these treatment alternatives varies largely amongst the studies. Large differences in relation to setting of system boundaries, methodological choices and variations in used input data were seenmore » between the studies. Also, a number of internal contradictions were identified, many times resulting in biased comparisons between alternatives. Thus, noticed differences in global warming potential are not found to be a result of actual differences in the environmental impacts from studied systems, but rather to differences in the performance of the study. A number of key issues with high impact on the overall global warming potential from different treatment alternatives for food waste were identified through the use of one-way sensitivity analyses in relation to a previously performed LCA of food waste management. Assumptions related to characteristics in treated waste, losses and emissions of carbon, nutrients and other compounds during the collection, storage and pretreatment, potential energy recovery through combustion, emissions from composting, emissions from storage and land use of bio-fertilizers and chemical fertilizers and eco-profiles of substituted goods were all identified as highly relevant for the outcomes of this type of comparisons. As the use of LCA in this area is likely to increase in coming years, it is highly relevant to establish more detailed guidelines within this field in order to increase both the general quality in assessments as well as the potentials for cross-study comparisons.« less

  20. Efficiency of extracting stereo-driven object motions

    PubMed Central

    Jain, Anshul; Zaidi, Qasim

    2013-01-01

    Most living things and many nonliving things deform as they move, requiring observers to separate object motions from object deformations. When the object is partially occluded, the task becomes more difficult because it is not possible to use two-dimensional (2-D) contour correlations (Cohen, Jain, & Zaidi, 2010). That leaves dynamic depth matching across the unoccluded views as the main possibility. We examined the role of stereo cues in extracting motion of partially occluded and deforming three-dimensional (3-D) objects, simulated by disk-shaped random-dot stereograms set at randomly assigned depths and placed uniformly around a circle. The stereo-disparities of the disks were temporally oscillated to simulate clockwise or counterclockwise rotation of the global shape. To dynamically deform the global shape, random disparity perturbation was added to each disk's depth on each stimulus frame. At low perturbation, observers reported rotation directions consistent with the global shape, even against local motion cues, but performance deteriorated at high perturbation. Using 3-D global shape correlations, we formulated an optimal Bayesian discriminator for rotation direction. Based on rotation discrimination thresholds, human observers were 75% as efficient as the optimal model, demonstrating that global shapes derived from stereo cues facilitate inferences of object motions. To complement reports of stereo and motion integration in extrastriate cortex, our results suggest the possibilities that disparity selectivity and feature tracking are linked, or that global motion selective neurons can be driven purely from disparity cues. PMID:23325345

  1. How Can a Global Social Support System Hope to Achieve Fairer Competiveness?

    PubMed Central

    Goldblatt, Peter

    2016-01-01

    Ooms et al sets out some good general principles for a global social support system to improve fairer global competitiveness as a result of redistribution. This commentary sets out to summarize some of the conditions that would need to be satisfied for it to level up gradients in inequality through such a social support system, using the National Basketball Association (NBA) example as a point of reference. From this, the minimal conditions are described that would be required for the support system, proposed in the article by Ooms et al, to succeed. PMID:26927594

  2. Predicting the seismic performance of typical R/C healthcare facilities: emphasis on hospitals

    NASA Astrophysics Data System (ADS)

    Bilgin, Huseyin; Frangu, Idlir

    2017-09-01

    Reinforced concrete (RC) type of buildings constitutes an important part of the current building stock in earthquake prone countries such as Albania. Seismic response of structures during a severe earthquake plays a vital role in the extent of structural damage and resulting injuries and losses. In this context, this study evaluates the expected performance of a five-story RC healthcare facility, representative of common practice in Albania, designed according to older codes. The design was based on the code requirements used in this region during the mid-1980s. Non-linear static and dynamic time history analyses were conducted on the structural model using the Zeus NL computer program. The dynamic time history analysis was conducted with a set of ground motions from real earthquakes. The building responses were estimated in global levels. FEMA 356 criteria were used to predict the seismic performance of the building. The structural response measures such as capacity curve and inter-story drift under the set of ground motions and pushover analyses results were compared and detailed seismic performance assessment was done. The main aim of this study is considering the application and methodology for the earthquake performance assessment of existing buildings. The seismic performance of the structural model varied significantly under different ground motions. Results indicate that case study building exhibit inadequate seismic performance under different seismic excitations. In addition, reasons for the poor performance of the building is discussed.

  3. Global interrupt and barrier networks

    DOEpatents

    Blumrich, Matthias A.; Chen, Dong; Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E; Heidelberger, Philip; Kopcsay, Gerard V.; Steinmacher-Burow, Burkhard D.; Takken, Todd E.

    2008-10-28

    A system and method for generating global asynchronous signals in a computing structure. Particularly, a global interrupt and barrier network is implemented that implements logic for generating global interrupt and barrier signals for controlling global asynchronous operations performed by processing elements at selected processing nodes of a computing structure in accordance with a processing algorithm; and includes the physical interconnecting of the processing nodes for communicating the global interrupt and barrier signals to the elements via low-latency paths. The global asynchronous signals respectively initiate interrupt and barrier operations at the processing nodes at times selected for optimizing performance of the processing algorithms. In one embodiment, the global interrupt and barrier network is implemented in a scalable, massively parallel supercomputing device structure comprising a plurality of processing nodes interconnected by multiple independent networks, with each node including one or more processing elements for performing computation or communication activity as required when performing parallel algorithm operations. One multiple independent network includes a global tree network for enabling high-speed global tree communications among global tree network nodes or sub-trees thereof. The global interrupt and barrier network may operate in parallel with the global tree network for providing global asynchronous sideband signals.

  4. Vienna FORTRAN: A FORTRAN language extension for distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Zima, Hans

    1991-01-01

    Exploiting the performance potential of distributed memory machines requires a careful distribution of data across the processors. Vienna FORTRAN is a language extension of FORTRAN which provides the user with a wide range of facilities for such mapping of data structures. However, programs in Vienna FORTRAN are written using global data references. Thus, the user has the advantage of a shared memory programming paradigm while explicitly controlling the placement of data. The basic features of Vienna FORTRAN are presented along with a set of examples illustrating the use of these features.

  5. Conjugate gradient heat bath for ill-conditioned actions.

    PubMed

    Ceriotti, Michele; Bussi, Giovanni; Parrinello, Michele

    2007-08-01

    We present a method for performing sampling from a Boltzmann distribution of an ill-conditioned quadratic action. This method is based on heat-bath thermalization along a set of conjugate directions, generated via a conjugate-gradient procedure. The resulting scheme outperforms local updates for matrices with very high condition number, since it avoids the slowing down of modes with lower eigenvalue, and has some advantages over the global heat-bath approach, compared to which it is more stable and allows for more freedom in devising case-specific optimizations.

  6. Social motivation in Qatari schools and their relation to school achievement.

    PubMed

    Nasser, Ramzi

    2014-10-01

    This study assessed the relation between school-social motivation and student academic achievement. A factor analysis was performed on a set of school-social items selected a priori from three measures of school motivation: the Inventory of School Motivation, the General Achievement Goals Orientation Scale, and the Facilitating Conditions Scale. Three factors with fewer items represented Global Motivation, Peer Help, and Social Power. Hierarchical regression analysis showed social motivation measures were weak predictors of achievement scores in the various content areas. Findings are discussed in the context of Qatari education and culture.

  7. Modeling, Simulation, and Operations Analysis in Afghanistan and Iraq: Operational Vignettes, Lessons Learned, and a Survey of Selected Efforts

    DTIC Science & Technology

    2014-01-01

    GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S...21 For example, see DoD, Sustaining U.S. Global Leadership: Priorities for 21st Century Defense, January 2012. 22 U.S. Joint Chiefs of Staff, 2011, p... projects whenever possible.10 And most of them recog- nized a need for a common set of tools and capabilities. Competence with the Micro- soft Excel and

  8. Nuclear Data Needs for the Neutronic Design of MYRRHA Fast Spectrum Research Reactor

    NASA Astrophysics Data System (ADS)

    Stankovskiy, A.; Malambu, E.; Van den Eynde, G.; Díez, C. J.

    2014-04-01

    A global sensitivity analysis of effective neutron multiplication factor to the change of nuclear data library has been performed. It revealed that the test version of JEFF-3.2 neutron-induced evaluated data library produces closer results to ENDF/B-VII.1 than JEFF-3.1.2 does. The analysis of contributions of individual evaluations into keff sensitivity resulted in the priority list of nuclides, uncertainties on cross sections and fission neutron multiplicities of which have to be improved by setting up dedicated differential and integral experiments.

  9. Aircraft measurements of trace gases and particles near the tropopause

    NASA Technical Reports Server (NTRS)

    Falconer, P.; Pratt, R.; Detwiler, A.; Chen, C. S.; Hogan, A.; Bernard, S.; Krebschull, K.; Winters, W.

    1983-01-01

    Research activities which were performed using atmospheric constituent data obtained by the NASA Global Atmospheric Sampling Program are described. The characteristics of the particle size spectrum in various meteorological settings from a special collection of GASP data are surveyed. The relationship between humidity and cloud particles is analyzed. Climatological and case studies of tropical ozone distributions measured on a large number of flights are reported. Particle counter calibrations are discussed as well as the comparison of GASP particle data in the upper troposphere with other measurements at lower altitudes over the Pacific Ocean.

  10. Optimal design for robust control of uncertain flexible joint manipulators: a fuzzy dynamical system approach

    NASA Astrophysics Data System (ADS)

    Han, Jiang; Chen, Ye-Hwa; Zhao, Xiaomin; Dong, Fangfang

    2018-04-01

    A novel fuzzy dynamical system approach to the control design of flexible joint manipulators with mismatched uncertainty is proposed. Uncertainties of the system are assumed to lie within prescribed fuzzy sets. The desired system performance includes a deterministic phase and a fuzzy phase. First, by creatively implanting a fictitious control, a robust control scheme is constructed to render the system uniformly bounded and uniformly ultimately bounded. Both the manipulator modelling and control scheme are deterministic and not IF-THEN heuristic rules-based. Next, a fuzzy-based performance index is proposed. An optimal design problem for a control design parameter is formulated as a constrained optimisation problem. The global solution to this problem can be obtained from solving two quartic equations. The fuzzy dynamical system approach is systematic and is able to assure the deterministic performance as well as to minimise the fuzzy performance index.

  11. Impact of Surface Roughness and Soil Texture on Mineral Dust Emission Fluxes Modeling

    NASA Technical Reports Server (NTRS)

    Menut, Laurent; Perez, Carlos; Haustein, Karsten; Bessagnet, Bertrand; Prigent, Catherine; Alfaro, Stephane

    2013-01-01

    Dust production models (DPM) used to estimate vertical fluxes of mineral dust aerosols over arid regions need accurate data on soil and surface properties. The Laboratoire Inter-Universitaire des Systemes Atmospheriques (LISA) data set was developed for Northern Africa, the Middle East, and East Asia. This regional data set was built through dedicated field campaigns and include, among others, the aerodynamic roughness length, the smooth roughness length of the erodible fraction of the surface, and the dry (undisturbed) soil size distribution. Recently, satellite-derived roughness length and high-resolution soil texture data sets at the global scale have emerged and provide the opportunity for the use of advanced schemes in global models. This paper analyzes the behavior of the ERS satellite-derived global roughness length and the State Soil Geographic data base-Food and Agriculture Organization of the United Nations (STATSGO-FAO) soil texture data set (based on wet techniques) using an advanced DPM in comparison to the LISA data set over Northern Africa and the Middle East. We explore the sensitivity of the drag partition scheme (a critical component of the DPM) and of the dust vertical fluxes (intensity and spatial patterns) to the roughness length and soil texture data sets. We also compare the use of the drag partition scheme to a widely used preferential source approach in global models. Idealized experiments with prescribed wind speeds show that the ERS and STATSGO-FAO data sets provide realistic spatial patterns of dust emission and friction velocity thresholds in the region. Finally, we evaluate a dust transport model for the period of March to July 2011 with observed aerosol optical depths from Aerosol Robotic Network sites. Results show that ERS and STATSGO-FAO provide realistic simulations in the region.

  12. Neonatal hypothermia in low-resource settings.

    PubMed

    Mullany, Luke C

    2010-12-01

    Hypothermia among newborns is considered an important contributor to neonatal morbidity and mortality in low-resource settings. However, in these settings only limited progress has been made towards understanding the risk of mortality after hypothermia, describing how this relationship is dependent on both the degree or severity of exposure and the gestational age and weight status of the baby, and implementing interventions to mitigate both exposure and the associated risk of poor outcomes. Given the centrality of averting neonatal mortality to achieving global milestones towards reductions in child mortality by 2015, recent years have seen substantial resources and efforts implemented to improve understanding of global epidemiology of neonatal health. In this article, a summary of the burden, consequences, and risk factors of neonatal hypothermia in low-resources settings is presented, with a particular focus on community-based data. Context-appropriate interventions for reducing hypothermia exposure and the role of these interventions in reducing global neonatal mortality burden are explored. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. A space-based climatology of diurnal MLT tidal winds, temperatures and densities from UARS wind measurements

    NASA Astrophysics Data System (ADS)

    Svoboda, Aaron A.; Forbes, Jeffrey M.; Miyahara, Saburo

    2005-11-01

    A self-consistent global tidal climatology, useful for comparing and interpreting radar observations from different locations around the globe, is created from space-based Upper Atmosphere Research Satellite (UARS) horizontal wind measurements. The climatology created includes tidal structures for horizontal winds, temperature and relative density, and is constructed by fitting local (in latitude and height) UARS wind data at 95 km to a set of basis functions called Hough mode extensions (HMEs). These basis functions are numerically computed modifications to Hough modes and are globally self-consistent in wind, temperature, and density. We first demonstrate this self-consistency with a proxy data set from the Kyushu University General Circulation Model, and then use a linear weighted superposition of the HMEs obtained from monthly fits to the UARS data to extrapolate the global, multi-variable tidal structure. A brief explanation of the HMEs’ origin is provided as well as information about a public website that has been set up to make the full extrapolated data sets available.

  14. Global precipitation measurements for validating climate models

    NASA Astrophysics Data System (ADS)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.

    2017-11-01

    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  15. CLIVAR-GSOP/GODAE Ocean Synthesis Inter-Comparison of Global Air-Sea Fluxes From Ocean and Coupled Reanalyses

    NASA Astrophysics Data System (ADS)

    Valdivieso, Maria

    2014-05-01

    The GODAE OceanView and CLIVAR-GSOP ocean synthesis program has been assessing the degree of consistency between global air-sea flux data sets obtained from ocean or coupled reanalyses (Valdivieso et al., 2014). So far, fifteen global air-sea heat flux products obtained from ocean or coupled reanalyses have been examined: seven are from low-resolution ocean reanalyses (BOM PEODAS, ECMWF ORAS4, JMA/MRI MOVEG2, JMA/MRI MOVECORE, Hamburg Univ. GECCO2, JPL ECCOv4, and NCEP GODAS), five are from eddy-permitting ocean reanalyses developed as part of the EU GMES MyOcean program (Mercator GLORYS2v1, Reading Univ. UR025.3, UR025.4, UKMO GloSea5, and CMCC C-GLORS), and the remaining three are couple reanalyses based on coupled climate models (JMA/MRI MOVE-C, GFDL ECDA and NCEP CFSR). The global heat closure in the products over the period 1993-2009 spanned by all data sets is presented in comparison with observational and atmospheric reanalysis estimates. Then, global maps of ensemble spread in the seasonal cycle, and of the Signal to Noise Ratio of interannual flux variability over the 17-yr common period are shown to illustrate the consistency between the products. We have also studied regional variability in the products, particularly at the OceanSITES project locations (such as, for instance, the TAO/TRITON and PIRATA arrays in the Tropical Pacific and Atlantic, respectively). Comparisons are being made with other products such as OAFlux latent and sensible heat fluxes (Yu et al., 2008) combined with ISCCP satellite-based radiation (Zhang et al., 2004), the ship-based NOC2.0 product (Berry and Kent, 2009), the Large and Yeager (2009) hybrid flux dataset CORE.2, and two atmospheric reanalysis products, the ECMWF ERA-Interim reanalysis (referred to as ERAi, Dee et al., 2011) and the NCEP/DOE reanalysis R2 (referred to as NCEP-R2, Kanamitsu et al., 2002). Preliminary comparisons with the observational flux products from OceanSITES are also underway. References Berry, D.I. and E.C. Kent (2009), A New Air-Sea Interaction Gridded Dataset from ICOADS with Uncertainty Estimates. Bull. Amer. Meteor. Soc 90(5), 645-656. doi: 10.1175/2008BAMS2639.1. Dee, D. P. et al. (2011), The ERA-Interim reanalysis: configuration and performance of the data assimilation system. Q.J.R. Meteorol. Soc., 137: 553-597. doi: 10.1002/qj.828. Kanamitsu M., Ebitsuzaki W., Woolen J., Yang S.K., Hnilo J.J., Fiorino M., Potter G. (2002), NCEP-DOE AMIP-II reanalysis (R-2). Bull. Amer. Meteor. Soc., 83:1631-1643. Large, W. and Yeager, S. (2009), The global climatology of an interannually varying air-sea flux data set. Clim. Dynamics, Volume 33, pp 341-364 Valdivieso, M. and co-authors (2014): Heat fluxes from ocean and coupled reanalyses, Clivar Exchanges. Issue 64. Yu, L., X. Jin, and R. A. Weller (2008), Multidecade Global Flux Datasets from the Objectively Analyzed Air-sea Fluxes (OAFlux) Project: Latent and Sensible Heat Fluxes, Ocean Evaporation, and Related Surface Meteorological Variables. Technical Report OAFlux Project (OA2008-01), Woods Hole Oceanographic Institution. Zhang, Y., WB Rossow, AA Lacis, V Oinas, MI Mishchenk (2004), Calculation of radiative fluxes from the surface to top of atmsophere based on ISCCP and other global data sets. Journal of Geophysical Research: Atmospheres (1984-2012) 109 (D19).

  16. Global optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Arora, Jasbir S.

    1990-01-01

    The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.

  17. A Bayesian ensemble data assimilation to constrain model parameters and land-use carbon emissions

    NASA Astrophysics Data System (ADS)

    Lienert, Sebastian; Joos, Fortunat

    2018-05-01

    A dynamic global vegetation model (DGVM) is applied in a probabilistic framework and benchmarking system to constrain uncertain model parameters by observations and to quantify carbon emissions from land-use and land-cover change (LULCC). Processes featured in DGVMs include parameters which are prone to substantial uncertainty. To cope with these uncertainties Latin hypercube sampling (LHS) is used to create a 1000-member perturbed parameter ensemble, which is then evaluated with a diverse set of global and spatiotemporally resolved observational constraints. We discuss the performance of the constrained ensemble and use it to formulate a new best-guess version of the model (LPX-Bern v1.4). The observationally constrained ensemble is used to investigate historical emissions due to LULCC (ELUC) and their sensitivity to model parametrization. We find a global ELUC estimate of 158 (108, 211) PgC (median and 90 % confidence interval) between 1800 and 2016. We compare ELUC to other estimates both globally and regionally. Spatial patterns are investigated and estimates of ELUC of the 10 countries with the largest contribution to the flux over the historical period are reported. We consider model versions with and without additional land-use processes (shifting cultivation and wood harvest) and find that the difference in global ELUC is on the same order of magnitude as parameter-induced uncertainty and in some cases could potentially even be offset with appropriate parameter choice.

  18. Global controls on carbon storage in mangrove soils

    NASA Astrophysics Data System (ADS)

    Rovai, André S.; Twilley, Robert R.; Castañeda-Moya, Edward; Riul, Pablo; Cifuentes-Jara, Miguel; Manrow-Villalobos, Marilyn; Horta, Paulo A.; Simonassi, José C.; Fonseca, Alessandra L.; Pagliosa, Paulo R.

    2018-06-01

    Global-scale variation in mangrove ecosystem properties has been explained using a conceptual framework linking geomorphological processes to distinct coastal environmental settings (CES) for nearly 50 years. However, these assumptions have not been empirically tested at the global scale. Here, we show that CES account for global variability in mangrove soil C:N:P stoichiometry and soil organic carbon (SOC) stocks. Using this ecogeomorphology framework, we developed a global model that captures variation in mangrove SOC stocks compatible with distinct CES. We show that mangrove SOC stocks have been underestimated by up to 50% (a difference of roughly 200 Mg ha-1) in carbonate settings and overestimated by up to 86% (around 400 Mg ha-1) in deltaic coastlines. Moreover, we provide information for 57 nations that currently lack SOC data, enabling these and other countries to develop or evaluate their blue carbon inventories.

  19. Final Report from The University of Texas at Austin for DEGAS: Dynamic Global Address Space programming environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erez, Mattan; Yelick, Katherine; Sarkar, Vivek

    The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. Our approach is to provide an efficient and scalable programming model that can be adapted to application needs through the use of dynamic runtime features and domain-specific languages for computational kernels. We address the following technical challenges: Programmability: Rich set of programming constructs based on a Hierarchical Partitioned Global Address Space (HPGAS) model, demonstrated in UPC++. Scalability: Hierarchical locality control, lightweight communication (extended GASNet), and ef- ficient synchronization mechanisms (Phasers). Performance Portability:more » Just-in-time specialization (SEJITS) for generating hardware-specific code and scheduling libraries for domain-specific adaptive runtimes (Habanero). Energy Efficiency: Communication-optimal code generation to optimize energy efficiency by re- ducing data movement. Resilience: Containment Domains for flexible, domain-specific resilience, using state capture mechanisms and lightweight, asynchronous recovery mechanisms. Interoperability: Runtime and language interoperability with MPI and OpenMP to encourage broad adoption.« less

  20. Citizen Science Seismic Stations for Monitoring Regional and Local Events

    NASA Astrophysics Data System (ADS)

    Zucca, J. J.; Myers, S.; Srikrishna, D.

    2016-12-01

    The earth has tens of thousands of seismometers installed on its surface or in boreholes that are operated by many organizations for many purposes including the study of earthquakes, volcanos, and nuclear explosions. Although global networks such as the Global Seismic Network and the International Monitoring System do an excellent job of monitoring nuclear test explosions and other seismic events, their thresholds could be lowered with the addition of more stations. In recent years there has been interest in citizen-science approaches to augment government-sponsored monitoring networks (see, for example, Stubbs and Drell, 2013). A modestly-priced seismic station that could be purchased by citizen scientists could enhance regional and local coverage of the GSN, IMS, and other networks if those stations are of high enough quality and distributed optimally. In this paper we present a minimum set of hardware and software specifications that a citizen seismograph station would need in order to add value to global networks. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

Top